Compare commits

..

No commits in common. "main" and "0.3.2" have entirely different histories.
main ... 0.3.2

10 changed files with 231 additions and 377 deletions

View file

@ -1,153 +0,0 @@
# Changelog
## [Unreleased]
- Your change here.
[Unreleased]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.4.0...HEAD
## [0.4.0] - 2025-09-10
### Fixed
- Fixed building with Xcode 26 RC
[0.4.0]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.9...0.4.0
## [0.3.9] - 2025-05-25
### Fixed
- Fixed crash on iOS 17 by using a new task instead of assumeIsolated
[0.3.9]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.8...0.3.9
## [0.3.8] - 2025-04-04
### Fixed
- Fixed crash when cancelled while writing samples
- Fixed tests with Swift 6.1 on macOS
- Fixed tests in Xcode 16.4 on macOS 15.5
- Fixed warnings in tests in Xcode 16.3
### Changed
- Stopped relying on specific delay in cancellation test
- Updated readme for 0.3.8
[0.3.8]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.7...0.3.8
## [0.3.7] - 2025-01-19
### Fixed
- Simplified cancellation and fixed memory leak
[0.3.7]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.6...0.3.7
## [0.3.6] - 2025-01-19
### Fixed
- Attempted to fix possible retain cycle
[0.3.6]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.5...0.3.6
## [0.3.5] - 2025-01-19
### Fixed
- Improved cancellation response (potential memory leak issue)
### Removed
- Deleted dead code
### Changed
- Extracted BaseTests class for better test organization
[0.3.5]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.4...0.3.5
## [0.3.4] - 2024-11-08
### Fixed
- [#3](https://github.com/samsonjs/SJSAssetExportSession/pull/3): Fixed encoding stalling by interleaving audio and video samples - [@samsonjs](https://github.com/samsonjs).
### Changed
- Updated readme with additional documentation
[0.3.4]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.3...0.3.4
## [0.3.3] - 2024-10-19
### Changed
- Made AudioOutputSettings and VideoOutputSettings properties public
### Fixed
- Made tests work on iOS 18.0 and iOS 18.1
- Fixed progress test
### Removed
- Removed SampleWriter.duration property
[0.3.3]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.2...0.3.3
## [0.3.2] - 2024-10-19
### Fixed
- Fixed release builds by using makeStream for SampleWriter's progress
### Changed
- Updated example in readme to version 0.3.2
[0.3.2]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.1...0.3.2
## [0.3.1] - 2024-10-19
### Fixed
- Removed unnecessary Task.yield() to fix intermittent hang
### Changed
- Improved code style and debuggability
- Updated version in readme to 0.3.1
[0.3.1]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3...0.3.1
## [0.3] - 2024-10-18
### Added
- Made audio/video settings Hashable, Sendable, and Codable
### Changed
- Updated readme for version 0.3
- Fixed SwiftPM instructions in readme
[0.3]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.2...0.3
## [0.2] - 2024-10-04
### Fixed
- [#2](https://github.com/samsonjs/SJSAssetExportSession/pull/2): Fixed spatial audio handling by dropping spatial audio tracks to fix encoding iPhone 16 videos - [@samsonjs](https://github.com/samsonjs).
### Changed
- Code style improvements
- Updated version in readme's SPM example
[0.2]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.1...0.2
## [0.1] - 2024-09-18
### Added
- Initial release as Swift Package
- Alternative to AVAssetExportSession with custom audio/video settings
- Builder pattern API for AudioOutputSettings and VideoOutputSettings
- Flexible raw dictionary API for maximum control
- Progress reporting via AsyncStream
- Support for iOS 17.0+, macOS 14.0+, and visionOS 1.3+
- Swift 6 strict concurrency support
- Comprehensive test suite with multiple video formats
### Changed
- Converted from Xcode project to Swift package
- Made yielding last progress value more reliable
- Set deployment targets to iOS 17, macOS 14, and visionOS 1.3
### Added
- Support for writing metadata on assets
- Documentation for most public API
- README and license files
[0.1]: https://github.com/samsonjs/SJSAssetExportSession/releases/tag/0.1

View file

@ -24,7 +24,6 @@ let package = Package(
.process("Resources/test-no-audio.mp4"), .process("Resources/test-no-audio.mp4"),
.process("Resources/test-no-video.m4a"), .process("Resources/test-no-video.m4a"),
.process("Resources/test-spatial-audio.mov"), .process("Resources/test-spatial-audio.mov"),
.process("Resources/test-x264-1080p-h264-60fps.mp4"),
] ]
), ),
] ]

View file

@ -1,9 +1,5 @@
# SJSAssetExportSession # SJSAssetExportSession
[![0 dependencies!](https://0dependencies.dev/0dependencies.svg)](https://0dependencies.dev)
[![](https://img.shields.io/endpoint?url=https%3A%2F%2Fswiftpackageindex.com%2Fapi%2Fpackages%2Fsamsonjs%2FSJSAssetExportSession%2Fbadge%3Ftype%3Dswift-versions)](https://swiftpackageindex.com/samsonjs/SJSAssetExportSession)
[![](https://img.shields.io/endpoint?url=https%3A%2F%2Fswiftpackageindex.com%2Fapi%2Fpackages%2Fsamsonjs%2FSJSAssetExportSession%2Fbadge%3Ftype%3Dplatforms)](https://swiftpackageindex.com/samsonjs/SJSAssetExportSession)
## Overview ## Overview
`SJSAssetExportSession` is an alternative to [`AVAssetExportSession`][AV] that lets you provide custom audio and video settings, without dropping down into the world of `AVAssetReader` and `AVAssetWriter`. It has similar capabilites to [SDAVAssetExportSession][SDAV] but the API is completely different, the code is written in Swift, and it's ready for the world of strict concurrency. `SJSAssetExportSession` is an alternative to [`AVAssetExportSession`][AV] that lets you provide custom audio and video settings, without dropping down into the world of `AVAssetReader` and `AVAssetWriter`. It has similar capabilites to [SDAVAssetExportSession][SDAV] but the API is completely different, the code is written in Swift, and it's ready for the world of strict concurrency.
@ -34,7 +30,7 @@ When you're integrating this into an app with Xcode then go to your project's Pa
When you're integrating this using SPM on its own then add this to the list of dependencies your Package.swift file: When you're integrating this using SPM on its own then add this to the list of dependencies your Package.swift file:
```swift ```swift
.package(url: "https://github.com/samsonjs/SJSAssetExportSession.git", .upToNextMajor(from: "0.4.0")) .package(url: "https://github.com/samsonjs/SJSAssetExportSession.git", .upToNextMajor(from: "0.3.2"))
``` ```
and then add `"SJSAssetExportSession"` to the list of dependencies in your target as well. and then add `"SJSAssetExportSession"` to the list of dependencies in your target as well.
@ -198,6 +194,6 @@ try await exporter.export(
## License ## License
Copyright © 2024-2025 [Sami Samhuri](https://samhuri.net) <sami@samhuri.net>. Released under the terms of the [MIT License][MIT]. Copyright © 2024 [Sami Samhuri](https://samhuri.net) <sami@samhuri.net>. Released under the terms of the [MIT License][MIT].
[MIT]: https://sjs.mit-license.org [MIT]: https://sjs.mit-license.org

View file

@ -26,9 +26,9 @@ public struct AudioOutputSettings: Hashable, Sendable, Codable {
} }
} }
public let format: AudioFormatID let format: AudioFormatID
public let channels: Int let channels: Int
public let sampleRate: Int? let sampleRate: Int?
/// Specifies the AAC format with 2 channels at a 44.1 KHz sample rate. /// Specifies the AAC format with 2 channels at a 44.1 KHz sample rate.
public static var `default`: AudioOutputSettings { public static var `default`: AudioOutputSettings {

View file

@ -37,16 +37,18 @@ actor SampleWriter {
private let audioMix: AVAudioMix? private let audioMix: AVAudioMix?
private let videoOutputSettings: [String: any Sendable] private let videoOutputSettings: [String: any Sendable]
private let videoComposition: AVVideoComposition? private let videoComposition: AVVideoComposition?
private let duration: CMTime
private let timeRange: CMTimeRange private let timeRange: CMTimeRange
// MARK: Internal state // MARK: Internal state
private var reader: AVAssetReader? private let reader: AVAssetReader
private var writer: AVAssetWriter? private let writer: AVAssetWriter
private var audioOutput: AVAssetReaderAudioMixOutput? private var audioOutput: AVAssetReaderAudioMixOutput?
private var audioInput: AVAssetWriterInput? private var audioInput: AVAssetWriterInput?
private var videoOutput: AVAssetReaderVideoCompositionOutput? private var videoOutput: AVAssetReaderVideoCompositionOutput?
private var videoInput: AVAssetWriterInput? private var videoInput: AVAssetWriterInput?
private var isCancelled = false
nonisolated init( nonisolated init(
asset: sending AVAsset, asset: sending AVAsset,
@ -64,26 +66,18 @@ actor SampleWriter {
(progressStream, progressContinuation) = AsyncStream<Float>.makeStream() (progressStream, progressContinuation) = AsyncStream<Float>.makeStream()
let duration = if let timeRange {
timeRange.duration
} else {
try await asset.load(.duration)
}
let reader = try AVAssetReader(asset: asset) let reader = try AVAssetReader(asset: asset)
if let timeRange { if let timeRange {
reader.timeRange = timeRange reader.timeRange = timeRange
} }
self.reader = reader
let writer = try AVAssetWriter(outputURL: outputURL, fileType: fileType) let writer = try AVAssetWriter(outputURL: outputURL, fileType: fileType)
writer.shouldOptimizeForNetworkUse = optimizeForNetworkUse writer.shouldOptimizeForNetworkUse = optimizeForNetworkUse
writer.metadata = metadata writer.metadata = metadata
self.writer = writer
self.audioOutputSettings = audioOutputSettings
self.audioMix = audioMix
self.videoOutputSettings = videoOutputSettings
self.videoComposition = videoComposition
self.timeRange = if let timeRange {
timeRange
} else {
try await CMTimeRange(start: .zero, duration: asset.load(.duration))
}
// Filter out disabled tracks to avoid problems encoding spatial audio. Ideally this would // Filter out disabled tracks to avoid problems encoding spatial audio. Ideally this would
// preserve track groups and make that all configurable. // preserve track groups and make that all configurable.
@ -92,24 +86,7 @@ actor SampleWriter {
// Audio is optional so only validate output settings when it's applicable. // Audio is optional so only validate output settings when it's applicable.
if !audioTracks.isEmpty { if !audioTracks.isEmpty {
try Self.validateAudio(outputSettings: audioOutputSettings, writer: writer) try Self.validateAudio(outputSettings: audioOutputSettings, writer: writer)
let audioOutput = AVAssetReaderAudioMixOutput(audioTracks: audioTracks, audioSettings: nil)
audioOutput.alwaysCopiesSampleData = false
audioOutput.audioMix = audioMix
guard reader.canAdd(audioOutput) else {
throw Error.setupFailure(.cannotAddAudioOutput)
}
reader.add(audioOutput)
self.audioOutput = audioOutput
let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioOutputSettings)
audioInput.expectsMediaDataInRealTime = false
guard writer.canAdd(audioInput) else {
throw Error.setupFailure(.cannotAddAudioInput)
}
writer.add(audioInput)
self.audioInput = audioInput
} }
let videoTracks = try await asset.loadTracks(withMediaType: .video) let videoTracks = try await asset.loadTracks(withMediaType: .video)
.filterAsync { try await $0.load(.isEnabled) } .filterAsync { try await $0.load(.isEnabled) }
guard !videoTracks.isEmpty else { throw Error.setupFailure(.videoTracksEmpty) } guard !videoTracks.isEmpty else { throw Error.setupFailure(.videoTracksEmpty) }
@ -118,61 +95,39 @@ actor SampleWriter {
renderSize: videoComposition.renderSize, renderSize: videoComposition.renderSize,
settings: videoOutputSettings settings: videoOutputSettings
) )
let videoOutput = AVAssetReaderVideoCompositionOutput(
videoTracks: videoTracks,
videoSettings: nil
)
videoOutput.alwaysCopiesSampleData = false
videoOutput.videoComposition = videoComposition
guard reader.canAdd(videoOutput) else {
throw Error.setupFailure(.cannotAddVideoOutput)
}
reader.add(videoOutput)
self.videoOutput = videoOutput
let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings) self.audioOutputSettings = audioOutputSettings
videoInput.expectsMediaDataInRealTime = false self.audioMix = audioMix
guard writer.canAdd(videoInput) else { self.videoOutputSettings = videoOutputSettings
throw Error.setupFailure(.cannotAddVideoInput) self.videoComposition = videoComposition
} self.reader = reader
writer.add(videoInput) self.writer = writer
self.videoInput = videoInput self.duration = duration
self.timeRange = timeRange ?? CMTimeRange(start: .zero, duration: duration)
try await setUpAudio(audioTracks: audioTracks)
try await setUpVideo(videoTracks: videoTracks)
} }
func writeSamples() async throws { func writeSamples() async throws {
guard let reader, let writer else { throw CancellationError() }
try Task.checkCancellation() try Task.checkCancellation()
// Clear all of these properties otherwise when we get cancelled then we leak a bunch of
// pixel buffers.
defer {
if Task.isCancelled {
reader.cancelReading()
writer.cancelWriting()
}
self.reader = nil
self.writer = nil
audioInput = nil
audioOutput = nil
videoInput = nil
videoOutput = nil
}
progressContinuation.yield(0.0) progressContinuation.yield(0.0)
writer.startWriting() writer.startWriting()
writer.startSession(atSourceTime: timeRange.start) writer.startSession(atSourceTime: timeRange.start)
reader.startReading() reader.startReading()
try Task.checkCancellation() try Task.checkCancellation()
startEncodingAudioTracks() await encodeAudioTracks()
startEncodingVideoTracks() try Task.checkCancellation()
while reader.status == .reading, writer.status == .writing { await encodeVideoTracks()
try await Task.sleep(for: .milliseconds(10)) try Task.checkCancellation()
guard reader.status != .cancelled && writer.status != .cancelled else {
throw CancellationError()
} }
guard writer.status != .failed else { guard writer.status != .failed else {
reader.cancelReading() reader.cancelReading()
throw Error.writeFailure(writer.error) throw Error.writeFailure(writer.error)
@ -199,40 +154,132 @@ actor SampleWriter {
} }
} }
// MARK: - Setup
private func setUpAudio(audioTracks: [AVAssetTrack]) throws {
guard !audioTracks.isEmpty else { return }
let audioOutput = AVAssetReaderAudioMixOutput(audioTracks: audioTracks, audioSettings: nil)
audioOutput.alwaysCopiesSampleData = false
audioOutput.audioMix = audioMix
guard reader.canAdd(audioOutput) else {
throw Error.setupFailure(.cannotAddAudioOutput)
}
reader.add(audioOutput)
self.audioOutput = audioOutput
let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioOutputSettings)
audioInput.expectsMediaDataInRealTime = false
guard writer.canAdd(audioInput) else {
throw Error.setupFailure(.cannotAddAudioInput)
}
writer.add(audioInput)
self.audioInput = audioInput
}
private func setUpVideo(videoTracks: [AVAssetTrack]) throws {
precondition(!videoTracks.isEmpty, "Video tracks must be provided")
let videoOutput = AVAssetReaderVideoCompositionOutput(
videoTracks: videoTracks,
videoSettings: nil
)
videoOutput.alwaysCopiesSampleData = false
videoOutput.videoComposition = videoComposition
guard reader.canAdd(videoOutput) else {
throw Error.setupFailure(.cannotAddVideoOutput)
}
reader.add(videoOutput)
self.videoOutput = videoOutput
let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings)
videoInput.expectsMediaDataInRealTime = false
guard writer.canAdd(videoInput) else {
throw Error.setupFailure(.cannotAddVideoInput)
}
writer.add(videoInput)
self.videoInput = videoInput
}
func cancel() async {
isCancelled = true
}
// MARK: - Encoding // MARK: - Encoding
private func startEncodingAudioTracks() { private func encodeAudioTracks() async {
// Don't do anything when we have no audio to encode. // Don't do anything when we have no audio to encode.
guard let audioInput, audioOutput != nil else { guard audioInput != nil, audioOutput != nil else {
return return
} }
audioInput.requestMediaDataWhenReady(on: queue) { await withTaskCancellationHandler {
Task { await self.writeAllReadySamples() } await withCheckedContinuation { continuation in
self.audioInput!.requestMediaDataWhenReady(on: queue) {
self.assumeIsolated { _self in
guard !_self.isCancelled else {
log.debug("Cancelled while encoding audio")
_self.reader.cancelReading()
_self.writer.cancelWriting()
continuation.resume()
return
}
let hasMoreSamples = _self.writeReadySamples(
output: _self.audioOutput!,
input: _self.audioInput!
)
if !hasMoreSamples {
log.debug("Finished encoding audio")
continuation.resume()
}
}
}
}
} onCancel: {
log.debug("Task cancelled while encoding audio")
Task {
await self.cancel()
}
} }
} }
private func startEncodingVideoTracks() { private func encodeVideoTracks() async {
videoInput?.requestMediaDataWhenReady(on: queue) { await withTaskCancellationHandler {
Task { await self.writeAllReadySamples() } await withCheckedContinuation { continuation in
} self.videoInput!.requestMediaDataWhenReady(on: queue) {
} // NOTE: assumeIsolated crashes on macOS at the moment
self.assumeIsolated { _self in
guard !_self.isCancelled else {
log.debug("Cancelled while encoding video")
_self.reader.cancelReading()
_self.writer.cancelWriting()
continuation.resume()
return
}
private func writeAllReadySamples() { let hasMoreSamples = _self.writeReadySamples(
if let audioInput, let audioOutput { output: _self.videoOutput!,
let hasMoreAudio = writeReadySamples(output: audioOutput, input: audioInput) input: _self.videoInput!
if !hasMoreAudio { log.debug("Finished encoding audio") } )
} if !hasMoreSamples {
log.debug("Finished encoding video")
if let videoInput, let videoOutput { continuation.resume()
let hasMoreVideo = writeReadySamples(output: videoOutput, input: videoInput) }
if !hasMoreVideo { log.debug("Finished encoding video") } }
}
}
} onCancel: {
log.debug("Task cancelled while encoding video")
Task {
await self.cancel()
}
} }
} }
private func writeReadySamples(output: AVAssetReaderOutput, input: AVAssetWriterInput) -> Bool { private func writeReadySamples(output: AVAssetReaderOutput, input: AVAssetWriterInput) -> Bool {
while input.isReadyForMoreMediaData { while input.isReadyForMoreMediaData {
guard reader?.status == .reading && writer?.status == .writing, guard reader.status == .reading && writer.status == .writing,
let sampleBuffer = output.copyNextSampleBuffer() else { let sampleBuffer = output.copyNextSampleBuffer() else {
input.markAsFinished() input.markAsFinished()
return false return false
@ -240,9 +287,8 @@ actor SampleWriter {
// Only yield progress values for video. Audio is insignificant in comparison. // Only yield progress values for video. Audio is insignificant in comparison.
if output == videoOutput { if output == videoOutput {
let endTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) let samplePresentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) - timeRange.start
let samplePresentationTime = endTime - timeRange.start let progress = Float(samplePresentationTime.seconds / duration.seconds)
let progress = Float(samplePresentationTime.seconds / timeRange.duration.seconds)
progressContinuation.yield(progress) progressContinuation.yield(progress)
} }
@ -291,7 +337,7 @@ actor SampleWriter {
let renderWidth = Int(renderSize.width) let renderWidth = Int(renderSize.width)
let renderHeight = Int(renderSize.height) let renderHeight = Int(renderSize.height)
if renderWidth != settingsWidth || renderHeight != settingsHeight { if renderWidth != settingsWidth || renderHeight != settingsHeight {
log.warning("Video composition's render size (\(renderWidth)\(renderHeight)) will be overridden by video output settings (\(settingsWidth)\(settingsHeight))") log.warning("Video composition's render size (\(renderWidth)\(renderHeight)) will be overriden by video output settings (\(settingsWidth)\(settingsHeight))")
} }
} }
} }

View file

@ -88,11 +88,11 @@ public struct VideoOutputSettings: Hashable, Sendable, Codable {
} }
} }
public let codec: Codec let codec: Codec
public let size: CGSize let size: CGSize
public let fps: Int? let fps: Int?
public let bitrate: Int? let bitrate: Int?
public let color: Color? let color: Color?
public static func codec(_ codec: Codec, size: CGSize) -> VideoOutputSettings { public static func codec(_ codec: Codec, size: CGSize) -> VideoOutputSettings {
.init(codec: codec, size: size, fps: nil, bitrate: nil, color: nil) .init(codec: codec, size: size, fps: nil, bitrate: nil, color: nil)

View file

@ -0,0 +1,14 @@
//
// AVAsset+sending.swift
// SJSAssetExportSessionTests
//
// Created by Sami Samhuri on 2024-07-07.
//
import AVFoundation
extension AVAsset {
func sendTracks(withMediaType mediaType: AVMediaType) async throws -> sending [AVAssetTrack] {
try await loadTracks(withMediaType: mediaType)
}
}

View file

@ -1,50 +0,0 @@
//
// BaseTests.swift
// SJSAssetExportSession
//
// Created by Sami Samhuri on 2025-01-19.
//
import AVFoundation
import Foundation
import Testing
class BaseTests {
func resourceURL(named name: String) -> URL {
Bundle.module.resourceURL!.appending(component: name)
}
func makeAsset(url: URL) -> sending AVAsset {
AVURLAsset(url: url, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true,
])
}
func makeTemporaryURL(function: String = #function) -> AutoDestructingURL {
let timestamp = Int(Date.now.timeIntervalSince1970)
let f = function.replacing(/[\(\)]/, with: { _ in "" })
let filename = "\(Self.self)_\(f)_\(timestamp).mp4"
let url = URL.temporaryDirectory.appending(component: filename)
return AutoDestructingURL(url: url)
}
func makeVideoComposition(
assetURL: URL,
size: CGSize? = nil,
fps: Int? = nil
) async throws -> sending AVMutableVideoComposition {
let asset = makeAsset(url: assetURL)
let videoComposition = try await AVMutableVideoComposition.videoComposition(
withPropertiesOf: asset
)
if let size {
videoComposition.renderSize = size
}
if let fps {
let seconds = 1.0 / TimeInterval(fps)
videoComposition.sourceTrackIDForFrameTiming = kCMPersistentTrackID_Invalid
videoComposition.frameDuration = CMTime(seconds: seconds, preferredTimescale: 600)
}
return videoComposition
}
}

View file

@ -10,7 +10,45 @@ import CoreLocation
import SJSAssetExportSession import SJSAssetExportSession
import Testing import Testing
final class ExportSessionTests: BaseTests { final class ExportSessionTests {
private func resourceURL(named name: String) -> URL {
Bundle.module.resourceURL!.appending(component: name)
}
private func makeAsset(url: URL) -> sending AVAsset {
AVURLAsset(url: url, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true,
])
}
private func makeTemporaryURL(function: String = #function) -> AutoDestructingURL {
let timestamp = Int(Date.now.timeIntervalSince1970)
let f = function.replacing(/[\(\)]/, with: { _ in "" })
let filename = "\(Self.self)_\(f)_\(timestamp).mp4"
let url = URL.temporaryDirectory.appending(component: filename)
return AutoDestructingURL(url: url)
}
private func makeVideoComposition(
assetURL: URL,
size: CGSize? = nil,
fps: Int? = nil
) async throws -> sending AVMutableVideoComposition {
let asset = makeAsset(url: assetURL)
let videoComposition = try await AVMutableVideoComposition.videoComposition(
withPropertiesOf: asset
)
if let size {
videoComposition.renderSize = size
}
if let fps {
let seconds = 1.0 / TimeInterval(fps)
videoComposition.sourceTrackIDForFrameTiming = kCMPersistentTrackID_Invalid
videoComposition.frameDuration = CMTime(seconds: seconds, preferredTimescale: 600)
}
return videoComposition
}
@Test func test_sugary_export_720p_h264_24fps() async throws { @Test func test_sugary_export_720p_h264_24fps() async throws {
let sourceURL = resourceURL(named: "test-4k-hdr-hevc-30fps.mov") let sourceURL = resourceURL(named: "test-4k-hdr-hevc-30fps.mov")
let destinationURL = makeTemporaryURL() let destinationURL = makeTemporaryURL()
@ -30,20 +68,19 @@ final class ExportSessionTests: BaseTests {
let exportedAsset = AVURLAsset(url: destinationURL.url) let exportedAsset = AVURLAsset(url: destinationURL.url)
#expect(try await exportedAsset.load(.duration) == .seconds(1)) #expect(try await exportedAsset.load(.duration) == .seconds(1))
// Audio // Audio
try #require(try await exportedAsset.loadTracks(withMediaType: .audio).count == 1) try #require(try await exportedAsset.sendTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.loadTracks(withMediaType: .audio).first) let audioTrack = try #require(await exportedAsset.sendTracks(withMediaType: .audio).first)
let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first) let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first)
#expect(audioFormat.mediaType == .audio) #expect(audioFormat.mediaType == .audio)
#expect(audioFormat.mediaSubType == .mpeg4AAC) #expect(audioFormat.mediaSubType == .mpeg4AAC)
#expect(audioFormat.audioChannelLayout?.numberOfChannels == 2) #expect(audioFormat.audioChannelLayout?.numberOfChannels == 2)
#expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100) #expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100)
// Video // Video
try #require(await exportedAsset.loadTracks(withMediaType: .video).count == 1) try #require(await exportedAsset.sendTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first) let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720)) #expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
#expect(try await videoTrack.load(.nominalFrameRate) == 24.0) #expect(try await videoTrack.load(.nominalFrameRate) == 24.0)
let dataRate = try await videoTrack.load(.estimatedDataRate) #expect(try await videoTrack.load(.estimatedDataRate) == 1_036_128)
#expect((900_000 ... 1_130_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first) let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video) #expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264) #expect(videoFormat.mediaSubType == .h264)
@ -79,20 +116,19 @@ final class ExportSessionTests: BaseTests {
let exportedAsset = AVURLAsset(url: destinationURL.url) let exportedAsset = AVURLAsset(url: destinationURL.url)
#expect(try await exportedAsset.load(.duration) == .seconds(1)) #expect(try await exportedAsset.load(.duration) == .seconds(1))
// Audio // Audio
try #require(try await exportedAsset.loadTracks(withMediaType: .audio).count == 1) try #require(try await exportedAsset.sendTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.loadTracks(withMediaType: .audio).first) let audioTrack = try #require(await exportedAsset.sendTracks(withMediaType: .audio).first)
let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first) let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first)
#expect(audioFormat.mediaType == .audio) #expect(audioFormat.mediaType == .audio)
#expect(audioFormat.mediaSubType == .mpeg4AAC) #expect(audioFormat.mediaSubType == .mpeg4AAC)
#expect(audioFormat.audioChannelLayout?.numberOfChannels == 2) #expect(audioFormat.audioChannelLayout?.numberOfChannels == 2)
#expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100) #expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100)
// Video // Video
try #require(await exportedAsset.loadTracks(withMediaType: .video).count == 1) try #require(await exportedAsset.sendTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first) let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720)) #expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
#expect(try await videoTrack.load(.nominalFrameRate) == 24.0) #expect(try await videoTrack.load(.nominalFrameRate) == 24.0)
let dataRate = try await videoTrack.load(.estimatedDataRate) #expect(try await videoTrack.load(.estimatedDataRate) == 1_036_128)
#expect((900_000 ... 1_130_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first) let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video) #expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264) #expect(videoFormat.mediaSubType == .h264)
@ -101,7 +137,7 @@ final class ExportSessionTests: BaseTests {
#expect(videoFormat.extensions[.yCbCrMatrix] == .yCbCrMatrix(.itu_R_709_2)) #expect(videoFormat.extensions[.yCbCrMatrix] == .yCbCrMatrix(.itu_R_709_2))
} }
@Test func test_export_default_time_range() async throws { @Test func test_export_default_timerange() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let destinationURL = makeTemporaryURL() let destinationURL = makeTemporaryURL()
@ -154,36 +190,6 @@ final class ExportSessionTests: BaseTests {
#expect(try await exportedTrack.load(.naturalSize) == CGSize(width: 1280, height: 720)) #expect(try await exportedTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
} }
@Test func test_export_x264_60fps() async throws {
let sourceURL = resourceURL(named: "test-x264-1080p-h264-60fps.mp4")
let destinationURL = makeTemporaryURL()
let subject = ExportSession()
try await subject.export(
asset: makeAsset(url: sourceURL),
video: .codec(.h264, width: 1920, height: 1080)
.bitrate(2_500_000)
.fps(30),
to: destinationURL.url,
as: .mp4
)
let exportedAsset = AVURLAsset(url: destinationURL.url)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first)
let naturalSize = try await videoTrack.load(.naturalSize)
#expect(naturalSize == CGSize(width: 1920, height: 1080))
let fps = try await videoTrack.load(.nominalFrameRate)
#expect(Int(fps.rounded()) == 30)
let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((2_400_000 ... 2_700_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264)
#expect(videoFormat.extensions[.colorPrimaries] == .colorPrimaries(.itu_R_709_2))
#expect(videoFormat.extensions[.transferFunction] == .transferFunction(.itu_R_709_2))
#expect(videoFormat.extensions[.yCbCrMatrix] == .yCbCrMatrix(.itu_R_709_2))
}
@Test func test_export_progress() async throws { @Test func test_export_progress() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let progressValues = SendableWrapper<[Float]>([]) let progressValues = SendableWrapper<[Float]>([])
@ -201,8 +207,6 @@ final class ExportSessionTests: BaseTests {
as: .mov as: .mov
) )
// Wait for last progress value to be yielded.
try await Task.sleep(for: .milliseconds(10))
#expect(progressValues.value.count > 2, "There should be intermediate progress updates") #expect(progressValues.value.count > 2, "There should be intermediate progress updates")
#expect(progressValues.value.first == 0.0) #expect(progressValues.value.first == 0.0)
#expect(progressValues.value.last == 1.0) #expect(progressValues.value.last == 1.0)
@ -226,17 +230,17 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_empty_audio_settings() async throws { @Test func test_export_throws_with_empty_audio_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsEmpty)) { try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsEmpty)) {
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let videoComposition = try await self.makeVideoComposition(assetURL: sourceURL) let videoComposition = try await makeVideoComposition(assetURL: sourceURL)
let subject = ExportSession() let subject = ExportSession()
try await subject.export( try await subject.export(
asset: self.makeAsset(url: sourceURL), asset: makeAsset(url: sourceURL),
audioOutputSettings: [:], // Here it matters because there's an audio track audioOutputSettings: [:], // Here it matters because there's an audio track
videoOutputSettings: VideoOutputSettings videoOutputSettings: VideoOutputSettings
.codec(.h264, size: videoComposition.renderSize).settingsDictionary, .codec(.h264, size: videoComposition.renderSize).settingsDictionary,
composition: videoComposition, composition: videoComposition,
to: self.makeTemporaryURL().url, to: makeTemporaryURL().url,
as: .mov as: .mov
) )
} }
@ -244,18 +248,18 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_invalid_audio_settings() async throws { @Test func test_export_throws_with_invalid_audio_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsInvalid)) { try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsInvalid)) {
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let subject = ExportSession() let subject = ExportSession()
try await subject.export( try await subject.export(
asset: self.makeAsset(url: sourceURL), asset: makeAsset(url: sourceURL),
audioOutputSettings: [ audioOutputSettings: [
AVFormatIDKey: kAudioFormatMPEG4AAC, AVFormatIDKey: kAudioFormatMPEG4AAC,
AVNumberOfChannelsKey: NSNumber(value: -1), // invalid number of channels AVNumberOfChannelsKey: NSNumber(value: -1), // invalid number of channels
], ],
videoOutputSettings: VideoOutputSettings videoOutputSettings: VideoOutputSettings
.codec(.h264, size: CGSize(width: 1280, height: 720)).settingsDictionary, .codec(.h264, size: CGSize(width: 1280, height: 720)).settingsDictionary,
to: self.makeTemporaryURL().url, to: makeTemporaryURL().url,
as: .mov as: .mov
) )
} }
@ -263,12 +267,12 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_invalid_video_settings() async throws { @Test func test_export_throws_with_invalid_video_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.videoSettingsInvalid)) { try await #require(throws: ExportSession.Error.setupFailure(.videoSettingsInvalid)) {
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let size = CGSize(width: 1280, height: 720) let size = CGSize(width: 1280, height: 720)
let subject = ExportSession() let subject = ExportSession()
try await subject.export( try await subject.export(
asset: self.makeAsset(url: sourceURL), asset: makeAsset(url: sourceURL),
audioOutputSettings: AudioOutputSettings.default.settingsDictionary, audioOutputSettings: AudioOutputSettings.default.settingsDictionary,
videoOutputSettings: [ videoOutputSettings: [
// missing codec // missing codec
@ -276,7 +280,7 @@ final class ExportSessionTests: BaseTests {
AVVideoHeightKey: NSNumber(value: Int(size.height)), AVVideoHeightKey: NSNumber(value: Int(size.height)),
], ],
composition: nil, composition: nil,
to: self.makeTemporaryURL().url, to: makeTemporaryURL().url,
as: .mov as: .mov
) )
} }
@ -284,12 +288,12 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_no_video_track() async throws { @Test func test_export_throws_with_no_video_track() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.videoTracksEmpty)) { try await #require(throws: ExportSession.Error.setupFailure(.videoTracksEmpty)) {
let sourceURL = self.resourceURL(named: "test-no-video.m4a") let sourceURL = resourceURL(named: "test-no-video.m4a")
let subject = ExportSession() let subject = ExportSession()
try await subject.export( try await subject.export(
asset: self.makeAsset(url: sourceURL), asset: makeAsset(url: sourceURL),
video: .codec(.h264, width: 1280, height: 720), video: .codec(.h264, width: 1280, height: 720),
to: self.makeTemporaryURL().url, to: makeTemporaryURL().url,
as: .mov as: .mov
) )
} }
@ -298,11 +302,11 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_cancellation() async throws { @Test func test_export_cancellation() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let destinationURL💥 = makeTemporaryURL() let destinationURL💥 = makeTemporaryURL()
let subject = ExportSession()
let task = Task { let task = Task {
let sourceAsset = AVURLAsset(url: sourceURL, options: [ let sourceAsset = AVURLAsset(url: sourceURL, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true, AVURLAssetPreferPreciseDurationAndTimingKey: true,
]) ])
let subject = ExportSession()
try await subject.export( try await subject.export(
asset: sourceAsset, asset: sourceAsset,
video: .codec(.h264, width: 1280, height: 720), video: .codec(.h264, width: 1280, height: 720),
@ -311,10 +315,8 @@ final class ExportSessionTests: BaseTests {
) )
Issue.record("Task should be cancelled long before we get here") Issue.record("Task should be cancelled long before we get here")
} }
NSLog("Waiting for encoding to begin...") NSLog("Sleeping for 0.3s")
for await progress in subject.progressStream where progress > 0 { try await Task.sleep(for: .milliseconds(300))
break
}
NSLog("Cancelling task") NSLog("Cancelling task")
task.cancel() task.cancel()
try? await task.value // Wait for task to complete try? await task.value // Wait for task to complete