Compare commits

...

19 commits
0.3.5 ... main

Author SHA1 Message Date
9c9e24d2bc
Inline all setup code in SampleWriter initializer
It seems like a bug that only one of these warns and when swapping
the order, whichever is first warns, it's just that both must be
present to trigger a warning which is the buggy part.

Anyway since it seems like a legit error, inline all the code so
things should work fine.
2025-10-30 19:06:45 -07:00
93806c5ed0
Update copyright in Readme 2025-09-10 14:11:20 -07:00
c77c3cacd7
Bump version to 0.4.0 2025-09-10 14:10:39 -07:00
7dee5ab772
Merge pull request #5 from samsonjs/xcode26
Make it work with Xcode 26 RC
2025-09-10 14:08:10 -07:00
1d4e486041
Make it work with Xcode 26 RC 2025-09-10 14:05:25 -07:00
f60f5a9035
Add a changelog 2025-06-08 21:01:33 -07:00
0eefb949e2
Use a new task instead of assumeIsolated to try to fix crash on iOS 17 2025-05-25 18:02:35 -07:00
34c374d914
Fix tests in Xcode 16.4 on macOS 15.5 2025-05-21 13:22:26 -07:00
b627e9bf50
Fix warnings in tests in Xcode 16.3 2025-04-06 23:18:32 -07:00
7b7891ce14
Update readme for 0.3.8 2025-04-04 10:29:11 -07:00
1e768033a1
Fix a crash when cancelled while writing samples
Now we never force-unwrap videoInput or videoOutput. Or anything else
for that matter.
2025-04-04 10:26:09 -07:00
2f1b859a03
Stop relying on a specific delay in cancellation test 2025-04-04 10:12:48 -07:00
10c717ab99
Fix a typo 2025-04-04 09:46:55 -07:00
62a7a375c0
Fix tests with Swift 6.1 on macOS, which finally works! 2025-02-23 09:20:15 -08:00
49d41080bb
Simplify cancellation and fix memory leak 2025-01-19 16:17:50 -08:00
865e524be6
Revert "Try to fix a possible retain cycle"
This reverts commit 2dac7d58dc.
2025-01-19 15:19:21 -08:00
2dac7d58dc
Try to fix a possible retain cycle 2025-01-19 15:01:41 -08:00
f72a073b36
Delete dead code 2025-01-19 14:55:06 -08:00
3be5b7f28e
Extract BaseTests class 2025-01-19 14:54:53 -08:00
6 changed files with 315 additions and 190 deletions

153
Changelog.md Normal file
View file

@ -0,0 +1,153 @@
# Changelog
## [Unreleased]
- Your change here.
[Unreleased]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.4.0...HEAD
## [0.4.0] - 2025-09-10
### Fixed
- Fixed building with Xcode 26 RC
[0.4.0]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.9...0.4.0
## [0.3.9] - 2025-05-25
### Fixed
- Fixed crash on iOS 17 by using a new task instead of assumeIsolated
[0.3.9]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.8...0.3.9
## [0.3.8] - 2025-04-04
### Fixed
- Fixed crash when cancelled while writing samples
- Fixed tests with Swift 6.1 on macOS
- Fixed tests in Xcode 16.4 on macOS 15.5
- Fixed warnings in tests in Xcode 16.3
### Changed
- Stopped relying on specific delay in cancellation test
- Updated readme for 0.3.8
[0.3.8]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.7...0.3.8
## [0.3.7] - 2025-01-19
### Fixed
- Simplified cancellation and fixed memory leak
[0.3.7]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.6...0.3.7
## [0.3.6] - 2025-01-19
### Fixed
- Attempted to fix possible retain cycle
[0.3.6]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.5...0.3.6
## [0.3.5] - 2025-01-19
### Fixed
- Improved cancellation response (potential memory leak issue)
### Removed
- Deleted dead code
### Changed
- Extracted BaseTests class for better test organization
[0.3.5]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.4...0.3.5
## [0.3.4] - 2024-11-08
### Fixed
- [#3](https://github.com/samsonjs/SJSAssetExportSession/pull/3): Fixed encoding stalling by interleaving audio and video samples - [@samsonjs](https://github.com/samsonjs).
### Changed
- Updated readme with additional documentation
[0.3.4]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.3...0.3.4
## [0.3.3] - 2024-10-19
### Changed
- Made AudioOutputSettings and VideoOutputSettings properties public
### Fixed
- Made tests work on iOS 18.0 and iOS 18.1
- Fixed progress test
### Removed
- Removed SampleWriter.duration property
[0.3.3]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.2...0.3.3
## [0.3.2] - 2024-10-19
### Fixed
- Fixed release builds by using makeStream for SampleWriter's progress
### Changed
- Updated example in readme to version 0.3.2
[0.3.2]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.1...0.3.2
## [0.3.1] - 2024-10-19
### Fixed
- Removed unnecessary Task.yield() to fix intermittent hang
### Changed
- Improved code style and debuggability
- Updated version in readme to 0.3.1
[0.3.1]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3...0.3.1
## [0.3] - 2024-10-18
### Added
- Made audio/video settings Hashable, Sendable, and Codable
### Changed
- Updated readme for version 0.3
- Fixed SwiftPM instructions in readme
[0.3]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.2...0.3
## [0.2] - 2024-10-04
### Fixed
- [#2](https://github.com/samsonjs/SJSAssetExportSession/pull/2): Fixed spatial audio handling by dropping spatial audio tracks to fix encoding iPhone 16 videos - [@samsonjs](https://github.com/samsonjs).
### Changed
- Code style improvements
- Updated version in readme's SPM example
[0.2]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.1...0.2
## [0.1] - 2024-09-18
### Added
- Initial release as Swift Package
- Alternative to AVAssetExportSession with custom audio/video settings
- Builder pattern API for AudioOutputSettings and VideoOutputSettings
- Flexible raw dictionary API for maximum control
- Progress reporting via AsyncStream
- Support for iOS 17.0+, macOS 14.0+, and visionOS 1.3+
- Swift 6 strict concurrency support
- Comprehensive test suite with multiple video formats
### Changed
- Converted from Xcode project to Swift package
- Made yielding last progress value more reliable
- Set deployment targets to iOS 17, macOS 14, and visionOS 1.3
### Added
- Support for writing metadata on assets
- Documentation for most public API
- README and license files
[0.1]: https://github.com/samsonjs/SJSAssetExportSession/releases/tag/0.1

View file

@ -34,7 +34,7 @@ When you're integrating this into an app with Xcode then go to your project's Pa
When you're integrating this using SPM on its own then add this to the list of dependencies your Package.swift file: When you're integrating this using SPM on its own then add this to the list of dependencies your Package.swift file:
```swift ```swift
.package(url: "https://github.com/samsonjs/SJSAssetExportSession.git", .upToNextMajor(from: "0.3.5")) .package(url: "https://github.com/samsonjs/SJSAssetExportSession.git", .upToNextMajor(from: "0.4.0"))
``` ```
and then add `"SJSAssetExportSession"` to the list of dependencies in your target as well. and then add `"SJSAssetExportSession"` to the list of dependencies in your target as well.
@ -198,6 +198,6 @@ try await exporter.export(
## License ## License
Copyright © 2024 [Sami Samhuri](https://samhuri.net) <sami@samhuri.net>. Released under the terms of the [MIT License][MIT]. Copyright © 2024-2025 [Sami Samhuri](https://samhuri.net) <sami@samhuri.net>. Released under the terms of the [MIT License][MIT].
[MIT]: https://sjs.mit-license.org [MIT]: https://sjs.mit-license.org

View file

@ -41,13 +41,12 @@ actor SampleWriter {
// MARK: Internal state // MARK: Internal state
private let reader: AVAssetReader private var reader: AVAssetReader?
private let writer: AVAssetWriter private var writer: AVAssetWriter?
private var audioOutput: AVAssetReaderAudioMixOutput? private var audioOutput: AVAssetReaderAudioMixOutput?
private var audioInput: AVAssetWriterInput? private var audioInput: AVAssetWriterInput?
private var videoOutput: AVAssetReaderVideoCompositionOutput? private var videoOutput: AVAssetReaderVideoCompositionOutput?
private var videoInput: AVAssetWriterInput? private var videoInput: AVAssetWriterInput?
private var isCancelled = false
nonisolated init( nonisolated init(
asset: sending AVAsset, asset: sending AVAsset,
@ -69,9 +68,22 @@ actor SampleWriter {
if let timeRange { if let timeRange {
reader.timeRange = timeRange reader.timeRange = timeRange
} }
self.reader = reader
let writer = try AVAssetWriter(outputURL: outputURL, fileType: fileType) let writer = try AVAssetWriter(outputURL: outputURL, fileType: fileType)
writer.shouldOptimizeForNetworkUse = optimizeForNetworkUse writer.shouldOptimizeForNetworkUse = optimizeForNetworkUse
writer.metadata = metadata writer.metadata = metadata
self.writer = writer
self.audioOutputSettings = audioOutputSettings
self.audioMix = audioMix
self.videoOutputSettings = videoOutputSettings
self.videoComposition = videoComposition
self.timeRange = if let timeRange {
timeRange
} else {
try await CMTimeRange(start: .zero, duration: asset.load(.duration))
}
// Filter out disabled tracks to avoid problems encoding spatial audio. Ideally this would // Filter out disabled tracks to avoid problems encoding spatial audio. Ideally this would
// preserve track groups and make that all configurable. // preserve track groups and make that all configurable.
@ -80,7 +92,24 @@ actor SampleWriter {
// Audio is optional so only validate output settings when it's applicable. // Audio is optional so only validate output settings when it's applicable.
if !audioTracks.isEmpty { if !audioTracks.isEmpty {
try Self.validateAudio(outputSettings: audioOutputSettings, writer: writer) try Self.validateAudio(outputSettings: audioOutputSettings, writer: writer)
let audioOutput = AVAssetReaderAudioMixOutput(audioTracks: audioTracks, audioSettings: nil)
audioOutput.alwaysCopiesSampleData = false
audioOutput.audioMix = audioMix
guard reader.canAdd(audioOutput) else {
throw Error.setupFailure(.cannotAddAudioOutput)
}
reader.add(audioOutput)
self.audioOutput = audioOutput
let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioOutputSettings)
audioInput.expectsMediaDataInRealTime = false
guard writer.canAdd(audioInput) else {
throw Error.setupFailure(.cannotAddAudioInput)
}
writer.add(audioInput)
self.audioInput = audioInput
} }
let videoTracks = try await asset.loadTracks(withMediaType: .video) let videoTracks = try await asset.loadTracks(withMediaType: .video)
.filterAsync { try await $0.load(.isEnabled) } .filterAsync { try await $0.load(.isEnabled) }
guard !videoTracks.isEmpty else { throw Error.setupFailure(.videoTracksEmpty) } guard !videoTracks.isEmpty else { throw Error.setupFailure(.videoTracksEmpty) }
@ -89,52 +118,61 @@ actor SampleWriter {
renderSize: videoComposition.renderSize, renderSize: videoComposition.renderSize,
settings: videoOutputSettings settings: videoOutputSettings
) )
let videoOutput = AVAssetReaderVideoCompositionOutput(
self.audioOutputSettings = audioOutputSettings videoTracks: videoTracks,
self.audioMix = audioMix videoSettings: nil
self.videoOutputSettings = videoOutputSettings )
self.videoComposition = videoComposition videoOutput.alwaysCopiesSampleData = false
self.reader = reader videoOutput.videoComposition = videoComposition
self.writer = writer guard reader.canAdd(videoOutput) else {
self.timeRange = if let timeRange { throw Error.setupFailure(.cannotAddVideoOutput)
timeRange
} else {
try await CMTimeRange(start: .zero, duration: asset.load(.duration))
} }
reader.add(videoOutput)
self.videoOutput = videoOutput
try await setUpAudio(audioTracks: audioTracks) let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings)
try await setUpVideo(videoTracks: videoTracks) videoInput.expectsMediaDataInRealTime = false
guard writer.canAdd(videoInput) else {
throw Error.setupFailure(.cannotAddVideoInput)
}
writer.add(videoInput)
self.videoInput = videoInput
} }
func writeSamples() async throws { func writeSamples() async throws {
guard let reader, let writer else { throw CancellationError() }
try Task.checkCancellation() try Task.checkCancellation()
// Clear all of these properties otherwise when we get cancelled then we leak a bunch of
// pixel buffers.
defer {
if Task.isCancelled {
reader.cancelReading()
writer.cancelWriting()
}
self.reader = nil
self.writer = nil
audioInput = nil
audioOutput = nil
videoInput = nil
videoOutput = nil
}
progressContinuation.yield(0.0) progressContinuation.yield(0.0)
writer.startWriting() writer.startWriting()
writer.startSession(atSourceTime: timeRange.start) writer.startSession(atSourceTime: timeRange.start)
reader.startReading() reader.startReading()
try Task.checkCancellation() try Task.checkCancellation()
startEncodingAudioTracks() startEncodingAudioTracks()
startEncodingVideoTracks() startEncodingVideoTracks()
while reader.status == .reading, writer.status == .writing { while reader.status == .reading, writer.status == .writing {
guard !Task.isCancelled else {
// Flag so that we stop writing samples
isCancelled = true
throw CancellationError()
}
try await Task.sleep(for: .milliseconds(10)) try await Task.sleep(for: .milliseconds(10))
} }
guard !isCancelled, reader.status != .cancelled, writer.status != .cancelled else {
log.debug("Cancelled before writing samples")
reader.cancelReading()
writer.cancelWriting()
throw CancellationError()
}
guard writer.status != .failed else { guard writer.status != .failed else {
reader.cancelReading() reader.cancelReading()
throw Error.writeFailure(writer.error) throw Error.writeFailure(writer.error)
@ -161,57 +199,6 @@ actor SampleWriter {
} }
} }
// MARK: - Setup
private func setUpAudio(audioTracks: [AVAssetTrack]) throws {
guard !audioTracks.isEmpty else { return }
let audioOutput = AVAssetReaderAudioMixOutput(audioTracks: audioTracks, audioSettings: nil)
audioOutput.alwaysCopiesSampleData = false
audioOutput.audioMix = audioMix
guard reader.canAdd(audioOutput) else {
throw Error.setupFailure(.cannotAddAudioOutput)
}
reader.add(audioOutput)
self.audioOutput = audioOutput
let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioOutputSettings)
audioInput.expectsMediaDataInRealTime = false
guard writer.canAdd(audioInput) else {
throw Error.setupFailure(.cannotAddAudioInput)
}
writer.add(audioInput)
self.audioInput = audioInput
}
private func setUpVideo(videoTracks: [AVAssetTrack]) throws {
precondition(!videoTracks.isEmpty, "Video tracks must be provided")
let videoOutput = AVAssetReaderVideoCompositionOutput(
videoTracks: videoTracks,
videoSettings: nil
)
videoOutput.alwaysCopiesSampleData = false
videoOutput.videoComposition = videoComposition
guard reader.canAdd(videoOutput) else {
throw Error.setupFailure(.cannotAddVideoOutput)
}
reader.add(videoOutput)
self.videoOutput = videoOutput
let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings)
videoInput.expectsMediaDataInRealTime = false
guard writer.canAdd(videoInput) else {
throw Error.setupFailure(.cannotAddVideoInput)
}
writer.add(videoInput)
self.videoInput = videoInput
}
func cancel() async {
isCancelled = true
}
// MARK: - Encoding // MARK: - Encoding
private func startEncodingAudioTracks() { private func startEncodingAudioTracks() {
@ -221,48 +208,31 @@ actor SampleWriter {
} }
audioInput.requestMediaDataWhenReady(on: queue) { audioInput.requestMediaDataWhenReady(on: queue) {
// NOTE: assumeIsolated crashes on macOS at the moment Task { await self.writeAllReadySamples() }
self.assumeIsolated { _self in
_self.writeAllReadySamples()
}
} }
} }
private func startEncodingVideoTracks() { private func startEncodingVideoTracks() {
videoInput!.requestMediaDataWhenReady(on: queue) { videoInput?.requestMediaDataWhenReady(on: queue) {
// NOTE: assumeIsolated crashes on macOS at the moment Task { await self.writeAllReadySamples() }
self.assumeIsolated { _self in
_self.writeAllReadySamples()
}
} }
} }
private func writeAllReadySamples() { private func writeAllReadySamples() {
guard !isCancelled else {
log.debug("Cancelled while writing samples")
reader.cancelReading()
writer.cancelWriting()
return
}
if let audioInput, let audioOutput { if let audioInput, let audioOutput {
let hasMoreAudio = writeReadySamples(output: audioOutput, input: audioInput) let hasMoreAudio = writeReadySamples(output: audioOutput, input: audioInput)
if !hasMoreAudio { log.debug("Finished encoding audio") } if !hasMoreAudio { log.debug("Finished encoding audio") }
} }
let hasMoreVideo = writeReadySamples(output: videoOutput!, input: videoInput!) if let videoInput, let videoOutput {
if !hasMoreVideo { log.debug("Finished encoding video") } let hasMoreVideo = writeReadySamples(output: videoOutput, input: videoInput)
if !hasMoreVideo { log.debug("Finished encoding video") }
}
} }
private func writeReadySamples(output: AVAssetReaderOutput, input: AVAssetWriterInput) -> Bool { private func writeReadySamples(output: AVAssetReaderOutput, input: AVAssetWriterInput) -> Bool {
while input.isReadyForMoreMediaData { while input.isReadyForMoreMediaData {
guard !isCancelled else { guard reader?.status == .reading && writer?.status == .writing,
log.debug("Cancelled while writing samples")
reader.cancelReading()
writer.cancelWriting()
return false
}
guard reader.status == .reading && writer.status == .writing,
let sampleBuffer = output.copyNextSampleBuffer() else { let sampleBuffer = output.copyNextSampleBuffer() else {
input.markAsFinished() input.markAsFinished()
return false return false
@ -321,7 +291,7 @@ actor SampleWriter {
let renderWidth = Int(renderSize.width) let renderWidth = Int(renderSize.width)
let renderHeight = Int(renderSize.height) let renderHeight = Int(renderSize.height)
if renderWidth != settingsWidth || renderHeight != settingsHeight { if renderWidth != settingsWidth || renderHeight != settingsHeight {
log.warning("Video composition's render size (\(renderWidth)\(renderHeight)) will be overriden by video output settings (\(settingsWidth)\(settingsHeight))") log.warning("Video composition's render size (\(renderWidth)\(renderHeight)) will be overridden by video output settings (\(settingsWidth)\(settingsHeight))")
} }
} }
} }

View file

@ -1,14 +0,0 @@
//
// AVAsset+sending.swift
// SJSAssetExportSessionTests
//
// Created by Sami Samhuri on 2024-07-07.
//
import AVFoundation
extension AVAsset {
func sendTracks(withMediaType mediaType: AVMediaType) async throws -> sending [AVAssetTrack] {
try await loadTracks(withMediaType: mediaType)
}
}

View file

@ -0,0 +1,50 @@
//
// BaseTests.swift
// SJSAssetExportSession
//
// Created by Sami Samhuri on 2025-01-19.
//
import AVFoundation
import Foundation
import Testing
class BaseTests {
func resourceURL(named name: String) -> URL {
Bundle.module.resourceURL!.appending(component: name)
}
func makeAsset(url: URL) -> sending AVAsset {
AVURLAsset(url: url, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true,
])
}
func makeTemporaryURL(function: String = #function) -> AutoDestructingURL {
let timestamp = Int(Date.now.timeIntervalSince1970)
let f = function.replacing(/[\(\)]/, with: { _ in "" })
let filename = "\(Self.self)_\(f)_\(timestamp).mp4"
let url = URL.temporaryDirectory.appending(component: filename)
return AutoDestructingURL(url: url)
}
func makeVideoComposition(
assetURL: URL,
size: CGSize? = nil,
fps: Int? = nil
) async throws -> sending AVMutableVideoComposition {
let asset = makeAsset(url: assetURL)
let videoComposition = try await AVMutableVideoComposition.videoComposition(
withPropertiesOf: asset
)
if let size {
videoComposition.renderSize = size
}
if let fps {
let seconds = 1.0 / TimeInterval(fps)
videoComposition.sourceTrackIDForFrameTiming = kCMPersistentTrackID_Invalid
videoComposition.frameDuration = CMTime(seconds: seconds, preferredTimescale: 600)
}
return videoComposition
}
}

View file

@ -10,45 +10,7 @@ import CoreLocation
import SJSAssetExportSession import SJSAssetExportSession
import Testing import Testing
final class ExportSessionTests { final class ExportSessionTests: BaseTests {
private func resourceURL(named name: String) -> URL {
Bundle.module.resourceURL!.appending(component: name)
}
private func makeAsset(url: URL) -> sending AVAsset {
AVURLAsset(url: url, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true,
])
}
private func makeTemporaryURL(function: String = #function) -> AutoDestructingURL {
let timestamp = Int(Date.now.timeIntervalSince1970)
let f = function.replacing(/[\(\)]/, with: { _ in "" })
let filename = "\(Self.self)_\(f)_\(timestamp).mp4"
let url = URL.temporaryDirectory.appending(component: filename)
return AutoDestructingURL(url: url)
}
private func makeVideoComposition(
assetURL: URL,
size: CGSize? = nil,
fps: Int? = nil
) async throws -> sending AVMutableVideoComposition {
let asset = makeAsset(url: assetURL)
let videoComposition = try await AVMutableVideoComposition.videoComposition(
withPropertiesOf: asset
)
if let size {
videoComposition.renderSize = size
}
if let fps {
let seconds = 1.0 / TimeInterval(fps)
videoComposition.sourceTrackIDForFrameTiming = kCMPersistentTrackID_Invalid
videoComposition.frameDuration = CMTime(seconds: seconds, preferredTimescale: 600)
}
return videoComposition
}
@Test func test_sugary_export_720p_h264_24fps() async throws { @Test func test_sugary_export_720p_h264_24fps() async throws {
let sourceURL = resourceURL(named: "test-4k-hdr-hevc-30fps.mov") let sourceURL = resourceURL(named: "test-4k-hdr-hevc-30fps.mov")
let destinationURL = makeTemporaryURL() let destinationURL = makeTemporaryURL()
@ -68,20 +30,20 @@ final class ExportSessionTests {
let exportedAsset = AVURLAsset(url: destinationURL.url) let exportedAsset = AVURLAsset(url: destinationURL.url)
#expect(try await exportedAsset.load(.duration) == .seconds(1)) #expect(try await exportedAsset.load(.duration) == .seconds(1))
// Audio // Audio
try #require(try await exportedAsset.sendTracks(withMediaType: .audio).count == 1) try #require(try await exportedAsset.loadTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.sendTracks(withMediaType: .audio).first) let audioTrack = try #require(await exportedAsset.loadTracks(withMediaType: .audio).first)
let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first) let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first)
#expect(audioFormat.mediaType == .audio) #expect(audioFormat.mediaType == .audio)
#expect(audioFormat.mediaSubType == .mpeg4AAC) #expect(audioFormat.mediaSubType == .mpeg4AAC)
#expect(audioFormat.audioChannelLayout?.numberOfChannels == 2) #expect(audioFormat.audioChannelLayout?.numberOfChannels == 2)
#expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100) #expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100)
// Video // Video
try #require(await exportedAsset.sendTracks(withMediaType: .video).count == 1) try #require(await exportedAsset.loadTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first) let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720)) #expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
#expect(try await videoTrack.load(.nominalFrameRate) == 24.0) #expect(try await videoTrack.load(.nominalFrameRate) == 24.0)
let dataRate = try await videoTrack.load(.estimatedDataRate) let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((1_000_000 ... 1_100_000).contains(dataRate)) #expect((900_000 ... 1_130_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first) let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video) #expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264) #expect(videoFormat.mediaSubType == .h264)
@ -117,20 +79,20 @@ final class ExportSessionTests {
let exportedAsset = AVURLAsset(url: destinationURL.url) let exportedAsset = AVURLAsset(url: destinationURL.url)
#expect(try await exportedAsset.load(.duration) == .seconds(1)) #expect(try await exportedAsset.load(.duration) == .seconds(1))
// Audio // Audio
try #require(try await exportedAsset.sendTracks(withMediaType: .audio).count == 1) try #require(try await exportedAsset.loadTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.sendTracks(withMediaType: .audio).first) let audioTrack = try #require(await exportedAsset.loadTracks(withMediaType: .audio).first)
let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first) let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first)
#expect(audioFormat.mediaType == .audio) #expect(audioFormat.mediaType == .audio)
#expect(audioFormat.mediaSubType == .mpeg4AAC) #expect(audioFormat.mediaSubType == .mpeg4AAC)
#expect(audioFormat.audioChannelLayout?.numberOfChannels == 2) #expect(audioFormat.audioChannelLayout?.numberOfChannels == 2)
#expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100) #expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100)
// Video // Video
try #require(await exportedAsset.sendTracks(withMediaType: .video).count == 1) try #require(await exportedAsset.loadTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first) let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720)) #expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
#expect(try await videoTrack.load(.nominalFrameRate) == 24.0) #expect(try await videoTrack.load(.nominalFrameRate) == 24.0)
let dataRate = try await videoTrack.load(.estimatedDataRate) let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((1_000_000 ... 1_100_000).contains(dataRate)) #expect((900_000 ... 1_130_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first) let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video) #expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264) #expect(videoFormat.mediaSubType == .h264)
@ -139,7 +101,7 @@ final class ExportSessionTests {
#expect(videoFormat.extensions[.yCbCrMatrix] == .yCbCrMatrix(.itu_R_709_2)) #expect(videoFormat.extensions[.yCbCrMatrix] == .yCbCrMatrix(.itu_R_709_2))
} }
@Test func test_export_default_timerange() async throws { @Test func test_export_default_time_range() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let destinationURL = makeTemporaryURL() let destinationURL = makeTemporaryURL()
@ -207,9 +169,11 @@ final class ExportSessionTests {
) )
let exportedAsset = AVURLAsset(url: destinationURL.url) let exportedAsset = AVURLAsset(url: destinationURL.url)
let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first) let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1920, height: 1080)) let naturalSize = try await videoTrack.load(.naturalSize)
#expect(try await videoTrack.load(.nominalFrameRate) == 30.0) #expect(naturalSize == CGSize(width: 1920, height: 1080))
let fps = try await videoTrack.load(.nominalFrameRate)
#expect(Int(fps.rounded()) == 30)
let dataRate = try await videoTrack.load(.estimatedDataRate) let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((2_400_000 ... 2_700_000).contains(dataRate)) #expect((2_400_000 ... 2_700_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first) let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
@ -262,17 +226,17 @@ final class ExportSessionTests {
@Test func test_export_throws_with_empty_audio_settings() async throws { @Test func test_export_throws_with_empty_audio_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsEmpty)) { try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsEmpty)) {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov")
let videoComposition = try await makeVideoComposition(assetURL: sourceURL) let videoComposition = try await self.makeVideoComposition(assetURL: sourceURL)
let subject = ExportSession() let subject = ExportSession()
try await subject.export( try await subject.export(
asset: makeAsset(url: sourceURL), asset: self.makeAsset(url: sourceURL),
audioOutputSettings: [:], // Here it matters because there's an audio track audioOutputSettings: [:], // Here it matters because there's an audio track
videoOutputSettings: VideoOutputSettings videoOutputSettings: VideoOutputSettings
.codec(.h264, size: videoComposition.renderSize).settingsDictionary, .codec(.h264, size: videoComposition.renderSize).settingsDictionary,
composition: videoComposition, composition: videoComposition,
to: makeTemporaryURL().url, to: self.makeTemporaryURL().url,
as: .mov as: .mov
) )
} }
@ -280,18 +244,18 @@ final class ExportSessionTests {
@Test func test_export_throws_with_invalid_audio_settings() async throws { @Test func test_export_throws_with_invalid_audio_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsInvalid)) { try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsInvalid)) {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov")
let subject = ExportSession() let subject = ExportSession()
try await subject.export( try await subject.export(
asset: makeAsset(url: sourceURL), asset: self.makeAsset(url: sourceURL),
audioOutputSettings: [ audioOutputSettings: [
AVFormatIDKey: kAudioFormatMPEG4AAC, AVFormatIDKey: kAudioFormatMPEG4AAC,
AVNumberOfChannelsKey: NSNumber(value: -1), // invalid number of channels AVNumberOfChannelsKey: NSNumber(value: -1), // invalid number of channels
], ],
videoOutputSettings: VideoOutputSettings videoOutputSettings: VideoOutputSettings
.codec(.h264, size: CGSize(width: 1280, height: 720)).settingsDictionary, .codec(.h264, size: CGSize(width: 1280, height: 720)).settingsDictionary,
to: makeTemporaryURL().url, to: self.makeTemporaryURL().url,
as: .mov as: .mov
) )
} }
@ -299,12 +263,12 @@ final class ExportSessionTests {
@Test func test_export_throws_with_invalid_video_settings() async throws { @Test func test_export_throws_with_invalid_video_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.videoSettingsInvalid)) { try await #require(throws: ExportSession.Error.setupFailure(.videoSettingsInvalid)) {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov")
let size = CGSize(width: 1280, height: 720) let size = CGSize(width: 1280, height: 720)
let subject = ExportSession() let subject = ExportSession()
try await subject.export( try await subject.export(
asset: makeAsset(url: sourceURL), asset: self.makeAsset(url: sourceURL),
audioOutputSettings: AudioOutputSettings.default.settingsDictionary, audioOutputSettings: AudioOutputSettings.default.settingsDictionary,
videoOutputSettings: [ videoOutputSettings: [
// missing codec // missing codec
@ -312,7 +276,7 @@ final class ExportSessionTests {
AVVideoHeightKey: NSNumber(value: Int(size.height)), AVVideoHeightKey: NSNumber(value: Int(size.height)),
], ],
composition: nil, composition: nil,
to: makeTemporaryURL().url, to: self.makeTemporaryURL().url,
as: .mov as: .mov
) )
} }
@ -320,12 +284,12 @@ final class ExportSessionTests {
@Test func test_export_throws_with_no_video_track() async throws { @Test func test_export_throws_with_no_video_track() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.videoTracksEmpty)) { try await #require(throws: ExportSession.Error.setupFailure(.videoTracksEmpty)) {
let sourceURL = resourceURL(named: "test-no-video.m4a") let sourceURL = self.resourceURL(named: "test-no-video.m4a")
let subject = ExportSession() let subject = ExportSession()
try await subject.export( try await subject.export(
asset: makeAsset(url: sourceURL), asset: self.makeAsset(url: sourceURL),
video: .codec(.h264, width: 1280, height: 720), video: .codec(.h264, width: 1280, height: 720),
to: makeTemporaryURL().url, to: self.makeTemporaryURL().url,
as: .mov as: .mov
) )
} }
@ -334,11 +298,11 @@ final class ExportSessionTests {
@Test func test_export_cancellation() async throws { @Test func test_export_cancellation() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let destinationURL💥 = makeTemporaryURL() let destinationURL💥 = makeTemporaryURL()
let subject = ExportSession()
let task = Task { let task = Task {
let sourceAsset = AVURLAsset(url: sourceURL, options: [ let sourceAsset = AVURLAsset(url: sourceURL, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true, AVURLAssetPreferPreciseDurationAndTimingKey: true,
]) ])
let subject = ExportSession()
try await subject.export( try await subject.export(
asset: sourceAsset, asset: sourceAsset,
video: .codec(.h264, width: 1280, height: 720), video: .codec(.h264, width: 1280, height: 720),
@ -347,8 +311,10 @@ final class ExportSessionTests {
) )
Issue.record("Task should be cancelled long before we get here") Issue.record("Task should be cancelled long before we get here")
} }
NSLog("Sleeping for 0.3s") NSLog("Waiting for encoding to begin...")
try await Task.sleep(for: .milliseconds(300)) for await progress in subject.progressStream where progress > 0 {
break
}
NSLog("Cancelling task") NSLog("Cancelling task")
task.cancel() task.cancel()
try? await task.value // Wait for task to complete try? await task.value // Wait for task to complete