Compare commits

..

No commits in common. "main" and "0.3.7" have entirely different histories.
main ... 0.3.7

5 changed files with 118 additions and 244 deletions

View file

@ -1,153 +0,0 @@
# Changelog
## [Unreleased]
- Your change here.
[Unreleased]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.4.0...HEAD
## [0.4.0] - 2025-09-10
### Fixed
- Fixed building with Xcode 26 RC
[0.4.0]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.9...0.4.0
## [0.3.9] - 2025-05-25
### Fixed
- Fixed crash on iOS 17 by using a new task instead of assumeIsolated
[0.3.9]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.8...0.3.9
## [0.3.8] - 2025-04-04
### Fixed
- Fixed crash when cancelled while writing samples
- Fixed tests with Swift 6.1 on macOS
- Fixed tests in Xcode 16.4 on macOS 15.5
- Fixed warnings in tests in Xcode 16.3
### Changed
- Stopped relying on specific delay in cancellation test
- Updated readme for 0.3.8
[0.3.8]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.7...0.3.8
## [0.3.7] - 2025-01-19
### Fixed
- Simplified cancellation and fixed memory leak
[0.3.7]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.6...0.3.7
## [0.3.6] - 2025-01-19
### Fixed
- Attempted to fix possible retain cycle
[0.3.6]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.5...0.3.6
## [0.3.5] - 2025-01-19
### Fixed
- Improved cancellation response (potential memory leak issue)
### Removed
- Deleted dead code
### Changed
- Extracted BaseTests class for better test organization
[0.3.5]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.4...0.3.5
## [0.3.4] - 2024-11-08
### Fixed
- [#3](https://github.com/samsonjs/SJSAssetExportSession/pull/3): Fixed encoding stalling by interleaving audio and video samples - [@samsonjs](https://github.com/samsonjs).
### Changed
- Updated readme with additional documentation
[0.3.4]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.3...0.3.4
## [0.3.3] - 2024-10-19
### Changed
- Made AudioOutputSettings and VideoOutputSettings properties public
### Fixed
- Made tests work on iOS 18.0 and iOS 18.1
- Fixed progress test
### Removed
- Removed SampleWriter.duration property
[0.3.3]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.2...0.3.3
## [0.3.2] - 2024-10-19
### Fixed
- Fixed release builds by using makeStream for SampleWriter's progress
### Changed
- Updated example in readme to version 0.3.2
[0.3.2]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.1...0.3.2
## [0.3.1] - 2024-10-19
### Fixed
- Removed unnecessary Task.yield() to fix intermittent hang
### Changed
- Improved code style and debuggability
- Updated version in readme to 0.3.1
[0.3.1]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3...0.3.1
## [0.3] - 2024-10-18
### Added
- Made audio/video settings Hashable, Sendable, and Codable
### Changed
- Updated readme for version 0.3
- Fixed SwiftPM instructions in readme
[0.3]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.2...0.3
## [0.2] - 2024-10-04
### Fixed
- [#2](https://github.com/samsonjs/SJSAssetExportSession/pull/2): Fixed spatial audio handling by dropping spatial audio tracks to fix encoding iPhone 16 videos - [@samsonjs](https://github.com/samsonjs).
### Changed
- Code style improvements
- Updated version in readme's SPM example
[0.2]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.1...0.2
## [0.1] - 2024-09-18
### Added
- Initial release as Swift Package
- Alternative to AVAssetExportSession with custom audio/video settings
- Builder pattern API for AudioOutputSettings and VideoOutputSettings
- Flexible raw dictionary API for maximum control
- Progress reporting via AsyncStream
- Support for iOS 17.0+, macOS 14.0+, and visionOS 1.3+
- Swift 6 strict concurrency support
- Comprehensive test suite with multiple video formats
### Changed
- Converted from Xcode project to Swift package
- Made yielding last progress value more reliable
- Set deployment targets to iOS 17, macOS 14, and visionOS 1.3
### Added
- Support for writing metadata on assets
- Documentation for most public API
- README and license files
[0.1]: https://github.com/samsonjs/SJSAssetExportSession/releases/tag/0.1

View file

@ -34,7 +34,7 @@ When you're integrating this into an app with Xcode then go to your project's Pa
When you're integrating this using SPM on its own then add this to the list of dependencies your Package.swift file: When you're integrating this using SPM on its own then add this to the list of dependencies your Package.swift file:
```swift ```swift
.package(url: "https://github.com/samsonjs/SJSAssetExportSession.git", .upToNextMajor(from: "0.4.0")) .package(url: "https://github.com/samsonjs/SJSAssetExportSession.git", .upToNextMajor(from: "0.3.7"))
``` ```
and then add `"SJSAssetExportSession"` to the list of dependencies in your target as well. and then add `"SJSAssetExportSession"` to the list of dependencies in your target as well.
@ -198,6 +198,6 @@ try await exporter.export(
## License ## License
Copyright © 2024-2025 [Sami Samhuri](https://samhuri.net) <sami@samhuri.net>. Released under the terms of the [MIT License][MIT]. Copyright © 2024 [Sami Samhuri](https://samhuri.net) <sami@samhuri.net>. Released under the terms of the [MIT License][MIT].
[MIT]: https://sjs.mit-license.org [MIT]: https://sjs.mit-license.org

View file

@ -68,22 +68,9 @@ actor SampleWriter {
if let timeRange { if let timeRange {
reader.timeRange = timeRange reader.timeRange = timeRange
} }
self.reader = reader
let writer = try AVAssetWriter(outputURL: outputURL, fileType: fileType) let writer = try AVAssetWriter(outputURL: outputURL, fileType: fileType)
writer.shouldOptimizeForNetworkUse = optimizeForNetworkUse writer.shouldOptimizeForNetworkUse = optimizeForNetworkUse
writer.metadata = metadata writer.metadata = metadata
self.writer = writer
self.audioOutputSettings = audioOutputSettings
self.audioMix = audioMix
self.videoOutputSettings = videoOutputSettings
self.videoComposition = videoComposition
self.timeRange = if let timeRange {
timeRange
} else {
try await CMTimeRange(start: .zero, duration: asset.load(.duration))
}
// Filter out disabled tracks to avoid problems encoding spatial audio. Ideally this would // Filter out disabled tracks to avoid problems encoding spatial audio. Ideally this would
// preserve track groups and make that all configurable. // preserve track groups and make that all configurable.
@ -92,24 +79,7 @@ actor SampleWriter {
// Audio is optional so only validate output settings when it's applicable. // Audio is optional so only validate output settings when it's applicable.
if !audioTracks.isEmpty { if !audioTracks.isEmpty {
try Self.validateAudio(outputSettings: audioOutputSettings, writer: writer) try Self.validateAudio(outputSettings: audioOutputSettings, writer: writer)
let audioOutput = AVAssetReaderAudioMixOutput(audioTracks: audioTracks, audioSettings: nil)
audioOutput.alwaysCopiesSampleData = false
audioOutput.audioMix = audioMix
guard reader.canAdd(audioOutput) else {
throw Error.setupFailure(.cannotAddAudioOutput)
}
reader.add(audioOutput)
self.audioOutput = audioOutput
let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioOutputSettings)
audioInput.expectsMediaDataInRealTime = false
guard writer.canAdd(audioInput) else {
throw Error.setupFailure(.cannotAddAudioInput)
}
writer.add(audioInput)
self.audioInput = audioInput
} }
let videoTracks = try await asset.loadTracks(withMediaType: .video) let videoTracks = try await asset.loadTracks(withMediaType: .video)
.filterAsync { try await $0.load(.isEnabled) } .filterAsync { try await $0.load(.isEnabled) }
guard !videoTracks.isEmpty else { throw Error.setupFailure(.videoTracksEmpty) } guard !videoTracks.isEmpty else { throw Error.setupFailure(.videoTracksEmpty) }
@ -118,25 +88,21 @@ actor SampleWriter {
renderSize: videoComposition.renderSize, renderSize: videoComposition.renderSize,
settings: videoOutputSettings settings: videoOutputSettings
) )
let videoOutput = AVAssetReaderVideoCompositionOutput(
videoTracks: videoTracks,
videoSettings: nil
)
videoOutput.alwaysCopiesSampleData = false
videoOutput.videoComposition = videoComposition
guard reader.canAdd(videoOutput) else {
throw Error.setupFailure(.cannotAddVideoOutput)
}
reader.add(videoOutput)
self.videoOutput = videoOutput
let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings) self.audioOutputSettings = audioOutputSettings
videoInput.expectsMediaDataInRealTime = false self.audioMix = audioMix
guard writer.canAdd(videoInput) else { self.videoOutputSettings = videoOutputSettings
throw Error.setupFailure(.cannotAddVideoInput) self.videoComposition = videoComposition
self.reader = reader
self.writer = writer
self.timeRange = if let timeRange {
timeRange
} else {
try await CMTimeRange(start: .zero, duration: asset.load(.duration))
} }
writer.add(videoInput)
self.videoInput = videoInput try await setUpAudio(audioTracks: audioTracks)
try await setUpVideo(videoTracks: videoTracks)
} }
func writeSamples() async throws { func writeSamples() async throws {
@ -199,6 +165,53 @@ actor SampleWriter {
} }
} }
// MARK: - Setup
private func setUpAudio(audioTracks: [AVAssetTrack]) throws {
guard !audioTracks.isEmpty else { return }
let audioOutput = AVAssetReaderAudioMixOutput(audioTracks: audioTracks, audioSettings: nil)
audioOutput.alwaysCopiesSampleData = false
audioOutput.audioMix = audioMix
guard let reader, reader.canAdd(audioOutput) else {
throw Error.setupFailure(.cannotAddAudioOutput)
}
reader.add(audioOutput)
self.audioOutput = audioOutput
let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioOutputSettings)
audioInput.expectsMediaDataInRealTime = false
guard let writer, writer.canAdd(audioInput) else {
throw Error.setupFailure(.cannotAddAudioInput)
}
writer.add(audioInput)
self.audioInput = audioInput
}
private func setUpVideo(videoTracks: [AVAssetTrack]) throws {
precondition(!videoTracks.isEmpty, "Video tracks must be provided")
let videoOutput = AVAssetReaderVideoCompositionOutput(
videoTracks: videoTracks,
videoSettings: nil
)
videoOutput.alwaysCopiesSampleData = false
videoOutput.videoComposition = videoComposition
guard let reader, reader.canAdd(videoOutput) else {
throw Error.setupFailure(.cannotAddVideoOutput)
}
reader.add(videoOutput)
self.videoOutput = videoOutput
let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings)
videoInput.expectsMediaDataInRealTime = false
guard let writer, writer.canAdd(videoInput) else {
throw Error.setupFailure(.cannotAddVideoInput)
}
writer.add(videoInput)
self.videoInput = videoInput
}
// MARK: - Encoding // MARK: - Encoding
private func startEncodingAudioTracks() { private func startEncodingAudioTracks() {
@ -208,13 +221,19 @@ actor SampleWriter {
} }
audioInput.requestMediaDataWhenReady(on: queue) { audioInput.requestMediaDataWhenReady(on: queue) {
Task { await self.writeAllReadySamples() } // NOTE: assumeIsolated crashes on macOS at the moment
self.assumeIsolated { _self in
_self.writeAllReadySamples()
}
} }
} }
private func startEncodingVideoTracks() { private func startEncodingVideoTracks() {
videoInput?.requestMediaDataWhenReady(on: queue) { videoInput!.requestMediaDataWhenReady(on: queue) {
Task { await self.writeAllReadySamples() } // NOTE: assumeIsolated crashes on macOS at the moment
self.assumeIsolated { _self in
_self.writeAllReadySamples()
}
} }
} }
@ -224,10 +243,8 @@ actor SampleWriter {
if !hasMoreAudio { log.debug("Finished encoding audio") } if !hasMoreAudio { log.debug("Finished encoding audio") }
} }
if let videoInput, let videoOutput { let hasMoreVideo = writeReadySamples(output: videoOutput!, input: videoInput!)
let hasMoreVideo = writeReadySamples(output: videoOutput, input: videoInput) if !hasMoreVideo { log.debug("Finished encoding video") }
if !hasMoreVideo { log.debug("Finished encoding video") }
}
} }
private func writeReadySamples(output: AVAssetReaderOutput, input: AVAssetWriterInput) -> Bool { private func writeReadySamples(output: AVAssetReaderOutput, input: AVAssetWriterInput) -> Bool {
@ -291,7 +308,7 @@ actor SampleWriter {
let renderWidth = Int(renderSize.width) let renderWidth = Int(renderSize.width)
let renderHeight = Int(renderSize.height) let renderHeight = Int(renderSize.height)
if renderWidth != settingsWidth || renderHeight != settingsHeight { if renderWidth != settingsWidth || renderHeight != settingsHeight {
log.warning("Video composition's render size (\(renderWidth)\(renderHeight)) will be overridden by video output settings (\(settingsWidth)\(settingsHeight))") log.warning("Video composition's render size (\(renderWidth)\(renderHeight)) will be overriden by video output settings (\(settingsWidth)\(settingsHeight))")
} }
} }
} }

View file

@ -0,0 +1,14 @@
//
// AVAsset+sending.swift
// SJSAssetExportSessionTests
//
// Created by Sami Samhuri on 2024-07-07.
//
import AVFoundation
extension AVAsset {
func sendTracks(withMediaType mediaType: AVMediaType) async throws -> sending [AVAssetTrack] {
try await loadTracks(withMediaType: mediaType)
}
}

View file

@ -30,20 +30,20 @@ final class ExportSessionTests: BaseTests {
let exportedAsset = AVURLAsset(url: destinationURL.url) let exportedAsset = AVURLAsset(url: destinationURL.url)
#expect(try await exportedAsset.load(.duration) == .seconds(1)) #expect(try await exportedAsset.load(.duration) == .seconds(1))
// Audio // Audio
try #require(try await exportedAsset.loadTracks(withMediaType: .audio).count == 1) try #require(try await exportedAsset.sendTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.loadTracks(withMediaType: .audio).first) let audioTrack = try #require(await exportedAsset.sendTracks(withMediaType: .audio).first)
let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first) let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first)
#expect(audioFormat.mediaType == .audio) #expect(audioFormat.mediaType == .audio)
#expect(audioFormat.mediaSubType == .mpeg4AAC) #expect(audioFormat.mediaSubType == .mpeg4AAC)
#expect(audioFormat.audioChannelLayout?.numberOfChannels == 2) #expect(audioFormat.audioChannelLayout?.numberOfChannels == 2)
#expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100) #expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100)
// Video // Video
try #require(await exportedAsset.loadTracks(withMediaType: .video).count == 1) try #require(await exportedAsset.sendTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first) let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720)) #expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
#expect(try await videoTrack.load(.nominalFrameRate) == 24.0) #expect(try await videoTrack.load(.nominalFrameRate) == 24.0)
let dataRate = try await videoTrack.load(.estimatedDataRate) let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((900_000 ... 1_130_000).contains(dataRate)) #expect((1_000_000 ... 1_100_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first) let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video) #expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264) #expect(videoFormat.mediaSubType == .h264)
@ -79,20 +79,20 @@ final class ExportSessionTests: BaseTests {
let exportedAsset = AVURLAsset(url: destinationURL.url) let exportedAsset = AVURLAsset(url: destinationURL.url)
#expect(try await exportedAsset.load(.duration) == .seconds(1)) #expect(try await exportedAsset.load(.duration) == .seconds(1))
// Audio // Audio
try #require(try await exportedAsset.loadTracks(withMediaType: .audio).count == 1) try #require(try await exportedAsset.sendTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.loadTracks(withMediaType: .audio).first) let audioTrack = try #require(await exportedAsset.sendTracks(withMediaType: .audio).first)
let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first) let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first)
#expect(audioFormat.mediaType == .audio) #expect(audioFormat.mediaType == .audio)
#expect(audioFormat.mediaSubType == .mpeg4AAC) #expect(audioFormat.mediaSubType == .mpeg4AAC)
#expect(audioFormat.audioChannelLayout?.numberOfChannels == 2) #expect(audioFormat.audioChannelLayout?.numberOfChannels == 2)
#expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100) #expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100)
// Video // Video
try #require(await exportedAsset.loadTracks(withMediaType: .video).count == 1) try #require(await exportedAsset.sendTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first) let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720)) #expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
#expect(try await videoTrack.load(.nominalFrameRate) == 24.0) #expect(try await videoTrack.load(.nominalFrameRate) == 24.0)
let dataRate = try await videoTrack.load(.estimatedDataRate) let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((900_000 ... 1_130_000).contains(dataRate)) #expect((1_000_000 ... 1_100_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first) let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video) #expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264) #expect(videoFormat.mediaSubType == .h264)
@ -101,7 +101,7 @@ final class ExportSessionTests: BaseTests {
#expect(videoFormat.extensions[.yCbCrMatrix] == .yCbCrMatrix(.itu_R_709_2)) #expect(videoFormat.extensions[.yCbCrMatrix] == .yCbCrMatrix(.itu_R_709_2))
} }
@Test func test_export_default_time_range() async throws { @Test func test_export_default_timerange() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let destinationURL = makeTemporaryURL() let destinationURL = makeTemporaryURL()
@ -169,11 +169,9 @@ final class ExportSessionTests: BaseTests {
) )
let exportedAsset = AVURLAsset(url: destinationURL.url) let exportedAsset = AVURLAsset(url: destinationURL.url)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first) let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first)
let naturalSize = try await videoTrack.load(.naturalSize) #expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1920, height: 1080))
#expect(naturalSize == CGSize(width: 1920, height: 1080)) #expect(try await videoTrack.load(.nominalFrameRate) == 30.0)
let fps = try await videoTrack.load(.nominalFrameRate)
#expect(Int(fps.rounded()) == 30)
let dataRate = try await videoTrack.load(.estimatedDataRate) let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((2_400_000 ... 2_700_000).contains(dataRate)) #expect((2_400_000 ... 2_700_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first) let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
@ -226,17 +224,17 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_empty_audio_settings() async throws { @Test func test_export_throws_with_empty_audio_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsEmpty)) { try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsEmpty)) {
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let videoComposition = try await self.makeVideoComposition(assetURL: sourceURL) let videoComposition = try await makeVideoComposition(assetURL: sourceURL)
let subject = ExportSession() let subject = ExportSession()
try await subject.export( try await subject.export(
asset: self.makeAsset(url: sourceURL), asset: makeAsset(url: sourceURL),
audioOutputSettings: [:], // Here it matters because there's an audio track audioOutputSettings: [:], // Here it matters because there's an audio track
videoOutputSettings: VideoOutputSettings videoOutputSettings: VideoOutputSettings
.codec(.h264, size: videoComposition.renderSize).settingsDictionary, .codec(.h264, size: videoComposition.renderSize).settingsDictionary,
composition: videoComposition, composition: videoComposition,
to: self.makeTemporaryURL().url, to: makeTemporaryURL().url,
as: .mov as: .mov
) )
} }
@ -244,18 +242,18 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_invalid_audio_settings() async throws { @Test func test_export_throws_with_invalid_audio_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsInvalid)) { try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsInvalid)) {
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let subject = ExportSession() let subject = ExportSession()
try await subject.export( try await subject.export(
asset: self.makeAsset(url: sourceURL), asset: makeAsset(url: sourceURL),
audioOutputSettings: [ audioOutputSettings: [
AVFormatIDKey: kAudioFormatMPEG4AAC, AVFormatIDKey: kAudioFormatMPEG4AAC,
AVNumberOfChannelsKey: NSNumber(value: -1), // invalid number of channels AVNumberOfChannelsKey: NSNumber(value: -1), // invalid number of channels
], ],
videoOutputSettings: VideoOutputSettings videoOutputSettings: VideoOutputSettings
.codec(.h264, size: CGSize(width: 1280, height: 720)).settingsDictionary, .codec(.h264, size: CGSize(width: 1280, height: 720)).settingsDictionary,
to: self.makeTemporaryURL().url, to: makeTemporaryURL().url,
as: .mov as: .mov
) )
} }
@ -263,12 +261,12 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_invalid_video_settings() async throws { @Test func test_export_throws_with_invalid_video_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.videoSettingsInvalid)) { try await #require(throws: ExportSession.Error.setupFailure(.videoSettingsInvalid)) {
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let size = CGSize(width: 1280, height: 720) let size = CGSize(width: 1280, height: 720)
let subject = ExportSession() let subject = ExportSession()
try await subject.export( try await subject.export(
asset: self.makeAsset(url: sourceURL), asset: makeAsset(url: sourceURL),
audioOutputSettings: AudioOutputSettings.default.settingsDictionary, audioOutputSettings: AudioOutputSettings.default.settingsDictionary,
videoOutputSettings: [ videoOutputSettings: [
// missing codec // missing codec
@ -276,7 +274,7 @@ final class ExportSessionTests: BaseTests {
AVVideoHeightKey: NSNumber(value: Int(size.height)), AVVideoHeightKey: NSNumber(value: Int(size.height)),
], ],
composition: nil, composition: nil,
to: self.makeTemporaryURL().url, to: makeTemporaryURL().url,
as: .mov as: .mov
) )
} }
@ -284,12 +282,12 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_no_video_track() async throws { @Test func test_export_throws_with_no_video_track() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.videoTracksEmpty)) { try await #require(throws: ExportSession.Error.setupFailure(.videoTracksEmpty)) {
let sourceURL = self.resourceURL(named: "test-no-video.m4a") let sourceURL = resourceURL(named: "test-no-video.m4a")
let subject = ExportSession() let subject = ExportSession()
try await subject.export( try await subject.export(
asset: self.makeAsset(url: sourceURL), asset: makeAsset(url: sourceURL),
video: .codec(.h264, width: 1280, height: 720), video: .codec(.h264, width: 1280, height: 720),
to: self.makeTemporaryURL().url, to: makeTemporaryURL().url,
as: .mov as: .mov
) )
} }
@ -298,11 +296,11 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_cancellation() async throws { @Test func test_export_cancellation() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov") let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let destinationURL💥 = makeTemporaryURL() let destinationURL💥 = makeTemporaryURL()
let subject = ExportSession()
let task = Task { let task = Task {
let sourceAsset = AVURLAsset(url: sourceURL, options: [ let sourceAsset = AVURLAsset(url: sourceURL, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true, AVURLAssetPreferPreciseDurationAndTimingKey: true,
]) ])
let subject = ExportSession()
try await subject.export( try await subject.export(
asset: sourceAsset, asset: sourceAsset,
video: .codec(.h264, width: 1280, height: 720), video: .codec(.h264, width: 1280, height: 720),
@ -311,10 +309,8 @@ final class ExportSessionTests: BaseTests {
) )
Issue.record("Task should be cancelled long before we get here") Issue.record("Task should be cancelled long before we get here")
} }
NSLog("Waiting for encoding to begin...") NSLog("Sleeping for 0.3s")
for await progress in subject.progressStream where progress > 0 { try await Task.sleep(for: .milliseconds(300))
break
}
NSLog("Cancelling task") NSLog("Cancelling task")
task.cancel() task.cancel()
try? await task.value // Wait for task to complete try? await task.value // Wait for task to complete