Compare commits

..

No commits in common. "main" and "0.3.4" have entirely different histories.
main ... 0.3.4

6 changed files with 181 additions and 315 deletions

View file

@ -1,153 +0,0 @@
# Changelog
## [Unreleased]
- Your change here.
[Unreleased]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.4.0...HEAD
## [0.4.0] - 2025-09-10
### Fixed
- Fixed building with Xcode 26 RC
[0.4.0]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.9...0.4.0
## [0.3.9] - 2025-05-25
### Fixed
- Fixed crash on iOS 17 by using a new task instead of assumeIsolated
[0.3.9]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.8...0.3.9
## [0.3.8] - 2025-04-04
### Fixed
- Fixed crash when cancelled while writing samples
- Fixed tests with Swift 6.1 on macOS
- Fixed tests in Xcode 16.4 on macOS 15.5
- Fixed warnings in tests in Xcode 16.3
### Changed
- Stopped relying on specific delay in cancellation test
- Updated readme for 0.3.8
[0.3.8]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.7...0.3.8
## [0.3.7] - 2025-01-19
### Fixed
- Simplified cancellation and fixed memory leak
[0.3.7]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.6...0.3.7
## [0.3.6] - 2025-01-19
### Fixed
- Attempted to fix possible retain cycle
[0.3.6]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.5...0.3.6
## [0.3.5] - 2025-01-19
### Fixed
- Improved cancellation response (potential memory leak issue)
### Removed
- Deleted dead code
### Changed
- Extracted BaseTests class for better test organization
[0.3.5]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.4...0.3.5
## [0.3.4] - 2024-11-08
### Fixed
- [#3](https://github.com/samsonjs/SJSAssetExportSession/pull/3): Fixed encoding stalling by interleaving audio and video samples - [@samsonjs](https://github.com/samsonjs).
### Changed
- Updated readme with additional documentation
[0.3.4]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.3...0.3.4
## [0.3.3] - 2024-10-19
### Changed
- Made AudioOutputSettings and VideoOutputSettings properties public
### Fixed
- Made tests work on iOS 18.0 and iOS 18.1
- Fixed progress test
### Removed
- Removed SampleWriter.duration property
[0.3.3]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.2...0.3.3
## [0.3.2] - 2024-10-19
### Fixed
- Fixed release builds by using makeStream for SampleWriter's progress
### Changed
- Updated example in readme to version 0.3.2
[0.3.2]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.1...0.3.2
## [0.3.1] - 2024-10-19
### Fixed
- Removed unnecessary Task.yield() to fix intermittent hang
### Changed
- Improved code style and debuggability
- Updated version in readme to 0.3.1
[0.3.1]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3...0.3.1
## [0.3] - 2024-10-18
### Added
- Made audio/video settings Hashable, Sendable, and Codable
### Changed
- Updated readme for version 0.3
- Fixed SwiftPM instructions in readme
[0.3]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.2...0.3
## [0.2] - 2024-10-04
### Fixed
- [#2](https://github.com/samsonjs/SJSAssetExportSession/pull/2): Fixed spatial audio handling by dropping spatial audio tracks to fix encoding iPhone 16 videos - [@samsonjs](https://github.com/samsonjs).
### Changed
- Code style improvements
- Updated version in readme's SPM example
[0.2]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.1...0.2
## [0.1] - 2024-09-18
### Added
- Initial release as Swift Package
- Alternative to AVAssetExportSession with custom audio/video settings
- Builder pattern API for AudioOutputSettings and VideoOutputSettings
- Flexible raw dictionary API for maximum control
- Progress reporting via AsyncStream
- Support for iOS 17.0+, macOS 14.0+, and visionOS 1.3+
- Swift 6 strict concurrency support
- Comprehensive test suite with multiple video formats
### Changed
- Converted from Xcode project to Swift package
- Made yielding last progress value more reliable
- Set deployment targets to iOS 17, macOS 14, and visionOS 1.3
### Added
- Support for writing metadata on assets
- Documentation for most public API
- README and license files
[0.1]: https://github.com/samsonjs/SJSAssetExportSession/releases/tag/0.1

View file

@ -34,7 +34,7 @@ When you're integrating this into an app with Xcode then go to your project's Pa
When you're integrating this using SPM on its own then add this to the list of dependencies your Package.swift file:
```swift
.package(url: "https://github.com/samsonjs/SJSAssetExportSession.git", .upToNextMajor(from: "0.4.0"))
.package(url: "https://github.com/samsonjs/SJSAssetExportSession.git", .upToNextMajor(from: "0.3.3"))
```
and then add `"SJSAssetExportSession"` to the list of dependencies in your target as well.
@ -198,6 +198,6 @@ try await exporter.export(
## License
Copyright © 2024-2025 [Sami Samhuri](https://samhuri.net) <sami@samhuri.net>. Released under the terms of the [MIT License][MIT].
Copyright © 2024 [Sami Samhuri](https://samhuri.net) <sami@samhuri.net>. Released under the terms of the [MIT License][MIT].
[MIT]: https://sjs.mit-license.org

View file

@ -41,12 +41,13 @@ actor SampleWriter {
// MARK: Internal state
private var reader: AVAssetReader?
private var writer: AVAssetWriter?
private let reader: AVAssetReader
private let writer: AVAssetWriter
private var audioOutput: AVAssetReaderAudioMixOutput?
private var audioInput: AVAssetWriterInput?
private var videoOutput: AVAssetReaderVideoCompositionOutput?
private var videoInput: AVAssetWriterInput?
private var isCancelled = false
nonisolated init(
asset: sending AVAsset,
@ -68,22 +69,9 @@ actor SampleWriter {
if let timeRange {
reader.timeRange = timeRange
}
self.reader = reader
let writer = try AVAssetWriter(outputURL: outputURL, fileType: fileType)
writer.shouldOptimizeForNetworkUse = optimizeForNetworkUse
writer.metadata = metadata
self.writer = writer
self.audioOutputSettings = audioOutputSettings
self.audioMix = audioMix
self.videoOutputSettings = videoOutputSettings
self.videoComposition = videoComposition
self.timeRange = if let timeRange {
timeRange
} else {
try await CMTimeRange(start: .zero, duration: asset.load(.duration))
}
// Filter out disabled tracks to avoid problems encoding spatial audio. Ideally this would
// preserve track groups and make that all configurable.
@ -92,24 +80,7 @@ actor SampleWriter {
// Audio is optional so only validate output settings when it's applicable.
if !audioTracks.isEmpty {
try Self.validateAudio(outputSettings: audioOutputSettings, writer: writer)
let audioOutput = AVAssetReaderAudioMixOutput(audioTracks: audioTracks, audioSettings: nil)
audioOutput.alwaysCopiesSampleData = false
audioOutput.audioMix = audioMix
guard reader.canAdd(audioOutput) else {
throw Error.setupFailure(.cannotAddAudioOutput)
}
reader.add(audioOutput)
self.audioOutput = audioOutput
let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioOutputSettings)
audioInput.expectsMediaDataInRealTime = false
guard writer.canAdd(audioInput) else {
throw Error.setupFailure(.cannotAddAudioInput)
}
writer.add(audioInput)
self.audioInput = audioInput
}
let videoTracks = try await asset.loadTracks(withMediaType: .video)
.filterAsync { try await $0.load(.isEnabled) }
guard !videoTracks.isEmpty else { throw Error.setupFailure(.videoTracksEmpty) }
@ -118,61 +89,49 @@ actor SampleWriter {
renderSize: videoComposition.renderSize,
settings: videoOutputSettings
)
let videoOutput = AVAssetReaderVideoCompositionOutput(
videoTracks: videoTracks,
videoSettings: nil
)
videoOutput.alwaysCopiesSampleData = false
videoOutput.videoComposition = videoComposition
guard reader.canAdd(videoOutput) else {
throw Error.setupFailure(.cannotAddVideoOutput)
}
reader.add(videoOutput)
self.videoOutput = videoOutput
let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings)
videoInput.expectsMediaDataInRealTime = false
guard writer.canAdd(videoInput) else {
throw Error.setupFailure(.cannotAddVideoInput)
self.audioOutputSettings = audioOutputSettings
self.audioMix = audioMix
self.videoOutputSettings = videoOutputSettings
self.videoComposition = videoComposition
self.reader = reader
self.writer = writer
self.timeRange = if let timeRange {
timeRange
} else {
try await CMTimeRange(start: .zero, duration: asset.load(.duration))
}
writer.add(videoInput)
self.videoInput = videoInput
try await setUpAudio(audioTracks: audioTracks)
try await setUpVideo(videoTracks: videoTracks)
}
func writeSamples() async throws {
guard let reader, let writer else { throw CancellationError() }
try Task.checkCancellation()
// Clear all of these properties otherwise when we get cancelled then we leak a bunch of
// pixel buffers.
defer {
if Task.isCancelled {
reader.cancelReading()
writer.cancelWriting()
}
self.reader = nil
self.writer = nil
audioInput = nil
audioOutput = nil
videoInput = nil
videoOutput = nil
}
progressContinuation.yield(0.0)
writer.startWriting()
writer.startSession(atSourceTime: timeRange.start)
reader.startReading()
try Task.checkCancellation()
startEncodingAudioTracks()
startEncodingVideoTracks()
while reader.status == .reading, writer.status == .writing {
guard !Task.isCancelled else {
// Flag so that we stop writing samples
isCancelled = true
throw CancellationError()
}
try await Task.sleep(for: .milliseconds(10))
}
guard reader.status != .cancelled && writer.status != .cancelled else {
throw CancellationError()
}
guard writer.status != .failed else {
reader.cancelReading()
throw Error.writeFailure(writer.error)
@ -199,6 +158,57 @@ actor SampleWriter {
}
}
// MARK: - Setup
private func setUpAudio(audioTracks: [AVAssetTrack]) throws {
guard !audioTracks.isEmpty else { return }
let audioOutput = AVAssetReaderAudioMixOutput(audioTracks: audioTracks, audioSettings: nil)
audioOutput.alwaysCopiesSampleData = false
audioOutput.audioMix = audioMix
guard reader.canAdd(audioOutput) else {
throw Error.setupFailure(.cannotAddAudioOutput)
}
reader.add(audioOutput)
self.audioOutput = audioOutput
let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioOutputSettings)
audioInput.expectsMediaDataInRealTime = false
guard writer.canAdd(audioInput) else {
throw Error.setupFailure(.cannotAddAudioInput)
}
writer.add(audioInput)
self.audioInput = audioInput
}
private func setUpVideo(videoTracks: [AVAssetTrack]) throws {
precondition(!videoTracks.isEmpty, "Video tracks must be provided")
let videoOutput = AVAssetReaderVideoCompositionOutput(
videoTracks: videoTracks,
videoSettings: nil
)
videoOutput.alwaysCopiesSampleData = false
videoOutput.videoComposition = videoComposition
guard reader.canAdd(videoOutput) else {
throw Error.setupFailure(.cannotAddVideoOutput)
}
reader.add(videoOutput)
self.videoOutput = videoOutput
let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings)
videoInput.expectsMediaDataInRealTime = false
guard writer.canAdd(videoInput) else {
throw Error.setupFailure(.cannotAddVideoInput)
}
writer.add(videoInput)
self.videoInput = videoInput
}
func cancel() async {
isCancelled = true
}
// MARK: - Encoding
private func startEncodingAudioTracks() {
@ -208,31 +218,42 @@ actor SampleWriter {
}
audioInput.requestMediaDataWhenReady(on: queue) {
Task { await self.writeAllReadySamples() }
// NOTE: assumeIsolated crashes on macOS at the moment
self.assumeIsolated { _self in
_self.writeAllReadySamples()
}
}
}
private func startEncodingVideoTracks() {
videoInput?.requestMediaDataWhenReady(on: queue) {
Task { await self.writeAllReadySamples() }
videoInput!.requestMediaDataWhenReady(on: queue) {
// NOTE: assumeIsolated crashes on macOS at the moment
self.assumeIsolated { _self in
_self.writeAllReadySamples()
}
}
}
private func writeAllReadySamples() {
guard !isCancelled else {
log.debug("Cancelled while writing samples")
reader.cancelReading()
writer.cancelWriting()
return
}
if let audioInput, let audioOutput {
let hasMoreAudio = writeReadySamples(output: audioOutput, input: audioInput)
if !hasMoreAudio { log.debug("Finished encoding audio") }
}
if let videoInput, let videoOutput {
let hasMoreVideo = writeReadySamples(output: videoOutput, input: videoInput)
if !hasMoreVideo { log.debug("Finished encoding video") }
}
let hasMoreVideo = writeReadySamples(output: videoOutput!, input: videoInput!)
if !hasMoreVideo { log.debug("Finished encoding video") }
}
private func writeReadySamples(output: AVAssetReaderOutput, input: AVAssetWriterInput) -> Bool {
while input.isReadyForMoreMediaData {
guard reader?.status == .reading && writer?.status == .writing,
guard reader.status == .reading && writer.status == .writing,
let sampleBuffer = output.copyNextSampleBuffer() else {
input.markAsFinished()
return false
@ -291,7 +312,7 @@ actor SampleWriter {
let renderWidth = Int(renderSize.width)
let renderHeight = Int(renderSize.height)
if renderWidth != settingsWidth || renderHeight != settingsHeight {
log.warning("Video composition's render size (\(renderWidth)\(renderHeight)) will be overridden by video output settings (\(settingsWidth)\(settingsHeight))")
log.warning("Video composition's render size (\(renderWidth)\(renderHeight)) will be overriden by video output settings (\(settingsWidth)\(settingsHeight))")
}
}
}

View file

@ -0,0 +1,14 @@
//
// AVAsset+sending.swift
// SJSAssetExportSessionTests
//
// Created by Sami Samhuri on 2024-07-07.
//
import AVFoundation
extension AVAsset {
func sendTracks(withMediaType mediaType: AVMediaType) async throws -> sending [AVAssetTrack] {
try await loadTracks(withMediaType: mediaType)
}
}

View file

@ -1,50 +0,0 @@
//
// BaseTests.swift
// SJSAssetExportSession
//
// Created by Sami Samhuri on 2025-01-19.
//
import AVFoundation
import Foundation
import Testing
class BaseTests {
func resourceURL(named name: String) -> URL {
Bundle.module.resourceURL!.appending(component: name)
}
func makeAsset(url: URL) -> sending AVAsset {
AVURLAsset(url: url, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true,
])
}
func makeTemporaryURL(function: String = #function) -> AutoDestructingURL {
let timestamp = Int(Date.now.timeIntervalSince1970)
let f = function.replacing(/[\(\)]/, with: { _ in "" })
let filename = "\(Self.self)_\(f)_\(timestamp).mp4"
let url = URL.temporaryDirectory.appending(component: filename)
return AutoDestructingURL(url: url)
}
func makeVideoComposition(
assetURL: URL,
size: CGSize? = nil,
fps: Int? = nil
) async throws -> sending AVMutableVideoComposition {
let asset = makeAsset(url: assetURL)
let videoComposition = try await AVMutableVideoComposition.videoComposition(
withPropertiesOf: asset
)
if let size {
videoComposition.renderSize = size
}
if let fps {
let seconds = 1.0 / TimeInterval(fps)
videoComposition.sourceTrackIDForFrameTiming = kCMPersistentTrackID_Invalid
videoComposition.frameDuration = CMTime(seconds: seconds, preferredTimescale: 600)
}
return videoComposition
}
}

View file

@ -10,7 +10,45 @@ import CoreLocation
import SJSAssetExportSession
import Testing
final class ExportSessionTests: BaseTests {
final class ExportSessionTests {
private func resourceURL(named name: String) -> URL {
Bundle.module.resourceURL!.appending(component: name)
}
private func makeAsset(url: URL) -> sending AVAsset {
AVURLAsset(url: url, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true,
])
}
private func makeTemporaryURL(function: String = #function) -> AutoDestructingURL {
let timestamp = Int(Date.now.timeIntervalSince1970)
let f = function.replacing(/[\(\)]/, with: { _ in "" })
let filename = "\(Self.self)_\(f)_\(timestamp).mp4"
let url = URL.temporaryDirectory.appending(component: filename)
return AutoDestructingURL(url: url)
}
private func makeVideoComposition(
assetURL: URL,
size: CGSize? = nil,
fps: Int? = nil
) async throws -> sending AVMutableVideoComposition {
let asset = makeAsset(url: assetURL)
let videoComposition = try await AVMutableVideoComposition.videoComposition(
withPropertiesOf: asset
)
if let size {
videoComposition.renderSize = size
}
if let fps {
let seconds = 1.0 / TimeInterval(fps)
videoComposition.sourceTrackIDForFrameTiming = kCMPersistentTrackID_Invalid
videoComposition.frameDuration = CMTime(seconds: seconds, preferredTimescale: 600)
}
return videoComposition
}
@Test func test_sugary_export_720p_h264_24fps() async throws {
let sourceURL = resourceURL(named: "test-4k-hdr-hevc-30fps.mov")
let destinationURL = makeTemporaryURL()
@ -30,20 +68,20 @@ final class ExportSessionTests: BaseTests {
let exportedAsset = AVURLAsset(url: destinationURL.url)
#expect(try await exportedAsset.load(.duration) == .seconds(1))
// Audio
try #require(try await exportedAsset.loadTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.loadTracks(withMediaType: .audio).first)
try #require(try await exportedAsset.sendTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.sendTracks(withMediaType: .audio).first)
let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first)
#expect(audioFormat.mediaType == .audio)
#expect(audioFormat.mediaSubType == .mpeg4AAC)
#expect(audioFormat.audioChannelLayout?.numberOfChannels == 2)
#expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100)
// Video
try #require(await exportedAsset.loadTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first)
try #require(await exportedAsset.sendTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
#expect(try await videoTrack.load(.nominalFrameRate) == 24.0)
let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((900_000 ... 1_130_000).contains(dataRate))
#expect((1_000_000 ... 1_100_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264)
@ -79,20 +117,20 @@ final class ExportSessionTests: BaseTests {
let exportedAsset = AVURLAsset(url: destinationURL.url)
#expect(try await exportedAsset.load(.duration) == .seconds(1))
// Audio
try #require(try await exportedAsset.loadTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.loadTracks(withMediaType: .audio).first)
try #require(try await exportedAsset.sendTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.sendTracks(withMediaType: .audio).first)
let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first)
#expect(audioFormat.mediaType == .audio)
#expect(audioFormat.mediaSubType == .mpeg4AAC)
#expect(audioFormat.audioChannelLayout?.numberOfChannels == 2)
#expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100)
// Video
try #require(await exportedAsset.loadTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first)
try #require(await exportedAsset.sendTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
#expect(try await videoTrack.load(.nominalFrameRate) == 24.0)
let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((900_000 ... 1_130_000).contains(dataRate))
#expect((1_000_000 ... 1_100_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264)
@ -101,7 +139,7 @@ final class ExportSessionTests: BaseTests {
#expect(videoFormat.extensions[.yCbCrMatrix] == .yCbCrMatrix(.itu_R_709_2))
}
@Test func test_export_default_time_range() async throws {
@Test func test_export_default_timerange() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let destinationURL = makeTemporaryURL()
@ -169,11 +207,9 @@ final class ExportSessionTests: BaseTests {
)
let exportedAsset = AVURLAsset(url: destinationURL.url)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first)
let naturalSize = try await videoTrack.load(.naturalSize)
#expect(naturalSize == CGSize(width: 1920, height: 1080))
let fps = try await videoTrack.load(.nominalFrameRate)
#expect(Int(fps.rounded()) == 30)
let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1920, height: 1080))
#expect(try await videoTrack.load(.nominalFrameRate) == 30.0)
let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((2_400_000 ... 2_700_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
@ -226,17 +262,17 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_empty_audio_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsEmpty)) {
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov")
let videoComposition = try await self.makeVideoComposition(assetURL: sourceURL)
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let videoComposition = try await makeVideoComposition(assetURL: sourceURL)
let subject = ExportSession()
try await subject.export(
asset: self.makeAsset(url: sourceURL),
asset: makeAsset(url: sourceURL),
audioOutputSettings: [:], // Here it matters because there's an audio track
videoOutputSettings: VideoOutputSettings
.codec(.h264, size: videoComposition.renderSize).settingsDictionary,
composition: videoComposition,
to: self.makeTemporaryURL().url,
to: makeTemporaryURL().url,
as: .mov
)
}
@ -244,18 +280,18 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_invalid_audio_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsInvalid)) {
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov")
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let subject = ExportSession()
try await subject.export(
asset: self.makeAsset(url: sourceURL),
asset: makeAsset(url: sourceURL),
audioOutputSettings: [
AVFormatIDKey: kAudioFormatMPEG4AAC,
AVNumberOfChannelsKey: NSNumber(value: -1), // invalid number of channels
],
videoOutputSettings: VideoOutputSettings
.codec(.h264, size: CGSize(width: 1280, height: 720)).settingsDictionary,
to: self.makeTemporaryURL().url,
to: makeTemporaryURL().url,
as: .mov
)
}
@ -263,12 +299,12 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_invalid_video_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.videoSettingsInvalid)) {
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov")
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let size = CGSize(width: 1280, height: 720)
let subject = ExportSession()
try await subject.export(
asset: self.makeAsset(url: sourceURL),
asset: makeAsset(url: sourceURL),
audioOutputSettings: AudioOutputSettings.default.settingsDictionary,
videoOutputSettings: [
// missing codec
@ -276,7 +312,7 @@ final class ExportSessionTests: BaseTests {
AVVideoHeightKey: NSNumber(value: Int(size.height)),
],
composition: nil,
to: self.makeTemporaryURL().url,
to: makeTemporaryURL().url,
as: .mov
)
}
@ -284,12 +320,12 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_no_video_track() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.videoTracksEmpty)) {
let sourceURL = self.resourceURL(named: "test-no-video.m4a")
let sourceURL = resourceURL(named: "test-no-video.m4a")
let subject = ExportSession()
try await subject.export(
asset: self.makeAsset(url: sourceURL),
asset: makeAsset(url: sourceURL),
video: .codec(.h264, width: 1280, height: 720),
to: self.makeTemporaryURL().url,
to: makeTemporaryURL().url,
as: .mov
)
}
@ -298,11 +334,11 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_cancellation() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let destinationURL💥 = makeTemporaryURL()
let subject = ExportSession()
let task = Task {
let sourceAsset = AVURLAsset(url: sourceURL, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true,
])
let subject = ExportSession()
try await subject.export(
asset: sourceAsset,
video: .codec(.h264, width: 1280, height: 720),
@ -311,10 +347,8 @@ final class ExportSessionTests: BaseTests {
)
Issue.record("Task should be cancelled long before we get here")
}
NSLog("Waiting for encoding to begin...")
for await progress in subject.progressStream where progress > 0 {
break
}
NSLog("Sleeping for 0.3s")
try await Task.sleep(for: .milliseconds(300))
NSLog("Cancelling task")
task.cancel()
try? await task.value // Wait for task to complete