Compare commits

..

30 commits
0.3.3 ... main

Author SHA1 Message Date
9c9e24d2bc
Inline all setup code in SampleWriter initializer
It seems like a bug that only one of these warns and when swapping
the order, whichever is first warns, it's just that both must be
present to trigger a warning which is the buggy part.

Anyway since it seems like a legit error, inline all the code so
things should work fine.
2025-10-30 19:06:45 -07:00
93806c5ed0
Update copyright in Readme 2025-09-10 14:11:20 -07:00
c77c3cacd7
Bump version to 0.4.0 2025-09-10 14:10:39 -07:00
7dee5ab772
Merge pull request #5 from samsonjs/xcode26
Make it work with Xcode 26 RC
2025-09-10 14:08:10 -07:00
1d4e486041
Make it work with Xcode 26 RC 2025-09-10 14:05:25 -07:00
f60f5a9035
Add a changelog 2025-06-08 21:01:33 -07:00
0eefb949e2
Use a new task instead of assumeIsolated to try to fix crash on iOS 17 2025-05-25 18:02:35 -07:00
34c374d914
Fix tests in Xcode 16.4 on macOS 15.5 2025-05-21 13:22:26 -07:00
b627e9bf50
Fix warnings in tests in Xcode 16.3 2025-04-06 23:18:32 -07:00
7b7891ce14
Update readme for 0.3.8 2025-04-04 10:29:11 -07:00
1e768033a1
Fix a crash when cancelled while writing samples
Now we never force-unwrap videoInput or videoOutput. Or anything else
for that matter.
2025-04-04 10:26:09 -07:00
2f1b859a03
Stop relying on a specific delay in cancellation test 2025-04-04 10:12:48 -07:00
10c717ab99
Fix a typo 2025-04-04 09:46:55 -07:00
62a7a375c0
Fix tests with Swift 6.1 on macOS, which finally works! 2025-02-23 09:20:15 -08:00
49d41080bb
Simplify cancellation and fix memory leak 2025-01-19 16:17:50 -08:00
865e524be6
Revert "Try to fix a possible retain cycle"
This reverts commit 2dac7d58dc.
2025-01-19 15:19:21 -08:00
2dac7d58dc
Try to fix a possible retain cycle 2025-01-19 15:01:41 -08:00
f72a073b36
Delete dead code 2025-01-19 14:55:06 -08:00
3be5b7f28e
Extract BaseTests class 2025-01-19 14:54:53 -08:00
4c7b64f045
Bump version to 0.3.5 2025-01-19 14:06:14 -08:00
33152e4e44
Try to improve cancellation response, memory might be leaking 2025-01-19 14:05:46 -08:00
c5b127c702
Update Readme.md 2024-11-08 17:08:42 -08:00
e1a9f38d5a
Merge pull request #3 from samsonjs/fix/audio-sample-stall
Fix encoding stalling by interleaving audio and video
2024-11-08 17:06:53 -08:00
9297a14920
Fix encoding stalling by interleaving audio and video
Thanks to the AVFoundation team I learned that both audio and video
samples are supposed to be interleaved whenever media data is ready
from either call to encode ready samples, and that fixes encoding this
video encoding with x264 and ffmpeg.
2024-11-08 17:06:26 -08:00
e7fbbacd30
Update Readme.md 2024-11-02 09:05:59 -07:00
e4b0671475
Add 0dependencies.dev badge 2024-10-31 11:51:57 -07:00
f9bacbe9be
Make tests work on iOS 18.0 and iOS 18.1
Estimated data rate changes a bit so check against a range instead of
a specific value.
2024-10-24 18:34:11 -07:00
63dc13d316
Update Readme.md with Swift Package Index shields 2024-10-21 17:11:18 -07:00
979a8e23ee
Fix progress test 2024-10-20 17:56:05 -07:00
d82bd64635
Remove SampleWriter.duration 2024-10-20 17:22:04 -07:00
8 changed files with 369 additions and 223 deletions

153
Changelog.md Normal file
View file

@ -0,0 +1,153 @@
# Changelog
## [Unreleased]
- Your change here.
[Unreleased]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.4.0...HEAD
## [0.4.0] - 2025-09-10
### Fixed
- Fixed building with Xcode 26 RC
[0.4.0]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.9...0.4.0
## [0.3.9] - 2025-05-25
### Fixed
- Fixed crash on iOS 17 by using a new task instead of assumeIsolated
[0.3.9]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.8...0.3.9
## [0.3.8] - 2025-04-04
### Fixed
- Fixed crash when cancelled while writing samples
- Fixed tests with Swift 6.1 on macOS
- Fixed tests in Xcode 16.4 on macOS 15.5
- Fixed warnings in tests in Xcode 16.3
### Changed
- Stopped relying on specific delay in cancellation test
- Updated readme for 0.3.8
[0.3.8]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.7...0.3.8
## [0.3.7] - 2025-01-19
### Fixed
- Simplified cancellation and fixed memory leak
[0.3.7]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.6...0.3.7
## [0.3.6] - 2025-01-19
### Fixed
- Attempted to fix possible retain cycle
[0.3.6]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.5...0.3.6
## [0.3.5] - 2025-01-19
### Fixed
- Improved cancellation response (potential memory leak issue)
### Removed
- Deleted dead code
### Changed
- Extracted BaseTests class for better test organization
[0.3.5]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.4...0.3.5
## [0.3.4] - 2024-11-08
### Fixed
- [#3](https://github.com/samsonjs/SJSAssetExportSession/pull/3): Fixed encoding stalling by interleaving audio and video samples - [@samsonjs](https://github.com/samsonjs).
### Changed
- Updated readme with additional documentation
[0.3.4]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.3...0.3.4
## [0.3.3] - 2024-10-19
### Changed
- Made AudioOutputSettings and VideoOutputSettings properties public
### Fixed
- Made tests work on iOS 18.0 and iOS 18.1
- Fixed progress test
### Removed
- Removed SampleWriter.duration property
[0.3.3]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.2...0.3.3
## [0.3.2] - 2024-10-19
### Fixed
- Fixed release builds by using makeStream for SampleWriter's progress
### Changed
- Updated example in readme to version 0.3.2
[0.3.2]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.1...0.3.2
## [0.3.1] - 2024-10-19
### Fixed
- Removed unnecessary Task.yield() to fix intermittent hang
### Changed
- Improved code style and debuggability
- Updated version in readme to 0.3.1
[0.3.1]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3...0.3.1
## [0.3] - 2024-10-18
### Added
- Made audio/video settings Hashable, Sendable, and Codable
### Changed
- Updated readme for version 0.3
- Fixed SwiftPM instructions in readme
[0.3]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.2...0.3
## [0.2] - 2024-10-04
### Fixed
- [#2](https://github.com/samsonjs/SJSAssetExportSession/pull/2): Fixed spatial audio handling by dropping spatial audio tracks to fix encoding iPhone 16 videos - [@samsonjs](https://github.com/samsonjs).
### Changed
- Code style improvements
- Updated version in readme's SPM example
[0.2]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.1...0.2
## [0.1] - 2024-09-18
### Added
- Initial release as Swift Package
- Alternative to AVAssetExportSession with custom audio/video settings
- Builder pattern API for AudioOutputSettings and VideoOutputSettings
- Flexible raw dictionary API for maximum control
- Progress reporting via AsyncStream
- Support for iOS 17.0+, macOS 14.0+, and visionOS 1.3+
- Swift 6 strict concurrency support
- Comprehensive test suite with multiple video formats
### Changed
- Converted from Xcode project to Swift package
- Made yielding last progress value more reliable
- Set deployment targets to iOS 17, macOS 14, and visionOS 1.3
### Added
- Support for writing metadata on assets
- Documentation for most public API
- README and license files
[0.1]: https://github.com/samsonjs/SJSAssetExportSession/releases/tag/0.1

View file

@ -24,6 +24,7 @@ let package = Package(
.process("Resources/test-no-audio.mp4"),
.process("Resources/test-no-video.m4a"),
.process("Resources/test-spatial-audio.mov"),
.process("Resources/test-x264-1080p-h264-60fps.mp4"),
]
),
]

View file

@ -1,5 +1,9 @@
# SJSAssetExportSession
[![0 dependencies!](https://0dependencies.dev/0dependencies.svg)](https://0dependencies.dev)
[![](https://img.shields.io/endpoint?url=https%3A%2F%2Fswiftpackageindex.com%2Fapi%2Fpackages%2Fsamsonjs%2FSJSAssetExportSession%2Fbadge%3Ftype%3Dswift-versions)](https://swiftpackageindex.com/samsonjs/SJSAssetExportSession)
[![](https://img.shields.io/endpoint?url=https%3A%2F%2Fswiftpackageindex.com%2Fapi%2Fpackages%2Fsamsonjs%2FSJSAssetExportSession%2Fbadge%3Ftype%3Dplatforms)](https://swiftpackageindex.com/samsonjs/SJSAssetExportSession)
## Overview
`SJSAssetExportSession` is an alternative to [`AVAssetExportSession`][AV] that lets you provide custom audio and video settings, without dropping down into the world of `AVAssetReader` and `AVAssetWriter`. It has similar capabilites to [SDAVAssetExportSession][SDAV] but the API is completely different, the code is written in Swift, and it's ready for the world of strict concurrency.
@ -30,7 +34,7 @@ When you're integrating this into an app with Xcode then go to your project's Pa
When you're integrating this using SPM on its own then add this to the list of dependencies your Package.swift file:
```swift
.package(url: "https://github.com/samsonjs/SJSAssetExportSession.git", .upToNextMajor(from: "0.3.2"))
.package(url: "https://github.com/samsonjs/SJSAssetExportSession.git", .upToNextMajor(from: "0.4.0"))
```
and then add `"SJSAssetExportSession"` to the list of dependencies in your target as well.
@ -194,6 +198,6 @@ try await exporter.export(
## License
Copyright © 2024 [Sami Samhuri](https://samhuri.net) <sami@samhuri.net>. Released under the terms of the [MIT License][MIT].
Copyright © 2024-2025 [Sami Samhuri](https://samhuri.net) <sami@samhuri.net>. Released under the terms of the [MIT License][MIT].
[MIT]: https://sjs.mit-license.org

View file

@ -37,18 +37,16 @@ actor SampleWriter {
private let audioMix: AVAudioMix?
private let videoOutputSettings: [String: any Sendable]
private let videoComposition: AVVideoComposition?
private let duration: CMTime
private let timeRange: CMTimeRange
// MARK: Internal state
private let reader: AVAssetReader
private let writer: AVAssetWriter
private var reader: AVAssetReader?
private var writer: AVAssetWriter?
private var audioOutput: AVAssetReaderAudioMixOutput?
private var audioInput: AVAssetWriterInput?
private var videoOutput: AVAssetReaderVideoCompositionOutput?
private var videoInput: AVAssetWriterInput?
private var isCancelled = false
nonisolated init(
asset: sending AVAsset,
@ -66,18 +64,26 @@ actor SampleWriter {
(progressStream, progressContinuation) = AsyncStream<Float>.makeStream()
let duration = if let timeRange {
timeRange.duration
} else {
try await asset.load(.duration)
}
let reader = try AVAssetReader(asset: asset)
if let timeRange {
reader.timeRange = timeRange
}
self.reader = reader
let writer = try AVAssetWriter(outputURL: outputURL, fileType: fileType)
writer.shouldOptimizeForNetworkUse = optimizeForNetworkUse
writer.metadata = metadata
self.writer = writer
self.audioOutputSettings = audioOutputSettings
self.audioMix = audioMix
self.videoOutputSettings = videoOutputSettings
self.videoComposition = videoComposition
self.timeRange = if let timeRange {
timeRange
} else {
try await CMTimeRange(start: .zero, duration: asset.load(.duration))
}
// Filter out disabled tracks to avoid problems encoding spatial audio. Ideally this would
// preserve track groups and make that all configurable.
@ -86,7 +92,24 @@ actor SampleWriter {
// Audio is optional so only validate output settings when it's applicable.
if !audioTracks.isEmpty {
try Self.validateAudio(outputSettings: audioOutputSettings, writer: writer)
let audioOutput = AVAssetReaderAudioMixOutput(audioTracks: audioTracks, audioSettings: nil)
audioOutput.alwaysCopiesSampleData = false
audioOutput.audioMix = audioMix
guard reader.canAdd(audioOutput) else {
throw Error.setupFailure(.cannotAddAudioOutput)
}
reader.add(audioOutput)
self.audioOutput = audioOutput
let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioOutputSettings)
audioInput.expectsMediaDataInRealTime = false
guard writer.canAdd(audioInput) else {
throw Error.setupFailure(.cannotAddAudioInput)
}
writer.add(audioInput)
self.audioInput = audioInput
}
let videoTracks = try await asset.loadTracks(withMediaType: .video)
.filterAsync { try await $0.load(.isEnabled) }
guard !videoTracks.isEmpty else { throw Error.setupFailure(.videoTracksEmpty) }
@ -95,39 +118,61 @@ actor SampleWriter {
renderSize: videoComposition.renderSize,
settings: videoOutputSettings
)
let videoOutput = AVAssetReaderVideoCompositionOutput(
videoTracks: videoTracks,
videoSettings: nil
)
videoOutput.alwaysCopiesSampleData = false
videoOutput.videoComposition = videoComposition
guard reader.canAdd(videoOutput) else {
throw Error.setupFailure(.cannotAddVideoOutput)
}
reader.add(videoOutput)
self.videoOutput = videoOutput
self.audioOutputSettings = audioOutputSettings
self.audioMix = audioMix
self.videoOutputSettings = videoOutputSettings
self.videoComposition = videoComposition
self.reader = reader
self.writer = writer
self.duration = duration
self.timeRange = timeRange ?? CMTimeRange(start: .zero, duration: duration)
try await setUpAudio(audioTracks: audioTracks)
try await setUpVideo(videoTracks: videoTracks)
let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings)
videoInput.expectsMediaDataInRealTime = false
guard writer.canAdd(videoInput) else {
throw Error.setupFailure(.cannotAddVideoInput)
}
writer.add(videoInput)
self.videoInput = videoInput
}
func writeSamples() async throws {
guard let reader, let writer else { throw CancellationError() }
try Task.checkCancellation()
// Clear all of these properties otherwise when we get cancelled then we leak a bunch of
// pixel buffers.
defer {
if Task.isCancelled {
reader.cancelReading()
writer.cancelWriting()
}
self.reader = nil
self.writer = nil
audioInput = nil
audioOutput = nil
videoInput = nil
videoOutput = nil
}
progressContinuation.yield(0.0)
writer.startWriting()
writer.startSession(atSourceTime: timeRange.start)
reader.startReading()
try Task.checkCancellation()
await encodeAudioTracks()
try Task.checkCancellation()
startEncodingAudioTracks()
startEncodingVideoTracks()
await encodeVideoTracks()
try Task.checkCancellation()
guard reader.status != .cancelled && writer.status != .cancelled else {
throw CancellationError()
while reader.status == .reading, writer.status == .writing {
try await Task.sleep(for: .milliseconds(10))
}
guard writer.status != .failed else {
reader.cancelReading()
throw Error.writeFailure(writer.error)
@ -154,132 +199,40 @@ actor SampleWriter {
}
}
// MARK: - Setup
private func setUpAudio(audioTracks: [AVAssetTrack]) throws {
guard !audioTracks.isEmpty else { return }
let audioOutput = AVAssetReaderAudioMixOutput(audioTracks: audioTracks, audioSettings: nil)
audioOutput.alwaysCopiesSampleData = false
audioOutput.audioMix = audioMix
guard reader.canAdd(audioOutput) else {
throw Error.setupFailure(.cannotAddAudioOutput)
}
reader.add(audioOutput)
self.audioOutput = audioOutput
let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioOutputSettings)
audioInput.expectsMediaDataInRealTime = false
guard writer.canAdd(audioInput) else {
throw Error.setupFailure(.cannotAddAudioInput)
}
writer.add(audioInput)
self.audioInput = audioInput
}
private func setUpVideo(videoTracks: [AVAssetTrack]) throws {
precondition(!videoTracks.isEmpty, "Video tracks must be provided")
let videoOutput = AVAssetReaderVideoCompositionOutput(
videoTracks: videoTracks,
videoSettings: nil
)
videoOutput.alwaysCopiesSampleData = false
videoOutput.videoComposition = videoComposition
guard reader.canAdd(videoOutput) else {
throw Error.setupFailure(.cannotAddVideoOutput)
}
reader.add(videoOutput)
self.videoOutput = videoOutput
let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings)
videoInput.expectsMediaDataInRealTime = false
guard writer.canAdd(videoInput) else {
throw Error.setupFailure(.cannotAddVideoInput)
}
writer.add(videoInput)
self.videoInput = videoInput
}
func cancel() async {
isCancelled = true
}
// MARK: - Encoding
private func encodeAudioTracks() async {
private func startEncodingAudioTracks() {
// Don't do anything when we have no audio to encode.
guard audioInput != nil, audioOutput != nil else {
guard let audioInput, audioOutput != nil else {
return
}
await withTaskCancellationHandler {
await withCheckedContinuation { continuation in
self.audioInput!.requestMediaDataWhenReady(on: queue) {
self.assumeIsolated { _self in
guard !_self.isCancelled else {
log.debug("Cancelled while encoding audio")
_self.reader.cancelReading()
_self.writer.cancelWriting()
continuation.resume()
return
}
let hasMoreSamples = _self.writeReadySamples(
output: _self.audioOutput!,
input: _self.audioInput!
)
if !hasMoreSamples {
log.debug("Finished encoding audio")
continuation.resume()
}
}
}
}
} onCancel: {
log.debug("Task cancelled while encoding audio")
Task {
await self.cancel()
}
audioInput.requestMediaDataWhenReady(on: queue) {
Task { await self.writeAllReadySamples() }
}
}
private func encodeVideoTracks() async {
await withTaskCancellationHandler {
await withCheckedContinuation { continuation in
self.videoInput!.requestMediaDataWhenReady(on: queue) {
// NOTE: assumeIsolated crashes on macOS at the moment
self.assumeIsolated { _self in
guard !_self.isCancelled else {
log.debug("Cancelled while encoding video")
_self.reader.cancelReading()
_self.writer.cancelWriting()
continuation.resume()
return
}
private func startEncodingVideoTracks() {
videoInput?.requestMediaDataWhenReady(on: queue) {
Task { await self.writeAllReadySamples() }
}
}
let hasMoreSamples = _self.writeReadySamples(
output: _self.videoOutput!,
input: _self.videoInput!
)
if !hasMoreSamples {
log.debug("Finished encoding video")
continuation.resume()
}
}
}
}
} onCancel: {
log.debug("Task cancelled while encoding video")
Task {
await self.cancel()
}
private func writeAllReadySamples() {
if let audioInput, let audioOutput {
let hasMoreAudio = writeReadySamples(output: audioOutput, input: audioInput)
if !hasMoreAudio { log.debug("Finished encoding audio") }
}
if let videoInput, let videoOutput {
let hasMoreVideo = writeReadySamples(output: videoOutput, input: videoInput)
if !hasMoreVideo { log.debug("Finished encoding video") }
}
}
private func writeReadySamples(output: AVAssetReaderOutput, input: AVAssetWriterInput) -> Bool {
while input.isReadyForMoreMediaData {
guard reader.status == .reading && writer.status == .writing,
guard reader?.status == .reading && writer?.status == .writing,
let sampleBuffer = output.copyNextSampleBuffer() else {
input.markAsFinished()
return false
@ -287,8 +240,9 @@ actor SampleWriter {
// Only yield progress values for video. Audio is insignificant in comparison.
if output == videoOutput {
let samplePresentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) - timeRange.start
let progress = Float(samplePresentationTime.seconds / duration.seconds)
let endTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
let samplePresentationTime = endTime - timeRange.start
let progress = Float(samplePresentationTime.seconds / timeRange.duration.seconds)
progressContinuation.yield(progress)
}
@ -337,7 +291,7 @@ actor SampleWriter {
let renderWidth = Int(renderSize.width)
let renderHeight = Int(renderSize.height)
if renderWidth != settingsWidth || renderHeight != settingsHeight {
log.warning("Video composition's render size (\(renderWidth)\(renderHeight)) will be overriden by video output settings (\(settingsWidth)\(settingsHeight))")
log.warning("Video composition's render size (\(renderWidth)\(renderHeight)) will be overridden by video output settings (\(settingsWidth)\(settingsHeight))")
}
}
}

View file

@ -1,14 +0,0 @@
//
// AVAsset+sending.swift
// SJSAssetExportSessionTests
//
// Created by Sami Samhuri on 2024-07-07.
//
import AVFoundation
extension AVAsset {
func sendTracks(withMediaType mediaType: AVMediaType) async throws -> sending [AVAssetTrack] {
try await loadTracks(withMediaType: mediaType)
}
}

View file

@ -0,0 +1,50 @@
//
// BaseTests.swift
// SJSAssetExportSession
//
// Created by Sami Samhuri on 2025-01-19.
//
import AVFoundation
import Foundation
import Testing
class BaseTests {
func resourceURL(named name: String) -> URL {
Bundle.module.resourceURL!.appending(component: name)
}
func makeAsset(url: URL) -> sending AVAsset {
AVURLAsset(url: url, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true,
])
}
func makeTemporaryURL(function: String = #function) -> AutoDestructingURL {
let timestamp = Int(Date.now.timeIntervalSince1970)
let f = function.replacing(/[\(\)]/, with: { _ in "" })
let filename = "\(Self.self)_\(f)_\(timestamp).mp4"
let url = URL.temporaryDirectory.appending(component: filename)
return AutoDestructingURL(url: url)
}
func makeVideoComposition(
assetURL: URL,
size: CGSize? = nil,
fps: Int? = nil
) async throws -> sending AVMutableVideoComposition {
let asset = makeAsset(url: assetURL)
let videoComposition = try await AVMutableVideoComposition.videoComposition(
withPropertiesOf: asset
)
if let size {
videoComposition.renderSize = size
}
if let fps {
let seconds = 1.0 / TimeInterval(fps)
videoComposition.sourceTrackIDForFrameTiming = kCMPersistentTrackID_Invalid
videoComposition.frameDuration = CMTime(seconds: seconds, preferredTimescale: 600)
}
return videoComposition
}
}

View file

@ -10,45 +10,7 @@ import CoreLocation
import SJSAssetExportSession
import Testing
final class ExportSessionTests {
private func resourceURL(named name: String) -> URL {
Bundle.module.resourceURL!.appending(component: name)
}
private func makeAsset(url: URL) -> sending AVAsset {
AVURLAsset(url: url, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true,
])
}
private func makeTemporaryURL(function: String = #function) -> AutoDestructingURL {
let timestamp = Int(Date.now.timeIntervalSince1970)
let f = function.replacing(/[\(\)]/, with: { _ in "" })
let filename = "\(Self.self)_\(f)_\(timestamp).mp4"
let url = URL.temporaryDirectory.appending(component: filename)
return AutoDestructingURL(url: url)
}
private func makeVideoComposition(
assetURL: URL,
size: CGSize? = nil,
fps: Int? = nil
) async throws -> sending AVMutableVideoComposition {
let asset = makeAsset(url: assetURL)
let videoComposition = try await AVMutableVideoComposition.videoComposition(
withPropertiesOf: asset
)
if let size {
videoComposition.renderSize = size
}
if let fps {
let seconds = 1.0 / TimeInterval(fps)
videoComposition.sourceTrackIDForFrameTiming = kCMPersistentTrackID_Invalid
videoComposition.frameDuration = CMTime(seconds: seconds, preferredTimescale: 600)
}
return videoComposition
}
final class ExportSessionTests: BaseTests {
@Test func test_sugary_export_720p_h264_24fps() async throws {
let sourceURL = resourceURL(named: "test-4k-hdr-hevc-30fps.mov")
let destinationURL = makeTemporaryURL()
@ -68,19 +30,20 @@ final class ExportSessionTests {
let exportedAsset = AVURLAsset(url: destinationURL.url)
#expect(try await exportedAsset.load(.duration) == .seconds(1))
// Audio
try #require(try await exportedAsset.sendTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.sendTracks(withMediaType: .audio).first)
try #require(try await exportedAsset.loadTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.loadTracks(withMediaType: .audio).first)
let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first)
#expect(audioFormat.mediaType == .audio)
#expect(audioFormat.mediaSubType == .mpeg4AAC)
#expect(audioFormat.audioChannelLayout?.numberOfChannels == 2)
#expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100)
// Video
try #require(await exportedAsset.sendTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first)
try #require(await exportedAsset.loadTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
#expect(try await videoTrack.load(.nominalFrameRate) == 24.0)
#expect(try await videoTrack.load(.estimatedDataRate) == 1_036_128)
let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((900_000 ... 1_130_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264)
@ -116,19 +79,20 @@ final class ExportSessionTests {
let exportedAsset = AVURLAsset(url: destinationURL.url)
#expect(try await exportedAsset.load(.duration) == .seconds(1))
// Audio
try #require(try await exportedAsset.sendTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.sendTracks(withMediaType: .audio).first)
try #require(try await exportedAsset.loadTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.loadTracks(withMediaType: .audio).first)
let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first)
#expect(audioFormat.mediaType == .audio)
#expect(audioFormat.mediaSubType == .mpeg4AAC)
#expect(audioFormat.audioChannelLayout?.numberOfChannels == 2)
#expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100)
// Video
try #require(await exportedAsset.sendTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first)
try #require(await exportedAsset.loadTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
#expect(try await videoTrack.load(.nominalFrameRate) == 24.0)
#expect(try await videoTrack.load(.estimatedDataRate) == 1_036_128)
let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((900_000 ... 1_130_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264)
@ -137,7 +101,7 @@ final class ExportSessionTests {
#expect(videoFormat.extensions[.yCbCrMatrix] == .yCbCrMatrix(.itu_R_709_2))
}
@Test func test_export_default_timerange() async throws {
@Test func test_export_default_time_range() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let destinationURL = makeTemporaryURL()
@ -190,6 +154,36 @@ final class ExportSessionTests {
#expect(try await exportedTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
}
@Test func test_export_x264_60fps() async throws {
let sourceURL = resourceURL(named: "test-x264-1080p-h264-60fps.mp4")
let destinationURL = makeTemporaryURL()
let subject = ExportSession()
try await subject.export(
asset: makeAsset(url: sourceURL),
video: .codec(.h264, width: 1920, height: 1080)
.bitrate(2_500_000)
.fps(30),
to: destinationURL.url,
as: .mp4
)
let exportedAsset = AVURLAsset(url: destinationURL.url)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first)
let naturalSize = try await videoTrack.load(.naturalSize)
#expect(naturalSize == CGSize(width: 1920, height: 1080))
let fps = try await videoTrack.load(.nominalFrameRate)
#expect(Int(fps.rounded()) == 30)
let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((2_400_000 ... 2_700_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264)
#expect(videoFormat.extensions[.colorPrimaries] == .colorPrimaries(.itu_R_709_2))
#expect(videoFormat.extensions[.transferFunction] == .transferFunction(.itu_R_709_2))
#expect(videoFormat.extensions[.yCbCrMatrix] == .yCbCrMatrix(.itu_R_709_2))
}
@Test func test_export_progress() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let progressValues = SendableWrapper<[Float]>([])
@ -207,6 +201,8 @@ final class ExportSessionTests {
as: .mov
)
// Wait for last progress value to be yielded.
try await Task.sleep(for: .milliseconds(10))
#expect(progressValues.value.count > 2, "There should be intermediate progress updates")
#expect(progressValues.value.first == 0.0)
#expect(progressValues.value.last == 1.0)
@ -230,17 +226,17 @@ final class ExportSessionTests {
@Test func test_export_throws_with_empty_audio_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsEmpty)) {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let videoComposition = try await makeVideoComposition(assetURL: sourceURL)
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov")
let videoComposition = try await self.makeVideoComposition(assetURL: sourceURL)
let subject = ExportSession()
try await subject.export(
asset: makeAsset(url: sourceURL),
asset: self.makeAsset(url: sourceURL),
audioOutputSettings: [:], // Here it matters because there's an audio track
videoOutputSettings: VideoOutputSettings
.codec(.h264, size: videoComposition.renderSize).settingsDictionary,
composition: videoComposition,
to: makeTemporaryURL().url,
to: self.makeTemporaryURL().url,
as: .mov
)
}
@ -248,18 +244,18 @@ final class ExportSessionTests {
@Test func test_export_throws_with_invalid_audio_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsInvalid)) {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov")
let subject = ExportSession()
try await subject.export(
asset: makeAsset(url: sourceURL),
asset: self.makeAsset(url: sourceURL),
audioOutputSettings: [
AVFormatIDKey: kAudioFormatMPEG4AAC,
AVNumberOfChannelsKey: NSNumber(value: -1), // invalid number of channels
],
videoOutputSettings: VideoOutputSettings
.codec(.h264, size: CGSize(width: 1280, height: 720)).settingsDictionary,
to: makeTemporaryURL().url,
to: self.makeTemporaryURL().url,
as: .mov
)
}
@ -267,12 +263,12 @@ final class ExportSessionTests {
@Test func test_export_throws_with_invalid_video_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.videoSettingsInvalid)) {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov")
let size = CGSize(width: 1280, height: 720)
let subject = ExportSession()
try await subject.export(
asset: makeAsset(url: sourceURL),
asset: self.makeAsset(url: sourceURL),
audioOutputSettings: AudioOutputSettings.default.settingsDictionary,
videoOutputSettings: [
// missing codec
@ -280,7 +276,7 @@ final class ExportSessionTests {
AVVideoHeightKey: NSNumber(value: Int(size.height)),
],
composition: nil,
to: makeTemporaryURL().url,
to: self.makeTemporaryURL().url,
as: .mov
)
}
@ -288,12 +284,12 @@ final class ExportSessionTests {
@Test func test_export_throws_with_no_video_track() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.videoTracksEmpty)) {
let sourceURL = resourceURL(named: "test-no-video.m4a")
let sourceURL = self.resourceURL(named: "test-no-video.m4a")
let subject = ExportSession()
try await subject.export(
asset: makeAsset(url: sourceURL),
asset: self.makeAsset(url: sourceURL),
video: .codec(.h264, width: 1280, height: 720),
to: makeTemporaryURL().url,
to: self.makeTemporaryURL().url,
as: .mov
)
}
@ -302,11 +298,11 @@ final class ExportSessionTests {
@Test func test_export_cancellation() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let destinationURL💥 = makeTemporaryURL()
let subject = ExportSession()
let task = Task {
let sourceAsset = AVURLAsset(url: sourceURL, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true,
])
let subject = ExportSession()
try await subject.export(
asset: sourceAsset,
video: .codec(.h264, width: 1280, height: 720),
@ -315,8 +311,10 @@ final class ExportSessionTests {
)
Issue.record("Task should be cancelled long before we get here")
}
NSLog("Sleeping for 0.3s")
try await Task.sleep(for: .milliseconds(300))
NSLog("Waiting for encoding to begin...")
for await progress in subject.progressStream where progress > 0 {
break
}
NSLog("Cancelling task")
task.cancel()
try? await task.value // Wait for task to complete