Compare commits

..

14 commits
0.3.7 ... main

Author SHA1 Message Date
9c9e24d2bc
Inline all setup code in SampleWriter initializer
It seems like a bug that only one of these warns and when swapping
the order, whichever is first warns, it's just that both must be
present to trigger a warning which is the buggy part.

Anyway since it seems like a legit error, inline all the code so
things should work fine.
2025-10-30 19:06:45 -07:00
93806c5ed0
Update copyright in Readme 2025-09-10 14:11:20 -07:00
c77c3cacd7
Bump version to 0.4.0 2025-09-10 14:10:39 -07:00
7dee5ab772
Merge pull request #5 from samsonjs/xcode26
Make it work with Xcode 26 RC
2025-09-10 14:08:10 -07:00
1d4e486041
Make it work with Xcode 26 RC 2025-09-10 14:05:25 -07:00
f60f5a9035
Add a changelog 2025-06-08 21:01:33 -07:00
0eefb949e2
Use a new task instead of assumeIsolated to try to fix crash on iOS 17 2025-05-25 18:02:35 -07:00
34c374d914
Fix tests in Xcode 16.4 on macOS 15.5 2025-05-21 13:22:26 -07:00
b627e9bf50
Fix warnings in tests in Xcode 16.3 2025-04-06 23:18:32 -07:00
7b7891ce14
Update readme for 0.3.8 2025-04-04 10:29:11 -07:00
1e768033a1
Fix a crash when cancelled while writing samples
Now we never force-unwrap videoInput or videoOutput. Or anything else
for that matter.
2025-04-04 10:26:09 -07:00
2f1b859a03
Stop relying on a specific delay in cancellation test 2025-04-04 10:12:48 -07:00
10c717ab99
Fix a typo 2025-04-04 09:46:55 -07:00
62a7a375c0
Fix tests with Swift 6.1 on macOS, which finally works! 2025-02-23 09:20:15 -08:00
5 changed files with 244 additions and 118 deletions

153
Changelog.md Normal file
View file

@ -0,0 +1,153 @@
# Changelog
## [Unreleased]
- Your change here.
[Unreleased]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.4.0...HEAD
## [0.4.0] - 2025-09-10
### Fixed
- Fixed building with Xcode 26 RC
[0.4.0]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.9...0.4.0
## [0.3.9] - 2025-05-25
### Fixed
- Fixed crash on iOS 17 by using a new task instead of assumeIsolated
[0.3.9]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.8...0.3.9
## [0.3.8] - 2025-04-04
### Fixed
- Fixed crash when cancelled while writing samples
- Fixed tests with Swift 6.1 on macOS
- Fixed tests in Xcode 16.4 on macOS 15.5
- Fixed warnings in tests in Xcode 16.3
### Changed
- Stopped relying on specific delay in cancellation test
- Updated readme for 0.3.8
[0.3.8]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.7...0.3.8
## [0.3.7] - 2025-01-19
### Fixed
- Simplified cancellation and fixed memory leak
[0.3.7]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.6...0.3.7
## [0.3.6] - 2025-01-19
### Fixed
- Attempted to fix possible retain cycle
[0.3.6]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.5...0.3.6
## [0.3.5] - 2025-01-19
### Fixed
- Improved cancellation response (potential memory leak issue)
### Removed
- Deleted dead code
### Changed
- Extracted BaseTests class for better test organization
[0.3.5]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.4...0.3.5
## [0.3.4] - 2024-11-08
### Fixed
- [#3](https://github.com/samsonjs/SJSAssetExportSession/pull/3): Fixed encoding stalling by interleaving audio and video samples - [@samsonjs](https://github.com/samsonjs).
### Changed
- Updated readme with additional documentation
[0.3.4]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.3...0.3.4
## [0.3.3] - 2024-10-19
### Changed
- Made AudioOutputSettings and VideoOutputSettings properties public
### Fixed
- Made tests work on iOS 18.0 and iOS 18.1
- Fixed progress test
### Removed
- Removed SampleWriter.duration property
[0.3.3]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.2...0.3.3
## [0.3.2] - 2024-10-19
### Fixed
- Fixed release builds by using makeStream for SampleWriter's progress
### Changed
- Updated example in readme to version 0.3.2
[0.3.2]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3.1...0.3.2
## [0.3.1] - 2024-10-19
### Fixed
- Removed unnecessary Task.yield() to fix intermittent hang
### Changed
- Improved code style and debuggability
- Updated version in readme to 0.3.1
[0.3.1]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.3...0.3.1
## [0.3] - 2024-10-18
### Added
- Made audio/video settings Hashable, Sendable, and Codable
### Changed
- Updated readme for version 0.3
- Fixed SwiftPM instructions in readme
[0.3]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.2...0.3
## [0.2] - 2024-10-04
### Fixed
- [#2](https://github.com/samsonjs/SJSAssetExportSession/pull/2): Fixed spatial audio handling by dropping spatial audio tracks to fix encoding iPhone 16 videos - [@samsonjs](https://github.com/samsonjs).
### Changed
- Code style improvements
- Updated version in readme's SPM example
[0.2]: https://github.com/samsonjs/SJSAssetExportSession/compare/0.1...0.2
## [0.1] - 2024-09-18
### Added
- Initial release as Swift Package
- Alternative to AVAssetExportSession with custom audio/video settings
- Builder pattern API for AudioOutputSettings and VideoOutputSettings
- Flexible raw dictionary API for maximum control
- Progress reporting via AsyncStream
- Support for iOS 17.0+, macOS 14.0+, and visionOS 1.3+
- Swift 6 strict concurrency support
- Comprehensive test suite with multiple video formats
### Changed
- Converted from Xcode project to Swift package
- Made yielding last progress value more reliable
- Set deployment targets to iOS 17, macOS 14, and visionOS 1.3
### Added
- Support for writing metadata on assets
- Documentation for most public API
- README and license files
[0.1]: https://github.com/samsonjs/SJSAssetExportSession/releases/tag/0.1

View file

@ -34,7 +34,7 @@ When you're integrating this into an app with Xcode then go to your project's Pa
When you're integrating this using SPM on its own then add this to the list of dependencies your Package.swift file:
```swift
.package(url: "https://github.com/samsonjs/SJSAssetExportSession.git", .upToNextMajor(from: "0.3.7"))
.package(url: "https://github.com/samsonjs/SJSAssetExportSession.git", .upToNextMajor(from: "0.4.0"))
```
and then add `"SJSAssetExportSession"` to the list of dependencies in your target as well.
@ -198,6 +198,6 @@ try await exporter.export(
## License
Copyright © 2024 [Sami Samhuri](https://samhuri.net) <sami@samhuri.net>. Released under the terms of the [MIT License][MIT].
Copyright © 2024-2025 [Sami Samhuri](https://samhuri.net) <sami@samhuri.net>. Released under the terms of the [MIT License][MIT].
[MIT]: https://sjs.mit-license.org

View file

@ -68,9 +68,22 @@ actor SampleWriter {
if let timeRange {
reader.timeRange = timeRange
}
self.reader = reader
let writer = try AVAssetWriter(outputURL: outputURL, fileType: fileType)
writer.shouldOptimizeForNetworkUse = optimizeForNetworkUse
writer.metadata = metadata
self.writer = writer
self.audioOutputSettings = audioOutputSettings
self.audioMix = audioMix
self.videoOutputSettings = videoOutputSettings
self.videoComposition = videoComposition
self.timeRange = if let timeRange {
timeRange
} else {
try await CMTimeRange(start: .zero, duration: asset.load(.duration))
}
// Filter out disabled tracks to avoid problems encoding spatial audio. Ideally this would
// preserve track groups and make that all configurable.
@ -79,7 +92,24 @@ actor SampleWriter {
// Audio is optional so only validate output settings when it's applicable.
if !audioTracks.isEmpty {
try Self.validateAudio(outputSettings: audioOutputSettings, writer: writer)
let audioOutput = AVAssetReaderAudioMixOutput(audioTracks: audioTracks, audioSettings: nil)
audioOutput.alwaysCopiesSampleData = false
audioOutput.audioMix = audioMix
guard reader.canAdd(audioOutput) else {
throw Error.setupFailure(.cannotAddAudioOutput)
}
reader.add(audioOutput)
self.audioOutput = audioOutput
let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioOutputSettings)
audioInput.expectsMediaDataInRealTime = false
guard writer.canAdd(audioInput) else {
throw Error.setupFailure(.cannotAddAudioInput)
}
writer.add(audioInput)
self.audioInput = audioInput
}
let videoTracks = try await asset.loadTracks(withMediaType: .video)
.filterAsync { try await $0.load(.isEnabled) }
guard !videoTracks.isEmpty else { throw Error.setupFailure(.videoTracksEmpty) }
@ -88,21 +118,25 @@ actor SampleWriter {
renderSize: videoComposition.renderSize,
settings: videoOutputSettings
)
self.audioOutputSettings = audioOutputSettings
self.audioMix = audioMix
self.videoOutputSettings = videoOutputSettings
self.videoComposition = videoComposition
self.reader = reader
self.writer = writer
self.timeRange = if let timeRange {
timeRange
} else {
try await CMTimeRange(start: .zero, duration: asset.load(.duration))
let videoOutput = AVAssetReaderVideoCompositionOutput(
videoTracks: videoTracks,
videoSettings: nil
)
videoOutput.alwaysCopiesSampleData = false
videoOutput.videoComposition = videoComposition
guard reader.canAdd(videoOutput) else {
throw Error.setupFailure(.cannotAddVideoOutput)
}
reader.add(videoOutput)
self.videoOutput = videoOutput
try await setUpAudio(audioTracks: audioTracks)
try await setUpVideo(videoTracks: videoTracks)
let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings)
videoInput.expectsMediaDataInRealTime = false
guard writer.canAdd(videoInput) else {
throw Error.setupFailure(.cannotAddVideoInput)
}
writer.add(videoInput)
self.videoInput = videoInput
}
func writeSamples() async throws {
@ -165,53 +199,6 @@ actor SampleWriter {
}
}
// MARK: - Setup
private func setUpAudio(audioTracks: [AVAssetTrack]) throws {
guard !audioTracks.isEmpty else { return }
let audioOutput = AVAssetReaderAudioMixOutput(audioTracks: audioTracks, audioSettings: nil)
audioOutput.alwaysCopiesSampleData = false
audioOutput.audioMix = audioMix
guard let reader, reader.canAdd(audioOutput) else {
throw Error.setupFailure(.cannotAddAudioOutput)
}
reader.add(audioOutput)
self.audioOutput = audioOutput
let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioOutputSettings)
audioInput.expectsMediaDataInRealTime = false
guard let writer, writer.canAdd(audioInput) else {
throw Error.setupFailure(.cannotAddAudioInput)
}
writer.add(audioInput)
self.audioInput = audioInput
}
private func setUpVideo(videoTracks: [AVAssetTrack]) throws {
precondition(!videoTracks.isEmpty, "Video tracks must be provided")
let videoOutput = AVAssetReaderVideoCompositionOutput(
videoTracks: videoTracks,
videoSettings: nil
)
videoOutput.alwaysCopiesSampleData = false
videoOutput.videoComposition = videoComposition
guard let reader, reader.canAdd(videoOutput) else {
throw Error.setupFailure(.cannotAddVideoOutput)
}
reader.add(videoOutput)
self.videoOutput = videoOutput
let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings)
videoInput.expectsMediaDataInRealTime = false
guard let writer, writer.canAdd(videoInput) else {
throw Error.setupFailure(.cannotAddVideoInput)
}
writer.add(videoInput)
self.videoInput = videoInput
}
// MARK: - Encoding
private func startEncodingAudioTracks() {
@ -221,19 +208,13 @@ actor SampleWriter {
}
audioInput.requestMediaDataWhenReady(on: queue) {
// NOTE: assumeIsolated crashes on macOS at the moment
self.assumeIsolated { _self in
_self.writeAllReadySamples()
}
Task { await self.writeAllReadySamples() }
}
}
private func startEncodingVideoTracks() {
videoInput!.requestMediaDataWhenReady(on: queue) {
// NOTE: assumeIsolated crashes on macOS at the moment
self.assumeIsolated { _self in
_self.writeAllReadySamples()
}
videoInput?.requestMediaDataWhenReady(on: queue) {
Task { await self.writeAllReadySamples() }
}
}
@ -243,8 +224,10 @@ actor SampleWriter {
if !hasMoreAudio { log.debug("Finished encoding audio") }
}
let hasMoreVideo = writeReadySamples(output: videoOutput!, input: videoInput!)
if !hasMoreVideo { log.debug("Finished encoding video") }
if let videoInput, let videoOutput {
let hasMoreVideo = writeReadySamples(output: videoOutput, input: videoInput)
if !hasMoreVideo { log.debug("Finished encoding video") }
}
}
private func writeReadySamples(output: AVAssetReaderOutput, input: AVAssetWriterInput) -> Bool {
@ -308,7 +291,7 @@ actor SampleWriter {
let renderWidth = Int(renderSize.width)
let renderHeight = Int(renderSize.height)
if renderWidth != settingsWidth || renderHeight != settingsHeight {
log.warning("Video composition's render size (\(renderWidth)\(renderHeight)) will be overriden by video output settings (\(settingsWidth)\(settingsHeight))")
log.warning("Video composition's render size (\(renderWidth)\(renderHeight)) will be overridden by video output settings (\(settingsWidth)\(settingsHeight))")
}
}
}

View file

@ -1,14 +0,0 @@
//
// AVAsset+sending.swift
// SJSAssetExportSessionTests
//
// Created by Sami Samhuri on 2024-07-07.
//
import AVFoundation
extension AVAsset {
func sendTracks(withMediaType mediaType: AVMediaType) async throws -> sending [AVAssetTrack] {
try await loadTracks(withMediaType: mediaType)
}
}

View file

@ -30,20 +30,20 @@ final class ExportSessionTests: BaseTests {
let exportedAsset = AVURLAsset(url: destinationURL.url)
#expect(try await exportedAsset.load(.duration) == .seconds(1))
// Audio
try #require(try await exportedAsset.sendTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.sendTracks(withMediaType: .audio).first)
try #require(try await exportedAsset.loadTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.loadTracks(withMediaType: .audio).first)
let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first)
#expect(audioFormat.mediaType == .audio)
#expect(audioFormat.mediaSubType == .mpeg4AAC)
#expect(audioFormat.audioChannelLayout?.numberOfChannels == 2)
#expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100)
// Video
try #require(await exportedAsset.sendTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first)
try #require(await exportedAsset.loadTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
#expect(try await videoTrack.load(.nominalFrameRate) == 24.0)
let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((1_000_000 ... 1_100_000).contains(dataRate))
#expect((900_000 ... 1_130_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264)
@ -79,20 +79,20 @@ final class ExportSessionTests: BaseTests {
let exportedAsset = AVURLAsset(url: destinationURL.url)
#expect(try await exportedAsset.load(.duration) == .seconds(1))
// Audio
try #require(try await exportedAsset.sendTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.sendTracks(withMediaType: .audio).first)
try #require(try await exportedAsset.loadTracks(withMediaType: .audio).count == 1)
let audioTrack = try #require(await exportedAsset.loadTracks(withMediaType: .audio).first)
let audioFormat = try #require(await audioTrack.load(.formatDescriptions).first)
#expect(audioFormat.mediaType == .audio)
#expect(audioFormat.mediaSubType == .mpeg4AAC)
#expect(audioFormat.audioChannelLayout?.numberOfChannels == 2)
#expect(audioFormat.audioStreamBasicDescription?.mSampleRate == 44_100)
// Video
try #require(await exportedAsset.sendTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first)
try #require(await exportedAsset.loadTracks(withMediaType: .video).count == 1)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1280, height: 720))
#expect(try await videoTrack.load(.nominalFrameRate) == 24.0)
let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((1_000_000 ... 1_100_000).contains(dataRate))
#expect((900_000 ... 1_130_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
#expect(videoFormat.mediaType == .video)
#expect(videoFormat.mediaSubType == .h264)
@ -101,7 +101,7 @@ final class ExportSessionTests: BaseTests {
#expect(videoFormat.extensions[.yCbCrMatrix] == .yCbCrMatrix(.itu_R_709_2))
}
@Test func test_export_default_timerange() async throws {
@Test func test_export_default_time_range() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let destinationURL = makeTemporaryURL()
@ -169,9 +169,11 @@ final class ExportSessionTests: BaseTests {
)
let exportedAsset = AVURLAsset(url: destinationURL.url)
let videoTrack = try #require(await exportedAsset.sendTracks(withMediaType: .video).first)
#expect(try await videoTrack.load(.naturalSize) == CGSize(width: 1920, height: 1080))
#expect(try await videoTrack.load(.nominalFrameRate) == 30.0)
let videoTrack = try #require(await exportedAsset.loadTracks(withMediaType: .video).first)
let naturalSize = try await videoTrack.load(.naturalSize)
#expect(naturalSize == CGSize(width: 1920, height: 1080))
let fps = try await videoTrack.load(.nominalFrameRate)
#expect(Int(fps.rounded()) == 30)
let dataRate = try await videoTrack.load(.estimatedDataRate)
#expect((2_400_000 ... 2_700_000).contains(dataRate))
let videoFormat = try #require(await videoTrack.load(.formatDescriptions).first)
@ -224,17 +226,17 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_empty_audio_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsEmpty)) {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let videoComposition = try await makeVideoComposition(assetURL: sourceURL)
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov")
let videoComposition = try await self.makeVideoComposition(assetURL: sourceURL)
let subject = ExportSession()
try await subject.export(
asset: makeAsset(url: sourceURL),
asset: self.makeAsset(url: sourceURL),
audioOutputSettings: [:], // Here it matters because there's an audio track
videoOutputSettings: VideoOutputSettings
.codec(.h264, size: videoComposition.renderSize).settingsDictionary,
composition: videoComposition,
to: makeTemporaryURL().url,
to: self.makeTemporaryURL().url,
as: .mov
)
}
@ -242,18 +244,18 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_invalid_audio_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.audioSettingsInvalid)) {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov")
let subject = ExportSession()
try await subject.export(
asset: makeAsset(url: sourceURL),
asset: self.makeAsset(url: sourceURL),
audioOutputSettings: [
AVFormatIDKey: kAudioFormatMPEG4AAC,
AVNumberOfChannelsKey: NSNumber(value: -1), // invalid number of channels
],
videoOutputSettings: VideoOutputSettings
.codec(.h264, size: CGSize(width: 1280, height: 720)).settingsDictionary,
to: makeTemporaryURL().url,
to: self.makeTemporaryURL().url,
as: .mov
)
}
@ -261,12 +263,12 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_invalid_video_settings() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.videoSettingsInvalid)) {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let sourceURL = self.resourceURL(named: "test-720p-h264-24fps.mov")
let size = CGSize(width: 1280, height: 720)
let subject = ExportSession()
try await subject.export(
asset: makeAsset(url: sourceURL),
asset: self.makeAsset(url: sourceURL),
audioOutputSettings: AudioOutputSettings.default.settingsDictionary,
videoOutputSettings: [
// missing codec
@ -274,7 +276,7 @@ final class ExportSessionTests: BaseTests {
AVVideoHeightKey: NSNumber(value: Int(size.height)),
],
composition: nil,
to: makeTemporaryURL().url,
to: self.makeTemporaryURL().url,
as: .mov
)
}
@ -282,12 +284,12 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_throws_with_no_video_track() async throws {
try await #require(throws: ExportSession.Error.setupFailure(.videoTracksEmpty)) {
let sourceURL = resourceURL(named: "test-no-video.m4a")
let sourceURL = self.resourceURL(named: "test-no-video.m4a")
let subject = ExportSession()
try await subject.export(
asset: makeAsset(url: sourceURL),
asset: self.makeAsset(url: sourceURL),
video: .codec(.h264, width: 1280, height: 720),
to: makeTemporaryURL().url,
to: self.makeTemporaryURL().url,
as: .mov
)
}
@ -296,11 +298,11 @@ final class ExportSessionTests: BaseTests {
@Test func test_export_cancellation() async throws {
let sourceURL = resourceURL(named: "test-720p-h264-24fps.mov")
let destinationURL💥 = makeTemporaryURL()
let subject = ExportSession()
let task = Task {
let sourceAsset = AVURLAsset(url: sourceURL, options: [
AVURLAssetPreferPreciseDurationAndTimingKey: true,
])
let subject = ExportSession()
try await subject.export(
asset: sourceAsset,
video: .codec(.h264, width: 1280, height: 720),
@ -309,8 +311,10 @@ final class ExportSessionTests: BaseTests {
)
Issue.record("Task should be cancelled long before we get here")
}
NSLog("Sleeping for 0.3s")
try await Task.sleep(for: .milliseconds(300))
NSLog("Waiting for encoding to begin...")
for await progress in subject.progressStream where progress > 0 {
break
}
NSLog("Cancelling task")
task.cancel()
try? await task.value // Wait for task to complete