Compare commits

..

5 commits

Author SHA1 Message Date
ivanbuper
6604afcb3a Add Kotlin dependencies to missing_aar_type_workaround.gradle
#cherrypick

PiperOrigin-RevId: 671736852
(cherry picked from commit a1357befff)
2024-09-06 15:07:08 +01:00
aquilescanta
afc72880e9 Populate DeviceInfo in CastPlayer using MediaRouter2 info
This enables linking the media session to a routing session.

Issue: androidx/media#1056
PiperOrigin-RevId: 671425490
(cherry picked from commit 4ea58a133e)
2024-09-05 19:04:22 +01:00
aquilescanta
4e8f17a7cb Do not clear the timeline after the Cast receiver disconnects
The goal is to enable the app to fetch the timeline after a
disconnection in order to prepare and resume local playback.

PiperOrigin-RevId: 671022044
(cherry picked from commit a00c446529)
2024-09-05 19:02:28 +01:00
ivanbuper
4e305729a0 Bump Media3 version to 1.5.0-alpha01
PiperOrigin-RevId: 670535221
(cherry picked from commit 9562c976a9)
2024-09-03 14:55:00 +01:00
ivanbuper
9c13301b0a Update release notes for Media3 1.5.0-alpha01 release
PiperOrigin-RevId: 670523759
(cherry picked from commit e16b4fff8d)
2024-09-03 14:31:59 +01:00
864 changed files with 11676 additions and 43391 deletions

View file

@ -19,8 +19,6 @@ body:
options:
- Media3 main branch
- Media3 pre-release (alpha, beta or RC not in this list)
- Media3 1.5.1
- Media3 1.5.0
- Media3 1.4.1
- Media3 1.4.0
- Media3 1.3.1

41
.gitignore vendored
View file

@ -52,31 +52,30 @@ tmp
# External native builds
.externalNativeBuild
.cxx
# VP9 decoder extension
libraries/decoder_vp9/src/main/jni/libvpx
libraries/decoder_vp9/src/main/jni/libvpx_android_configs
libraries/decoder_vp9/src/main/jni/libyuv
# VP9 extension
extensions/vp9/src/main/jni/libvpx
extensions/vp9/src/main/jni/libvpx_android_configs
extensions/vp9/src/main/jni/libyuv
# AV1 decoder extension
libraries/decoder_av1/src/main/jni/cpu_features
libraries/decoder_av1/src/main/jni/libgav1
# AV1 extension
extensions/av1/src/main/jni/cpu_features
extensions/av1/src/main/jni/libgav1
# Opus decoder extension
libraries/decoder_opus/src/main/jni/libopus
# Opus extension
extensions/opus/src/main/jni/libopus
# FLAC decoder extension
libraries/decoder_flac/src/main/jni/flac
# FLAC extension
extensions/flac/src/main/jni/flac
# FFmpeg decoder extension
libraries/decoder_ffmpeg/src/main/jni/ffmpeg
# FFmpeg extension
extensions/ffmpeg/src/main/jni/ffmpeg
# Cronet datasource extension
libraries/datasource_cronet/jniLibs/*
!libraries/datasource_cronet/jniLibs/README.md
libraries/datasource_cronet/libs/*
!libraries/datasource_cronet/libs/README.md
# Cronet extension
extensions/cronet/jniLibs/*
!extensions/cronet/jniLibs/README.md
extensions/cronet/libs/*
!extensions/cronet/libs/README.md
# MIDI decoder extension
libraries/decoder_midi/lib
# MIDI extension
extensions/midi/lib

View file

@ -2,33 +2,7 @@
## 1.5
### 1.5.1 (2024-12-19)
This release includes the following changes since the
[1.5.0 release](#150-2024-11-27):
* ExoPlayer:
* Disable use of asynchronous decryption in MediaCodec to avoid reported
codec timeout issues with this platform API
([#1641](https://github.com/androidx/media/issues/1641)).
* Extractors:
* MP3: Don't stop playback early when a `VBRI` frame's table of contents
doesn't cover all the MP3 data in a file
([#1904](https://github.com/androidx/media/issues/1904)).
* Video:
* Rollback of using `MediaCodecAdapter` supplied pixel aspect ratio values
when provided while processing `onOutputFormatChanged`
([#1371](https://github.com/androidx/media/pull/1371)).
* Text:
* Fix bug in `ReplacingCuesResolver.discardCuesBeforeTimeUs` where the cue
active at `timeUs` (started before but not yet ended) was incorrectly
discarded ([#1939](https://github.com/androidx/media/issues/1939)).
* Metadata:
* Extract disc/track numbering and genre from Vorbis comments into
`MediaMetadata`
([#1958](https://github.com/androidx/media/issues/1958)).
### 1.5.0 (2024-11-27)
### 1.5.0-alpha01 (2024-09-06)
This release includes the following changes since the
[1.4.1 release](#141-2024-08-23):
@ -48,20 +22,6 @@ This release includes the following changes since the
Kotlin-specific functionality built on top of the Common library
* Add `Player.listen` suspending extension function to spin a coroutine to
listen to `Player.Events` to the `media3-common-ktx` library.
* Remove `@DoNotInline` annotations from manually out-of-lined inner
classes designed to avoid
[runtime class verification failures](https://chromium.googlesource.com/chromium/src/+/HEAD/build/android/docs/class_verification_failures.md).
Recent versions of [R8](https://developer.android.com/build/shrink-code)
now automatically out-of-line calls like these to avoid the runtime
failures (so the manual out-of-lining is no longer required). All Gradle
users of the library must already be a using a version of the Android
Gradle Plugin that uses a version of R8 which does this,
[due to `compileSdk = 35`](https://issuetracker.google.com/345472586#comment7).
Users of the library with non-Gradle build systems will need to ensure
their R8-equivalent shrinking/obfuscating step does a similar automatic
out-of-lining process in order to avoid runtime class verification
failures. This change has
[already been done in other AndroidX libraries](http://r.android.com/3156141).
* ExoPlayer:
* `MediaCodecRenderer.onProcessedStreamChange()` can now be called for
every media item. Previously it was not called for the first one. Use
@ -86,78 +46,11 @@ This release includes the following changes since the
([#1571](https://github.com/androidx/media/issues/1571)).
* Add `AnalyticsListener.onRendererReadyChanged()` to signal when
individual renderers allow playback to be ready.
* Fix `MediaCodec.CryptoException` sometimes being reported as an
"unexpected runtime error" when `MediaCodec` is operated in asynchronous
mode (default behaviour on API 31+).
* Pass `bufferedDurationUs` instead of `bufferedPositionUs` with
`PreloadMediaSource.PreloadControl.onContinueLoadingRequested()`. Also
changes `DefaultPreloadManager.Status.STAGE_LOADED_TO_POSITION_MS` to
`DefaultPreloadManager.Status.STAGE_LOADED_FOR_DURATION_MS`, apps then
need to pass a value representing a specific duration from the default
start position for which the corresponding media source has to be
preloaded with this IntDef, instead of a position.
* Add `ForwardingRenderer` implementation that forwards all method calls
to another renderer
([1703](https://github.com/androidx/media/pull/1703)).
* Add playlist preloading for the next item in the playlist. Apps can
enable preloading by calling
`ExoPlayer.setPreloadConfiguration(PreloadConfiguration)` accordingly.
By default preloading is disabled. When opted-in and to not interfere
with playback, `DefaultLoadControl` restricts preloading to start and
continue only when the player is not loading for playback. Apps can
change this behaviour by implementing
`LoadControl.shouldContinuePreloading()` accordingly (like when
overriding this method in `DefaultLoadControl`). The default
implementation of `LoadControl` disables preloading in case an app is
using a custom implementation of `LoadControl`.
* Add method `MediaSourceEventListener.EventDispatcher.dispatchEvent()` to
allow invoking events of subclass listeners
([1736](https://github.com/androidx/media/pull/1736)).
* Add `DefaultPreloadManager.Builder` that builds the
`DefaultPreloadManager` and `ExoPlayer` instances with consistently
shared configurations.
* Remove `Renderer[]` parameter from `LoadControl.onTracksSelected()` as
`DefaultLoadControl` implementation can retrieve the stream types from
`ExoTrackSelection[]`.
* Deprecated `DefaultLoadControl.calculateTargetBufferBytes(Renderer[],
ExoTrackSelection[])` and marked method as final to prevent overrides.
The new
`DefaultLoadControl.calculateTargetBufferBytes(ExoTrackSelection[])`
should be used instead.
* Report `MediaSourceEventListener` events from secondary sources in
`MergingMediaSource`. This will result in load
start/error/cancelled/completed events being reported for sideloaded
subtitles (those added with
`MediaItem.LocalConfiguration.subtitleConfigurations`), which may appear
as duplicate load events emitted from `AnalyticsListener`.
* Prevent subtitle & metadata errors from completely stopping playback.
Instead the problematic track is disabled and playback of the remaining
tracks continues
([#1722](https://github.com/google/ExoPlayer/issues/1722)).
* In new subtitle handling (during extraction), associated parse (e.g.
invalid subtitle data) and load errors (e.g. HTTP 404) are emitted
via `onLoadError` callbacks.
* In legacy subtitle handling (during rendering), only associated load
errors are emitted via `onLoadError` callbacks while parse errors
are silently ignored (this is pre-existing behaviour).
* Fix bug where playlist items or periods in multi-period DASH streams
with durations that don't match the actual content could cause frame
freezes at the end of the item
([#1698](https://github.com/androidx/media/issues/1698)).
* Add a setter to `SntpClient` to set the max elapsed time since the last
update after which the client is re-initialized
([#1794](https://github.com/androidx/media/pull/1794)).
* Transformer:
* Add `SurfaceAssetLoader`, which supports queueing video data to
Transformer via a `Surface`.
* `ImageAssetLoader` reports unsupported input via `AssetLoader.onError`
instead of throwing an `IllegalStateException`.
* Make setting the image duration using
`MediaItem.Builder.setImageDurationMs` mandatory for image export.
* Add export support for gaps in sequences of audio EditedMediaItems.
* Track Selection:
* `DefaultTrackSelector`: Prefer object-based audio over channel-based
audio when other factors are equal.
* Extractors:
* Allow `Mp4Extractor` and `FragmentedMp4Extractor` to identify H264
samples that are not used as reference by subsequent samples.
@ -168,50 +61,14 @@ This release includes the following changes since the
playback at the end of the MP3 data instead of failing with
`ParserException: Searched too many bytes.{contentIsMalformed=true,
dataType=1}` ([#1563](https://github.com/androidx/media/issues/1563)).
* Fix preroll sample handling for non-keyframe media start positions when
processing edit lists in MP4 files
([#1659](https://github.com/google/ExoPlayer/issues/1659)).
* Improved frame rate calculation by using media duration from the `mdhd`
box in `Mp4Extractor` and `FragmentedMp4Extractor`
([#1531](https://github.com/androidx/media/issues/1531)).
* Fix incorrect scaling of `media_time` in MP4 edit lists. While
`segment_duration` was already correctly scaled using the movie
timescale, `media_time` is now properly scaled using the track
timescale, as specified by the MP4 format standard
([#1792](https://github.com/androidx/media/issues/1792)).
* Handle out-of-order frames in `endIndices` calculation for MP4 with edit
list ([#1797](https://github.com/androidx/media/issues/1797)).
* Fix media duration parsing in `mdhd` box of MP4 files to handle `-1`
values ([#1819](https://github.com/androidx/media/issues/1819)).
* Add support for identifying `h263` box in MP4 files for H.263 video
([#1821](https://github.com/androidx/media/issues/1821)).
* Add AC-4 Level-4 ISO base media file format support
([#1265](https://github.com/androidx/media/pull/1265)).
* DataSource:
* Update `HttpEngineDataSource` to allow use starting at version S
extension 7 instead of API level 34
([#1262](https://github.com/androidx/media/issues/1262)).
* `DataSourceContractTest`: Assert that `DataSource.getUri()` returns the
resolved URI (as documented). Where this is different to the requested
URI, tests can indicate this using the new
`DataSourceContractTest.TestResource.Builder.setResolvedUri()` method.
* `DataSourceContractTest`: Assert that `DataSource.getUri()` and
`getResponseHeaders()` return their 'open' value after a failed call to
`open()` (due to a 'not found' resource) and before a subsequent
`close()` call.
* Overriding `DataSourceContractTest.getNotFoundResources()` allows
test sub-classes to provide multiple 'not found' resources, and to
provide any expected headers too. This allows to distinguish between
HTTP 404 (with headers) and "server not found" (no headers).
* Audio:
* Automatically configure CTA-2075 loudness metadata on the codec if
present in the media.
* Ensure smooth volume ramp down when seeking.
* Fix pop sounds that may occur during seeks.
* Fix truncation error accumulation for Sonic's
time-stretching/pitch-shifting algorithm.
* Fix bug in `SpeedChangingAudioProcessor` that causes dropped output
frames.
* Video:
* `MediaCodecVideoRenderer` avoids decoding samples that are neither
rendered nor used as reference by other samples.
@ -222,26 +79,10 @@ This release includes the following changes since the
* Use `MediaCodecAdapter` supplied pixel aspect ratio values if provided
when processing `onOutputFormatChanged`
([#1371](https://github.com/androidx/media/pull/1371)).
* Add workaround for a device issue on Galaxy Tab S7 FE that causes 60fps
secure H264 streams to be marked as unsupported
([#1619](https://github.com/androidx/media/issues/1619)).
* Add workaround for codecs that get stuck after the last sample without
returning an end-of-stream signal.
* Text:
* Add a custom `VoiceSpan` and populate it for
[WebVTT voice spans](https://www.w3.org/TR/webvtt1/#webvtt-cue-voice-span)
([#1632](https://github.com/androidx/media/issues/1632)).
* Ensure WebVTT in HLS with very large subtitle timestamps (which overflow
a 64-bit `long` when represented as microseconds and multiplied by the
`90,000` MPEG timebase) are displayed
([#1763](https://github.com/androidx/media/issues/1763)).
* Support CEA-608 subtitles in Dolby Vision content
([#1820](https://github.com/androidx/media/issues/1820)).
* Fix playback hanging on DASH multi-period streams when CEA-608 subtitles
are enabled ([#1863](https://github.com/androidx/media/issues/1863)).
* Metadata:
* Assign the `C.TRACK_TYPE_METADATA` type to tracks containing icy or
vnd.dvb.ait content.
* Image:
* Add `ExternallyLoadedImageDecoder` for simplified integration with
external image loading libraries like Glide or Coil.
@ -260,9 +101,6 @@ This release includes the following changes since the
* Fix bug where clearing the playlist may cause an
`ArrayIndexOutOfBoundsException` in
`ImaServerSideAdInsertionMediaSource`.
* Fix bug where server-side inserted DAI streams without a preroll can
result in an `ArrayIndexOutOfBoundsException` when playing past the last
midroll ([#1741](https://github.com/androidx/media/issues/1741)).
* Session:
* Add `MediaButtonReceiver.shouldStartForegroundService(Intent)` to allow
apps to suppress a play command coming in for playback resumption by
@ -270,60 +108,9 @@ This release includes the following changes since the
playback can't be suppressed without the system crashing the service
with a `ForegroundServiceDidNotStartInTimeException`
([#1528](https://github.com/google/ExoPlayer/issues/1528)).
* Fix bug that caused custom commands sent from a `MediaBrowser` being
dispatched to the `MediaSessionCompat.Callback` instead of the
`MediaBrowserServiceCompat` variant of the method when connected to a
legacy service. This prevented the `MediaBrowser` to receive the actual
return value sent back by the legacy service
([#1474](https://github.com/androidx/media/issues/1474)).
* Handle `IllegalArgumentException` thrown by devices of certain
manufacturers when setting the broadcast receiver for media button
intents ([#1730](https://github.com/androidx/media/issues/1730)).
* Add command buttons for media items. This adds the Media3 API for what
was known as `Custom browse actions` with the legacy library with
`MediaBrowserCompat`. Note that with Media3 command buttons for media
items are available for both, `MediaBrowser` and `MediaController`. See
[Custom Browse actions of AAOS](https://developer.android.com/training/cars/media#custom_browse_actions).
* Fix bug where a Media3 controller was sometimes unable to let a session
app start a foreground service after requesting `play()`.
* Restrict `CommandButton.Builder.setIconUri` to only accept content Uris.
* Pass connection hints of a Media3 browser to the initial
`MediaBrowserCompat` when connecting to a legacy `MediaBrowserCompat`.
The service can receive the connection hints passed in as root hints
with the first call to `onGetRoot()`.
* Fix bug where a `MediaBrowser` connected to a legacy browser service,
didn't receive an error sent by the service after the browser has
subscribed to a `parentid`.
* Improve interoperability behavior, so that a Media3 browser that is
connected to a legacy `MediaBrowserService` doesn't request the children
of a `parentId` twice when subscribing to a parent.
* UI:
* Make the stretched/cropped video in
`PlayerView`-in-Compose-`AndroidView` workaround opt-in, due to issues
with XML-based shared transitions. Apps using `PlayerView` inside
`AndroidView` need to call
`PlayerView.setEnableComposeSurfaceSyncWorkaround` in order to opt-in
([#1237](https://github.com/androidx/media/issues/1237),
[#1594](https://github.com/androidx/media/issues/1594)).
* Add `setFullscreenButtonState` to `PlayerView` to allow updates of
fullscreen button's icon on demand, i.e. out-of-band and not reactively
to a click interaction
([#1590](https://github.com/androidx/media/issues/1590),
[#184](https://github.com/androidx/media/issues/184)).
* Fix bug where the "None" choice in the text selection is not working if
there are app-defined text track selection preferences.
* DASH Extension:
* Add support for periods starting in the middle of a segment
([#1440](https://github.com/androidx/media/issues/1440)).
* Smooth Streaming Extension:
* Fix a `Bad magic number for Bundle` error when playing SmoothStreaming
streams with text tracks
([#1779](https://github.com/androidx/media/issues/1779)).
* RTSP Extension:
* Fix user info removal for URLs that contain encoded @ characters
([#1138](https://github.com/androidx/media/pull/1138)).
* Fix crashing when parsing of RTP packets with header extensions
([#1225](https://github.com/androidx/media/pull/1225)).
* Decoder Extensions (FFmpeg, VP9, AV1, etc.):
* Add the IAMF decoder module, which provides support for playback of MP4
files containing IAMF tracks using the libiamf native library to
@ -331,10 +118,8 @@ This release includes the following changes since the
* Playback is enabled with a stereo layout as well as 5.1 with
spatialization together with optional head tracking enabled, but
binaural playback support is currently not available.
* Add 16 KB page support for decoder extensions on Android 15
([#1685](https://github.com/androidx/media/issues/1685)).
* Cast Extension:
* Stop cleaning the timeline after the CastSession disconnects, which
* Stop clearning the timeline after the CastSession disconnects, which
enables the sender app to resume playback locally after a disconnection.
* Populate CastPlayer's `DeviceInfo` when a `Context` is provided. This
enables linking the `MediaSession` to a `RoutingSession`, which is
@ -344,33 +129,12 @@ This release includes the following changes since the
* `DataSourceContractTest` now includes tests to verify:
* Input stream `read position` is updated.
* Output buffer `offset` is applied correctly.
* Demo app
* Resolve the memory leaks in demo short-form app
([#1839](https://github.com/androidx/media/issues/1839)).
* Remove deprecated symbols:
* Remove deprecated `Player.hasPrevious`, `Player.hasPreviousWindow()`.
Use `Player.hasPreviousMediaItem()` instead.
* Remove deprecated `Player.previous()`method. Use
`Player.seekToPreviousMediaItem()` instead.
* Remove deprecated `DrmSessionEventListener.onDrmSessionAcquired` method.
* Remove deprecated `DefaultEncoderFactory` constructors. Use
`DefaultEncoderFactory.Builder` instead.
### 1.5.0-rc02 (2024-11-19)
Use the 1.5.0 [stable version](#150-2024-11-27).
### 1.5.0-rc01 (2024-11-13)
Use the 1.5.0 [stable version](#150-2024-11-27).
### 1.5.0-beta01 (2024-10-30)
Use the 1.5.0 [stable version](#150-2024-11-27).
### 1.5.0-alpha01 (2024-09-06)
Use the 1.5.0 [stable version](#150-2024-11-27).
## 1.4

View file

@ -12,8 +12,8 @@
// See the License for the specific language governing permissions and
// limitations under the License.
project.ext {
releaseVersion = '1.5.1'
releaseVersionCode = 1_005_001_3_00
releaseVersion = '1.5.0-alpha01'
releaseVersionCode = 1_005_000_0_01
minSdkVersion = 21
// See https://developer.android.com/training/cars/media/automotive-os#automotive-module
automotiveMinSdkVersion = 28
@ -28,7 +28,7 @@ project.ext {
junitVersion = '4.13.2'
// Use the same Guava version as the Android repo:
// https://cs.android.com/android/platform/superproject/main/+/main:external/guava/METADATA
guavaVersion = '33.3.1-android'
guavaVersion = '33.0.0-android'
glideVersion = '4.14.2'
kotlinxCoroutinesVersion = '1.8.1'
leakCanaryVersion = '2.10'

View file

@ -54,7 +54,6 @@ dependencies {
implementation project(modulePrefix + 'lib-effect')
implementation project(modulePrefix + 'lib-exoplayer')
implementation project(modulePrefix + 'lib-exoplayer-dash')
implementation project(modulePrefix + 'lib-muxer')
implementation project(modulePrefix + 'lib-transformer')
implementation project(modulePrefix + 'lib-ui')
implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion

View file

@ -15,19 +15,9 @@
*/
package androidx.media3.demo.composition;
import static androidx.media3.transformer.Composition.HDR_MODE_EXPERIMENTAL_FORCE_INTERPRET_HDR_AS_SDR;
import static androidx.media3.transformer.Composition.HDR_MODE_KEEP_HDR;
import static androidx.media3.transformer.Composition.HDR_MODE_TONE_MAP_HDR_TO_SDR_USING_MEDIACODEC;
import static androidx.media3.transformer.Composition.HDR_MODE_TONE_MAP_HDR_TO_SDR_USING_OPEN_GL;
import android.app.Activity;
import android.content.DialogInterface;
import android.os.Bundle;
import android.view.LayoutInflater;
import android.view.View;
import android.widget.ArrayAdapter;
import android.widget.CheckBox;
import android.widget.Spinner;
import android.widget.Toast;
import androidx.annotation.Nullable;
import androidx.appcompat.app.AlertDialog;
@ -37,16 +27,12 @@ import androidx.appcompat.widget.AppCompatCheckBox;
import androidx.appcompat.widget.AppCompatTextView;
import androidx.media3.common.Effect;
import androidx.media3.common.MediaItem;
import androidx.media3.common.MimeTypes;
import androidx.media3.common.PlaybackException;
import androidx.media3.common.Player;
import androidx.media3.common.audio.SonicAudioProcessor;
import androidx.media3.common.util.Clock;
import androidx.media3.common.util.Log;
import androidx.media3.common.util.Util;
import androidx.media3.effect.DebugTraceUtil;
import androidx.media3.effect.LanczosResample;
import androidx.media3.effect.Presentation;
import androidx.media3.effect.RgbFilter;
import androidx.media3.transformer.Composition;
import androidx.media3.transformer.CompositionPlayer;
@ -55,7 +41,6 @@ import androidx.media3.transformer.EditedMediaItemSequence;
import androidx.media3.transformer.Effects;
import androidx.media3.transformer.ExportException;
import androidx.media3.transformer.ExportResult;
import androidx.media3.transformer.InAppMuxer;
import androidx.media3.transformer.JsonUtil;
import androidx.media3.transformer.Transformer;
import androidx.media3.ui.PlayerView;
@ -65,7 +50,6 @@ import androidx.recyclerview.widget.RecyclerView;
import com.google.common.base.Stopwatch;
import com.google.common.base.Ticker;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableMap;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
@ -82,17 +66,6 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
private static final String TAG = "CompPreviewActivity";
private static final String AUDIO_URI =
"https://storage.googleapis.com/exoplayer-test-media-0/play.mp3";
private static final String SAME_AS_INPUT_OPTION = "same as input";
private static final ImmutableMap<String, @Composition.HdrMode Integer> HDR_MODE_DESCRIPTIONS =
new ImmutableMap.Builder<String, @Composition.HdrMode Integer>()
.put("Keep HDR", HDR_MODE_KEEP_HDR)
.put("MediaCodec tone-map HDR to SDR", HDR_MODE_TONE_MAP_HDR_TO_SDR_USING_MEDIACODEC)
.put("OpenGL tone-map HDR to SDR", HDR_MODE_TONE_MAP_HDR_TO_SDR_USING_OPEN_GL)
.put("Force Interpret HDR as SDR", HDR_MODE_EXPERIMENTAL_FORCE_INTERPRET_HDR_AS_SDR)
.build();
private static final ImmutableList<String> RESOLUTION_HEIGHTS =
ImmutableList.of(
SAME_AS_INPUT_OPTION, "144", "240", "360", "480", "720", "1080", "1440", "2160");
private ArrayList<String> sequenceAssetTitles;
private boolean[] selectedMediaItems;
@ -124,25 +97,12 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
exportInformationTextView = findViewById(R.id.export_information_text);
exportButton = findViewById(R.id.composition_export_button);
exportButton.setOnClickListener(view -> showExportSettings());
exportButton.setOnClickListener(view -> exportComposition());
AppCompatCheckBox backgroundAudioCheckBox = findViewById(R.id.background_audio_checkbox);
backgroundAudioCheckBox.setOnCheckedChangeListener(
(compoundButton, checked) -> includeBackgroundAudioTrack = checked);
ArrayAdapter<String> resolutionHeightAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
resolutionHeightAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
Spinner resolutionHeightSpinner = findViewById(R.id.resolution_height_spinner);
resolutionHeightSpinner.setAdapter(resolutionHeightAdapter);
resolutionHeightAdapter.addAll(RESOLUTION_HEIGHTS);
ArrayAdapter<String> hdrModeAdapter = new ArrayAdapter<>(this, R.layout.spinner_item);
hdrModeAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
Spinner hdrModeSpinner = findViewById(R.id.hdr_mode_spinner);
hdrModeSpinner.setAdapter(hdrModeAdapter);
hdrModeAdapter.addAll(HDR_MODE_DESCRIPTIONS.keySet());
AppCompatCheckBox applyVideoEffectsCheckBox = findViewById(R.id.apply_video_effects_checkbox);
applyVideoEffectsCheckBox.setOnCheckedChangeListener(
((compoundButton, checked) -> appliesVideoEffects = checked));
@ -190,19 +150,12 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
String[] presetUris = getResources().getStringArray(/* id= */ R.array.preset_uris);
int[] presetDurationsUs = getResources().getIntArray(/* id= */ R.array.preset_durations);
List<EditedMediaItem> mediaItems = new ArrayList<>();
ImmutableList.Builder<Effect> videoEffectsBuilder = new ImmutableList.Builder<>();
if (appliesVideoEffects) {
videoEffectsBuilder.add(MatrixTransformationFactory.createDizzyCropEffect());
videoEffectsBuilder.add(RgbFilter.createGrayscaleFilter());
}
Spinner resolutionHeightSpinner = findViewById(R.id.resolution_height_spinner);
String selectedResolutionHeight = String.valueOf(resolutionHeightSpinner.getSelectedItem());
if (!SAME_AS_INPUT_OPTION.equals(selectedResolutionHeight)) {
int resolutionHeight = Integer.parseInt(selectedResolutionHeight);
videoEffectsBuilder.add(LanczosResample.scaleToFit(10000, resolutionHeight));
videoEffectsBuilder.add(Presentation.createForHeight(resolutionHeight));
}
ImmutableList<Effect> videoEffects = videoEffectsBuilder.build();
ImmutableList<Effect> videoEffects =
appliesVideoEffects
? ImmutableList.of(
MatrixTransformationFactory.createDizzyCropEffect(),
RgbFilter.createGrayscaleFilter())
: ImmutableList.of();
// Preview requires all sequences to be the same duration, so calculate main sequence duration
// and limit background sequence duration to match.
long videoSequenceDurationUs = 0;
@ -226,7 +179,7 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
mediaItems.add(itemBuilder.build());
}
}
EditedMediaItemSequence videoSequence = new EditedMediaItemSequence.Builder(mediaItems).build();
EditedMediaItemSequence videoSequence = new EditedMediaItemSequence(mediaItems);
List<EditedMediaItemSequence> compositionSequences = new ArrayList<>();
compositionSequences.add(videoSequence);
if (includeBackgroundAudioTrack) {
@ -234,15 +187,11 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
}
SonicAudioProcessor sampleRateChanger = new SonicAudioProcessor();
sampleRateChanger.setOutputSampleRateHz(8_000);
Spinner hdrModeSpinner = findViewById(R.id.hdr_mode_spinner);
int selectedHdrMode =
HDR_MODE_DESCRIPTIONS.get(String.valueOf(hdrModeSpinner.getSelectedItem()));
return new Composition.Builder(compositionSequences)
return new Composition.Builder(/* sequences= */ compositionSequences)
.setEffects(
new Effects(
/* audioProcessors= */ ImmutableList.of(sampleRateChanger),
/* videoEffects= */ ImmutableList.of()))
.setHdrMode(selectedHdrMode)
.build();
}
@ -258,7 +207,7 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
.build();
EditedMediaItem audioItem =
new EditedMediaItem.Builder(audioMediaItem).setDurationUs(59_000_000).build();
return new EditedMediaItemSequence.Builder(audioItem).build();
return new EditedMediaItemSequence(audioItem);
}
private void previewComposition() {
@ -289,7 +238,7 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
new AlertDialog.Builder(/* context= */ this)
.setTitle(R.string.select_preset_title)
.setMultiChoiceItems(presetDescriptions, selectedMediaItems, this::selectPresetInDialog)
.setPositiveButton(R.string.ok, /* listener= */ null)
.setPositiveButton(android.R.string.ok, /* listener= */ null)
.setCancelable(false)
.create()
.show();
@ -308,67 +257,7 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
}
}
private void showExportSettings() {
AlertDialog.Builder alertDialogBuilder = new AlertDialog.Builder(this);
LayoutInflater inflater = this.getLayoutInflater();
View exportSettingsDialogView = inflater.inflate(R.layout.export_settings, null);
alertDialogBuilder
.setView(exportSettingsDialogView)
.setTitle(R.string.export_settings)
.setPositiveButton(
R.string.export, (dialog, id) -> exportComposition(exportSettingsDialogView))
.setNegativeButton(R.string.cancel, (dialog, id) -> dialog.dismiss());
ArrayAdapter<String> audioMimeAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
audioMimeAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
Spinner audioMimeSpinner = exportSettingsDialogView.findViewById(R.id.audio_mime_spinner);
audioMimeSpinner.setAdapter(audioMimeAdapter);
audioMimeAdapter.addAll(
SAME_AS_INPUT_OPTION, MimeTypes.AUDIO_AAC, MimeTypes.AUDIO_AMR_NB, MimeTypes.AUDIO_AMR_WB);
ArrayAdapter<String> videoMimeAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
videoMimeAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
Spinner videoMimeSpinner = exportSettingsDialogView.findViewById(R.id.video_mime_spinner);
videoMimeSpinner.setAdapter(videoMimeAdapter);
videoMimeAdapter.addAll(
SAME_AS_INPUT_OPTION,
MimeTypes.VIDEO_H263,
MimeTypes.VIDEO_H264,
MimeTypes.VIDEO_H265,
MimeTypes.VIDEO_MP4V,
MimeTypes.VIDEO_AV1);
CheckBox enableDebugTracingCheckBox =
exportSettingsDialogView.findViewById(R.id.enable_debug_tracing_checkbox);
enableDebugTracingCheckBox.setOnCheckedChangeListener(
(buttonView, isChecked) -> DebugTraceUtil.enableTracing = isChecked);
// Connect producing fragmented MP4 to using Media3 Muxer
CheckBox useMedia3MuxerCheckBox =
exportSettingsDialogView.findViewById(R.id.use_media3_muxer_checkbox);
CheckBox produceFragmentedMp4CheckBox =
exportSettingsDialogView.findViewById(R.id.produce_fragmented_mp4_checkbox);
useMedia3MuxerCheckBox.setOnCheckedChangeListener(
(buttonView, isChecked) -> {
if (!isChecked) {
produceFragmentedMp4CheckBox.setChecked(false);
}
});
produceFragmentedMp4CheckBox.setOnCheckedChangeListener(
(buttonView, isChecked) -> {
if (isChecked) {
useMedia3MuxerCheckBox.setChecked(true);
}
});
AlertDialog dialog = alertDialogBuilder.create();
dialog.show();
}
private void exportComposition(View exportSettingsDialogView) {
private void exportComposition() {
// Cancel and clean up files from any ongoing export.
cancelExport();
@ -389,33 +278,8 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
}
String filePath = outputFile.getAbsolutePath();
Transformer.Builder transformerBuilder = new Transformer.Builder(/* context= */ this);
Spinner audioMimeTypeSpinner = exportSettingsDialogView.findViewById(R.id.audio_mime_spinner);
String selectedAudioMimeType = String.valueOf(audioMimeTypeSpinner.getSelectedItem());
if (!SAME_AS_INPUT_OPTION.equals(selectedAudioMimeType)) {
transformerBuilder.setAudioMimeType(selectedAudioMimeType);
}
Spinner videoMimeTypeSpinner = exportSettingsDialogView.findViewById(R.id.video_mime_spinner);
String selectedVideoMimeType = String.valueOf(videoMimeTypeSpinner.getSelectedItem());
if (!SAME_AS_INPUT_OPTION.equals(selectedVideoMimeType)) {
transformerBuilder.setVideoMimeType(selectedVideoMimeType);
}
CheckBox useMedia3MuxerCheckBox =
exportSettingsDialogView.findViewById(R.id.use_media3_muxer_checkbox);
CheckBox produceFragmentedMp4CheckBox =
exportSettingsDialogView.findViewById(R.id.produce_fragmented_mp4_checkbox);
if (useMedia3MuxerCheckBox.isChecked()) {
transformerBuilder.setMuxerFactory(
new InAppMuxer.Factory.Builder()
.setOutputFragmentedMp4(produceFragmentedMp4CheckBox.isChecked())
.build());
}
transformer =
transformerBuilder
new Transformer.Builder(/* context= */ this)
.addListener(
new Transformer.Listener() {
@Override
@ -424,7 +288,6 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
long elapsedTimeMs = exportStopwatch.elapsed(TimeUnit.MILLISECONDS);
String details =
getString(R.string.export_completed, elapsedTimeMs / 1000.f, filePath);
Log.d(TAG, DebugTraceUtil.generateTraceSummary());
Log.i(TAG, details);
exportInformationTextView.setText(details);
@ -453,7 +316,6 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
Toast.LENGTH_LONG)
.show();
Log.e(TAG, "Export error", exportException);
Log.d(TAG, DebugTraceUtil.generateTraceSummary());
exportInformationTextView.setText(R.string.export_error);
}
})

View file

@ -111,49 +111,7 @@
android:text="@string/add_background_audio"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintBottom_toTopOf="@id/resolution_height_setting" />
<LinearLayout
android:id="@+id/resolution_height_setting"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical"
android:layout_marginBottom="8dp"
app:layout_constraintBottom_toTopOf="@id/hdr_mode_setting">
<TextView
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1"
android:text="@string/output_video_resolution"/>
<Spinner
android:id="@+id/resolution_height_spinner"
android:layout_gravity="end|center_vertical"
android:gravity="end"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
<LinearLayout
android:id="@+id/hdr_mode_setting"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical"
android:layout_marginBottom="12dp"
app:layout_constraintBottom_toTopOf="@id/preview_button">
<TextView
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1"
android:text="@string/hdr_mode" />
<Spinner
android:id="@+id/hdr_mode_spinner"
android:layout_gravity="end|center_vertical"
android:gravity="end"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
app:layout_constraintBottom_toTopOf="@id/preview_button" />
<androidx.appcompat.widget.AppCompatButton
android:id="@+id/composition_export_button"

View file

@ -1,110 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Copyright 2024 The Android Open Source Project
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/export_settings_list"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="vertical"
android:padding="8dp">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical"
android:layout_marginBottom="12dp"
android:layout_marginTop="12dp">
<TextView
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1"
android:text="@string/output_audio_mime_type"/>
<Spinner
android:id="@+id/audio_mime_spinner"
android:layout_gravity="end|center_vertical"
android:gravity="end"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical"
android:layout_marginBottom="12dp">
<TextView
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1"
android:text="@string/output_video_mime_type"/>
<Spinner
android:id="@+id/video_mime_spinner"
android:layout_gravity="end|center_vertical"
android:gravity="end"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical">
<TextView
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1"
android:text="@string/enable_debug_tracing"/>
<CheckBox
android:id="@+id/enable_debug_tracing_checkbox"
android:layout_gravity="end"
android:checked="false"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical">
<TextView
android:text="@string/use_media3_muxer"
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1" />
<CheckBox
android:id="@+id/use_media3_muxer_checkbox"
android:layout_gravity="end"
android:checked="false"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical">
<TextView
android:text="@string/produce_fragmented_mp4"
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1" />
<CheckBox
android:id="@+id/produce_fragmented_mp4_checkbox"
android:layout_gravity="end"
android:checked="false"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
</LinearLayout>

View file

@ -1,25 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Copyright 2024 The Android Open Source Project
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<TextView
xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="wrap_content"
android:layout_height="32dp"
android:gravity="start|center_vertical"
android:paddingLeft="4dp"
android:paddingRight="4dp"
android:layout_marginLeft="4dp"
android:layout_marginRight="4dp"
android:textIsSelectable="false" />

View file

@ -26,14 +26,4 @@
<string name="export_error" translatable="false">Export error</string>
<string name="export_started" translatable="false">Export started</string>
<string name="add_background_audio" translatable="false">Add background audio</string>
<string name="output_video_resolution" translatable="false">Output video resolution</string>
<string name="hdr_mode" translatable="false">HDR mode</string>
<string name="ok" translatable="false">OK</string>
<string name="cancel" translatable="false">Cancel</string>
<string name="export_settings" translatable="false">Export Settings</string>
<string name="output_audio_mime_type" translatable="false">Output audio MIME type</string>
<string name="output_video_mime_type" translatable="false">Output video MIME type</string>
<string name="enable_debug_tracing" translatable="false">Enable debug tracing</string>
<string name="use_media3_muxer" translatable="false">Use Media3 muxer</string>
<string name="produce_fragmented_mp4" translatable="false">Produce fragmented MP4</string>
</resources>

View file

@ -50,6 +50,16 @@ public final class DemoUtil {
public static final String DOWNLOAD_NOTIFICATION_CHANNEL_ID = "download_channel";
/**
* Whether the demo application uses Cronet for networking when {@link HttpEngine} is not
* supported. Note that Cronet does not provide automatic support for cookies
* (https://github.com/google/ExoPlayer/issues/5975).
*
* <p>If set to false, the {@link DefaultHttpDataSource} is used with a {@link CookieManager}
* configured in {@link #getHttpDataSourceFactory} when {@link HttpEngine} is not supported.
*/
private static final boolean ALLOW_CRONET_FOR_NETWORKING = true;
private static final String TAG = "DemoUtil";
private static final String DOWNLOAD_CONTENT_DIRECTORY = "downloads";
@ -104,13 +114,16 @@ public final class DemoUtil {
new HttpEngineDataSource.Factory(httpEngine, Executors.newSingleThreadExecutor());
return httpDataSourceFactory;
}
@Nullable CronetEngine cronetEngine = CronetUtil.buildCronetEngine(context);
if (cronetEngine != null) {
httpDataSourceFactory =
new CronetDataSource.Factory(cronetEngine, Executors.newSingleThreadExecutor());
return httpDataSourceFactory;
if (ALLOW_CRONET_FOR_NETWORKING) {
@Nullable CronetEngine cronetEngine = CronetUtil.buildCronetEngine(context);
if (cronetEngine != null) {
httpDataSourceFactory =
new CronetDataSource.Factory(cronetEngine, Executors.newSingleThreadExecutor());
return httpDataSourceFactory;
}
}
// The device doesn't support HttpEngine and we failed to instantiate a CronetEngine.
// The device doesn't support HttpEngine or we don't want to allow Cronet, or we failed to
// instantiate a CronetEngine.
CookieManager cookieManager = new CookieManager();
cookieManager.setCookiePolicy(CookiePolicy.ACCEPT_ORIGINAL_SERVER);
CookieHandler.setDefault(cookieManager);

View file

@ -45,6 +45,7 @@ import android.widget.ExpandableListView.OnChildClickListener;
import android.widget.ImageButton;
import android.widget.TextView;
import android.widget.Toast;
import androidx.annotation.DoNotInline;
import androidx.annotation.Nullable;
import androidx.annotation.OptIn;
import androidx.annotation.RequiresApi;
@ -666,6 +667,7 @@ public class SampleChooserActivity extends AppCompatActivity
@RequiresApi(33)
private static class Api33 {
@DoNotInline
public static String getPostNotificationPermissionString() {
return Manifest.permission.POST_NOTIFICATIONS;
}

View file

@ -15,13 +15,16 @@
*/
package androidx.media3.demo.shortform
import android.content.Context
import android.os.Handler
import android.os.Looper
import androidx.annotation.OptIn
import androidx.media3.common.Player
import androidx.media3.common.util.UnstableApi
import androidx.media3.exoplayer.ExoPlayer
import androidx.media3.exoplayer.source.preload.DefaultPreloadManager.Builder
import androidx.media3.exoplayer.LoadControl
import androidx.media3.exoplayer.RenderersFactory
import androidx.media3.exoplayer.upstream.BandwidthMeter
import androidx.media3.exoplayer.util.EventLogger
import com.google.common.collect.BiMap
import com.google.common.collect.HashBiMap
@ -31,7 +34,14 @@ import java.util.LinkedList
import java.util.Queue
@OptIn(UnstableApi::class)
class PlayerPool(private val numberOfPlayers: Int, preloadManagerBuilder: Builder) {
class PlayerPool(
private val numberOfPlayers: Int,
context: Context,
playbackLooper: Looper,
loadControl: LoadControl,
renderersFactory: RenderersFactory,
bandwidthMeter: BandwidthMeter,
) {
/** Creates a player instance to be used by the pool. */
interface PlayerFactory {
@ -42,7 +52,8 @@ class PlayerPool(private val numberOfPlayers: Int, preloadManagerBuilder: Builde
private val availablePlayerQueue: Queue<Int> = LinkedList()
private val playerMap: BiMap<Int, ExoPlayer> = Maps.synchronizedBiMap(HashBiMap.create())
private val playerRequestTokenSet: MutableSet<Int> = Collections.synchronizedSet(HashSet<Int>())
private val playerFactory: PlayerFactory = DefaultPlayerFactory(preloadManagerBuilder)
private val playerFactory: PlayerFactory =
DefaultPlayerFactory(context, playbackLooper, loadControl, renderersFactory, bandwidthMeter)
fun acquirePlayer(token: Int, callback: (ExoPlayer) -> Unit) {
synchronized(playerMap) {
@ -115,11 +126,23 @@ class PlayerPool(private val numberOfPlayers: Int, preloadManagerBuilder: Builde
}
@OptIn(UnstableApi::class)
private class DefaultPlayerFactory(private val preloadManagerBuilder: Builder) : PlayerFactory {
private class DefaultPlayerFactory(
private val context: Context,
private val playbackLooper: Looper,
private val loadControl: LoadControl,
private val renderersFactory: RenderersFactory,
private val bandwidthMeter: BandwidthMeter,
) : PlayerFactory {
private var playerCounter = 0
override fun createPlayer(): ExoPlayer {
val player = preloadManagerBuilder.buildExoPlayer()
val player =
ExoPlayer.Builder(context)
.setPlaybackLooper(playbackLooper)
.setLoadControl(loadControl)
.setRenderersFactory(renderersFactory)
.setBandwidthMeter(bandwidthMeter)
.build()
player.addAnalyticsListener(EventLogger("player-$playerCounter"))
playerCounter++
player.repeatMode = ExoPlayer.REPEAT_MODE_ONE

View file

@ -25,7 +25,7 @@ import androidx.viewpager2.widget.ViewPager2
class ViewPagerActivity : AppCompatActivity() {
private lateinit var viewPagerView: ViewPager2
private lateinit var onPageChangeCallback: ViewPager2.OnPageChangeCallback
private lateinit var adapter: ViewPagerMediaAdapter
private var numberOfPlayers = 3
private var mediaItemDatabase = MediaItemDatabase()
@ -40,24 +40,23 @@ class ViewPagerActivity : AppCompatActivity() {
Log.d(TAG, "Using a pool of $numberOfPlayers players")
viewPagerView = findViewById(R.id.viewPager)
viewPagerView.offscreenPageLimit = 1
}
override fun onStart() {
super.onStart()
val adapter = ViewPagerMediaAdapter(mediaItemDatabase, numberOfPlayers, applicationContext)
viewPagerView.adapter = adapter
onPageChangeCallback =
viewPagerView.registerOnPageChangeCallback(
object : ViewPager2.OnPageChangeCallback() {
override fun onPageSelected(position: Int) {
adapter.onPageSelected(position)
}
}
viewPagerView.registerOnPageChangeCallback(onPageChangeCallback)
)
}
override fun onStart() {
super.onStart()
adapter = ViewPagerMediaAdapter(mediaItemDatabase, numberOfPlayers, this)
viewPagerView.adapter = adapter
}
override fun onStop() {
viewPagerView.unregisterOnPageChangeCallback(onPageChangeCallback)
viewPagerView.adapter = null
adapter.onDestroy()
super.onStop()
}
}

View file

@ -16,6 +16,8 @@
package androidx.media3.demo.shortform.viewpager
import android.content.Context
import android.os.HandlerThread
import android.os.Process
import android.view.LayoutInflater
import android.view.ViewGroup
import androidx.annotation.OptIn
@ -27,9 +29,14 @@ import androidx.media3.demo.shortform.MediaItemDatabase
import androidx.media3.demo.shortform.PlayerPool
import androidx.media3.demo.shortform.R
import androidx.media3.exoplayer.DefaultLoadControl
import androidx.media3.exoplayer.DefaultRendererCapabilitiesList
import androidx.media3.exoplayer.DefaultRenderersFactory
import androidx.media3.exoplayer.source.DefaultMediaSourceFactory
import androidx.media3.exoplayer.source.preload.DefaultPreloadManager
import androidx.media3.exoplayer.source.preload.DefaultPreloadManager.Status.STAGE_LOADED_FOR_DURATION_MS
import androidx.media3.exoplayer.source.preload.DefaultPreloadManager.Status.STAGE_LOADED_TO_POSITION_MS
import androidx.media3.exoplayer.source.preload.TargetPreloadStatusControl
import androidx.media3.exoplayer.trackselection.DefaultTrackSelector
import androidx.media3.exoplayer.upstream.DefaultBandwidthMeter
import androidx.recyclerview.widget.RecyclerView
import kotlin.math.abs
@ -39,11 +46,13 @@ class ViewPagerMediaAdapter(
numberOfPlayers: Int,
context: Context,
) : RecyclerView.Adapter<ViewPagerMediaHolder>() {
private val playbackThread: HandlerThread =
HandlerThread("playback-thread", Process.THREAD_PRIORITY_AUDIO)
private val preloadManager: DefaultPreloadManager
private val currentMediaItemsAndIndexes: ArrayDeque<Pair<MediaItem, Int>> = ArrayDeque()
private var playerPool: PlayerPool
private val holderMap: MutableMap<Int, ViewPagerMediaHolder>
private val preloadControl: DefaultPreloadControl
private var currentPlayingIndex: Int = C.INDEX_UNSET
companion object {
private const val TAG = "ViewPagerMediaAdapter"
@ -55,6 +64,7 @@ class ViewPagerMediaAdapter(
}
init {
playbackThread.start()
val loadControl =
DefaultLoadControl.Builder()
.setBufferDurationsMs(
@ -65,26 +75,35 @@ class ViewPagerMediaAdapter(
)
.setPrioritizeTimeOverSizeThresholds(true)
.build()
preloadControl = DefaultPreloadControl()
val preloadManagerBuilder =
DefaultPreloadManager.Builder(context.applicationContext, preloadControl)
.setLoadControl(loadControl)
playerPool = PlayerPool(numberOfPlayers, preloadManagerBuilder)
val renderersFactory = DefaultRenderersFactory(context)
playerPool =
PlayerPool(
numberOfPlayers,
context,
playbackThread.looper,
loadControl,
renderersFactory,
DefaultBandwidthMeter.getSingletonInstance(context),
)
holderMap = mutableMapOf()
preloadManager = preloadManagerBuilder.build()
val trackSelector = DefaultTrackSelector(context)
trackSelector.init({}, DefaultBandwidthMeter.getSingletonInstance(context))
preloadManager =
DefaultPreloadManager(
DefaultPreloadControl(),
DefaultMediaSourceFactory(context),
trackSelector,
DefaultBandwidthMeter.getSingletonInstance(context),
DefaultRendererCapabilitiesList.Factory(renderersFactory),
loadControl.allocator,
playbackThread.looper,
)
for (i in 0 until MANAGED_ITEM_COUNT) {
addMediaItem(index = i, isAddingToRightEnd = true)
}
preloadManager.invalidate()
}
override fun onDetachedFromRecyclerView(recyclerView: RecyclerView) {
playerPool.destroyPlayers()
preloadManager.release()
holderMap.clear()
super.onDetachedFromRecyclerView(recyclerView)
}
override fun onCreateViewHolder(parent: ViewGroup, viewType: Int): ViewPagerMediaHolder {
val view =
LayoutInflater.from(parent.context).inflate(R.layout.media_item_view_pager, parent, false)
@ -137,9 +156,15 @@ class ViewPagerMediaAdapter(
return Int.MAX_VALUE
}
fun onDestroy() {
preloadManager.release()
playerPool.destroyPlayers()
playbackThread.quit()
}
fun onPageSelected(position: Int) {
currentPlayingIndex = position
holderMap[position]?.playIfPossible()
preloadControl.currentPlayingIndex = position
preloadManager.setCurrentPlayingIndex(position)
preloadManager.invalidate()
}
@ -172,14 +197,12 @@ class ViewPagerMediaAdapter(
preloadManager.remove(itemAndIndex.first)
}
inner class DefaultPreloadControl(var currentPlayingIndex: Int = C.INDEX_UNSET) :
TargetPreloadStatusControl<Int> {
inner class DefaultPreloadControl : TargetPreloadStatusControl<Int> {
override fun getTargetPreloadStatus(rankingData: Int): DefaultPreloadManager.Status? {
if (abs(rankingData - currentPlayingIndex) == 2) {
return DefaultPreloadManager.Status(STAGE_LOADED_FOR_DURATION_MS, 500L)
return DefaultPreloadManager.Status(STAGE_LOADED_TO_POSITION_MS, 500L)
} else if (abs(rankingData - currentPlayingIndex) == 1) {
return DefaultPreloadManager.Status(STAGE_LOADED_FOR_DURATION_MS, 1000L)
return DefaultPreloadManager.Status(STAGE_LOADED_TO_POSITION_MS, 1000L)
}
return null
}

View file

@ -0,0 +1 @@
../../proguard-rules.txt

View file

@ -42,7 +42,7 @@
android:background="@color/purple_700"
android:gravity="center"
android:hint="@string/num_of_players"
android:inputType="number"
android:inputType="numberDecimal"
android:textColorHint="@color/grey" />
</LinearLayout>

View file

@ -62,8 +62,8 @@ public final class MainActivity extends Activity {
private boolean isOwner;
@Nullable private LegacyPlayerControlView playerControlView;
@Nullable private SurfaceView fullscreenView;
@Nullable private SurfaceView nonFullscreenView;
@Nullable private SurfaceView fullScreenView;
@Nullable private SurfaceView nonFullScreenView;
@Nullable private SurfaceView currentOutputView;
@Nullable private static ExoPlayer player;
@ -75,13 +75,13 @@ public final class MainActivity extends Activity {
super.onCreate(savedInstanceState);
setContentView(R.layout.main_activity);
playerControlView = findViewById(R.id.player_control_view);
fullscreenView = findViewById(R.id.full_screen_view);
fullscreenView.setOnClickListener(
fullScreenView = findViewById(R.id.full_screen_view);
fullScreenView.setOnClickListener(
v -> {
setCurrentOutputView(nonFullscreenView);
Assertions.checkNotNull(fullscreenView).setVisibility(View.GONE);
setCurrentOutputView(nonFullScreenView);
Assertions.checkNotNull(fullScreenView).setVisibility(View.GONE);
});
attachSurfaceListener(fullscreenView);
attachSurfaceListener(fullScreenView);
isOwner = getIntent().getBooleanExtra(OWNER_EXTRA, /* defaultValue= */ true);
GridLayout gridLayout = findViewById(R.id.grid_layout);
for (int i = 0; i < 9; i++) {
@ -97,8 +97,8 @@ public final class MainActivity extends Activity {
button.setText(getString(R.string.full_screen_label));
button.setOnClickListener(
v -> {
setCurrentOutputView(fullscreenView);
Assertions.checkNotNull(fullscreenView).setVisibility(View.VISIBLE);
setCurrentOutputView(fullScreenView);
Assertions.checkNotNull(fullScreenView).setVisibility(View.VISIBLE);
});
} else if (i == 2) {
Button button = new Button(/* context= */ this);
@ -116,10 +116,10 @@ public final class MainActivity extends Activity {
surfaceView.setOnClickListener(
v -> {
setCurrentOutputView(surfaceView);
nonFullscreenView = surfaceView;
nonFullScreenView = surfaceView;
});
if (nonFullscreenView == null) {
nonFullscreenView = surfaceView;
if (nonFullScreenView == null) {
nonFullScreenView = surfaceView;
}
}
gridLayout.addView(view);
@ -144,7 +144,7 @@ public final class MainActivity extends Activity {
initializePlayer();
}
setCurrentOutputView(nonFullscreenView);
setCurrentOutputView(nonFullScreenView);
LegacyPlayerControlView playerControlView = Assertions.checkNotNull(this.playerControlView);
playerControlView.setPlayer(player);

View file

@ -257,12 +257,13 @@ public final class ConfigurationActivity extends AppCompatActivity {
videoMimeSpinner = findViewById(R.id.video_mime_spinner);
videoMimeSpinner.setAdapter(videoMimeAdapter);
videoMimeAdapter.addAll(
SAME_AS_INPUT_OPTION,
MimeTypes.VIDEO_H263,
MimeTypes.VIDEO_H264,
MimeTypes.VIDEO_H265,
MimeTypes.VIDEO_MP4V,
MimeTypes.VIDEO_AV1);
SAME_AS_INPUT_OPTION, MimeTypes.VIDEO_H263, MimeTypes.VIDEO_H264, MimeTypes.VIDEO_MP4V);
if (SDK_INT >= 24) {
videoMimeAdapter.add(MimeTypes.VIDEO_H265);
}
if (SDK_INT >= 34) {
videoMimeAdapter.add(MimeTypes.VIDEO_AV1);
}
ArrayAdapter<String> resolutionHeightAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
@ -301,18 +302,6 @@ public final class ConfigurationActivity extends AppCompatActivity {
abortSlowExportCheckBox = findViewById(R.id.abort_slow_export_checkbox);
useMedia3Muxer = findViewById(R.id.use_media3_muxer_checkbox);
produceFragmentedMp4CheckBox = findViewById(R.id.produce_fragmented_mp4_checkbox);
useMedia3Muxer.setOnCheckedChangeListener(
(buttonView, isChecked) -> {
if (!isChecked) {
produceFragmentedMp4CheckBox.setChecked(false);
}
});
produceFragmentedMp4CheckBox.setOnCheckedChangeListener(
(buttonView, isChecked) -> {
if (isChecked) {
useMedia3Muxer.setChecked(true);
}
});
ArrayAdapter<String> hdrModeAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);

View file

@ -84,8 +84,10 @@ import androidx.media3.exoplayer.DefaultLoadControl;
import androidx.media3.exoplayer.ExoPlayer;
import androidx.media3.exoplayer.audio.SilenceSkippingAudioProcessor;
import androidx.media3.exoplayer.util.DebugTextViewHelper;
import androidx.media3.muxer.Muxer;
import androidx.media3.transformer.Composition;
import androidx.media3.transformer.DefaultEncoderFactory;
import androidx.media3.transformer.DefaultMuxer;
import androidx.media3.transformer.EditedMediaItem;
import androidx.media3.transformer.EditedMediaItemSequence;
import androidx.media3.transformer.Effects;
@ -119,8 +121,6 @@ import org.json.JSONObject;
/** An {@link Activity} that exports and plays media using {@link Transformer}. */
public final class TransformerActivity extends AppCompatActivity {
private static final String TAG = "TransformerActivity";
private static final int IMAGE_DURATION_MS = 5_000;
private static final int IMAGE_FRAME_RATE_FPS = 30;
private static int LOAD_CONTROL_MIN_BUFFER_MS = 5_000;
private static int LOAD_CONTROL_MAX_BUFFER_MS = 5_000;
@ -267,8 +267,7 @@ public final class TransformerActivity extends AppCompatActivity {
}
private MediaItem createMediaItem(@Nullable Bundle bundle, Uri uri) {
MediaItem.Builder mediaItemBuilder =
new MediaItem.Builder().setUri(uri).setImageDurationMs(IMAGE_DURATION_MS);
MediaItem.Builder mediaItemBuilder = new MediaItem.Builder().setUri(uri);
if (bundle != null) {
long trimStartMs =
bundle.getLong(ConfigurationActivity.TRIM_START_MS, /* defaultValue= */ C.TIME_UNSET);
@ -323,13 +322,14 @@ public final class TransformerActivity extends AppCompatActivity {
transformerBuilder.setMaxDelayBetweenMuxerSamplesMs(C.TIME_UNSET);
}
Muxer.Factory muxerFactory = new DefaultMuxer.Factory();
if (bundle.getBoolean(ConfigurationActivity.USE_MEDIA3_MUXER)) {
transformerBuilder.setMuxerFactory(
new InAppMuxer.Factory.Builder()
.setOutputFragmentedMp4(
bundle.getBoolean(ConfigurationActivity.PRODUCE_FRAGMENTED_MP4))
.build());
muxerFactory = new InAppMuxer.Factory.Builder().build();
}
if (bundle.getBoolean(ConfigurationActivity.PRODUCE_FRAGMENTED_MP4)) {
muxerFactory = new InAppMuxer.Factory.Builder().setOutputFragmentedMp4(true).build();
}
transformerBuilder.setMuxerFactory(muxerFactory);
if (bundle.getBoolean(ConfigurationActivity.ENABLE_DEBUG_PREVIEW)) {
transformerBuilder.setDebugViewProvider(new DemoDebugViewProvider());
@ -359,7 +359,7 @@ public final class TransformerActivity extends AppCompatActivity {
private Composition createComposition(MediaItem mediaItem, @Nullable Bundle bundle) {
EditedMediaItem.Builder editedMediaItemBuilder = new EditedMediaItem.Builder(mediaItem);
// For image inputs. Automatically ignored if input is audio/video.
editedMediaItemBuilder.setFrameRate(IMAGE_FRAME_RATE_FPS);
editedMediaItemBuilder.setDurationUs(5_000_000).setFrameRate(30);
if (bundle != null) {
ImmutableList<AudioProcessor> audioProcessors = createAudioProcessorsFromBundle(bundle);
ImmutableList<Effect> videoEffects = createVideoEffectsFromBundle(bundle);
@ -371,8 +371,7 @@ public final class TransformerActivity extends AppCompatActivity {
.setEffects(new Effects(audioProcessors, videoEffects));
}
Composition.Builder compositionBuilder =
new Composition.Builder(
new EditedMediaItemSequence.Builder(editedMediaItemBuilder.build()).build());
new Composition.Builder(new EditedMediaItemSequence(editedMediaItemBuilder.build()));
if (bundle != null) {
compositionBuilder
.setHdrMode(bundle.getInt(ConfigurationActivity.HDR_MODE))

View file

@ -33,6 +33,7 @@ import android.view.Surface;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.TextureView;
import androidx.annotation.DoNotInline;
import androidx.annotation.IntRange;
import androidx.annotation.Nullable;
import androidx.annotation.RequiresApi;
@ -1585,6 +1586,7 @@ public final class CastPlayer extends BasePlayer {
}
/** Acquires necessary resources and registers callbacks. */
@DoNotInline
public void initialize() {
mediaRouter2.registerTransferCallback(handler::post, transferCallback);
// We need at least one route callback registered in order to get transfer callback updates.
@ -1599,6 +1601,7 @@ public final class CastPlayer extends BasePlayer {
* Releases any resources acquired in {@link #initialize()} and unregisters any registered
* callbacks.
*/
@DoNotInline
public void release() {
mediaRouter2.unregisterTransferCallback(transferCallback);
mediaRouter2.unregisterRouteCallback(emptyRouteCallback);
@ -1606,6 +1609,7 @@ public final class CastPlayer extends BasePlayer {
}
/** Updates the device info with an up-to-date value and notifies the listeners. */
@DoNotInline
private void updateDeviceInfo() {
DeviceInfo oldDeviceInfo = deviceInfo;
DeviceInfo newDeviceInfo = fetchDeviceInfo();
@ -1620,6 +1624,7 @@ public final class CastPlayer extends BasePlayer {
* Returns a {@link DeviceInfo} with the {@link RoutingController#getId() id} that corresponds
* to the Cast session, or {@link #DEVICE_INFO_REMOTE_EMPTY} if not available.
*/
@DoNotInline
public DeviceInfo fetchDeviceInfo() {
// TODO: b/364833997 - Fetch this information from the AndroidX MediaRouter selected route
// once the selected route id matches the controller id.

View file

@ -16,6 +16,7 @@
package androidx.media3.common;
import android.os.Bundle;
import androidx.annotation.DoNotInline;
import androidx.annotation.Nullable;
import androidx.annotation.RequiresApi;
import androidx.media3.common.util.UnstableApi;
@ -239,6 +240,7 @@ public final class AudioAttributes {
@RequiresApi(29)
private static final class Api29 {
@DoNotInline
public static void setAllowedCapturePolicy(
android.media.AudioAttributes.Builder builder,
@C.AudioAllowedCapturePolicy int allowedCapturePolicy) {
@ -248,6 +250,7 @@ public final class AudioAttributes {
@RequiresApi(32)
private static final class Api32 {
@DoNotInline
public static void setSpatializationBehavior(
android.media.AudioAttributes.Builder builder,
@C.SpatializationBehavior int spatializationBehavior) {

View file

@ -203,7 +203,7 @@ public final class ColorInfo {
/**
* Returns the {@link C.ColorSpace} corresponding to the given ISO color primary code, as per
* table A.7.21.1 in Rec. ITU-T T.832 (06/2019), or {@link Format#NO_VALUE} if no mapping can be
* table A.7.21.1 in Rec. ITU-T T.832 (03/2009), or {@link Format#NO_VALUE} if no mapping can be
* made.
*/
@Pure
@ -219,52 +219,13 @@ public final class ColorInfo {
case 9:
return C.COLOR_SPACE_BT2020;
default:
// Remaining color primaries are either reserved or unspecified.
return Format.NO_VALUE;
}
}
/**
* Returns the ISO color primary code corresponding to the given {@link C.ColorSpace}, as per
* table A.7.21.1 in Rec. ITU-T T.832 (06/2019). made.
*/
public static int colorSpaceToIsoColorPrimaries(@C.ColorSpace int colorSpace) {
switch (colorSpace) {
// Default to BT.709 SDR as per the <a
// href="https://www.webmproject.org/vp9/mp4/#optional-fields">recommendation</a>.
case Format.NO_VALUE:
case C.COLOR_SPACE_BT709:
return 1;
case C.COLOR_SPACE_BT601:
return 5;
case C.COLOR_SPACE_BT2020:
return 9;
}
return 1;
}
/**
* Returns the ISO matrix coefficients code corresponding to the given {@link C.ColorSpace}, as
* per table A.7.21.3 in Rec. ITU-T T.832 (06/2019).
*/
public static int colorSpaceToIsoMatrixCoefficients(@C.ColorSpace int colorSpace) {
switch (colorSpace) {
// Default to BT.709 SDR as per the <a
// href="https://www.webmproject.org/vp9/mp4/#optional-fields">recommendation</a>.
case Format.NO_VALUE:
case C.COLOR_SPACE_BT709:
return 1;
case C.COLOR_SPACE_BT601:
return 6;
case C.COLOR_SPACE_BT2020:
return 9;
}
return 1;
}
/**
* Returns the {@link C.ColorTransfer} corresponding to the given ISO transfer characteristics
* code, as per table A.7.21.2 in Rec. ITU-T T.832 (06/2019), or {@link Format#NO_VALUE} if no
* code, as per table A.7.21.2 in Rec. ITU-T T.832 (03/2009), or {@link Format#NO_VALUE} if no
* mapping can be made.
*/
@Pure
@ -288,31 +249,6 @@ public final class ColorInfo {
}
}
/**
* Returns the ISO transfer characteristics code corresponding to the given {@link
* C.ColorTransfer}, as per table A.7.21.2 in Rec. ITU-T T.832 (06/2019).
*/
public static int colorTransferToIsoTransferCharacteristics(@C.ColorTransfer int colorTransfer) {
switch (colorTransfer) {
// Default to BT.709 SDR as per the <a
// href="https://www.webmproject.org/vp9/mp4/#optional-fields">recommendation</a>.
case C.COLOR_TRANSFER_LINEAR:
return 8;
case C.COLOR_TRANSFER_SRGB:
return 13;
case Format.NO_VALUE:
case C.COLOR_TRANSFER_SDR:
return 1;
case C.COLOR_TRANSFER_ST2084:
return 16;
case C.COLOR_TRANSFER_HLG:
return 18;
case C.COLOR_TRANSFER_GAMMA_2_2:
return 4;
}
return 1;
}
/**
* Returns whether the {@code ColorInfo} uses an HDR {@link C.ColorTransfer}.
*

View file

@ -16,7 +16,6 @@
package androidx.media3.common;
import static androidx.media3.common.util.Assertions.checkState;
import static com.google.common.math.DoubleMath.fuzzyEquals;
import static java.lang.annotation.ElementType.TYPE_USE;
import android.os.Bundle;
@ -28,7 +27,6 @@ import androidx.media3.common.util.UnstableApi;
import androidx.media3.common.util.Util;
import com.google.common.base.Joiner;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.Lists;
import com.google.errorprone.annotations.CanIgnoreReturnValue;
import java.lang.annotation.Documented;
import java.lang.annotation.Retention;
@ -167,7 +165,6 @@ public final class Format {
@Nullable private List<byte[]> initializationData;
@Nullable private DrmInitData drmInitData;
private long subsampleOffsetUs;
private boolean hasPrerollSamples;
// Video specific.
@ -258,7 +255,6 @@ public final class Format {
this.initializationData = format.initializationData;
this.drmInitData = format.drmInitData;
this.subsampleOffsetUs = format.subsampleOffsetUs;
this.hasPrerollSamples = format.hasPrerollSamples;
// Video specific.
this.width = format.width;
this.height = format.height;
@ -546,18 +542,6 @@ public final class Format {
return this;
}
/**
* Sets {@link Format#hasPrerollSamples}. The default value is {@code false}.
*
* @param hasPrerollSamples The {@link Format#hasPrerollSamples}.
* @return The builder.
*/
@CanIgnoreReturnValue
public Builder setHasPrerollSamples(boolean hasPrerollSamples) {
this.hasPrerollSamples = hasPrerollSamples;
return this;
}
// Video specific.
/**
@ -750,7 +734,7 @@ public final class Format {
/**
* Sets {@link Format#tileCountHorizontal}. The default value is {@link #NO_VALUE}.
*
* @param tileCountHorizontal The {@link Format#tileCountHorizontal}.
* @param tileCountHorizontal The {@link Format#accessibilityChannel}.
* @return The builder.
*/
@CanIgnoreReturnValue
@ -762,7 +746,7 @@ public final class Format {
/**
* Sets {@link Format#tileCountVertical}. The default value is {@link #NO_VALUE}.
*
* @param tileCountVertical The {@link Format#tileCountVertical}.
* @param tileCountVertical The {@link Format#accessibilityChannel}.
* @return The builder.
*/
@CanIgnoreReturnValue
@ -967,15 +951,6 @@ public final class Format {
*/
@UnstableApi public final long subsampleOffsetUs;
/**
* Indicates whether the stream contains preroll samples.
*
* <p>When this field is set to {@code true}, it means that the stream includes decode-only
* samples that occur before the intended playback start position. These samples are necessary for
* decoding but are not meant to be rendered and should be skipped after decoding.
*/
@UnstableApi public final boolean hasPrerollSamples;
// Video specific.
/** The width of the video in pixels, or {@link #NO_VALUE} if unknown or not applicable. */
@ -1116,7 +1091,6 @@ public final class Format {
builder.initializationData == null ? Collections.emptyList() : builder.initializationData;
drmInitData = builder.drmInitData;
subsampleOffsetUs = builder.subsampleOffsetUs;
hasPrerollSamples = builder.hasPrerollSamples;
// Video specific.
width = builder.width;
height = builder.height;
@ -1406,7 +1380,6 @@ public final class Format {
if (format == null) {
return "null";
}
Joiner commaJoiner = Joiner.on(',');
StringBuilder builder = new StringBuilder();
builder.append("id=").append(format.id).append(", mimeType=").append(format.sampleMimeType);
if (format.containerMimeType != null) {
@ -1437,15 +1410,12 @@ public final class Format {
}
}
builder.append(", drm=[");
commaJoiner.appendTo(builder, schemes);
Joiner.on(',').appendTo(builder, schemes);
builder.append(']');
}
if (format.width != NO_VALUE && format.height != NO_VALUE) {
builder.append(", res=").append(format.width).append("x").append(format.height);
}
if (!fuzzyEquals(format.pixelWidthHeightRatio, 1, 0.001)) {
builder.append(", par=").append(Util.formatInvariant("%.3f", format.pixelWidthHeightRatio));
}
if (format.colorInfo != null && format.colorInfo.isValid()) {
builder.append(", color=").append(format.colorInfo.toLogString());
}
@ -1463,18 +1433,17 @@ public final class Format {
}
if (!format.labels.isEmpty()) {
builder.append(", labels=[");
commaJoiner.appendTo(
builder, Lists.transform(format.labels, l -> l.language + ": " + l.value));
Joiner.on(',').appendTo(builder, format.labels);
builder.append("]");
}
if (format.selectionFlags != 0) {
builder.append(", selectionFlags=[");
commaJoiner.appendTo(builder, Util.getSelectionFlagStrings(format.selectionFlags));
Joiner.on(',').appendTo(builder, Util.getSelectionFlagStrings(format.selectionFlags));
builder.append("]");
}
if (format.roleFlags != 0) {
builder.append(", roleFlags=[");
commaJoiner.appendTo(builder, Util.getRoleFlagStrings(format.roleFlags));
Joiner.on(',').appendTo(builder, Util.getRoleFlagStrings(format.roleFlags));
builder.append("]");
}
if (format.customData != null) {

View file

@ -30,25 +30,6 @@ import java.util.List;
/**
* A {@link Player} that forwards method calls to another {@link Player}. Applications can use this
* class to suppress or modify specific operations, by overriding the respective methods.
*
* <p>Subclasses must ensure they maintain consistency with the {@link Player} interface, including
* interactions with {@link Player.Listener}, which can be quite fiddly. For example, if removing an
* available {@link Player.Command} and disabling the corresponding method, subclasses need to:
*
* <ul>
* <li>Override {@link #isCommandAvailable(int)} and {@link #getAvailableCommands()}
* <li>Override and no-op the method itself
* <li>Override {@link #addListener(Listener)} and wrap the provided {@link Player.Listener} with
* an implementation that drops calls to {@link
* Player.Listener#onAvailableCommandsChanged(Commands)} and {@link
* Player.Listener#onEvents(Player, Events)} if they were only triggered by a change in
* command availability that is 'invisible' after the command removal.
* </ul>
*
* <p>Many customization use-cases are instead better served by {@link ForwardingSimpleBasePlayer},
* which allows subclasses to more concisely modify the behavior of an operation, or disallow a
* {@link Player.Command}. In many cases {@link ForwardingSimpleBasePlayer} should be used in
* preference to {@code ForwardingPlayer}.
*/
@UnstableApi
public class ForwardingPlayer implements Player {

View file

@ -29,11 +29,11 @@ public final class MediaLibraryInfo {
/** The version of the library expressed as a string, for example "1.2.3" or "1.2.0-beta01". */
// Intentionally hardcoded. Do not derive from other constants (e.g. VERSION_INT) or vice versa.
public static final String VERSION = "1.5.1";
public static final String VERSION = "1.5.0-alpha01";
/** The version of the library expressed as {@code TAG + "/" + VERSION}. */
// Intentionally hardcoded. Do not derive from other constants (e.g. VERSION) or vice versa.
public static final String VERSION_SLASHY = "AndroidXMedia3/1.5.1";
public static final String VERSION_SLASHY = "AndroidXMedia3/1.5.0-alpha01";
/**
* The version of the library expressed as an integer, for example 1002003300.
@ -47,7 +47,7 @@ public final class MediaLibraryInfo {
* (123-045-006-3-00).
*/
// Intentionally hardcoded. Do not derive from other constants (e.g. VERSION) or vice versa.
public static final int VERSION_INT = 1_005_001_3_00;
public static final int VERSION_INT = 1_005_000_0_01;
/** Whether the library was compiled with {@link Assertions} checks enabled. */
public static final boolean ASSERTIONS_ENABLED = true;

View file

@ -30,13 +30,11 @@ import androidx.annotation.Nullable;
import androidx.media3.common.util.UnstableApi;
import androidx.media3.common.util.Util;
import com.google.common.base.Objects;
import com.google.common.collect.ImmutableList;
import com.google.errorprone.annotations.CanIgnoreReturnValue;
import java.lang.annotation.Documented;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
@ -87,11 +85,8 @@ public final class MediaMetadata {
@Nullable private CharSequence station;
@Nullable private @MediaType Integer mediaType;
@Nullable private Bundle extras;
private ImmutableList<String> supportedCommands;
public Builder() {
supportedCommands = ImmutableList.of();
}
public Builder() {}
@SuppressWarnings("deprecation") // Assigning from deprecated fields.
private Builder(MediaMetadata mediaMetadata) {
@ -128,7 +123,6 @@ public final class MediaMetadata {
this.compilation = mediaMetadata.compilation;
this.station = mediaMetadata.station;
this.mediaType = mediaMetadata.mediaType;
this.supportedCommands = mediaMetadata.supportedCommands;
this.extras = mediaMetadata.extras;
}
@ -446,17 +440,6 @@ public final class MediaMetadata {
return this;
}
/**
* Sets the IDs of the supported commands (see for instance {@code
* CommandButton.sessionCommand.customAction} of the Media3 session module).
*/
@CanIgnoreReturnValue
@UnstableApi
public Builder setSupportedCommands(List<String> supportedCommands) {
this.supportedCommands = ImmutableList.copyOf(supportedCommands);
return this;
}
/**
* Sets all fields supported by the {@link Metadata.Entry entries} within the {@link Metadata}.
*
@ -613,10 +596,6 @@ public final class MediaMetadata {
setExtras(mediaMetadata.extras);
}
if (!mediaMetadata.supportedCommands.isEmpty()) {
setSupportedCommands(mediaMetadata.supportedCommands);
}
return this;
}
@ -1144,12 +1123,6 @@ public final class MediaMetadata {
*/
@Nullable public final Bundle extras;
/**
* The IDs of the supported commands of this media item (see for instance {@code
* CommandButton.sessionCommand.customAction} of the Media3 session module).
*/
@UnstableApi public final ImmutableList<String> supportedCommands;
@SuppressWarnings("deprecation") // Assigning deprecated fields.
private MediaMetadata(Builder builder) {
// Handle compatibility for deprecated fields.
@ -1202,7 +1175,6 @@ public final class MediaMetadata {
this.compilation = builder.compilation;
this.station = builder.station;
this.mediaType = mediaType;
this.supportedCommands = builder.supportedCommands;
this.extras = builder.extras;
}
@ -1255,7 +1227,6 @@ public final class MediaMetadata {
&& Util.areEqual(compilation, that.compilation)
&& Util.areEqual(station, that.station)
&& Util.areEqual(mediaType, that.mediaType)
&& Util.areEqual(supportedCommands, that.supportedCommands)
&& ((extras == null) == (that.extras == null));
}
@ -1296,8 +1267,7 @@ public final class MediaMetadata {
compilation,
station,
mediaType,
extras == null,
supportedCommands);
extras == null);
}
private static final String FIELD_TITLE = Util.intToStringMaxRadix(0);
@ -1334,7 +1304,6 @@ public final class MediaMetadata {
private static final String FIELD_MEDIA_TYPE = Util.intToStringMaxRadix(31);
private static final String FIELD_IS_BROWSABLE = Util.intToStringMaxRadix(32);
private static final String FIELD_DURATION_MS = Util.intToStringMaxRadix(33);
private static final String FIELD_SUPPORTED_COMMANDS = Util.intToStringMaxRadix(34);
private static final String FIELD_EXTRAS = Util.intToStringMaxRadix(1000);
@SuppressWarnings("deprecation") // Bundling deprecated fields.
@ -1440,9 +1409,6 @@ public final class MediaMetadata {
if (mediaType != null) {
bundle.putInt(FIELD_MEDIA_TYPE, mediaType);
}
if (!supportedCommands.isEmpty()) {
bundle.putStringArrayList(FIELD_SUPPORTED_COMMANDS, new ArrayList<>(supportedCommands));
}
if (extras != null) {
bundle.putBundle(FIELD_EXTRAS, extras);
}
@ -1533,11 +1499,6 @@ public final class MediaMetadata {
if (bundle.containsKey(FIELD_MEDIA_TYPE)) {
builder.setMediaType(bundle.getInt(FIELD_MEDIA_TYPE));
}
@Nullable
ArrayList<String> supportedCommands = bundle.getStringArrayList(FIELD_SUPPORTED_COMMANDS);
if (supportedCommands != null) {
builder.setSupportedCommands(supportedCommands);
}
return builder.build();
}

View file

@ -601,9 +601,7 @@ public final class MimeTypes {
return C.TRACK_TYPE_IMAGE;
} else if (APPLICATION_ID3.equals(mimeType)
|| APPLICATION_EMSG.equals(mimeType)
|| APPLICATION_SCTE35.equals(mimeType)
|| APPLICATION_ICY.equals(mimeType)
|| APPLICATION_AIT.equals(mimeType)) {
|| APPLICATION_SCTE35.equals(mimeType)) {
return C.TRACK_TYPE_METADATA;
} else if (APPLICATION_CAMERA_MOTION.equals(mimeType)) {
return C.TRACK_TYPE_CAMERA_MOTION;
@ -685,9 +683,6 @@ public final class MimeTypes {
}
mimeType = Ascii.toLowerCase(mimeType);
switch (mimeType) {
// Normalize uncommon versions of some video MIME types to their standard equivalent.
case BASE_TYPE_VIDEO + "/x-mvhevc":
return VIDEO_MV_HEVC;
// Normalize uncommon versions of some audio MIME types to their standard equivalent.
case BASE_TYPE_AUDIO + "/x-flac":
return AUDIO_FLAC;

View file

@ -143,8 +143,8 @@ public interface VideoFrameProcessor {
* @param effects The list of {@link Effect effects} to apply to the new input stream.
* @param frameInfo The {@link FrameInfo} of the new input stream.
*/
default void onInputStreamRegistered(
@InputType int inputType, List<Effect> effects, FrameInfo frameInfo) {}
void onInputStreamRegistered(
@InputType int inputType, List<Effect> effects, FrameInfo frameInfo);
/**
* Called when the output size changes.
@ -155,7 +155,7 @@ public interface VideoFrameProcessor {
* <p>The output size may differ from the size specified using {@link
* #setOutputSurfaceInfo(SurfaceInfo)}.
*/
default void onOutputSizeChanged(int width, int height) {}
void onOutputSizeChanged(int width, int height);
/**
* Called when an output frame with the given {@code presentationTimeUs} becomes available for
@ -163,7 +163,7 @@ public interface VideoFrameProcessor {
*
* @param presentationTimeUs The presentation time of the frame, in microseconds.
*/
default void onOutputFrameAvailableForRendering(long presentationTimeUs) {}
void onOutputFrameAvailableForRendering(long presentationTimeUs);
/**
* Called when an exception occurs during asynchronous video frame processing.
@ -171,10 +171,10 @@ public interface VideoFrameProcessor {
* <p>If this is called, the calling {@link VideoFrameProcessor} must immediately be {@linkplain
* VideoFrameProcessor#release() released}.
*/
default void onError(VideoFrameProcessingException exception) {}
void onError(VideoFrameProcessingException exception);
/** Called after the {@link VideoFrameProcessor} has rendered its final output frame. */
default void onEnded() {}
void onEnded();
}
/**

View file

@ -33,7 +33,7 @@ public interface VideoGraph {
* @param width The new output width in pixels.
* @param height The new output width in pixels.
*/
default void onOutputSizeChanged(int width, int height) {}
void onOutputSizeChanged(int width, int height);
/**
* Called when an output frame with the given {@code framePresentationTimeUs} becomes available
@ -41,14 +41,14 @@ public interface VideoGraph {
*
* @param framePresentationTimeUs The presentation time of the frame, in microseconds.
*/
default void onOutputFrameAvailableForRendering(long framePresentationTimeUs) {}
void onOutputFrameAvailableForRendering(long framePresentationTimeUs);
/**
* Called after the {@link VideoGraph} has rendered its final output frame.
*
* @param finalFramePresentationTimeUs The timestamp of the last output frame, in microseconds.
*/
default void onEnded(long finalFramePresentationTimeUs) {}
void onEnded(long finalFramePresentationTimeUs);
/**
* Called when an exception occurs during video frame processing.
@ -56,7 +56,7 @@ public interface VideoGraph {
* <p>If this is called, the calling {@link VideoGraph} must immediately be {@linkplain
* #release() released}.
*/
default void onError(VideoFrameProcessingException exception) {}
void onError(VideoFrameProcessingException exception);
}
/**

View file

@ -16,9 +16,9 @@
*/
package androidx.media3.common.audio;
import static androidx.media3.common.util.Assertions.checkState;
import static java.lang.Math.min;
import androidx.media3.common.util.Assertions;
import java.nio.ShortBuffer;
import java.util.Arrays;
@ -52,23 +52,11 @@ import java.util.Arrays;
private int pitchFrameCount;
private int oldRatePosition;
private int newRatePosition;
/**
* Number of frames pending to be copied from {@link #inputBuffer} directly to {@link
* #outputBuffer}.
*
* <p>This field is only relevant to time-stretching or pitch-shifting in {@link
* #changeSpeed(double)}, particularly when more frames need to be copied to the {@link
* #outputBuffer} than are available in {@link #inputBuffer} and Sonic must wait until the next
* buffer (or EOS) is queued.
*/
private int remainingInputToCopyFrameCount;
private int prevPeriod;
private int prevMinDiff;
private int minDiff;
private int maxDiff;
private double accumulatedSpeedAdjustmentError;
/**
* Creates a new Sonic audio stream processor.
@ -142,26 +130,10 @@ import java.util.Arrays;
*/
public void queueEndOfStream() {
int remainingFrameCount = inputFrameCount;
double s = speed / pitch;
double r = rate * pitch;
// If there are frames to be copied directly onto the output buffer, we should not count those
// as "input frames" because Sonic is not applying any processing on them.
int adjustedRemainingFrames = remainingFrameCount - remainingInputToCopyFrameCount;
// We add directly to the output the number of frames in remainingInputToCopyFrameCount.
// Otherwise, expectedOutputFrames will be off and will make Sonic output an incorrect number of
// frames.
float s = speed / pitch;
float r = rate * pitch;
int expectedOutputFrames =
outputFrameCount
+ (int)
((adjustedRemainingFrames / s
+ remainingInputToCopyFrameCount
+ accumulatedSpeedAdjustmentError
+ pitchFrameCount)
/ r
+ 0.5);
accumulatedSpeedAdjustmentError = 0;
outputFrameCount + (int) ((remainingFrameCount / s + pitchFrameCount) / r + 0.5f);
// Add enough silence to flush both input and pitch buffers.
inputBuffer =
@ -194,7 +166,6 @@ import java.util.Arrays;
prevMinDiff = 0;
minDiff = 0;
maxDiff = 0;
accumulatedSpeedAdjustmentError = 0;
}
/** Returns the size of output that can be read with {@link #getOutput(ShortBuffer)}, in bytes. */
@ -384,14 +355,14 @@ import java.util.Arrays;
pitchFrameCount -= frameCount;
}
private short interpolate(short[] in, int inPos, long oldSampleRate, long newSampleRate) {
private short interpolate(short[] in, int inPos, int oldSampleRate, int newSampleRate) {
short left = in[inPos];
short right = in[inPos + channelCount];
long position = newRatePosition * oldSampleRate;
long leftPosition = oldRatePosition * newSampleRate;
long rightPosition = (oldRatePosition + 1) * newSampleRate;
long ratio = rightPosition - position;
long width = rightPosition - leftPosition;
int position = newRatePosition * oldSampleRate;
int leftPosition = oldRatePosition * newSampleRate;
int rightPosition = (oldRatePosition + 1) * newSampleRate;
int ratio = rightPosition - position;
int width = rightPosition - leftPosition;
return (short) ((ratio * left + (width - ratio) * right) / width);
}
@ -399,23 +370,16 @@ import java.util.Arrays;
if (outputFrameCount == originalOutputFrameCount) {
return;
}
// Use long to avoid overflows int-int multiplications. The actual value of newSampleRate and
// oldSampleRate should always be comfortably within the int range.
long newSampleRate = (long) (inputSampleRateHz / rate);
long oldSampleRate = inputSampleRateHz;
int newSampleRate = (int) (inputSampleRateHz / rate);
int oldSampleRate = inputSampleRateHz;
// Set these values to help with the integer math.
while (newSampleRate != 0
&& oldSampleRate != 0
&& newSampleRate % 2 == 0
&& oldSampleRate % 2 == 0) {
while (newSampleRate > (1 << 14) || oldSampleRate > (1 << 14)) {
newSampleRate /= 2;
oldSampleRate /= 2;
}
moveNewSamplesToPitchBuffer(originalOutputFrameCount);
// Leave at least one pitch sample in the buffer.
for (int position = 0; position < pitchFrameCount - 1; position++) {
// Cast to long to avoid overflow.
while ((oldRatePosition + 1) * newSampleRate > newRatePosition * oldSampleRate) {
outputBuffer =
ensureSpaceForAdditionalFrames(
@ -430,26 +394,21 @@ import java.util.Arrays;
oldRatePosition++;
if (oldRatePosition == oldSampleRate) {
oldRatePosition = 0;
checkState(newRatePosition == newSampleRate);
Assertions.checkState(newRatePosition == newSampleRate);
newRatePosition = 0;
}
}
removePitchFrames(pitchFrameCount - 1);
}
private int skipPitchPeriod(short[] samples, int position, double speed, int period) {
private int skipPitchPeriod(short[] samples, int position, float speed, int period) {
// Skip over a pitch period, and copy period/speed samples to the output.
int newFrameCount;
if (speed >= 2.0f) {
double expectedFrameCount = period / (speed - 1.0) + accumulatedSpeedAdjustmentError;
newFrameCount = (int) Math.round(expectedFrameCount);
accumulatedSpeedAdjustmentError = expectedFrameCount - newFrameCount;
newFrameCount = (int) (period / (speed - 1.0f));
} else {
newFrameCount = period;
double expectedInputToCopy =
period * (2.0f - speed) / (speed - 1.0f) + accumulatedSpeedAdjustmentError;
remainingInputToCopyFrameCount = (int) Math.round(expectedInputToCopy);
accumulatedSpeedAdjustmentError = expectedInputToCopy - remainingInputToCopyFrameCount;
remainingInputToCopyFrameCount = (int) (period * (2.0f - speed) / (speed - 1.0f));
}
outputBuffer = ensureSpaceForAdditionalFrames(outputBuffer, outputFrameCount, newFrameCount);
overlapAdd(
@ -465,19 +424,14 @@ import java.util.Arrays;
return newFrameCount;
}
private int insertPitchPeriod(short[] samples, int position, double speed, int period) {
private int insertPitchPeriod(short[] samples, int position, float speed, int period) {
// Insert a pitch period, and determine how much input to copy directly.
int newFrameCount;
if (speed < 0.5f) {
double expectedFrameCount = period * speed / (1.0f - speed) + accumulatedSpeedAdjustmentError;
newFrameCount = (int) Math.round(expectedFrameCount);
accumulatedSpeedAdjustmentError = expectedFrameCount - newFrameCount;
newFrameCount = (int) (period * speed / (1.0f - speed));
} else {
newFrameCount = period;
double expectedInputToCopy =
period * (2.0f * speed - 1.0f) / (1.0f - speed) + accumulatedSpeedAdjustmentError;
remainingInputToCopyFrameCount = (int) Math.round(expectedInputToCopy);
accumulatedSpeedAdjustmentError = expectedInputToCopy - remainingInputToCopyFrameCount;
remainingInputToCopyFrameCount = (int) (period * (2.0f * speed - 1.0f) / (1.0f - speed));
}
outputBuffer =
ensureSpaceForAdditionalFrames(outputBuffer, outputFrameCount, period + newFrameCount);
@ -500,7 +454,7 @@ import java.util.Arrays;
return newFrameCount;
}
private void changeSpeed(double speed) {
private void changeSpeed(float speed) {
if (inputFrameCount < maxRequiredFrameCount) {
return;
}
@ -524,7 +478,7 @@ import java.util.Arrays;
private void processStreamInput() {
// Resample as many pitch periods as we have buffered on the input.
int originalOutputFrameCount = outputFrameCount;
double s = speed / pitch;
float s = speed / pitch;
float r = rate * pitch;
if (s > 1.00001 || s < 0.99999) {
changeSpeed(s);

View file

@ -28,6 +28,7 @@ import androidx.media3.common.util.SpeedProviderUtil;
import androidx.media3.common.util.TimestampConsumer;
import androidx.media3.common.util.UnstableApi;
import androidx.media3.common.util.Util;
import java.math.RoundingMode;
import java.nio.ByteBuffer;
import java.util.ArrayDeque;
import java.util.Queue;
@ -114,39 +115,27 @@ public final class SpeedChangingAudioProcessor extends BaseAudioProcessor {
@Override
public void queueInput(ByteBuffer inputBuffer) {
long currentTimeUs =
long timeUs =
Util.scaleLargeTimestamp(
/* timestamp= */ bytesRead,
/* multiplier= */ C.MICROS_PER_SECOND,
/* divisor= */ (long) inputAudioFormat.sampleRate * inputAudioFormat.bytesPerFrame);
float newSpeed = speedProvider.getSpeed(currentTimeUs);
long nextSpeedChangeTimeUs = speedProvider.getNextSpeedChangeTimeUs(currentTimeUs);
long sampleRateAlignedNextSpeedChangeTimeUs =
getSampleRateAlignedTimestamp(nextSpeedChangeTimeUs, inputAudioFormat.sampleRate);
float newSpeed = speedProvider.getSpeed(timeUs);
// If next speed change falls between the current sample position and the next sample, then get
// the next speed and next speed change from the following sample. If needed, this will ignore
// one or more mid-sample speed changes.
if (sampleRateAlignedNextSpeedChangeTimeUs == currentTimeUs) {
long sampleDuration =
Util.sampleCountToDurationUs(/* sampleCount= */ 1, inputAudioFormat.sampleRate);
newSpeed = speedProvider.getSpeed(currentTimeUs + sampleDuration);
nextSpeedChangeTimeUs =
speedProvider.getNextSpeedChangeTimeUs(currentTimeUs + sampleDuration);
}
updateSpeed(newSpeed, currentTimeUs);
updateSpeed(newSpeed, timeUs);
int inputBufferLimit = inputBuffer.limit();
long nextSpeedChangeTimeUs = speedProvider.getNextSpeedChangeTimeUs(timeUs);
int bytesToNextSpeedChange;
if (nextSpeedChangeTimeUs != C.TIME_UNSET) {
bytesToNextSpeedChange =
(int)
Util.scaleLargeTimestamp(
/* timestamp= */ nextSpeedChangeTimeUs - currentTimeUs,
Util.scaleLargeValue(
/* timestamp= */ nextSpeedChangeTimeUs - timeUs,
/* multiplier= */ (long) inputAudioFormat.sampleRate
* inputAudioFormat.bytesPerFrame,
/* divisor= */ C.MICROS_PER_SECOND);
/* divisor= */ C.MICROS_PER_SECOND,
RoundingMode.CEILING);
int bytesToNextFrame =
inputAudioFormat.bytesPerFrame - bytesToNextSpeedChange % inputAudioFormat.bytesPerFrame;
if (bytesToNextFrame != inputAudioFormat.bytesPerFrame) {
@ -421,15 +410,4 @@ public final class SpeedChangingAudioProcessor extends BaseAudioProcessor {
// because some clients register callbacks with getSpeedAdjustedTimeAsync before this audio
// processor is flushed.
}
/**
* Returns the timestamp in microseconds of the sample defined by {@code sampleRate} that is
* closest to {@code timestampUs}, using the rounding mode specified in {@link
* Util#scaleLargeTimestamp}.
*/
private static long getSampleRateAlignedTimestamp(long timestampUs, int sampleRate) {
long exactSamplePosition =
Util.scaleLargeTimestamp(timestampUs, sampleRate, C.MICROS_PER_SECOND);
return Util.scaleLargeTimestamp(exactSamplePosition, C.MICROS_PER_SECOND, sampleRate);
}
}

View file

@ -17,23 +17,16 @@ package androidx.media3.common.util;
import static androidx.media3.common.util.Assertions.checkArgument;
import android.annotation.SuppressLint;
import android.media.MediaCodecInfo;
import android.util.Pair;
import androidx.annotation.Nullable;
import androidx.media3.common.C;
import androidx.media3.common.ColorInfo;
import androidx.media3.common.Format;
import androidx.media3.common.MimeTypes;
import com.google.common.collect.ImmutableList;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
/** Provides utilities for handling various types of codec-specific data. */
@SuppressLint("InlinedApi")
@UnstableApi
public final class CodecSpecificDataUtil {
@ -47,26 +40,6 @@ public final class CodecSpecificDataUtil {
private static final int EXTENDED_PAR = 0x0F;
private static final int RECTANGULAR = 0x00;
// Codecs to constant mappings.
// H263
private static final String CODEC_ID_H263 = "s263";
// AVC.
private static final String CODEC_ID_AVC1 = "avc1";
private static final String CODEC_ID_AVC2 = "avc2";
// VP9
private static final String CODEC_ID_VP09 = "vp09";
// HEVC.
private static final String CODEC_ID_HEV1 = "hev1";
private static final String CODEC_ID_HVC1 = "hvc1";
// AV1.
private static final String CODEC_ID_AV01 = "av01";
// MP4A AAC.
private static final String CODEC_ID_MP4A = "mp4a";
private static final Pattern PROFILE_PATTERN = Pattern.compile("^\\D?(\\d+)$");
private static final String TAG = "CodecSpecificDataUtil";
/**
* Parses an ALAC AudioSpecificConfig (i.e. an <a
* href="https://github.com/macosforge/alac/blob/master/ALACMagicCookieDescription.txt">ALACSpecificConfig</a>).
@ -261,103 +234,6 @@ public final class CodecSpecificDataUtil {
return builder.toString();
}
/** Builds an RFC 6381 H263 codec string using profile and level. */
public static String buildH263CodecString(int profile, int level) {
return Util.formatInvariant("s263.%d.%d", profile, level);
}
/**
* Returns profile and level (as defined by {@link MediaCodecInfo.CodecProfileLevel})
* corresponding to the codec description string (as defined by RFC 6381) of the given format.
*
* @param format Media format with a codec description string, as defined by RFC 6381.
* @return A pair (profile constant, level constant) if the codec of the {@code format} is
* well-formed and recognized, or null otherwise.
*/
@Nullable
public static Pair<Integer, Integer> getCodecProfileAndLevel(Format format) {
if (format.codecs == null) {
return null;
}
String[] parts = format.codecs.split("\\.");
// Dolby Vision can use DV, AVC or HEVC codec IDs, so check the MIME type first.
if (MimeTypes.VIDEO_DOLBY_VISION.equals(format.sampleMimeType)) {
return getDolbyVisionProfileAndLevel(format.codecs, parts);
}
switch (parts[0]) {
case CODEC_ID_H263:
return getH263ProfileAndLevel(format.codecs, parts);
case CODEC_ID_AVC1:
case CODEC_ID_AVC2:
return getAvcProfileAndLevel(format.codecs, parts);
case CODEC_ID_VP09:
return getVp9ProfileAndLevel(format.codecs, parts);
case CODEC_ID_HEV1:
case CODEC_ID_HVC1:
return getHevcProfileAndLevel(format.codecs, parts, format.colorInfo);
case CODEC_ID_AV01:
return getAv1ProfileAndLevel(format.codecs, parts, format.colorInfo);
case CODEC_ID_MP4A:
return getAacCodecProfileAndLevel(format.codecs, parts);
default:
return null;
}
}
/**
* Returns Hevc profile and level corresponding to the codec description string (as defined by RFC
* 6381) and it {@link ColorInfo}.
*
* @param codec The codec description string (as defined by RFC 6381).
* @param parts The codec string split by ".".
* @param colorInfo The {@link ColorInfo}.
* @return A pair (profile constant, level constant) if profile and level are recognized, or
* {@code null} otherwise.
*/
@Nullable
public static Pair<Integer, Integer> getHevcProfileAndLevel(
String codec, String[] parts, @Nullable ColorInfo colorInfo) {
if (parts.length < 4) {
// The codec has fewer parts than required by the HEVC codec string format.
Log.w(TAG, "Ignoring malformed HEVC codec string: " + codec);
return null;
}
// The profile_space gets ignored.
Matcher matcher = PROFILE_PATTERN.matcher(parts[1]);
if (!matcher.matches()) {
Log.w(TAG, "Ignoring malformed HEVC codec string: " + codec);
return null;
}
@Nullable String profileString = matcher.group(1);
int profile;
if ("1".equals(profileString)) {
profile = MediaCodecInfo.CodecProfileLevel.HEVCProfileMain;
} else if ("2".equals(profileString)) {
if (colorInfo != null && colorInfo.colorTransfer == C.COLOR_TRANSFER_ST2084) {
profile = MediaCodecInfo.CodecProfileLevel.HEVCProfileMain10HDR10;
} else {
// For all other cases, we map to the Main10 profile. Note that this includes HLG
// HDR. On Android 13+, the platform guarantees that a decoder that advertises
// HEVCProfileMain10 will be able to decode HLG. This is not guaranteed for older
// Android versions, but we still map to Main10 for backwards compatibility.
profile = MediaCodecInfo.CodecProfileLevel.HEVCProfileMain10;
}
} else if ("6".equals(profileString)) {
// Framework does not have profileLevel.HEVCProfileMultiviewMain defined.
profile = 6;
} else {
Log.w(TAG, "Unknown HEVC profile string: " + profileString);
return null;
}
@Nullable String levelString = parts[3];
@Nullable Integer level = hevcCodecStringToProfileLevel(levelString);
if (level == null) {
Log.w(TAG, "Unknown HEVC level string: " + levelString);
return null;
}
return new Pair<>(profile, level);
}
/**
* Constructs a NAL unit consisting of the NAL start code followed by the specified data.
*
@ -443,528 +319,5 @@ public final class CodecSpecificDataUtil {
return true;
}
@Nullable
private static Pair<Integer, Integer> getDolbyVisionProfileAndLevel(
String codec, String[] parts) {
if (parts.length < 3) {
// The codec has fewer parts than required by the Dolby Vision codec string format.
Log.w(TAG, "Ignoring malformed Dolby Vision codec string: " + codec);
return null;
}
// The profile_space gets ignored.
Matcher matcher = PROFILE_PATTERN.matcher(parts[1]);
if (!matcher.matches()) {
Log.w(TAG, "Ignoring malformed Dolby Vision codec string: " + codec);
return null;
}
@Nullable String profileString = matcher.group(1);
@Nullable Integer profile = dolbyVisionStringToProfile(profileString);
if (profile == null) {
Log.w(TAG, "Unknown Dolby Vision profile string: " + profileString);
return null;
}
String levelString = parts[2];
@Nullable Integer level = dolbyVisionStringToLevel(levelString);
if (level == null) {
Log.w(TAG, "Unknown Dolby Vision level string: " + levelString);
return null;
}
return new Pair<>(profile, level);
}
/** Returns H263 profile and level from codec string. */
private static Pair<Integer, Integer> getH263ProfileAndLevel(String codec, String[] parts) {
Pair<Integer, Integer> defaultProfileAndLevel =
new Pair<>(
MediaCodecInfo.CodecProfileLevel.H263ProfileBaseline,
MediaCodecInfo.CodecProfileLevel.H263Level10);
if (parts.length < 3) {
Log.w(TAG, "Ignoring malformed H263 codec string: " + codec);
return defaultProfileAndLevel;
}
try {
int profile = Integer.parseInt(parts[1]);
int level = Integer.parseInt(parts[2]);
return new Pair<>(profile, level);
} catch (NumberFormatException e) {
Log.w(TAG, "Ignoring malformed H263 codec string: " + codec);
return defaultProfileAndLevel;
}
}
@Nullable
private static Pair<Integer, Integer> getAvcProfileAndLevel(String codec, String[] parts) {
if (parts.length < 2) {
// The codec has fewer parts than required by the AVC codec string format.
Log.w(TAG, "Ignoring malformed AVC codec string: " + codec);
return null;
}
int profileInteger;
int levelInteger;
try {
if (parts[1].length() == 6) {
// Format: avc1.xxccyy, where xx is profile and yy level, both hexadecimal.
profileInteger = Integer.parseInt(parts[1].substring(0, 2), 16);
levelInteger = Integer.parseInt(parts[1].substring(4), 16);
} else if (parts.length >= 3) {
// Format: avc1.xx.[y]yy where xx is profile and [y]yy level, both decimal.
profileInteger = Integer.parseInt(parts[1]);
levelInteger = Integer.parseInt(parts[2]);
} else {
// We don't recognize the format.
Log.w(TAG, "Ignoring malformed AVC codec string: " + codec);
return null;
}
} catch (NumberFormatException e) {
Log.w(TAG, "Ignoring malformed AVC codec string: " + codec);
return null;
}
int profile = avcProfileNumberToConst(profileInteger);
if (profile == -1) {
Log.w(TAG, "Unknown AVC profile: " + profileInteger);
return null;
}
int level = avcLevelNumberToConst(levelInteger);
if (level == -1) {
Log.w(TAG, "Unknown AVC level: " + levelInteger);
return null;
}
return new Pair<>(profile, level);
}
@Nullable
private static Pair<Integer, Integer> getVp9ProfileAndLevel(String codec, String[] parts) {
if (parts.length < 3) {
Log.w(TAG, "Ignoring malformed VP9 codec string: " + codec);
return null;
}
int profileInteger;
int levelInteger;
try {
profileInteger = Integer.parseInt(parts[1]);
levelInteger = Integer.parseInt(parts[2]);
} catch (NumberFormatException e) {
Log.w(TAG, "Ignoring malformed VP9 codec string: " + codec);
return null;
}
int profile = vp9ProfileNumberToConst(profileInteger);
if (profile == -1) {
Log.w(TAG, "Unknown VP9 profile: " + profileInteger);
return null;
}
int level = vp9LevelNumberToConst(levelInteger);
if (level == -1) {
Log.w(TAG, "Unknown VP9 level: " + levelInteger);
return null;
}
return new Pair<>(profile, level);
}
@Nullable
private static Pair<Integer, Integer> getAv1ProfileAndLevel(
String codec, String[] parts, @Nullable ColorInfo colorInfo) {
if (parts.length < 4) {
Log.w(TAG, "Ignoring malformed AV1 codec string: " + codec);
return null;
}
int profileInteger;
int levelInteger;
int bitDepthInteger;
try {
profileInteger = Integer.parseInt(parts[1]);
levelInteger = Integer.parseInt(parts[2].substring(0, 2));
bitDepthInteger = Integer.parseInt(parts[3]);
} catch (NumberFormatException e) {
Log.w(TAG, "Ignoring malformed AV1 codec string: " + codec);
return null;
}
if (profileInteger != 0) {
Log.w(TAG, "Unknown AV1 profile: " + profileInteger);
return null;
}
if (bitDepthInteger != 8 && bitDepthInteger != 10) {
Log.w(TAG, "Unknown AV1 bit depth: " + bitDepthInteger);
return null;
}
int profile;
if (bitDepthInteger == 8) {
profile = MediaCodecInfo.CodecProfileLevel.AV1ProfileMain8;
} else if (colorInfo != null
&& (colorInfo.hdrStaticInfo != null
|| colorInfo.colorTransfer == C.COLOR_TRANSFER_HLG
|| colorInfo.colorTransfer == C.COLOR_TRANSFER_ST2084)) {
profile = MediaCodecInfo.CodecProfileLevel.AV1ProfileMain10HDR10;
} else {
profile = MediaCodecInfo.CodecProfileLevel.AV1ProfileMain10;
}
int level = av1LevelNumberToConst(levelInteger);
if (level == -1) {
Log.w(TAG, "Unknown AV1 level: " + levelInteger);
return null;
}
return new Pair<>(profile, level);
}
@Nullable
private static Pair<Integer, Integer> getAacCodecProfileAndLevel(String codec, String[] parts) {
if (parts.length != 3) {
Log.w(TAG, "Ignoring malformed MP4A codec string: " + codec);
return null;
}
try {
// Get the object type indication, which is a hexadecimal value (see RFC 6381/ISO 14496-1).
int objectTypeIndication = Integer.parseInt(parts[1], 16);
@Nullable String mimeType = MimeTypes.getMimeTypeFromMp4ObjectType(objectTypeIndication);
if (MimeTypes.AUDIO_AAC.equals(mimeType)) {
// For MPEG-4 audio this is followed by an audio object type indication as a decimal number.
int audioObjectTypeIndication = Integer.parseInt(parts[2]);
int profile = mp4aAudioObjectTypeToProfile(audioObjectTypeIndication);
if (profile != -1) {
// Level is set to zero in AAC decoder CodecProfileLevels.
return new Pair<>(profile, 0);
}
}
} catch (NumberFormatException e) {
Log.w(TAG, "Ignoring malformed MP4A codec string: " + codec);
}
return null;
}
private static int avcProfileNumberToConst(int profileNumber) {
switch (profileNumber) {
case 66:
return MediaCodecInfo.CodecProfileLevel.AVCProfileBaseline;
case 77:
return MediaCodecInfo.CodecProfileLevel.AVCProfileMain;
case 88:
return MediaCodecInfo.CodecProfileLevel.AVCProfileExtended;
case 100:
return MediaCodecInfo.CodecProfileLevel.AVCProfileHigh;
case 110:
return MediaCodecInfo.CodecProfileLevel.AVCProfileHigh10;
case 122:
return MediaCodecInfo.CodecProfileLevel.AVCProfileHigh422;
case 244:
return MediaCodecInfo.CodecProfileLevel.AVCProfileHigh444;
default:
return -1;
}
}
private static int avcLevelNumberToConst(int levelNumber) {
// TODO: Find int for CodecProfileLevel.AVCLevel1b.
switch (levelNumber) {
case 10:
return MediaCodecInfo.CodecProfileLevel.AVCLevel1;
case 11:
return MediaCodecInfo.CodecProfileLevel.AVCLevel11;
case 12:
return MediaCodecInfo.CodecProfileLevel.AVCLevel12;
case 13:
return MediaCodecInfo.CodecProfileLevel.AVCLevel13;
case 20:
return MediaCodecInfo.CodecProfileLevel.AVCLevel2;
case 21:
return MediaCodecInfo.CodecProfileLevel.AVCLevel21;
case 22:
return MediaCodecInfo.CodecProfileLevel.AVCLevel22;
case 30:
return MediaCodecInfo.CodecProfileLevel.AVCLevel3;
case 31:
return MediaCodecInfo.CodecProfileLevel.AVCLevel31;
case 32:
return MediaCodecInfo.CodecProfileLevel.AVCLevel32;
case 40:
return MediaCodecInfo.CodecProfileLevel.AVCLevel4;
case 41:
return MediaCodecInfo.CodecProfileLevel.AVCLevel41;
case 42:
return MediaCodecInfo.CodecProfileLevel.AVCLevel42;
case 50:
return MediaCodecInfo.CodecProfileLevel.AVCLevel5;
case 51:
return MediaCodecInfo.CodecProfileLevel.AVCLevel51;
case 52:
return MediaCodecInfo.CodecProfileLevel.AVCLevel52;
default:
return -1;
}
}
private static int vp9ProfileNumberToConst(int profileNumber) {
switch (profileNumber) {
case 0:
return MediaCodecInfo.CodecProfileLevel.VP9Profile0;
case 1:
return MediaCodecInfo.CodecProfileLevel.VP9Profile1;
case 2:
return MediaCodecInfo.CodecProfileLevel.VP9Profile2;
case 3:
return MediaCodecInfo.CodecProfileLevel.VP9Profile3;
default:
return -1;
}
}
private static int vp9LevelNumberToConst(int levelNumber) {
switch (levelNumber) {
case 10:
return MediaCodecInfo.CodecProfileLevel.VP9Level1;
case 11:
return MediaCodecInfo.CodecProfileLevel.VP9Level11;
case 20:
return MediaCodecInfo.CodecProfileLevel.VP9Level2;
case 21:
return MediaCodecInfo.CodecProfileLevel.VP9Level21;
case 30:
return MediaCodecInfo.CodecProfileLevel.VP9Level3;
case 31:
return MediaCodecInfo.CodecProfileLevel.VP9Level31;
case 40:
return MediaCodecInfo.CodecProfileLevel.VP9Level4;
case 41:
return MediaCodecInfo.CodecProfileLevel.VP9Level41;
case 50:
return MediaCodecInfo.CodecProfileLevel.VP9Level5;
case 51:
return MediaCodecInfo.CodecProfileLevel.VP9Level51;
case 60:
return MediaCodecInfo.CodecProfileLevel.VP9Level6;
case 61:
return MediaCodecInfo.CodecProfileLevel.VP9Level61;
case 62:
return MediaCodecInfo.CodecProfileLevel.VP9Level62;
default:
return -1;
}
}
@Nullable
private static Integer hevcCodecStringToProfileLevel(@Nullable String codecString) {
if (codecString == null) {
return null;
}
switch (codecString) {
case "L30":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel1;
case "L60":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel2;
case "L63":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel21;
case "L90":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel3;
case "L93":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel31;
case "L120":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel4;
case "L123":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel41;
case "L150":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel5;
case "L153":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel51;
case "L156":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel52;
case "L180":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel6;
case "L183":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel61;
case "L186":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel62;
case "H30":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel1;
case "H60":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel2;
case "H63":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel21;
case "H90":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel3;
case "H93":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel31;
case "H120":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel4;
case "H123":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel41;
case "H150":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel5;
case "H153":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel51;
case "H156":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel52;
case "H180":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel6;
case "H183":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel61;
case "H186":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel62;
default:
return null;
}
}
@Nullable
private static Integer dolbyVisionStringToProfile(@Nullable String profileString) {
if (profileString == null) {
return null;
}
switch (profileString) {
case "00":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvavPer;
case "01":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvavPen;
case "02":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheDer;
case "03":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheDen;
case "04":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheDtr;
case "05":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheStn;
case "06":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheDth;
case "07":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheDtb;
case "08":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheSt;
case "09":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvavSe;
case "10":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvav110;
default:
return null;
}
}
@Nullable
private static Integer dolbyVisionStringToLevel(@Nullable String levelString) {
if (levelString == null) {
return null;
}
// TODO (Internal: b/179261323): use framework constant for level 13.
switch (levelString) {
case "01":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelHd24;
case "02":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelHd30;
case "03":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelFhd24;
case "04":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelFhd30;
case "05":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelFhd60;
case "06":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd24;
case "07":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd30;
case "08":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd48;
case "09":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd60;
case "10":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd120;
case "11":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevel8k30;
case "12":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevel8k60;
case "13":
return 0x1000;
default:
return null;
}
}
private static int av1LevelNumberToConst(int levelNumber) {
// See https://aomediacodec.github.io/av1-spec/av1-spec.pdf Annex A: Profiles and levels for
// more information on mapping AV1 codec strings to levels.
switch (levelNumber) {
case 0:
return MediaCodecInfo.CodecProfileLevel.AV1Level2;
case 1:
return MediaCodecInfo.CodecProfileLevel.AV1Level21;
case 2:
return MediaCodecInfo.CodecProfileLevel.AV1Level22;
case 3:
return MediaCodecInfo.CodecProfileLevel.AV1Level23;
case 4:
return MediaCodecInfo.CodecProfileLevel.AV1Level3;
case 5:
return MediaCodecInfo.CodecProfileLevel.AV1Level31;
case 6:
return MediaCodecInfo.CodecProfileLevel.AV1Level32;
case 7:
return MediaCodecInfo.CodecProfileLevel.AV1Level33;
case 8:
return MediaCodecInfo.CodecProfileLevel.AV1Level4;
case 9:
return MediaCodecInfo.CodecProfileLevel.AV1Level41;
case 10:
return MediaCodecInfo.CodecProfileLevel.AV1Level42;
case 11:
return MediaCodecInfo.CodecProfileLevel.AV1Level43;
case 12:
return MediaCodecInfo.CodecProfileLevel.AV1Level5;
case 13:
return MediaCodecInfo.CodecProfileLevel.AV1Level51;
case 14:
return MediaCodecInfo.CodecProfileLevel.AV1Level52;
case 15:
return MediaCodecInfo.CodecProfileLevel.AV1Level53;
case 16:
return MediaCodecInfo.CodecProfileLevel.AV1Level6;
case 17:
return MediaCodecInfo.CodecProfileLevel.AV1Level61;
case 18:
return MediaCodecInfo.CodecProfileLevel.AV1Level62;
case 19:
return MediaCodecInfo.CodecProfileLevel.AV1Level63;
case 20:
return MediaCodecInfo.CodecProfileLevel.AV1Level7;
case 21:
return MediaCodecInfo.CodecProfileLevel.AV1Level71;
case 22:
return MediaCodecInfo.CodecProfileLevel.AV1Level72;
case 23:
return MediaCodecInfo.CodecProfileLevel.AV1Level73;
default:
return -1;
}
}
private static int mp4aAudioObjectTypeToProfile(int profileNumber) {
switch (profileNumber) {
case 1:
return MediaCodecInfo.CodecProfileLevel.AACObjectMain;
case 2:
return MediaCodecInfo.CodecProfileLevel.AACObjectLC;
case 3:
return MediaCodecInfo.CodecProfileLevel.AACObjectSSR;
case 4:
return MediaCodecInfo.CodecProfileLevel.AACObjectLTP;
case 5:
return MediaCodecInfo.CodecProfileLevel.AACObjectHE;
case 6:
return MediaCodecInfo.CodecProfileLevel.AACObjectScalable;
case 17:
return MediaCodecInfo.CodecProfileLevel.AACObjectERLC;
case 20:
return MediaCodecInfo.CodecProfileLevel.AACObjectERScalable;
case 23:
return MediaCodecInfo.CodecProfileLevel.AACObjectLD;
case 29:
return MediaCodecInfo.CodecProfileLevel.AACObjectHE_PS;
case 39:
return MediaCodecInfo.CodecProfileLevel.AACObjectELD;
case 42:
return MediaCodecInfo.CodecProfileLevel.AACObjectXHE;
default:
return -1;
}
}
private CodecSpecificDataUtil() {}
}

View file

@ -473,7 +473,7 @@ public final class GlProgram {
? GLES20.GL_TEXTURE_2D
: GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
texIdValue,
type == GLES20.GL_SAMPLER_2D || !externalTexturesRequireNearestSampling
type == GLES20.GL_SAMPLER_2D && !externalTexturesRequireNearestSampling
? GLES20.GL_LINEAR
: GLES20.GL_NEAREST);
GLES20.glUniform1i(location, texUnitIndex);

View file

@ -28,7 +28,6 @@ import androidx.media3.common.MimeTypes;
import com.google.common.collect.ImmutableList;
import java.nio.ByteBuffer;
import java.util.List;
import java.util.Objects;
/** Helper class containing utility methods for managing {@link MediaFormat} instances. */
@UnstableApi
@ -80,7 +79,7 @@ public final class MediaFormatUtil {
.setAverageBitrate(
getInteger(
mediaFormat, MediaFormat.KEY_BIT_RATE, /* defaultValue= */ Format.NO_VALUE))
.setCodecs(getCodecString(mediaFormat))
.setCodecs(mediaFormat.getString(MediaFormat.KEY_CODECS_STRING))
.setFrameRate(getFrameRate(mediaFormat, /* defaultValue= */ Format.NO_VALUE))
.setWidth(
getInteger(mediaFormat, MediaFormat.KEY_WIDTH, /* defaultValue= */ Format.NO_VALUE))
@ -96,7 +95,8 @@ public final class MediaFormatUtil {
/* defaultValue= */ Format.NO_VALUE))
.setRotationDegrees(
getInteger(mediaFormat, MediaFormat.KEY_ROTATION, /* defaultValue= */ 0))
.setColorInfo(getColorInfo(mediaFormat))
// TODO(b/278101856): Disallow invalid values after confirming.
.setColorInfo(getColorInfo(mediaFormat, /* allowInvalidValues= */ true))
.setSampleRate(
getInteger(
mediaFormat, MediaFormat.KEY_SAMPLE_RATE, /* defaultValue= */ Format.NO_VALUE))
@ -269,6 +269,13 @@ public final class MediaFormatUtil {
*/
@Nullable
public static ColorInfo getColorInfo(MediaFormat mediaFormat) {
return getColorInfo(mediaFormat, /* allowInvalidValues= */ false);
}
// Internal methods.
@Nullable
private static ColorInfo getColorInfo(MediaFormat mediaFormat, boolean allowInvalidValues) {
if (SDK_INT < 24) {
// MediaFormat KEY_COLOR_TRANSFER and other KEY_COLOR values available from API 24.
return null;
@ -286,17 +293,21 @@ public final class MediaFormatUtil {
@Nullable
byte[] hdrStaticInfo =
hdrStaticInfoByteBuffer != null ? getArray(hdrStaticInfoByteBuffer) : null;
// Some devices may produce invalid values from MediaFormat#getInteger.
// See b/239435670 for more information.
if (!isValidColorSpace(colorSpace)) {
colorSpace = Format.NO_VALUE;
}
if (!isValidColorRange(colorRange)) {
colorRange = Format.NO_VALUE;
}
if (!isValidColorTransfer(colorTransfer)) {
colorTransfer = Format.NO_VALUE;
if (!allowInvalidValues) {
// Some devices may produce invalid values from MediaFormat#getInteger.
// See b/239435670 for more information.
if (!isValidColorSpace(colorSpace)) {
colorSpace = Format.NO_VALUE;
}
if (!isValidColorRange(colorRange)) {
colorRange = Format.NO_VALUE;
}
if (!isValidColorTransfer(colorTransfer)) {
colorTransfer = Format.NO_VALUE;
}
}
if (colorSpace != Format.NO_VALUE
|| colorRange != Format.NO_VALUE
|| colorTransfer != Format.NO_VALUE
@ -321,32 +332,6 @@ public final class MediaFormatUtil {
return mediaFormat.containsKey(name) ? mediaFormat.getFloat(name) : defaultValue;
}
/** Supports {@link MediaFormat#getString(String, String)} for {@code API < 29}. */
@Nullable
public static String getString(
MediaFormat mediaFormat, String name, @Nullable String defaultValue) {
return mediaFormat.containsKey(name) ? mediaFormat.getString(name) : defaultValue;
}
/**
* Returns a {@code Codecs string} of {@link MediaFormat}. In case of an H263 codec string, builds
* and returns an RFC 6381 H263 codec string using profile and level.
*/
@Nullable
@SuppressLint("InlinedApi") // Inlined MediaFormat keys.
private static String getCodecString(MediaFormat mediaFormat) {
// Add H263 profile and level to codec string as per RFC 6381.
if (Objects.equals(mediaFormat.getString(MediaFormat.KEY_MIME), MimeTypes.VIDEO_H263)
&& mediaFormat.containsKey(MediaFormat.KEY_PROFILE)
&& mediaFormat.containsKey(MediaFormat.KEY_LEVEL)) {
return CodecSpecificDataUtil.buildH263CodecString(
mediaFormat.getInteger(MediaFormat.KEY_PROFILE),
mediaFormat.getInteger(MediaFormat.KEY_LEVEL));
} else {
return getString(mediaFormat, MediaFormat.KEY_CODECS_STRING, /* defaultValue= */ null);
}
}
/**
* Returns the frame rate from a {@link MediaFormat}.
*

View file

@ -63,12 +63,12 @@ public final class RepeatModeUtil {
/**
* Gets the next repeat mode out of {@code enabledModes} starting from {@code currentMode}.
*
* @param currentMode The current {@link Player.RepeatMode}.
* @param enabledModes The bitmask of enabled {@link RepeatToggleModes}.
* @param currentMode The current repeat mode.
* @param enabledModes Bitmask of enabled modes.
* @return The next repeat mode.
*/
public static @Player.RepeatMode int getNextRepeatMode(
@Player.RepeatMode int currentMode, @RepeatToggleModes int enabledModes) {
@Player.RepeatMode int currentMode, int enabledModes) {
for (int offset = 1; offset <= 2; offset++) {
@Player.RepeatMode int proposedMode = (currentMode + offset) % 3;
if (isRepeatModeEnabled(proposedMode, enabledModes)) {
@ -79,15 +79,13 @@ public final class RepeatModeUtil {
}
/**
* Verifies whether a given {@link Player.RepeatMode} is enabled in the bitmask of {@link
* RepeatToggleModes}.
* Verifies whether a given {@code repeatMode} is enabled in the bitmask {@code enabledModes}.
*
* @param repeatMode The {@link Player.RepeatMode} to check.
* @param enabledModes The bitmask of enabled {@link RepeatToggleModes}.
* @param repeatMode The mode to check.
* @param enabledModes The bitmask representing the enabled modes.
* @return {@code true} if enabled.
*/
public static boolean isRepeatModeEnabled(
@Player.RepeatMode int repeatMode, @RepeatToggleModes int enabledModes) {
public static boolean isRepeatModeEnabled(@Player.RepeatMode int repeatMode, int enabledModes) {
switch (repeatMode) {
case Player.REPEAT_MODE_OFF:
return true;

View file

@ -271,7 +271,7 @@ public final class TimestampAdjuster {
* @return The corresponding value in microseconds.
*/
public static long ptsToUs(long pts) {
return Util.scaleLargeTimestamp(pts, C.MICROS_PER_SECOND, 90000);
return (pts * C.MICROS_PER_SECOND) / 90000;
}
/**
@ -295,6 +295,6 @@ public final class TimestampAdjuster {
* @return The corresponding value as a 90 kHz clock timestamp.
*/
public static long usToNonWrappedPts(long us) {
return Util.scaleLargeTimestamp(us, 90000, C.MICROS_PER_SECOND);
return (us * 90000) / C.MICROS_PER_SECOND;
}
}

View file

@ -80,6 +80,7 @@ import android.view.Display;
import android.view.SurfaceView;
import android.view.WindowManager;
import androidx.annotation.ChecksSdkIntAtLeast;
import androidx.annotation.DoNotInline;
import androidx.annotation.DrawableRes;
import androidx.annotation.Nullable;
import androidx.annotation.RequiresApi;
@ -1600,7 +1601,7 @@ public final class Util {
*/
@UnstableApi
public static long sampleCountToDurationUs(long sampleCount, int sampleRate) {
return scaleLargeValue(sampleCount, C.MICROS_PER_SECOND, sampleRate, RoundingMode.DOWN);
return scaleLargeValue(sampleCount, C.MICROS_PER_SECOND, sampleRate, RoundingMode.FLOOR);
}
/**
@ -1617,7 +1618,7 @@ public final class Util {
*/
@UnstableApi
public static long durationUsToSampleCount(long durationUs, int sampleRate) {
return scaleLargeValue(durationUs, sampleRate, C.MICROS_PER_SECOND, RoundingMode.UP);
return scaleLargeValue(durationUs, sampleRate, C.MICROS_PER_SECOND, RoundingMode.CEILING);
}
/**
@ -1902,18 +1903,16 @@ public final class Util {
* Scales a large timestamp.
*
* <p>Equivalent to {@link #scaleLargeValue(long, long, long, RoundingMode)} with {@link
* RoundingMode#DOWN}.
* RoundingMode#FLOOR}.
*
* @param timestamp The timestamp to scale.
* @param multiplier The multiplier.
* @param divisor The divisor.
* @return The scaled timestamp.
*/
// TODO: b/372204124 - Consider switching this (and impls below) to HALF_UP rounding to reduce
// round-trip errors when switching between time bases with different resolutions.
@UnstableApi
public static long scaleLargeTimestamp(long timestamp, long multiplier, long divisor) {
return scaleLargeValue(timestamp, multiplier, divisor, RoundingMode.DOWN);
return scaleLargeValue(timestamp, multiplier, divisor, RoundingMode.FLOOR);
}
/**
@ -1926,7 +1925,7 @@ public final class Util {
*/
@UnstableApi
public static long[] scaleLargeTimestamps(List<Long> timestamps, long multiplier, long divisor) {
return scaleLargeValues(timestamps, multiplier, divisor, RoundingMode.DOWN);
return scaleLargeValues(timestamps, multiplier, divisor, RoundingMode.FLOOR);
}
/**
@ -1938,7 +1937,7 @@ public final class Util {
*/
@UnstableApi
public static void scaleLargeTimestampsInPlace(long[] timestamps, long multiplier, long divisor) {
scaleLargeValuesInPlace(timestamps, multiplier, divisor, RoundingMode.DOWN);
scaleLargeValuesInPlace(timestamps, multiplier, divisor, RoundingMode.FLOOR);
}
/**
@ -2250,24 +2249,6 @@ public final class Util {
}
case 12:
return AudioFormat.CHANNEL_OUT_7POINT1POINT4;
case 24:
if (Util.SDK_INT >= 32) {
return AudioFormat.CHANNEL_OUT_7POINT1POINT4
| AudioFormat.CHANNEL_OUT_FRONT_LEFT_OF_CENTER
| AudioFormat.CHANNEL_OUT_FRONT_RIGHT_OF_CENTER
| AudioFormat.CHANNEL_OUT_BACK_CENTER
| AudioFormat.CHANNEL_OUT_TOP_CENTER
| AudioFormat.CHANNEL_OUT_TOP_FRONT_CENTER
| AudioFormat.CHANNEL_OUT_TOP_BACK_CENTER
| AudioFormat.CHANNEL_OUT_TOP_SIDE_LEFT
| AudioFormat.CHANNEL_OUT_TOP_SIDE_RIGHT
| AudioFormat.CHANNEL_OUT_BOTTOM_FRONT_LEFT
| AudioFormat.CHANNEL_OUT_BOTTOM_FRONT_RIGHT
| AudioFormat.CHANNEL_OUT_BOTTOM_FRONT_CENTER
| AudioFormat.CHANNEL_OUT_LOW_FREQUENCY_2;
} else {
return AudioFormat.CHANNEL_INVALID;
}
default:
return AudioFormat.CHANNEL_INVALID;
}
@ -3937,6 +3918,7 @@ public final class Util {
@RequiresApi(29)
private static class Api29 {
@DoNotInline
public static void startForeground(
Service mediaSessionService,
int notificationId,

View file

@ -20,7 +20,6 @@ import static com.google.common.truth.Truth.assertThat;
import android.net.Uri;
import android.os.Bundle;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import com.google.common.collect.ImmutableList;
import org.junit.Test;
import org.junit.runner.RunWith;
@ -69,7 +68,6 @@ public class MediaMetadataTest {
assertThat(mediaMetadata.compilation).isNull();
assertThat(mediaMetadata.station).isNull();
assertThat(mediaMetadata.mediaType).isNull();
assertThat(mediaMetadata.supportedCommands).isEmpty();
assertThat(mediaMetadata.extras).isNull();
}
@ -280,7 +278,6 @@ public class MediaMetadataTest {
.setCompilation("Amazing songs.")
.setStation("radio station")
.setMediaType(MediaMetadata.MEDIA_TYPE_MIXED)
.setSupportedCommands(ImmutableList.of("command1", "command2"))
.setExtras(extras)
.build();
}

View file

@ -148,7 +148,6 @@ public final class MimeTypesTest {
assertThat(MimeTypes.getTrackType(MimeTypes.APPLICATION_CEA608)).isEqualTo(C.TRACK_TYPE_TEXT);
assertThat(MimeTypes.getTrackType(MimeTypes.APPLICATION_EMSG)).isEqualTo(C.TRACK_TYPE_METADATA);
assertThat(MimeTypes.getTrackType(MimeTypes.APPLICATION_AIT)).isEqualTo(C.TRACK_TYPE_METADATA);
assertThat(MimeTypes.getTrackType(MimeTypes.APPLICATION_CAMERA_MOTION))
.isEqualTo(C.TRACK_TYPE_CAMERA_MOTION);
assertThat(MimeTypes.getTrackType("application/custom")).isEqualTo(C.TRACK_TYPE_UNKNOWN);

View file

@ -1,227 +0,0 @@
/*
* Copyright (C) 2024 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package androidx.media3.common.audio;
import static androidx.media3.common.audio.SonicTestingUtils.calculateAccumulatedTruncationErrorForResampling;
import static androidx.media3.test.utils.TestUtil.generateFloatInRange;
import static com.google.common.truth.Truth.assertThat;
import static java.lang.Math.max;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableSet;
import com.google.common.collect.Range;
import java.math.BigDecimal;
import java.math.RoundingMode;
import java.nio.ByteBuffer;
import java.nio.ShortBuffer;
import java.util.Random;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.robolectric.ParameterizedRobolectricTestRunner;
import org.robolectric.ParameterizedRobolectricTestRunner.Parameter;
import org.robolectric.ParameterizedRobolectricTestRunner.Parameters;
/** Parameterized robolectric test for {@link Sonic}. */
@RunWith(ParameterizedRobolectricTestRunner.class)
public final class RandomParameterizedSonicTest {
private static final int BLOCK_SIZE = 4096;
private static final int BYTES_PER_SAMPLE = 2;
private static final int SAMPLE_RATE = 48000;
// Max 10 min streams.
private static final long MAX_LENGTH_SAMPLES = 10 * 60 * SAMPLE_RATE;
/** Defines how many random instances of each parameter the test runner should generate. */
private static final int PARAM_COUNT = 5;
private static final int SPEED_DECIMAL_PRECISION = 2;
/**
* Allowed error tolerance ratio for number of output samples for Sonic's time stretching
* algorithm.
*
* <p>The actual tolerance is calculated as {@code expectedOutputSampleCount /
* TIME_STRETCHING_SAMPLE_DRIFT_TOLERANCE}, rounded to the nearest integer value. However, we
* always allow a minimum tolerance of ±1 samples.
*
* <p>This tolerance is roughly equal to an error of 900us/~44 samples/0.000017% for a 90 min mono
* stream @48KHz. To obtain the value, we ran 100 iterations of {@link
* #timeStretching_returnsExpectedNumberOfSamples()} (by setting {@link #PARAM_COUNT} to 10) and
* we calculated the average delta percentage between expected number of samples and actual number
* of samples (b/366169590).
*/
private static final BigDecimal TIME_STRETCHING_SAMPLE_DRIFT_TOLERANCE =
new BigDecimal("0.00000017");
private static final ImmutableList<Range<Float>> SPEED_RANGES =
ImmutableList.of(
Range.closedOpen(0f, 0.5f),
Range.closedOpen(0.5f, 1f),
Range.closedOpen(1f, 2f),
Range.closedOpen(2f, 20f));
private static final Random random = new Random(/* seed */ 0);
private static final ImmutableList<Object[]> sParams = initParams();
@Parameters(name = "speed={0}, streamLength={1}")
public static ImmutableList<Object[]> params() {
// params() is called multiple times, so return cached parameters to avoid regenerating
// different random parameter values.
return sParams;
}
/**
* Returns a list of random parameter combinations with which to run the tests in this class.
*
* <p>Each list item contains a value for {{@link #speed}, {@link #streamLength}} stored within an
* Object array.
*
* <p>The method generates {@link #PARAM_COUNT} random {@link #speed} values and {@link
* #PARAM_COUNT} random {@link #streamLength} values. These generated values are then grouped into
* all possible combinations, and every group passed as parameters for each test.
*/
private static ImmutableList<Object[]> initParams() {
ImmutableSet.Builder<Object[]> paramsBuilder = new ImmutableSet.Builder<>();
ImmutableSet.Builder<BigDecimal> speedsBuilder = new ImmutableSet.Builder<>();
for (int i = 0; i < PARAM_COUNT; i++) {
Range<Float> range = SPEED_RANGES.get(i % SPEED_RANGES.size());
BigDecimal speed =
BigDecimal.valueOf(generateFloatInRange(random, range))
.setScale(SPEED_DECIMAL_PRECISION, RoundingMode.HALF_EVEN);
speedsBuilder.add(speed);
}
ImmutableSet<BigDecimal> speeds = speedsBuilder.build();
ImmutableSet<Long> lengths =
new ImmutableSet.Builder<Long>()
.addAll(
random
.longs(/* min */ 0, MAX_LENGTH_SAMPLES)
.distinct()
.limit(PARAM_COUNT)
.iterator())
.build();
for (long length : lengths) {
for (BigDecimal speed : speeds) {
paramsBuilder.add(new Object[] {speed, length});
}
}
return paramsBuilder.build().asList();
}
@Parameter(0)
public BigDecimal speed;
@Parameter(1)
public long streamLength;
@Test
public void resampling_returnsExpectedNumberOfSamples() {
byte[] inputBuffer = new byte[BLOCK_SIZE * BYTES_PER_SAMPLE];
ShortBuffer outBuffer = ShortBuffer.allocate(BLOCK_SIZE);
// Use same speed and pitch values for Sonic to resample stream.
Sonic sonic =
new Sonic(
/* inputSampleRateHz= */ SAMPLE_RATE,
/* channelCount= */ 1,
/* speed= */ speed.floatValue(),
/* pitch= */ speed.floatValue(),
/* outputSampleRateHz= */ SAMPLE_RATE);
long readSampleCount = 0;
for (long samplesLeft = streamLength; samplesLeft > 0; samplesLeft -= BLOCK_SIZE) {
random.nextBytes(inputBuffer);
if (samplesLeft >= BLOCK_SIZE) {
sonic.queueInput(ByteBuffer.wrap(inputBuffer).asShortBuffer());
} else {
// The last buffer to queue might have less samples than BLOCK_SIZE, so we should only queue
// the remaining number of samples (samplesLeft).
sonic.queueInput(
ByteBuffer.wrap(inputBuffer, 0, (int) (samplesLeft * BYTES_PER_SAMPLE))
.asShortBuffer());
sonic.queueEndOfStream();
}
while (sonic.getOutputSize() > 0) {
sonic.getOutput(outBuffer);
readSampleCount += outBuffer.position();
outBuffer.clear();
}
}
sonic.flush();
BigDecimal bigLength = new BigDecimal(String.valueOf(streamLength));
// The scale of expectedSize will be bigLength.scale() - speed.scale(). Thus, the result should
// always yield an integer.
BigDecimal expectedSize = bigLength.divide(speed, RoundingMode.HALF_EVEN);
long accumulatedTruncationError =
calculateAccumulatedTruncationErrorForResampling(
bigLength, new BigDecimal(SAMPLE_RATE), speed);
assertThat(readSampleCount)
.isWithin(1)
.of(expectedSize.longValueExact() - accumulatedTruncationError);
}
@Test
public void timeStretching_returnsExpectedNumberOfSamples() {
byte[] buf = new byte[BLOCK_SIZE * BYTES_PER_SAMPLE];
ShortBuffer outBuffer = ShortBuffer.allocate(BLOCK_SIZE);
Sonic sonic =
new Sonic(
/* inputSampleRateHz= */ SAMPLE_RATE,
/* channelCount= */ 1,
speed.floatValue(),
/* pitch= */ 1,
/* outputSampleRateHz= */ SAMPLE_RATE);
long readSampleCount = 0;
for (long samplesLeft = streamLength; samplesLeft > 0; samplesLeft -= BLOCK_SIZE) {
random.nextBytes(buf);
if (samplesLeft >= BLOCK_SIZE) {
sonic.queueInput(ByteBuffer.wrap(buf).asShortBuffer());
} else {
sonic.queueInput(
ByteBuffer.wrap(buf, 0, (int) (samplesLeft * BYTES_PER_SAMPLE)).asShortBuffer());
sonic.queueEndOfStream();
}
while (sonic.getOutputSize() > 0) {
sonic.getOutput(outBuffer);
readSampleCount += outBuffer.position();
outBuffer.clear();
}
}
sonic.flush();
BigDecimal bigLength = new BigDecimal(String.valueOf(streamLength));
// The scale of expectedSampleCount will be bigLength.scale() - speed.scale(). Thus, the result
// should always yield an integer.
BigDecimal expectedSampleCount = bigLength.divide(speed, RoundingMode.HALF_EVEN);
// Calculate allowed tolerance and round to nearest integer.
BigDecimal allowedTolerance =
TIME_STRETCHING_SAMPLE_DRIFT_TOLERANCE
.multiply(expectedSampleCount)
.setScale(/* newScale= */ 0, RoundingMode.HALF_EVEN);
// Always allow at least 1 sample of tolerance.
long tolerance = max(allowedTolerance.longValue(), 1);
assertThat(readSampleCount).isWithin(tolerance).of(expectedSampleCount.longValueExact());
}
}

View file

@ -1,110 +0,0 @@
/*
* Copyright (C) 2024 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package androidx.media3.common.audio;
import static com.google.common.truth.Truth.assertThat;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import java.nio.ShortBuffer;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.Timeout;
import org.junit.runner.RunWith;
/** Unit test for {@link Sonic}. */
@RunWith(AndroidJUnit4.class)
public class SonicTest {
@Rule public final Timeout globalTimeout = Timeout.millis(1000);
@Test
public void resample_toDoubleRate_linearlyInterpolatesSamples() {
ShortBuffer inputBuffer = ShortBuffer.wrap(new short[] {0, 10, 20, 30, 40, 50});
Sonic sonic =
new Sonic(
/* inputSampleRateHz= */ 44100,
/* channelCount= */ 1,
/* speed= */ 1,
/* pitch= */ 1,
/* outputSampleRateHz= */ 88200);
sonic.queueInput(inputBuffer);
sonic.queueEndOfStream();
ShortBuffer outputBuffer = ShortBuffer.allocate(sonic.getOutputSize() / 2);
sonic.getOutput(outputBuffer);
// End of stream is padded with silence, so last sample will be interpolated between (50; 0).
assertThat(outputBuffer.array())
.isEqualTo(new short[] {0, 5, 10, 15, 20, 25, 30, 35, 40, 45, 50, 25});
}
@Test
public void resample_toHalfRate_linearlyInterpolatesSamples() {
ShortBuffer inputBuffer =
ShortBuffer.wrap(new short[] {-40, -30, -20, -10, 0, 10, 20, 30, 40, 50});
Sonic sonic =
new Sonic(
/* inputSampleRateHz= */ 44100,
/* channelCount= */ 1,
/* speed= */ 1,
/* pitch= */ 1,
/* outputSampleRateHz= */ 22050);
sonic.queueInput(inputBuffer);
sonic.queueEndOfStream();
ShortBuffer outputBuffer = ShortBuffer.allocate(sonic.getOutputSize() / 2);
sonic.getOutput(outputBuffer);
// TODO (b/361768785): Remove this unexpected last sample when Sonic's resampler returns the
// right number of samples.
assertThat(outputBuffer.array()).isEqualTo(new short[] {-40, -20, 0, 20, 40, 0});
}
@Test
public void resample_withOneSample_doesNotHang() {
ShortBuffer inputBuffer = ShortBuffer.wrap(new short[] {10});
Sonic sonic =
new Sonic(
/* inputSampleRateHz= */ 44100,
/* channelCount= */ 1,
/* speed= */ 1,
/* pitch= */ 1,
/* outputSampleRateHz= */ 88200);
sonic.queueInput(inputBuffer);
sonic.queueEndOfStream();
ShortBuffer outputBuffer = ShortBuffer.allocate(sonic.getOutputSize() / 2);
sonic.getOutput(outputBuffer);
// End of stream is padded with silence, so last sample will be interpolated between (10; 0).
assertThat(outputBuffer.array()).isEqualTo(new short[] {10, 5});
}
@Test
public void resample_withFractionalOutputSampleCount_roundsNumberOfOutputSamples() {
ShortBuffer inputBuffer = ShortBuffer.wrap(new short[] {0, 2, 4, 6, 8});
Sonic sonic =
new Sonic(
/* inputSampleRateHz= */ 44100,
/* channelCount= */ 1,
/* speed= */ 1,
/* pitch= */ 1,
/* outputSampleRateHz= */ 22050);
sonic.queueInput(inputBuffer);
sonic.queueEndOfStream();
ShortBuffer outputBuffer = ShortBuffer.allocate(sonic.getOutputSize() / 2);
sonic.getOutput(outputBuffer);
assertThat(outputBuffer.array()).isEqualTo(new short[] {0, 4, 8});
}
}

View file

@ -1,66 +0,0 @@
/*
* Copyright (C) 2024 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package androidx.media3.common.audio;
import java.math.BigDecimal;
import java.math.RoundingMode;
/** Testing utils class related to {@link Sonic} */
/* package */ final class SonicTestingUtils {
/**
* Returns expected accumulated truncation error for {@link Sonic}'s resampling algorithm, given
* an input length, input sample rate, and resampling rate.
*
* <p><b>Note:</b> This method is only necessary until we address b/361768785 and fix the
* underlying truncation issue.
*
* <p>The accumulated truncation error is calculated as follows:
*
* <ol>
* <li>Individual truncation error: Divide sample rate by resampling rate, and calculate delta
* between floating point result and truncated int representation.
* <li>Truncation accumulation count: Divide length by sample rate to obtain number of times
* that truncation error accumulates.
* <li>Accumulated truncation error: Multiply results of 1 and 2.
* </ol>
*
* @param length Length of input in frames.
* @param sampleRate Input sample rate of {@link Sonic} instance.
* @param resamplingRate Resampling rate given by {@code pitch * (inputSampleRate /
* outputSampleRate)}.
*/
public static long calculateAccumulatedTruncationErrorForResampling(
BigDecimal length, BigDecimal sampleRate, BigDecimal resamplingRate) {
// Calculate number of times that Sonic accumulates truncation error. Set scale to 20 decimal
// places, so that division doesn't return an integer.
BigDecimal errorCount = length.divide(sampleRate, /* scale= */ 20, RoundingMode.HALF_EVEN);
// Calculate what truncation error Sonic is accumulating, calculated as:
// inputSampleRate / resamplingRate - (int) inputSampleRate / resamplingRate. Set scale to 20
// decimal places, so that division doesn't return an integer.
BigDecimal individualError =
sampleRate.divide(resamplingRate, /* scale */ 20, RoundingMode.HALF_EVEN);
individualError =
individualError.subtract(individualError.setScale(/* newScale= */ 0, RoundingMode.FLOOR));
// Calculate total accumulated error = (int) floor(errorCount * individualError).
BigDecimal accumulatedError =
errorCount.multiply(individualError).setScale(/* newScale= */ 0, RoundingMode.FLOOR);
return accumulatedError.longValueExact();
}
private SonicTestingUtils() {}
}

View file

@ -240,9 +240,9 @@ public class SpeedChangingAudioProcessorTest {
}
@Test
public void queueInput_multipleSpeedsInBufferWithLimitVeryClose_doesNotHang() throws Exception {
public void queueInput_multipleSpeedsInBufferWithLimitVeryClose_readsDataUntilSpeedLimit()
throws Exception {
long speedChangeTimeUs = 1; // Change speed very close to current position at 1us.
int outputFrames = 0;
SpeedProvider speedProvider =
TestSpeedProvider.createWithStartTimes(
/* startTimesUs= */ new long[] {0L, speedChangeTimeUs},
@ -250,14 +250,12 @@ public class SpeedChangingAudioProcessorTest {
SpeedChangingAudioProcessor speedChangingAudioProcessor =
getConfiguredSpeedChangingAudioProcessor(speedProvider);
ByteBuffer inputBuffer = getInputBuffer(/* frameCount= */ 5);
int inputBufferLimit = inputBuffer.limit();
speedChangingAudioProcessor.queueInput(inputBuffer);
outputFrames +=
speedChangingAudioProcessor.getOutput().remaining() / AUDIO_FORMAT.bytesPerFrame;
speedChangingAudioProcessor.queueEndOfStream();
outputFrames +=
speedChangingAudioProcessor.getOutput().remaining() / AUDIO_FORMAT.bytesPerFrame;
assertThat(outputFrames).isEqualTo(3);
assertThat(inputBuffer.position()).isEqualTo(AUDIO_FORMAT.bytesPerFrame);
assertThat(inputBuffer.limit()).isEqualTo(inputBufferLimit);
}
@Test
@ -533,68 +531,6 @@ public class SpeedChangingAudioProcessorTest {
.isEqualTo(40_000);
}
@Test
public void queueInput_exactlyUpToSpeedBoundary_outputsExpectedNumberOfSamples()
throws AudioProcessor.UnhandledAudioFormatException {
int outputFrameCount = 0;
SpeedProvider speedProvider =
TestSpeedProvider.createWithFrameCounts(
AUDIO_FORMAT,
/* frameCounts= */ new int[] {1000, 1000, 1000},
/* speeds= */ new float[] {2, 4, 2}); // 500, 250, 500 = 1250
SpeedChangingAudioProcessor speedChangingAudioProcessor =
getConfiguredSpeedChangingAudioProcessor(speedProvider);
ByteBuffer input = getInputBuffer(1000);
speedChangingAudioProcessor.queueInput(input);
outputFrameCount +=
speedChangingAudioProcessor.getOutput().remaining() / AUDIO_FORMAT.bytesPerFrame;
input.rewind();
speedChangingAudioProcessor.queueInput(input);
outputFrameCount +=
speedChangingAudioProcessor.getOutput().remaining() / AUDIO_FORMAT.bytesPerFrame;
input.rewind();
speedChangingAudioProcessor.queueInput(input);
outputFrameCount +=
speedChangingAudioProcessor.getOutput().remaining() / AUDIO_FORMAT.bytesPerFrame;
speedChangingAudioProcessor.queueEndOfStream();
outputFrameCount +=
speedChangingAudioProcessor.getOutput().remaining() / AUDIO_FORMAT.bytesPerFrame;
assertThat(outputFrameCount).isWithin(2).of(1250);
}
@Test
public void queueInput_withUnalignedSpeedStartTimes_skipsMidSampleSpeedChanges()
throws AudioProcessor.UnhandledAudioFormatException {
int outputFrameCount = 0;
// Sample duration @44.1KHz is 22.67573696145125us. The last three speed changes fall between
// samples 4 and 5, so only the speed change at 105us should be used. We expect an output of
// 4 / 2 + 8 / 4 = 4 samples.
SpeedProvider speedProvider =
TestSpeedProvider.createWithStartTimes(
/* startTimesUs= */ new long[] {0, 95, 100, 105},
/* speeds= */ new float[] {2, 3, 8, 4});
SpeedChangingAudioProcessor speedChangingAudioProcessor =
getConfiguredSpeedChangingAudioProcessor(speedProvider);
ByteBuffer input = getInputBuffer(12);
while (input.hasRemaining()) {
speedChangingAudioProcessor.queueInput(input);
outputFrameCount +=
speedChangingAudioProcessor.getOutput().remaining() / AUDIO_FORMAT.bytesPerFrame;
}
speedChangingAudioProcessor.queueEndOfStream();
outputFrameCount +=
speedChangingAudioProcessor.getOutput().remaining() / AUDIO_FORMAT.bytesPerFrame;
// Allow one sample of tolerance per effectively applied speed change.
assertThat(outputFrameCount).isWithin(1).of(4);
}
private static SpeedChangingAudioProcessor getConfiguredSpeedChangingAudioProcessor(
SpeedProvider speedProvider) throws AudioProcessor.UnhandledAudioFormatException {
SpeedChangingAudioProcessor speedChangingAudioProcessor =

View file

@ -15,16 +15,9 @@
*/
package androidx.media3.common.util;
import static androidx.media3.common.util.CodecSpecificDataUtil.getCodecProfileAndLevel;
import static com.google.common.truth.Truth.assertThat;
import android.media.MediaCodecInfo;
import android.util.Pair;
import androidx.annotation.Nullable;
import androidx.media3.common.C;
import androidx.media3.common.ColorInfo;
import androidx.media3.common.Format;
import androidx.media3.common.MimeTypes;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import org.junit.Test;
import org.junit.runner.RunWith;
@ -52,163 +45,4 @@ public class CodecSpecificDataUtilTest {
assertThat(sampleRateAndChannelCount.first).isEqualTo(96000);
assertThat(sampleRateAndChannelCount.second).isEqualTo(2);
}
@Test
public void getCodecProfileAndLevel_handlesH263CodecString() {
assertCodecProfileAndLevelForCodecsString(
MimeTypes.VIDEO_H263,
"s263.1.1",
MediaCodecInfo.CodecProfileLevel.H263ProfileBaseline,
MediaCodecInfo.CodecProfileLevel.H263Level10);
}
@Test
public void getCodecProfileAndLevel_handlesVp9Profile1CodecString() {
assertCodecProfileAndLevelForCodecsString(
MimeTypes.VIDEO_VP9,
"vp09.01.51",
MediaCodecInfo.CodecProfileLevel.VP9Profile1,
MediaCodecInfo.CodecProfileLevel.VP9Level51);
}
@Test
public void getCodecProfileAndLevel_handlesVp9Profile2CodecString() {
assertCodecProfileAndLevelForCodecsString(
MimeTypes.VIDEO_VP9,
"vp09.02.10",
MediaCodecInfo.CodecProfileLevel.VP9Profile2,
MediaCodecInfo.CodecProfileLevel.VP9Level1);
}
@Test
public void getCodecProfileAndLevel_handlesFullVp9CodecString() {
// Example from https://www.webmproject.org/vp9/mp4/#codecs-parameter-string.
assertCodecProfileAndLevelForCodecsString(
MimeTypes.VIDEO_VP9,
"vp09.02.10.10.01.09.16.09.01",
MediaCodecInfo.CodecProfileLevel.VP9Profile2,
MediaCodecInfo.CodecProfileLevel.VP9Level1);
}
@Test
public void getCodecProfileAndLevel_handlesDolbyVisionCodecString() {
assertCodecProfileAndLevelForCodecsString(
MimeTypes.VIDEO_DOLBY_VISION,
"dvh1.05.05",
MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheStn,
MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelFhd60);
}
@Test
public void getCodecProfileAndLevel_handlesDolbyVisionProfile10CodecString() {
assertCodecProfileAndLevelForCodecsString(
MimeTypes.VIDEO_DOLBY_VISION,
"dav1.10.09",
MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvav110,
MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd60);
}
@Test
public void getCodecProfileAndLevel_handlesAv1ProfileMain8CodecString() {
assertCodecProfileAndLevelForCodecsString(
MimeTypes.VIDEO_AV1,
"av01.0.10M.08",
MediaCodecInfo.CodecProfileLevel.AV1ProfileMain8,
MediaCodecInfo.CodecProfileLevel.AV1Level42);
}
@Test
public void getCodecProfileAndLevel_handlesAv1ProfileMain10CodecString() {
assertCodecProfileAndLevelForCodecsString(
MimeTypes.VIDEO_AV1,
"av01.0.20M.10",
MediaCodecInfo.CodecProfileLevel.AV1ProfileMain10,
MediaCodecInfo.CodecProfileLevel.AV1Level7);
}
@Test
public void getCodecProfileAndLevel_handlesAv1ProfileMain10HDRWithHdrInfoSet() {
ColorInfo colorInfo =
new ColorInfo.Builder()
.setColorSpace(C.COLOR_SPACE_BT709)
.setColorRange(C.COLOR_RANGE_LIMITED)
.setColorTransfer(C.COLOR_TRANSFER_SDR)
.setHdrStaticInfo(new byte[] {1, 2, 3, 4, 5, 6, 7})
.build();
Format format =
new Format.Builder()
.setSampleMimeType(MimeTypes.VIDEO_AV1)
.setCodecs("av01.0.21M.10")
.setColorInfo(colorInfo)
.build();
assertCodecProfileAndLevelForFormat(
format,
MediaCodecInfo.CodecProfileLevel.AV1ProfileMain10HDR10,
MediaCodecInfo.CodecProfileLevel.AV1Level71);
}
@Test
public void getCodecProfileAndLevel_handlesAv1ProfileMain10HDRWithoutHdrInfoSet() {
ColorInfo colorInfo =
new ColorInfo.Builder()
.setColorSpace(C.COLOR_SPACE_BT709)
.setColorRange(C.COLOR_RANGE_LIMITED)
.setColorTransfer(C.COLOR_TRANSFER_HLG)
.build();
Format format =
new Format.Builder()
.setSampleMimeType(MimeTypes.VIDEO_AV1)
.setCodecs("av01.0.21M.10")
.setColorInfo(colorInfo)
.build();
assertCodecProfileAndLevelForFormat(
format,
MediaCodecInfo.CodecProfileLevel.AV1ProfileMain10HDR10,
MediaCodecInfo.CodecProfileLevel.AV1Level71);
}
@Test
public void getCodecProfileAndLevel_handlesFullAv1CodecString() {
// Example from https://aomediacodec.github.io/av1-isobmff/#codecsparam.
assertCodecProfileAndLevelForCodecsString(
MimeTypes.VIDEO_AV1,
"av01.0.04M.10.0.112.09.16.09.0",
MediaCodecInfo.CodecProfileLevel.AV1ProfileMain10,
MediaCodecInfo.CodecProfileLevel.AV1Level3);
}
@Test
public void getCodecProfileAndLevel_rejectsNullCodecString() {
Format format = new Format.Builder().setCodecs(null).build();
assertThat(getCodecProfileAndLevel(format)).isNull();
}
@Test
public void getCodecProfileAndLevel_rejectsEmptyCodecString() {
Format format = new Format.Builder().setCodecs("").build();
assertThat(getCodecProfileAndLevel(format)).isNull();
}
@Test
public void getCodecProfileAndLevel_handlesMvHevcCodecString() {
assertCodecProfileAndLevelForCodecsString(
MimeTypes.VIDEO_MV_HEVC,
"hvc1.6.40.L120.BF.80",
/* profile= */ 6,
MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel4);
}
private static void assertCodecProfileAndLevelForCodecsString(
String sampleMimeType, String codecs, int profile, int level) {
Format format =
new Format.Builder().setSampleMimeType(sampleMimeType).setCodecs(codecs).build();
assertCodecProfileAndLevelForFormat(format, profile, level);
}
private static void assertCodecProfileAndLevelForFormat(Format format, int profile, int level) {
@Nullable Pair<Integer, Integer> codecProfileAndLevel = getCodecProfileAndLevel(format);
assertThat(codecProfileAndLevel).isNotNull();
assertThat(codecProfileAndLevel.first).isEqualTo(profile);
assertThat(codecProfileAndLevel.second).isEqualTo(level);
}
}

View file

@ -240,10 +240,4 @@ public class TimestampAdjusterTest {
assertThat(secondAdjustedTimestampUs - firstAdjustedTimestampUs).isGreaterThan(0x100000000L);
}
// https://github.com/androidx/media/issues/1763
@Test
public void usToWrappedPts_usTimestampCloseToOverflow_doesntOverflow() {
assertThat(TimestampAdjuster.usToNonWrappedPts(1L << 52)).isEqualTo(405323966463344L);
}
}

View file

@ -116,9 +116,6 @@ public abstract class Mp4Box {
public static final int TYPE_H263 = 0x48323633;
@SuppressWarnings("ConstantCaseForConstants")
public static final int TYPE_h263 = 0x68323633;
@SuppressWarnings("ConstantCaseForConstants")
public static final int TYPE_d263 = 0x64323633;

View file

@ -15,7 +15,6 @@
*/
package androidx.media3.container;
import static androidx.media3.common.MimeTypes.containsCodecsCorrespondingToMimeType;
import static com.google.common.math.DoubleMath.log2;
import static java.lang.Math.max;
import static java.lang.Math.min;
@ -34,7 +33,6 @@ import java.math.RoundingMode;
import java.nio.ByteBuffer;
import java.util.Arrays;
import java.util.List;
import java.util.Objects;
/** Utility methods for handling H.264/AVC and H.265/HEVC NAL units. */
@UnstableApi
@ -616,24 +614,6 @@ public final class NalUnitUtil {
&& ((nalUnitHeaderFirstByte & 0x7E) >> 1) == H265_NAL_UNIT_TYPE_PREFIX_SEI);
}
/**
* Returns whether the NAL unit with the specified header contains supplemental enhancement
* information.
*
* @param format The sample {@link Format}.
* @param nalUnitHeaderFirstByte The first byte of nal_unit().
* @return Whether the NAL unit with the specified header is an SEI NAL unit. False is returned if
* the {@code MimeType} is {@code null}.
*/
public static boolean isNalUnitSei(Format format, byte nalUnitHeaderFirstByte) {
return ((Objects.equals(format.sampleMimeType, MimeTypes.VIDEO_H264)
|| containsCodecsCorrespondingToMimeType(format.codecs, MimeTypes.VIDEO_H264))
&& (nalUnitHeaderFirstByte & 0x1F) == H264_NAL_UNIT_TYPE_SEI)
|| ((Objects.equals(format.sampleMimeType, MimeTypes.VIDEO_H265)
|| containsCodecsCorrespondingToMimeType(format.codecs, MimeTypes.VIDEO_H265))
&& ((nalUnitHeaderFirstByte & 0x7E) >> 1) == H265_NAL_UNIT_TYPE_PREFIX_SEI);
}
/**
* Returns the type of the NAL unit in {@code data} that starts at {@code offset}.
*

View file

@ -16,19 +16,17 @@
package androidx.media3.container;
import static androidx.annotation.RestrictTo.Scope.LIBRARY_GROUP;
import static androidx.media3.common.util.Assertions.checkArgument;
import static androidx.media3.common.util.Assertions.checkState;
import static androidx.media3.common.util.Util.castNonNull;
import androidx.annotation.Nullable;
import androidx.annotation.RestrictTo;
import androidx.media3.common.C;
import androidx.media3.common.util.ParsableByteArray;
import androidx.media3.common.util.UnstableApi;
import java.util.ArrayDeque;
import java.util.ArrayList;
import java.util.List;
import java.util.Deque;
import java.util.PriorityQueue;
import java.util.concurrent.atomic.AtomicLong;
/** A queue of SEI messages, ordered by presentation timestamp. */
@UnstableApi
@ -42,17 +40,18 @@ public final class ReorderingSeiMessageQueue {
}
private final SeiConsumer seiConsumer;
private final AtomicLong tieBreakGenerator = new AtomicLong();
/** Pool of re-usable {@link ParsableByteArray} objects to avoid repeated allocations. */
private final ArrayDeque<ParsableByteArray> unusedParsableByteArrays;
/**
* Pool of re-usable {@link SeiMessage} objects to avoid repeated allocations. Elements should be
* added and removed from the 'tail' of the queue (with {@link Deque#push(Object)} and {@link
* Deque#pop()}), to avoid unnecessary array copying.
*/
private final ArrayDeque<SeiMessage> unusedSeiMessages;
/** Pool of re-usable {@link SampleSeiMessages} objects to avoid repeated allocations. */
private final ArrayDeque<SampleSeiMessages> unusedSampleSeiMessages;
private final PriorityQueue<SampleSeiMessages> pendingSeiMessages;
private final PriorityQueue<SeiMessage> pendingSeiMessages;
private int reorderingQueueSize;
@Nullable private SampleSeiMessages lastQueuedMessage;
/**
* Creates an instance, initially with no max size.
@ -63,8 +62,7 @@ public final class ReorderingSeiMessageQueue {
*/
public ReorderingSeiMessageQueue(SeiConsumer seiConsumer) {
this.seiConsumer = seiConsumer;
unusedParsableByteArrays = new ArrayDeque<>();
unusedSampleSeiMessages = new ArrayDeque<>();
unusedSeiMessages = new ArrayDeque<>();
pendingSeiMessages = new PriorityQueue<>();
reorderingQueueSize = C.LENGTH_UNSET;
}
@ -72,15 +70,8 @@ public final class ReorderingSeiMessageQueue {
/**
* Sets the max size of the re-ordering queue.
*
* <p>The size is defined in terms of the number of unique presentation timestamps, rather than
* the number of messages. This ensures that properties like H.264's {@code
* max_number_reorder_frames} can be used to set this max size in the case of multiple SEI
* messages per sample (where multiple SEI messages therefore have the same presentation
* timestamp).
*
* <p>When the queue exceeds this size during a call to {@link #add(long, ParsableByteArray)}, the
* messages associated with the least timestamp are passed to the {@link SeiConsumer} provided
* during construction.
* least message is passed to the {@link SeiConsumer} provided during construction.
*
* <p>If the new size is larger than the number of elements currently in the queue, items are
* removed from the head of the queue (least first) and passed to the {@link SeiConsumer} provided
@ -95,7 +86,7 @@ public final class ReorderingSeiMessageQueue {
/**
* Returns the maximum size of this queue, or {@link C#LENGTH_UNSET} if it is unbounded.
*
* <p>See {@link #setMaxSize(int)} for details on how size is defined.
* <p>See {@link #setMaxSize(int)}.
*/
public int getMaxSize() {
return reorderingQueueSize;
@ -104,16 +95,12 @@ public final class ReorderingSeiMessageQueue {
/**
* Adds a message to the queue.
*
* <p>If this causes the queue to exceed its {@linkplain #setMaxSize(int) max size}, messages
* associated with the least timestamp (which may be the message passed to this method) are passed
* to the {@link SeiConsumer} provided during construction.
*
* <p>Messages with matching timestamps must be added consecutively (this will naturally happen
* when parsing messages from a container).
* <p>If this causes the queue to exceed its {@linkplain #setMaxSize(int) max size}, the least
* message (which may be the one passed to this method) is passed to the {@link SeiConsumer}
* provided during construction.
*
* @param presentationTimeUs The presentation time of the SEI message.
* @param seiBuffer The SEI data. The data will be copied, so the provided object can be re-used
* after this method returns.
* @param seiBuffer The SEI data. The data will be copied, so the provided object can be re-used.
*/
public void add(long presentationTimeUs, ParsableByteArray seiBuffer) {
if (reorderingQueueSize == 0
@ -123,42 +110,15 @@ public final class ReorderingSeiMessageQueue {
seiConsumer.consume(presentationTimeUs, seiBuffer);
return;
}
// Make a local copy of the SEI data so we can store it in the queue and allow the seiBuffer
// parameter to be safely re-used after this add() method returns.
ParsableByteArray seiBufferCopy = copy(seiBuffer);
if (lastQueuedMessage != null && presentationTimeUs == lastQueuedMessage.presentationTimeUs) {
lastQueuedMessage.nalBuffers.add(seiBufferCopy);
return;
}
SampleSeiMessages sampleSeiMessages =
unusedSampleSeiMessages.isEmpty() ? new SampleSeiMessages() : unusedSampleSeiMessages.pop();
sampleSeiMessages.init(presentationTimeUs, seiBufferCopy);
pendingSeiMessages.add(sampleSeiMessages);
lastQueuedMessage = sampleSeiMessages;
SeiMessage seiMessage =
unusedSeiMessages.isEmpty() ? new SeiMessage() : unusedSeiMessages.poll();
seiMessage.reset(presentationTimeUs, tieBreakGenerator.getAndIncrement(), seiBuffer);
pendingSeiMessages.add(seiMessage);
if (reorderingQueueSize != C.LENGTH_UNSET) {
flushQueueDownToSize(reorderingQueueSize);
}
}
/**
* Copies {@code input} into a {@link ParsableByteArray} instance from {@link
* #unusedParsableByteArrays}, or a new instance if that is empty.
*/
private ParsableByteArray copy(ParsableByteArray input) {
ParsableByteArray result =
unusedParsableByteArrays.isEmpty()
? new ParsableByteArray()
: unusedParsableByteArrays.pop();
result.reset(input.bytesLeft());
System.arraycopy(
/* src= */ input.getData(),
/* srcPos= */ input.getPosition(),
/* dest= */ result.getData(),
/* destPos= */ 0,
/* length= */ result.bytesLeft());
return result;
}
/**
* Empties the queue, passing all messages (least first) to the {@link SeiConsumer} provided
* during construction.
@ -169,42 +129,47 @@ public final class ReorderingSeiMessageQueue {
private void flushQueueDownToSize(int targetSize) {
while (pendingSeiMessages.size() > targetSize) {
SampleSeiMessages sampleSeiMessages = castNonNull(pendingSeiMessages.poll());
for (int i = 0; i < sampleSeiMessages.nalBuffers.size(); i++) {
seiConsumer.consume(
sampleSeiMessages.presentationTimeUs, sampleSeiMessages.nalBuffers.get(i));
unusedParsableByteArrays.push(sampleSeiMessages.nalBuffers.get(i));
}
sampleSeiMessages.nalBuffers.clear();
if (lastQueuedMessage != null
&& lastQueuedMessage.presentationTimeUs == sampleSeiMessages.presentationTimeUs) {
lastQueuedMessage = null;
}
unusedSampleSeiMessages.push(sampleSeiMessages);
SeiMessage seiMessage = castNonNull(pendingSeiMessages.poll());
seiConsumer.consume(seiMessage.presentationTimeUs, seiMessage.data);
unusedSeiMessages.push(seiMessage);
}
}
/** Holds the presentation timestamp of a sample and the data from associated SEI messages. */
private static final class SampleSeiMessages implements Comparable<SampleSeiMessages> {
/** Holds data from a SEI sample with its presentation timestamp. */
private static final class SeiMessage implements Comparable<SeiMessage> {
public final List<ParsableByteArray> nalBuffers;
public long presentationTimeUs;
private final ParsableByteArray data;
public SampleSeiMessages() {
private long presentationTimeUs;
/**
* {@link PriorityQueue} breaks ties arbitrarily. This field ensures that insertion order is
* preserved when messages have the same {@link #presentationTimeUs}.
*/
private long tieBreak;
public SeiMessage() {
presentationTimeUs = C.TIME_UNSET;
nalBuffers = new ArrayList<>();
data = new ParsableByteArray();
}
public void init(long presentationTimeUs, ParsableByteArray nalBuffer) {
checkArgument(presentationTimeUs != C.TIME_UNSET);
checkState(this.nalBuffers.isEmpty());
public void reset(long presentationTimeUs, long tieBreak, ParsableByteArray nalBuffer) {
checkState(presentationTimeUs != C.TIME_UNSET);
this.presentationTimeUs = presentationTimeUs;
this.nalBuffers.add(nalBuffer);
this.tieBreak = tieBreak;
this.data.reset(nalBuffer.bytesLeft());
System.arraycopy(
/* src= */ nalBuffer.getData(),
/* srcPos= */ nalBuffer.getPosition(),
/* dest= */ data.getData(),
/* destPos= */ 0,
/* length= */ nalBuffer.bytesLeft());
}
@Override
public int compareTo(SampleSeiMessages other) {
return Long.compare(this.presentationTimeUs, other.presentationTimeUs);
public int compareTo(SeiMessage other) {
int timeComparison = Long.compare(this.presentationTimeUs, other.presentationTimeUs);
return timeComparison != 0 ? timeComparison : Long.compare(this.tieBreak, other.tieBreak);
}
}
}

View file

@ -115,41 +115,6 @@ public final class ReorderingSeiMessageQueueTest {
.containsExactly(new SeiMessage(/* presentationTimeUs= */ -123, data2));
}
@Test
public void withMaxSize_addEmitsWhenQueueIsFull_handlesDuplicateTimestamps() {
ArrayList<SeiMessage> emittedMessages = new ArrayList<>();
ReorderingSeiMessageQueue reorderingQueue =
new ReorderingSeiMessageQueue(
(presentationTimeUs, seiBuffer) ->
emittedMessages.add(new SeiMessage(presentationTimeUs, seiBuffer)));
reorderingQueue.setMaxSize(1);
// Deliberately re-use a single ParsableByteArray instance to ensure the implementation is
// copying as required.
ParsableByteArray scratchData = new ParsableByteArray();
byte[] data1 = TestUtil.buildTestData(20);
scratchData.reset(data1);
reorderingQueue.add(/* presentationTimeUs= */ 345, scratchData);
// Add a message with a repeated timestamp which should not trigger the max size.
byte[] data2 = TestUtil.buildTestData(15);
scratchData.reset(data2);
reorderingQueue.add(/* presentationTimeUs= */ 345, scratchData);
byte[] data3 = TestUtil.buildTestData(10);
scratchData.reset(data3);
reorderingQueue.add(/* presentationTimeUs= */ -123, scratchData);
// Add another message to flush out the two t=345 messages.
byte[] data4 = TestUtil.buildTestData(5);
scratchData.reset(data4);
reorderingQueue.add(/* presentationTimeUs= */ 456, scratchData);
assertThat(emittedMessages)
.containsExactly(
new SeiMessage(/* presentationTimeUs= */ -123, data3),
new SeiMessage(/* presentationTimeUs= */ 345, data1),
new SeiMessage(/* presentationTimeUs= */ 345, data2))
.inOrder();
}
/**
* Tests that if a message smaller than all current queue items is added when the queue is full,
* the same {@link ParsableByteArray} instance is passed straight to the output to avoid

View file

@ -31,7 +31,6 @@ import com.google.common.util.concurrent.ListenableFuture;
import com.google.common.util.concurrent.MoreExecutors;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.concurrent.ExecutionException;
import okhttp3.mockwebserver.MockResponse;
@ -216,28 +215,6 @@ public class DataSourceBitmapLoaderTest {
assertThat(bitmap.isMutable()).isTrue();
}
@Test
public void loadBitmap_withFileUriAndMaxOutputDimension_loadsDataWithSmallerSize()
throws Exception {
byte[] imageData =
TestUtil.getByteArray(ApplicationProvider.getApplicationContext(), TEST_IMAGE_PATH);
File file = tempFolder.newFile();
Files.write(Path.of(file.getAbsolutePath()), imageData);
Uri uri = Uri.fromFile(file);
int maximumOutputDimension = 2000;
DataSourceBitmapLoader bitmapLoader =
new DataSourceBitmapLoader(
MoreExecutors.newDirectExecutorService(),
dataSourceFactory,
/* options= */ null,
maximumOutputDimension);
Bitmap bitmap = bitmapLoader.loadBitmap(uri).get();
assertThat(bitmap.getWidth()).isAtMost(maximumOutputDimension);
assertThat(bitmap.getHeight()).isAtMost(maximumOutputDimension);
}
@Test
public void loadBitmap_fileUriWithFileNotExisting_throws() {
DataSourceBitmapLoader bitmapLoader =

View file

@ -15,11 +15,11 @@
*/
package androidx.media3.datasource;
import android.net.Uri;
import androidx.media3.test.utils.DataSourceContractTest;
import androidx.media3.test.utils.HttpDataSourceTestEnv;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import com.google.common.collect.ImmutableList;
import java.util.List;
import org.junit.Rule;
import org.junit.runner.RunWith;
@ -40,7 +40,7 @@ public class DefaultHttpDataSourceContractTest extends DataSourceContractTest {
}
@Override
protected List<TestResource> getNotFoundResources() {
return httpDataSourceTestEnv.getNotFoundResources();
protected Uri getNotFoundUri() {
return Uri.parse(httpDataSourceTestEnv.getNonexistentUrl());
}
}

View file

@ -15,13 +15,13 @@
*/
package androidx.media3.datasource;
import android.net.Uri;
import android.net.http.HttpEngine;
import androidx.media3.test.utils.DataSourceContractTest;
import androidx.media3.test.utils.HttpDataSourceTestEnv;
import androidx.test.core.app.ApplicationProvider;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import com.google.common.collect.ImmutableList;
import java.util.List;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import org.junit.After;
@ -53,7 +53,7 @@ public class HttpEngineDataSourceContractTest extends DataSourceContractTest {
}
@Override
protected List<TestResource> getNotFoundResources() {
return httpDataSourceTestEnv.getNotFoundResources();
protected Uri getNotFoundUri() {
return Uri.parse(httpDataSourceTestEnv.getNonexistentUrl());
}
}

View file

@ -15,14 +15,11 @@
*/
package androidx.media3.datasource;
import static java.lang.Math.max;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Matrix;
import androidx.annotation.Nullable;
import androidx.exifinterface.media.ExifInterface;
import androidx.media3.common.C;
import androidx.media3.common.ParserException;
import androidx.media3.common.util.UnstableApi;
import java.io.ByteArrayInputStream;
@ -41,37 +38,14 @@ public final class BitmapUtil {
*
* @param data Byte array of compressed image data.
* @param length The number of bytes to parse.
* @param options The {@link BitmapFactory.Options} to decode the {@code data} with.
* @param maximumOutputDimension The largest output Bitmap dimension that can be returned by this
* method, or {@link C#LENGTH_UNSET} if no limits are enforced.
* @param options the {@link BitmapFactory.Options} to decode the {@code data} with.
* @throws ParserException if the {@code data} could not be decoded.
*/
// BitmapFactory's options parameter is null-ok.
@SuppressWarnings("nullness:argument.type.incompatible")
public static Bitmap decode(
byte[] data, int length, @Nullable BitmapFactory.Options options, int maximumOutputDimension)
public static Bitmap decode(byte[] data, int length, @Nullable BitmapFactory.Options options)
throws IOException {
if (maximumOutputDimension != C.LENGTH_UNSET) {
if (options == null) {
options = new BitmapFactory.Options();
}
options.inJustDecodeBounds = true;
BitmapFactory.decodeByteArray(data, /* offset= */ 0, length, options);
int largerDimensions = max(options.outWidth, options.outHeight);
options.inJustDecodeBounds = false;
options.inSampleSize = 1;
// Only scaling by 2x is supported.
while (largerDimensions > maximumOutputDimension) {
options.inSampleSize *= 2;
largerDimensions /= 2;
}
}
@Nullable Bitmap bitmap = BitmapFactory.decodeByteArray(data, /* offset= */ 0, length, options);
if (options != null) {
options.inSampleSize = 1;
}
if (bitmap == null) {
throw ParserException.createForMalformedContainer(
"Could not decode image data", new IllegalStateException());

View file

@ -23,7 +23,6 @@ import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.net.Uri;
import androidx.annotation.Nullable;
import androidx.media3.common.C;
import androidx.media3.common.util.BitmapLoader;
import androidx.media3.common.util.UnstableApi;
import com.google.common.base.Supplier;
@ -52,7 +51,6 @@ public final class DataSourceBitmapLoader implements BitmapLoader {
private final ListeningExecutorService listeningExecutorService;
private final DataSource.Factory dataSourceFactory;
@Nullable private final BitmapFactory.Options options;
private final int maximumOutputDimension;
/**
* Creates an instance that uses a {@link DefaultHttpDataSource} for image loading and delegates
@ -86,29 +84,9 @@ public final class DataSourceBitmapLoader implements BitmapLoader {
ListeningExecutorService listeningExecutorService,
DataSource.Factory dataSourceFactory,
@Nullable BitmapFactory.Options options) {
this(listeningExecutorService, dataSourceFactory, options, C.LENGTH_UNSET);
}
/**
* Creates an instance that delegates loading tasks to the {@link ListeningExecutorService}.
*
* <p>Use {@code maximumOutputDimension} to limit memory usage when loading large Bitmaps.
*
* @param listeningExecutorService The {@link ListeningExecutorService}.
* @param dataSourceFactory The {@link DataSource.Factory} that creates the {@link DataSource}
* used to load the image.
* @param options The {@link BitmapFactory.Options} the image should be loaded with.
* @param maximumOutputDimension The maximum dimension of the output Bitmap.
*/
public DataSourceBitmapLoader(
ListeningExecutorService listeningExecutorService,
DataSource.Factory dataSourceFactory,
@Nullable BitmapFactory.Options options,
int maximumOutputDimension) {
this.listeningExecutorService = listeningExecutorService;
this.dataSourceFactory = dataSourceFactory;
this.options = options;
this.maximumOutputDimension = maximumOutputDimension;
}
@Override
@ -118,27 +96,22 @@ public final class DataSourceBitmapLoader implements BitmapLoader {
@Override
public ListenableFuture<Bitmap> decodeBitmap(byte[] data) {
return listeningExecutorService.submit(
() -> BitmapUtil.decode(data, data.length, options, maximumOutputDimension));
return listeningExecutorService.submit(() -> BitmapUtil.decode(data, data.length, options));
}
@Override
public ListenableFuture<Bitmap> loadBitmap(Uri uri) {
return listeningExecutorService.submit(
() -> load(dataSourceFactory.createDataSource(), uri, options, maximumOutputDimension));
() -> load(dataSourceFactory.createDataSource(), uri, options));
}
private static Bitmap load(
DataSource dataSource,
Uri uri,
@Nullable BitmapFactory.Options options,
int maximumOutputDimension)
throws IOException {
DataSource dataSource, Uri uri, @Nullable BitmapFactory.Options options) throws IOException {
try {
DataSpec dataSpec = new DataSpec(uri);
dataSource.open(dataSpec);
byte[] readData = DataSourceUtil.readToEnd(dataSource);
return BitmapUtil.decode(readData, readData.length, options, maximumOutputDimension);
return BitmapUtil.decode(readData, readData.length, options);
} finally {
dataSource.close();
}

View file

@ -87,7 +87,7 @@ public final class DefaultDataSource implements DataSource {
*/
public Factory(Context context, DataSource.Factory baseDataSourceFactory) {
this.context = context.getApplicationContext();
this.baseDataSourceFactory = Assertions.checkNotNull(baseDataSourceFactory);
this.baseDataSourceFactory = baseDataSourceFactory;
}
/**

View file

@ -261,7 +261,7 @@ public class DefaultHttpDataSource extends BaseDataSource implements HttpDataSou
@Nullable private DataSpec dataSpec;
@Nullable private HttpURLConnection connection;
@Nullable private InputStream inputStream;
private boolean transferStarted;
private boolean opened;
private int responseCode;
private long bytesToRead;
private long bytesRead;
@ -296,13 +296,7 @@ public class DefaultHttpDataSource extends BaseDataSource implements HttpDataSou
@Override
@Nullable
public Uri getUri() {
if (connection != null) {
return Uri.parse(connection.getURL().toString());
} else if (dataSpec != null) {
return dataSpec.uri;
} else {
return null;
}
return connection == null ? null : Uri.parse(connection.getURL().toString());
}
@UnstableApi
@ -378,7 +372,7 @@ public class DefaultHttpDataSource extends BaseDataSource implements HttpDataSou
long documentSize =
HttpUtil.getDocumentSize(connection.getHeaderField(HttpHeaders.CONTENT_RANGE));
if (dataSpec.position == documentSize) {
transferStarted = true;
opened = true;
transferStarted(dataSpec);
return dataSpec.length != C.LENGTH_UNSET ? dataSpec.length : 0;
}
@ -448,7 +442,7 @@ public class DefaultHttpDataSource extends BaseDataSource implements HttpDataSou
HttpDataSourceException.TYPE_OPEN);
}
transferStarted = true;
opened = true;
transferStarted(dataSpec);
try {
@ -499,12 +493,10 @@ public class DefaultHttpDataSource extends BaseDataSource implements HttpDataSou
} finally {
inputStream = null;
closeConnectionQuietly();
if (transferStarted) {
transferStarted = false;
if (opened) {
opened = false;
transferEnded();
}
connection = null;
dataSpec = null;
}
}
@ -795,6 +787,7 @@ public class DefaultHttpDataSource extends BaseDataSource implements HttpDataSou
} catch (Exception e) {
Log.e(TAG, "Unexpected error while disconnecting", e);
}
connection = null;
}
}

View file

@ -38,6 +38,9 @@ import androidx.media3.common.util.Clock;
import androidx.media3.common.util.ConditionVariable;
import androidx.media3.common.util.UnstableApi;
import androidx.media3.common.util.Util;
import androidx.media3.datasource.HttpDataSource.CleartextNotPermittedException;
import androidx.media3.datasource.HttpDataSource.HttpDataSourceException;
import androidx.media3.datasource.HttpDataSource.InvalidResponseCodeException;
import com.google.common.base.Ascii;
import com.google.common.base.Predicate;
import com.google.common.net.HttpHeaders;
@ -338,7 +341,7 @@ public final class HttpEngineDataSource extends BaseDataSource implements HttpDa
private final boolean keepPostFor302Redirects;
// Accessed by the calling thread only.
private boolean transferStarted;
private boolean opened;
private long bytesRemaining;
@Nullable private DataSpec currentDataSpec;
@ -427,20 +430,14 @@ public final class HttpEngineDataSource extends BaseDataSource implements HttpDa
@Override
@Nullable
public Uri getUri() {
if (responseInfo != null) {
return Uri.parse(responseInfo.getUrl());
} else if (currentDataSpec != null) {
return currentDataSpec.uri;
} else {
return null;
}
return responseInfo == null ? null : Uri.parse(responseInfo.getUrl());
}
@UnstableApi
@Override
public long open(DataSpec dataSpec) throws HttpDataSourceException {
Assertions.checkNotNull(dataSpec);
Assertions.checkState(!transferStarted);
Assertions.checkState(!opened);
operation.close();
resetConnectTimeout();
@ -502,7 +499,7 @@ public final class HttpEngineDataSource extends BaseDataSource implements HttpDa
long documentSize =
HttpUtil.getDocumentSize(getFirstHeader(responseHeaders, HttpHeaders.CONTENT_RANGE));
if (dataSpec.position == documentSize) {
transferStarted = true;
opened = true;
transferStarted(dataSpec);
return dataSpec.length != C.LENGTH_UNSET ? dataSpec.length : 0;
}
@ -561,7 +558,7 @@ public final class HttpEngineDataSource extends BaseDataSource implements HttpDa
bytesRemaining = dataSpec.length;
}
transferStarted = true;
opened = true;
transferStarted(dataSpec);
skipFully(bytesToSkip, dataSpec);
@ -571,7 +568,7 @@ public final class HttpEngineDataSource extends BaseDataSource implements HttpDa
@UnstableApi
@Override
public int read(byte[] buffer, int offset, int length) throws HttpDataSourceException {
Assertions.checkState(transferStarted);
Assertions.checkState(opened);
if (length == 0) {
return 0;
@ -642,7 +639,7 @@ public final class HttpEngineDataSource extends BaseDataSource implements HttpDa
*/
@UnstableApi
public int read(ByteBuffer buffer) throws HttpDataSourceException {
Assertions.checkState(transferStarted);
Assertions.checkState(opened);
if (!buffer.isDirect()) {
throw new IllegalArgumentException("Passed buffer is not a direct ByteBuffer");
@ -699,8 +696,8 @@ public final class HttpEngineDataSource extends BaseDataSource implements HttpDa
responseInfo = null;
exception = null;
finished = false;
if (transferStarted) {
transferStarted = false;
if (opened) {
opened = false;
transferEnded();
}
}

View file

@ -83,18 +83,10 @@ public final class StatsDataSource implements DataSource {
// Reassign defaults in case dataSource.open throws an exception.
lastOpenedUri = dataSpec.uri;
lastResponseHeaders = Collections.emptyMap();
try {
return dataSource.open(dataSpec);
} finally {
// TODO: b/373321956 - Remove this null-tolerance when we've fixed all DataSource
// implementations to return a non-null URI after a failed open() call and before close()
// (and updated the DataSourceContractTest to enforce this).
Uri upstreamUri = getUri();
if (upstreamUri != null) {
lastOpenedUri = upstreamUri;
}
lastResponseHeaders = getResponseHeaders();
}
long availableBytes = dataSource.open(dataSpec);
lastOpenedUri = Assertions.checkNotNull(getUri());
lastResponseHeaders = getResponseHeaders();
return availableBytes;
}
@Override

View file

@ -32,28 +32,17 @@ import org.junit.runner.RunWith;
@RunWith(AndroidJUnit4.class)
public class ResolvingDataSourceContractTest extends DataSourceContractTest {
private static final String REQUESTED_URI = "test://simple.test";
private static final String URI = "test://simple.test";
private static final String RESOLVED_URI = "resolved://simple.resolved";
private static final String REQUESTED_URI_WITH_DIFFERENT_REPORTED =
"test://different.report.test";
private static final String RESOLVED_URI_WITH_DIFFERENT_REPORTED =
"resolved://different.report.test";
private static final String REPORTED_URI = "reported://reported.test";
private byte[] simpleData;
private byte[] differentReportedData;
private FakeDataSet fakeDataSet;
private FakeDataSource fakeDataSource;
@Before
public void setUp() {
simpleData = TestUtil.buildTestData(/* length= */ 20);
differentReportedData = TestUtil.buildTestData(/* length= */ 15);
fakeDataSet =
new FakeDataSet()
.setData(RESOLVED_URI, simpleData)
.setData(RESOLVED_URI_WITH_DIFFERENT_REPORTED, differentReportedData);
fakeDataSet = new FakeDataSet().newData(RESOLVED_URI).appendReadData(simpleData).endData();
}
@Override
@ -61,15 +50,8 @@ public class ResolvingDataSourceContractTest extends DataSourceContractTest {
return ImmutableList.of(
new TestResource.Builder()
.setName("simple")
.setUri(REQUESTED_URI)
.setResolvedUri(RESOLVED_URI)
.setUri(URI)
.setExpectedBytes(simpleData)
.build(),
new TestResource.Builder()
.setName("different-reported")
.setUri(REQUESTED_URI_WITH_DIFFERENT_REPORTED)
.setResolvedUri(REPORTED_URI)
.setExpectedBytes(differentReportedData)
.build());
}
@ -86,21 +68,9 @@ public class ResolvingDataSourceContractTest extends DataSourceContractTest {
new Resolver() {
@Override
public DataSpec resolveDataSpec(DataSpec dataSpec) throws IOException {
switch (dataSpec.uri.normalizeScheme().toString()) {
case REQUESTED_URI:
return dataSpec.buildUpon().setUri(RESOLVED_URI).build();
case REQUESTED_URI_WITH_DIFFERENT_REPORTED:
return dataSpec.buildUpon().setUri(RESOLVED_URI_WITH_DIFFERENT_REPORTED).build();
default:
return dataSpec;
}
}
@Override
public Uri resolveReportedUri(Uri uri) {
return uri.normalizeScheme().toString().equals(RESOLVED_URI_WITH_DIFFERENT_REPORTED)
? Uri.parse(REPORTED_URI)
: uri;
return URI.equals(dataSpec.uri.normalizeScheme().toString())
? dataSpec.buildUpon().setUri(RESOLVED_URI).build()
: dataSpec;
}
});
}

View file

@ -1,63 +0,0 @@
/*
* Copyright 2024 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package androidx.media3.datasource;
import static com.google.common.truth.Truth.assertThat;
import static org.junit.Assert.assertThrows;
import android.net.Uri;
import androidx.media3.test.utils.FakeDataSet;
import androidx.media3.test.utils.FakeDataSource;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import java.io.IOException;
import org.junit.Test;
import org.junit.runner.RunWith;
@RunWith(AndroidJUnit4.class)
public final class StatsDataSourceTest {
@Test
public void getLastOpenedUri_openSucceeds_returnsRedirectedUriAfterClosure() throws Exception {
Uri redirectedUri = Uri.parse("bar");
FakeDataSet fakeDataSet = new FakeDataSet();
fakeDataSet.setRandomData(redirectedUri, /* length= */ 10);
StatsDataSource statsDataSource =
new StatsDataSource(
new ResolvingDataSource(
new FakeDataSource(fakeDataSet),
dataSpec -> dataSpec.buildUpon().setUri(redirectedUri).build()));
statsDataSource.open(new DataSpec(Uri.parse("foo")));
statsDataSource.close();
assertThat(statsDataSource.getLastOpenedUri()).isEqualTo(redirectedUri);
}
@Test
public void getLastOpenedUri_openFails_returnsRedirectedUriAfterClosure() throws Exception {
Uri redirectedUri = Uri.parse("bar");
StatsDataSource statsDataSource =
new StatsDataSource(
new ResolvingDataSource(
new FakeDataSource(),
dataSpec -> dataSpec.buildUpon().setUri(redirectedUri).build()));
assertThrows(IOException.class, () -> statsDataSource.open(new DataSpec(Uri.parse("foo"))));
statsDataSource.close();
assertThat(statsDataSource.getLastOpenedUri()).isEqualTo(redirectedUri);
}
}

View file

@ -15,6 +15,7 @@
*/
package androidx.media3.datasource.cronet;
import android.net.Uri;
import androidx.media3.datasource.DataSource;
import androidx.media3.test.utils.DataSourceContractTest;
import androidx.media3.test.utils.HttpDataSourceTestEnv;
@ -65,7 +66,7 @@ public class CronetDataSourceContractTest extends DataSourceContractTest {
}
@Override
protected List<TestResource> getNotFoundResources() {
return httpDataSourceTestEnv.getNotFoundResources();
protected Uri getNotFoundUri() {
return Uri.parse(httpDataSourceTestEnv.getNonexistentUrl());
}
}

View file

@ -463,7 +463,7 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource {
private final boolean keepPostFor302Redirects;
// Accessed by the calling thread only.
private boolean transferStarted;
private boolean opened;
private long bytesRemaining;
// Written from the calling thread only. currentUrlRequest.start() calls ensure writes are visible
@ -555,20 +555,14 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource {
@Override
@Nullable
public Uri getUri() {
if (responseInfo != null) {
return Uri.parse(responseInfo.getUrl());
} else if (currentDataSpec != null) {
return currentDataSpec.uri;
} else {
return null;
}
return responseInfo == null ? null : Uri.parse(responseInfo.getUrl());
}
@UnstableApi
@Override
public long open(DataSpec dataSpec) throws HttpDataSourceException {
Assertions.checkNotNull(dataSpec);
Assertions.checkState(!transferStarted);
Assertions.checkState(!opened);
operation.close();
resetConnectTimeout();
@ -630,7 +624,7 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource {
long documentSize =
HttpUtil.getDocumentSize(getFirstHeader(responseHeaders, HttpHeaders.CONTENT_RANGE));
if (dataSpec.position == documentSize) {
transferStarted = true;
opened = true;
transferStarted(dataSpec);
return dataSpec.length != C.LENGTH_UNSET ? dataSpec.length : 0;
}
@ -689,7 +683,7 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource {
bytesRemaining = dataSpec.length;
}
transferStarted = true;
opened = true;
transferStarted(dataSpec);
skipFully(bytesToSkip, dataSpec);
@ -699,7 +693,7 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource {
@UnstableApi
@Override
public int read(byte[] buffer, int offset, int length) throws HttpDataSourceException {
Assertions.checkState(transferStarted);
Assertions.checkState(opened);
if (length == 0) {
return 0;
@ -770,7 +764,7 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource {
*/
@UnstableApi
public int read(ByteBuffer buffer) throws HttpDataSourceException {
Assertions.checkState(transferStarted);
Assertions.checkState(opened);
if (!buffer.isDirect()) {
throw new IllegalArgumentException("Passed buffer is not a direct ByteBuffer");
@ -824,8 +818,8 @@ public class CronetDataSource extends BaseDataSource implements HttpDataSource {
responseInfo = null;
exception = null;
finished = false;
if (transferStarted) {
transferStarted = false;
if (opened) {
opened = false;
transferEnded();
}
}

View file

@ -192,7 +192,7 @@ public class OkHttpDataSource extends BaseDataSource implements HttpDataSource {
@Nullable private DataSpec dataSpec;
@Nullable private Response response;
@Nullable private InputStream responseByteStream;
private boolean connectionEstablished;
private boolean opened;
private long bytesToRead;
private long bytesRead;
@ -215,13 +215,7 @@ public class OkHttpDataSource extends BaseDataSource implements HttpDataSource {
@Override
@Nullable
public Uri getUri() {
if (response != null) {
return Uri.parse(response.request().url().toString());
} else if (dataSpec != null) {
return dataSpec.uri;
} else {
return null;
}
return response == null ? null : Uri.parse(response.request().url().toString());
}
@UnstableApi
@ -287,7 +281,7 @@ public class OkHttpDataSource extends BaseDataSource implements HttpDataSource {
long documentSize =
HttpUtil.getDocumentSize(response.headers().get(HttpHeaders.CONTENT_RANGE));
if (dataSpec.position == documentSize) {
connectionEstablished = true;
opened = true;
transferStarted(dataSpec);
return dataSpec.length != C.LENGTH_UNSET ? dataSpec.length : 0;
}
@ -331,7 +325,7 @@ public class OkHttpDataSource extends BaseDataSource implements HttpDataSource {
bytesToRead = contentLength != -1 ? (contentLength - bytesToSkip) : C.LENGTH_UNSET;
}
connectionEstablished = true;
opened = true;
transferStarted(dataSpec);
try {
@ -358,13 +352,11 @@ public class OkHttpDataSource extends BaseDataSource implements HttpDataSource {
@UnstableApi
@Override
public void close() {
if (connectionEstablished) {
connectionEstablished = false;
if (opened) {
opened = false;
transferEnded();
closeConnectionQuietly();
}
response = null;
dataSpec = null;
}
/** Establishes a connection. */
@ -532,6 +524,7 @@ public class OkHttpDataSource extends BaseDataSource implements HttpDataSource {
private void closeConnectionQuietly() {
if (response != null) {
Assertions.checkNotNull(response.body()).close();
response = null;
}
responseByteStream = null;
}

View file

@ -15,12 +15,12 @@
*/
package androidx.media3.datasource.okhttp;
import android.net.Uri;
import androidx.media3.datasource.DataSource;
import androidx.media3.test.utils.DataSourceContractTest;
import androidx.media3.test.utils.HttpDataSourceTestEnv;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import com.google.common.collect.ImmutableList;
import java.util.List;
import okhttp3.OkHttpClient;
import org.junit.Rule;
import org.junit.runner.RunWith;
@ -42,7 +42,7 @@ public class OkHttpDataSourceContractTest extends DataSourceContractTest {
}
@Override
protected List<TestResource> getNotFoundResources() {
return httpDataSourceTestEnv.getNotFoundResources();
protected Uri getNotFoundUri() {
return Uri.parse(httpDataSourceTestEnv.getNonexistentUrl());
}
}

View file

@ -27,14 +27,6 @@ android {
}
}
}
// TODO(Internal: b/372449691): Remove packagingOptions once AGP is updated
// to version 8.5.1 or higher.
packagingOptions {
jniLibs {
useLegacyPackaging true
}
}
}
// Configure the native build only if libgav1 is present to avoid gradle sync
@ -59,9 +51,4 @@ dependencies {
implementation project(modulePrefix + 'lib-exoplayer')
implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion
compileOnly 'org.jetbrains.kotlin:kotlin-annotations-jvm:' + kotlinAnnotationsVersion
testImplementation project(modulePrefix + 'test-utils')
testImplementation 'org.robolectric:robolectric:' + robolectricVersion
androidTestImplementation project(modulePrefix + 'test-utils')
androidTestImplementation 'androidx.test:runner:' + androidxTestRunnerVersion
androidTestImplementation 'androidx.test.ext:junit:' + androidxTestJUnitVersion
}

View file

@ -61,7 +61,3 @@ target_link_libraries(gav1JNI
PRIVATE libgav1_static
PRIVATE ${android_log_lib})
# Enable 16 KB ELF alignment.
target_link_options(gav1JNI
PRIVATE "-Wl,-z,max-page-size=16384")

View file

@ -19,23 +19,12 @@
#include <android/native_window_jni.h>
#include "cpu_features_macros.h" // NOLINT
// For ARMv7, we use `cpu_feature` to detect availability of NEON at runtime.
#ifdef CPU_FEATURES_ARCH_ARM
#include "cpuinfo_arm.h" // NOLINT
#endif // CPU_FEATURES_ARCH_ARM
// For ARM in general (v7/v8) we detect compile time availability of NEON.
#ifdef CPU_FEATURES_ARCH_ANY_ARM
#if CPU_FEATURES_COMPILED_ANY_ARM_NEON // always defined to 0 or 1.
#define HAS_COMPILE_TIME_NEON_SUPPORT
#endif // CPU_FEATURES_COMPILED_ANY_ARM_NEON
#endif // CPU_FEATURES_ARCH_ANY_ARM
#ifdef HAS_COMPILE_TIME_NEON_SUPPORT
#ifdef CPU_FEATURES_COMPILED_ANY_ARM_NEON
#include <arm_neon.h>
#endif
#endif // CPU_FEATURES_COMPILED_ANY_ARM_NEON
#include <jni.h>
#include <cstdint>
@ -411,7 +400,7 @@ void Convert10BitFrameTo8BitDataBuffer(
}
}
#ifdef HAS_COMPILE_TIME_NEON_SUPPORT
#ifdef CPU_FEATURES_COMPILED_ANY_ARM_NEON
void Convert10BitFrameTo8BitDataBufferNeon(
const libgav1::DecoderBuffer* decoder_buffer, jbyte* data) {
uint32x2_t lcg_value = vdup_n_u32(random());
@ -508,7 +497,7 @@ void Convert10BitFrameTo8BitDataBufferNeon(
}
}
}
#endif // HAS_COMPILE_TIME_NEON_SUPPORT
#endif // CPU_FEATURES_COMPILED_ANY_ARM_NEON
} // namespace
@ -518,19 +507,20 @@ DECODER_FUNC(jlong, gav1Init, jint threads) {
return kStatusError;
}
#ifdef CPU_FEATURES_ARCH_ANY_ARM // Arm v7/v8
#ifndef HAS_COMPILE_TIME_NEON_SUPPORT // no compile time NEON support
#ifdef CPU_FEATURES_ARCH_ARM // check runtime support for ARMv7
if (cpu_features::GetArmInfo().features.neon == false) {
#ifdef CPU_FEATURES_ARCH_ARM
// Libgav1 requires NEON with arm ABIs.
#ifdef CPU_FEATURES_COMPILED_ANY_ARM_NEON
const cpu_features::ArmFeatures arm_features =
cpu_features::GetArmInfo().features;
if (!arm_features.neon) {
context->jni_status_code = kJniStatusNeonNotSupported;
return reinterpret_cast<jlong>(context);
}
#else // Unexpected case of an ARMv8 with no NEON support.
#else
context->jni_status_code = kJniStatusNeonNotSupported;
return reinterpret_cast<jlong>(context);
#endif // CPU_FEATURES_COMPILED_ANY_ARM_NEON
#endif // CPU_FEATURES_ARCH_ARM
#endif // HAS_COMPILE_TIME_NEON_SUPPORT
#endif // CPU_FEATURES_ARCH_ANY_ARM
libgav1::DecoderSettings settings;
settings.threads = threads;
@ -623,11 +613,11 @@ DECODER_FUNC(jint, gav1GetFrame, jlong jContext, jobject jOutputBuffer,
CopyFrameToDataBuffer(decoder_buffer, data);
break;
case 10:
#ifdef HAS_COMPILE_TIME_NEON_SUPPORT
#ifdef CPU_FEATURES_COMPILED_ANY_ARM_NEON
Convert10BitFrameTo8BitDataBufferNeon(decoder_buffer, data);
#else
Convert10BitFrameTo8BitDataBuffer(decoder_buffer, data);
#endif // HAS_COMPILE_TIME_NEON_SUPPORT
#endif // CPU_FEATURES_COMPILED_ANY_ARM_NEON
break;
default:
context->jni_status_code = kJniStatusBitDepth12NotSupportedWithYuv;

View file

@ -1,19 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Copyright 2024 The Android Open Source Project
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<manifest package="androidx.media3.decoder.av1.test">
<uses-sdk/>
</manifest>

View file

@ -1,33 +0,0 @@
/*
* Copyright 2024 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package androidx.media3.decoder.av1;
import androidx.media3.common.C;
import androidx.media3.test.utils.DefaultRenderersFactoryAsserts;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import org.junit.Test;
import org.junit.runner.RunWith;
/** Unit test for {@link DefaultRenderersFactoryTest} with {@link Libgav1VideoRenderer}. */
@RunWith(AndroidJUnit4.class)
public final class DefaultRenderersFactoryTest {
@Test
public void createRenderers_instantiatesAv1Renderer() {
DefaultRenderersFactoryAsserts.assertExtensionRendererCreated(
Libgav1VideoRenderer.class, C.TRACK_TYPE_VIDEO);
}
}

View file

@ -13,17 +13,7 @@
// limitations under the License.
apply from: "$gradle.ext.androidxMediaSettingsDir/common_library_config.gradle"
android {
namespace 'androidx.media3.decoder.ffmpeg'
// TODO(Internal: b/372449691): Remove packagingOptions once AGP is updated
// to version 8.5.1 or higher.
packagingOptions {
jniLibs {
useLegacyPackaging true
}
}
}
android.namespace = 'androidx.media3.decoder.ffmpeg'
// Configure the native build only if ffmpeg is present to avoid gradle sync
// failures if ffmpeg hasn't been built according to the README instructions.

View file

@ -56,7 +56,3 @@ target_link_libraries(ffmpegJNI
if(ANDROID_ABI STREQUAL "arm64-v8a")
target_link_options(ffmpegJNI PRIVATE "-Wl,-Bsymbolic")
endif()
# Enable 16 KB ELF alignment.
target_link_options(ffmpegJNI
PRIVATE "-Wl,-z,max-page-size=16384")

View file

@ -17,7 +17,7 @@ To use the module you need to clone this GitHub project and depend on its
modules locally. Instructions for doing this can be found in the
[top level README][].
In addition, it's necessary to fetch libflac as follows:
In addition, it's necessary to build the module's native components as follows:
* Set the following environment variables:
@ -26,24 +26,30 @@ cd "<path to project checkout>"
FLAC_MODULE_PATH="$(pwd)/libraries/decoder_flac/src/main"
```
* Fetch libflac:
* Download the [Android NDK][] and set its location in an environment variable.
This build configuration has been tested on NDK r21.
```
NDK_PATH="<path to Android NDK>"
```
* Download and extract flac-1.3.2 as "${FLAC_MODULE_PATH}/jni/flac" folder:
```
cd "${FLAC_MODULE_PATH}/jni" && \
git clone https://github.com/xiph/flac.git libflac
curl https://ftp.osuosl.org/pub/xiph/releases/flac/flac-1.3.2.tar.xz | tar xJ && \
mv flac-1.3.2 flac
```
* [Install CMake][]
* Build the JNI native libraries from the command line:
Having followed these steps, gradle will build the module automatically when run
on the command line or via Android Studio, using [CMake][] and [Ninja][] to
configure and build libflac and the module's [JNI wrapper library][].
```
cd "${FLAC_MODULE_PATH}"/jni && \
${NDK_PATH}/ndk-build APP_ABI=all -j4
```
[top level README]: ../../README.md
[Install CMake]: https://developer.android.com/studio/projects/install-ndk
[CMake]: https://cmake.org/
[Ninja]: https://ninja-build.org
[JNI wrapper library]: src/main/jni/flac_jni.cc
[Android NDK]: https://developer.android.com/tools/sdk/ndk/index.html
## Build instructions (Windows)

View file

@ -17,42 +17,12 @@ android {
namespace 'androidx.media3.decoder.flac'
sourceSets {
main {
jniLibs.srcDir 'src/main/libs'
jni.srcDirs = [] // Disable the automatic ndk-build call by Android Studio.
}
androidTest.assets.srcDir '../test_data/src/test/assets'
}
defaultConfig {
externalNativeBuild {
cmake {
arguments "-DWITH_OGG=OFF"
arguments "-DINSTALL_MANPAGES=OFF"
targets "flacJNI"
}
}
}
// TODO(Internal: b/372449691): Remove packagingOptions once AGP is updated
// to version 8.5.1 or higher.
packagingOptions {
jniLibs {
useLegacyPackaging true
}
}
}
// Configure the native build only if libflac is present to avoid gradle sync
// failures if libflac hasn't been built according to the README instructions.
if (project.file('src/main/jni/libflac').exists()) {
android.externalNativeBuild.cmake {
path = 'src/main/jni/CMakeLists.txt'
version = '3.21.0+'
if (project.hasProperty('externalNativeBuildDir')) {
if (!new File(externalNativeBuildDir).isAbsolute()) {
ext.externalNativeBuildDir =
new File(rootDir, it.externalNativeBuildDir)
}
buildStagingDirectory = "${externalNativeBuildDir}/${project.name}"
}
}
}
dependencies {

View file

@ -29,6 +29,7 @@ import androidx.test.core.app.ApplicationProvider;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import java.io.IOException;
import java.util.List;
import org.junit.Ignore;
import org.junit.Test;
import org.junit.runner.RunWith;
@ -75,6 +76,7 @@ public final class FlacExtractorSeekTest {
fileName, trackOutput, targetSeekTimeUs, extractedFrameIndex);
}
@Ignore("Fix [internal: b/249505968] before enabling this.")
@Test
public void seeking_seekTable_handlesSeekToEoF() throws IOException {
String fileName = TEST_FILE_SEEK_TABLE;
@ -92,6 +94,7 @@ public final class FlacExtractorSeekTest {
fileName, trackOutput, targetSeekTimeUs, extractedFrameIndex);
}
@Ignore("Fix [internal: b/249505968] before enabling this.")
@Test
public void seeking_seekTable_handlesSeekingBackward() throws IOException {
String fileName = TEST_FILE_SEEK_TABLE;
@ -111,6 +114,7 @@ public final class FlacExtractorSeekTest {
fileName, trackOutput, targetSeekTimeUs, extractedFrameIndex);
}
@Ignore("Fix [internal: b/249505968] before enabling this.")
@Test
public void seeking_seekTable_handlesSeekingForward() throws IOException {
String fileName = TEST_FILE_SEEK_TABLE;
@ -158,6 +162,7 @@ public final class FlacExtractorSeekTest {
fileName, trackOutput, targetSeekTimeUs, extractedFrameIndex);
}
@Ignore("Fix [internal: b/249505968] before enabling this.")
@Test
public void seeking_binarySearch_handlesSeekToEoF() throws IOException {
String fileName = TEST_FILE_BINARY_SEARCH;
@ -175,6 +180,7 @@ public final class FlacExtractorSeekTest {
fileName, trackOutput, targetSeekTimeUs, extractedFrameIndex);
}
@Ignore("Fix [internal: b/249505968] before enabling this.")
@Test
public void seeking_binarySearch_handlesSeekingBackward() throws IOException {
String fileName = TEST_FILE_BINARY_SEARCH;
@ -194,6 +200,7 @@ public final class FlacExtractorSeekTest {
fileName, trackOutput, targetSeekTimeUs, extractedFrameIndex);
}
@Ignore("Fix [internal: b/249505968] before enabling this.")
@Test
public void seeking_binarySearch_handlesSeekingForward() throws IOException {
String fileName = TEST_FILE_BINARY_SEARCH;

View file

@ -0,0 +1,38 @@
#
# Copyright (C) 2016 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
WORKING_DIR := $(call my-dir)
# build libflacJNI.so
include $(CLEAR_VARS)
include $(WORKING_DIR)/flac_sources.mk
LOCAL_PATH := $(WORKING_DIR)
LOCAL_MODULE := libflacJNI
LOCAL_ARM_MODE := arm
LOCAL_CPP_EXTENSION := .cc
LOCAL_C_INCLUDES := \
$(LOCAL_PATH)/flac/include \
$(LOCAL_PATH)/flac/src/libFLAC/include
LOCAL_SRC_FILES := $(FLAC_SOURCES)
LOCAL_CFLAGS += '-DPACKAGE_VERSION="1.3.2"' -DFLAC__NO_MD5 -DFLAC__INTEGER_ONLY_LIBRARY
LOCAL_CFLAGS += -D_REENTRANT -DPIC -DU_COMMON_IMPLEMENTATION -fPIC -DHAVE_SYS_PARAM_H
LOCAL_CFLAGS += -O3 -funroll-loops -finline-functions -DFLAC__NO_ASM '-DFLAC__HAS_OGG=0'
LOCAL_LDLIBS := -llog -lz -lm
include $(BUILD_SHARED_LIBRARY)

View file

@ -0,0 +1,20 @@
#
# Copyright (C) 2016 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
APP_OPTIM := release
APP_STL := c++_static
APP_CPPFLAGS := -frtti
APP_PLATFORM := android-14

View file

@ -1,51 +0,0 @@
#
# Copyright 2024 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
cmake_minimum_required(VERSION 3.21.0 FATAL_ERROR)
# Enable C++11 features.
set(CMAKE_CXX_STANDARD 11)
# Define project name for your JNI module
project(libflacJNI C CXX)
set(libflac_jni_root "${CMAKE_CURRENT_SOURCE_DIR}")
# Build libflac.
add_subdirectory("${libflac_jni_root}/libflac"
EXCLUDE_FROM_ALL)
# Build libflacJNI.
add_library(flacJNI
SHARED
flac_jni.cc
flac_parser.cc)
# Add the include directory from libflac.
include_directories("${libflac_jni_root}/libflac/include")
# Locate NDK log library.
find_library(android_log_lib log)
# Link libflacJNI against used libraries.
target_link_libraries(flacJNI
PRIVATE android
PRIVATE FLAC
PRIVATE ${android_log_lib})
# Enable 16 KB ELF alignment.
target_link_options(flacJNI
PRIVATE "-Wl,-z,max-page-size=16384")

View file

@ -0,0 +1,45 @@
#
# Copyright (C) 2016 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
FLAC_SOURCES = \
flac_jni.cc \
flac_parser.cc \
flac/src/libFLAC/bitmath.c \
flac/src/libFLAC/bitreader.c \
flac/src/libFLAC/bitwriter.c \
flac/src/libFLAC/cpu.c \
flac/src/libFLAC/crc.c \
flac/src/libFLAC/fixed.c \
flac/src/libFLAC/fixed_intrin_sse2.c \
flac/src/libFLAC/fixed_intrin_ssse3.c \
flac/src/libFLAC/float.c \
flac/src/libFLAC/format.c \
flac/src/libFLAC/lpc.c \
flac/src/libFLAC/lpc_intrin_avx2.c \
flac/src/libFLAC/lpc_intrin_sse2.c \
flac/src/libFLAC/lpc_intrin_sse41.c \
flac/src/libFLAC/lpc_intrin_sse.c \
flac/src/libFLAC/md5.c \
flac/src/libFLAC/memory.c \
flac/src/libFLAC/metadata_iterators.c \
flac/src/libFLAC/metadata_object.c \
flac/src/libFLAC/stream_decoder.c \
flac/src/libFLAC/stream_encoder.c \
flac/src/libFLAC/stream_encoder_framing.c \
flac/src/libFLAC/stream_encoder_intrin_avx2.c \
flac/src/libFLAC/stream_encoder_intrin_sse2.c \
flac/src/libFLAC/stream_encoder_intrin_ssse3.c \
flac/src/libFLAC/window.c

View file

@ -19,15 +19,15 @@
#include <stdint.h>
// libFLAC parser
#include <FLAC/stream_decoder.h>
#include <array>
#include <cstdlib>
#include <string>
#include <vector>
#include "../include/data_source.h"
// libFLAC parser
#include "FLAC/stream_decoder.h"
#include "include/data_source.h"
typedef int status_t;

View file

@ -1,7 +1,7 @@
# IAMF decoder module
The IAMF module provides `LibiamfAudioRenderer`, which uses the libiamf native
library to decode IAMF audio.
The IAMF module provides `LibiamfAudioRenderer`, which uses libiamf (the IAMF
decoding library) to decode IAMF audio.
## License note
@ -17,33 +17,58 @@ To use the module you need to clone this GitHub project and depend on its
modules locally. Instructions for doing this can be found in the
[top level README][].
In addition, it's necessary to fetch libiamf as follows:
In addition, it's necessary to build the module's native components as follows:
* Set the following environment variables:
```
cd "<path to project checkout>"
IAMF_MODULE_PATH="$(pwd)/libraries/decoder_iamf/src/main"
```
* Fetch libiamf:
* Download the [Android NDK][] and set its location in an environment
variable. This build configuration has been tested on NDK r27.
```
cd "${IAMF_MODULE_PATH}/jni" && \
git clone https://github.com/AOMediaCodec/libiamf.git
NDK_PATH="<path to Android NDK>"
```
* [Install CMake][]
* Fetch libiamf:
Having followed these steps, gradle will build the module automatically when run
on the command line or via Android Studio, using [CMake][] and [Ninja][] to
configure and build libiamf and the module's [JNI wrapper library][].
Clone the repository containing libiamf to a local folder of choice - preferably
outside of the project checkout. Link it to the project's jni folder through
symlink.
```
cd <preferred location for libiamf>
git clone https://github.com/AOMediaCodec/libiamf.git libiamf && \
cd libiamf && \
LIBIAMF_PATH=$(pwd)
```
* Symlink the folder containing libiamf to the project's JNI folder and run
the script to convert libiamf code to NDK compatible format:
```
cd "${IAMF_MODULE_PATH}"/jni && \
ln -s $LIBIAMF_PATH libiamf && \
cd libiamf/code &&\
cmake . && \
make
```
* Build the JNI native libraries from the command line:
```
cd "${IAMF_MODULE_PATH}"/jni && \
${NDK_PATH}/ndk-build APP_ABI=all -j4
```
[top level README]: ../../README.md
[Install CMake]: https://developer.android.com/studio/projects/install-ndk
[CMake]: https://cmake.org/
[Ninja]: https://ninja-build.org
[JNI wrapper library]: src/main/jni/iamf_jni.cc
[Android NDK]: https://developer.android.com/tools/sdk/ndk/index.html
## Build instructions (Windows)
@ -52,6 +77,13 @@ be possible to follow the Linux instructions in [Windows PowerShell][].
[Windows PowerShell]: https://docs.microsoft.com/en-us/powershell/scripting/getting-started/getting-started-with-windows-powershell
## Notes
* Every time there is a change to the libiamf checkout clean and re-build the
project.
* If you want to use your own version of libiamf, place it in
`${IAMF_MODULE_PATH}/jni/libiamf`.
## Using the module with ExoPlayer
Once you've followed the instructions above to check out, build and depend on

View file

@ -1,4 +1,4 @@
// Copyright 2024 The Android Open Source Project
// Copyright (C) 2024 The Android Open Source Project
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
@ -17,40 +17,12 @@ android {
namespace 'androidx.media3.decoder.iamf'
sourceSets {
main {
jniLibs.srcDir 'src/main/libs'
jni.srcDirs = [] // Disable the automatic ndk-build call by Android Studio.
}
androidTest.assets.srcDir '../test_data/src/test/assets'
}
defaultConfig {
externalNativeBuild {
cmake {
targets "iamfJNI"
}
}
}
// TODO(Internal: b/372449691): Remove packagingOptions once AGP is updated
// to version 8.5.1 or higher.
packagingOptions {
jniLibs {
useLegacyPackaging true
}
}
}
// Configure the native build only if libiamf is present to avoid gradle sync
// failures if libiamf hasn't been built according to the README instructions.
if (project.file('src/main/jni/libiamf').exists()) {
android.externalNativeBuild.cmake {
path = 'src/main/jni/CMakeLists.txt'
version = '3.21.0+'
if (project.hasProperty('externalNativeBuildDir')) {
if (!new File(externalNativeBuildDir).isAbsolute()) {
ext.externalNativeBuildDir =
new File(rootDir, it.externalNativeBuildDir)
}
buildStagingDirectory = "${externalNativeBuildDir}/${project.name}"
}
}
}
dependencies {

View file

@ -16,7 +16,6 @@
package androidx.media3.decoder.iamf;
import static com.google.common.truth.Truth.assertThat;
import static org.junit.Assume.assumeTrue;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import com.google.common.collect.ImmutableList;
@ -41,7 +40,7 @@ public final class IamfDecoderTest {
@Before
public void setUp() {
assumeTrue(IamfLibrary.isAvailable());
assertThat(IamfLibrary.isAvailable()).isTrue();
}
@Test

View file

@ -56,9 +56,6 @@ public final class IamfDecoder
public IamfDecoder(List<byte[]> initializationData, boolean spatializationSupported)
throws IamfDecoderException {
super(new DecoderInputBuffer[1], new SimpleDecoderOutputBuffer[1]);
if (!IamfLibrary.isAvailable()) {
throw new IamfDecoderException("Failed to load decoder native libraries.");
}
if (initializationData.size() != 1) {
throw new IamfDecoderException("Initialization data must contain a single element.");
}

View file

@ -15,12 +15,10 @@
*/
package androidx.media3.decoder.iamf;
import androidx.media3.common.util.UnstableApi;
import androidx.media3.decoder.DecoderException;
/** Thrown when an IAMF decoder error occurs. */
@UnstableApi
public final class IamfDecoderException extends DecoderException {
final class IamfDecoderException extends DecoderException {
/* package */ IamfDecoderException(String message) {
super(message);

View file

@ -15,6 +15,7 @@
*/
package androidx.media3.decoder.iamf;
import androidx.media3.common.C;
import androidx.media3.common.MediaLibraryInfo;
import androidx.media3.common.util.LibraryLoader;
import androidx.media3.common.util.UnstableApi;
@ -42,9 +43,12 @@ public final class IamfLibrary {
* it must do so before calling any other method defined by this class, and before instantiating a
* {@link LibiamfAudioRenderer} instance.
*
* @param cryptoType The {@link C.CryptoType} for which the decoder library supports decrypting
* protected content, or {@link C#CRYPTO_TYPE_UNSUPPORTED} if the library does not support
* decryption.
* @param libraries The names of the IAMF native libraries.
*/
public static void setLibraries(String... libraries) {
public static void setLibraries(@C.CryptoType int cryptoType, String... libraries) {
LOADER.setLibraries(libraries);
}

View file

@ -27,7 +27,6 @@ import androidx.media3.common.C;
import androidx.media3.common.Format;
import androidx.media3.common.MimeTypes;
import androidx.media3.common.util.TraceUtil;
import androidx.media3.common.util.UnstableApi;
import androidx.media3.common.util.Util;
import androidx.media3.decoder.CryptoConfig;
import androidx.media3.decoder.DecoderException;
@ -37,7 +36,6 @@ import androidx.media3.exoplayer.audio.DecoderAudioRenderer;
import java.util.Objects;
/** Decodes and renders audio using the native IAMF decoder. */
@UnstableApi
public class LibiamfAudioRenderer extends DecoderAudioRenderer<IamfDecoder> {
private final Context context;
@ -96,19 +94,18 @@ public class LibiamfAudioRenderer extends DecoderAudioRenderer<IamfDecoder> {
}
AudioManager audioManager = (AudioManager) context.getSystemService(Context.AUDIO_SERVICE);
AudioFormat.Builder audioFormat =
new AudioFormat.Builder()
.setEncoding(IamfDecoder.OUTPUT_PCM_ENCODING)
.setChannelMask(IamfDecoder.SPATIALIZED_OUTPUT_LAYOUT);
if (audioManager == null) {
return false;
}
AudioFormat audioFormat =
new AudioFormat.Builder()
.setEncoding(IamfDecoder.OUTPUT_PCM_ENCODING)
.setChannelMask(IamfDecoder.SPATIALIZED_OUTPUT_LAYOUT)
.build();
Spatializer spatializer = audioManager.getSpatializer();
return spatializer.getImmersiveAudioLevel() != Spatializer.SPATIALIZER_IMMERSIVE_LEVEL_NONE
&& spatializer.isAvailable()
&& spatializer.isEnabled()
&& spatializer.canBeSpatialized(
AudioAttributes.DEFAULT.getAudioAttributesV21().audioAttributes, audioFormat);
AudioAttributes.DEFAULT.getAudioAttributesV21().audioAttributes, audioFormat.build());
}
}

View file

@ -0,0 +1,32 @@
#
# Copyright (C) 2024 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
WORKING_DIR := $(call my-dir)
include $(CLEAR_VARS)
# build libiamf.a
LOCAL_PATH := $(WORKING_DIR)
include libiamf.mk
# build libiamfJNI.so
include $(CLEAR_VARS)
LOCAL_PATH := $(WORKING_DIR)
LOCAL_MODULE := libiamfJNI
LOCAL_ARM_MODE := arm
LOCAL_CPP_EXTENSION := .cc
LOCAL_SRC_FILES := iamf_jni.cc
LOCAL_LDLIBS := -llog -lz -lm
LOCAL_STATIC_LIBRARIES := libiamf
include $(BUILD_SHARED_LIBRARY)

View file

@ -0,0 +1,18 @@
#
# Copyright (C) 2024 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
APP_ABI := all
APP_PLATFORM := android-21

View file

@ -1,50 +0,0 @@
#
# Copyright 2024 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
cmake_minimum_required(VERSION 3.21.0 FATAL_ERROR)
# Enable C++11 features.
set(CMAKE_CXX_STANDARD 11)
# Define project name for your JNI module
project(libiamfJNI C CXX)
set(libiamf_jni_root "${CMAKE_CURRENT_SOURCE_DIR}")
# Build libiamf.
add_subdirectory("${libiamf_jni_root}/libiamf/code"
EXCLUDE_FROM_ALL)
# Add the include directory from libiamf.
include_directories ("${libiamf_jni_root}/libiamf/code/include")
# Build libiamfJNI.
add_library(iamfJNI
SHARED
iamf_jni.cc)
# Locate NDK log library.
find_library(android_log_lib log)
# Link libiamfJNI against used libraries.
target_link_libraries(iamfJNI
PRIVATE android
PRIVATE iamf
PRIVATE ${android_log_lib})
# Enable 16 KB ELF alignment.
target_link_options(iamfJNI
PRIVATE "-Wl,-z,max-page-size=16384")

View file

@ -0,0 +1,35 @@
#
# Copyright (C) 2024 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
LOCAL_PATH := $(WORKING_DIR)/libiamf
include $(CLEAR_VARS)
LOCAL_MODULE := libiamf
LOCAL_ARM_MODE := arm
LOCAL_C_INCLUDES := $(LOCAL_PATH)/code/include \
$(LOCAL_PATH)/code/src/iamf_dec \
$(LOCAL_PATH)/code/src/common \
$(LOCAL_PATH)/code/dep_codecs/include \
$(LOCAL_PATH)/code/dep_external/include
LOCAL_SRC_FILES := $(shell find $(LOCAL_PATH)/code/src -name "*.c")
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/code/include \
$(LOCAL_PATH)/code/src/iamf_dec \
$(LOCAL_PATH)/code/src/common \
$(LOCAL_PATH)/code/dep_codecs/include \
$(LOCAL_PATH)/code/dep_external/include
include $(BUILD_STATIC_LIBRARY)

View file

@ -1,19 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Copyright 2024 The Android Open Source Project
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<manifest package="androidx.media3.decoder.iamf.test">
<uses-sdk/>
</manifest>

View file

@ -1,33 +0,0 @@
/*
* Copyright 2024 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package androidx.media3.decoder.iamf;
import androidx.media3.common.C;
import androidx.media3.test.utils.DefaultRenderersFactoryAsserts;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import org.junit.Test;
import org.junit.runner.RunWith;
/** Unit test for {@link DefaultRenderersFactoryTest} with {@link LibiamfAudioRenderer}. */
@RunWith(AndroidJUnit4.class)
public final class DefaultRenderersFactoryTest {
@Test
public void createRenderers_instantiatesIamfRenderer() {
DefaultRenderersFactoryAsserts.assertExtensionRendererCreated(
LibiamfAudioRenderer.class, C.TRACK_TYPE_AUDIO);
}
}

View file

@ -1,33 +0,0 @@
/*
* Copyright 2024 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package androidx.media3.decoder.midi;
import androidx.media3.common.C;
import androidx.media3.test.utils.DefaultRenderersFactoryAsserts;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import org.junit.Test;
import org.junit.runner.RunWith;
/** Unit test for {@link DefaultRenderersFactoryTest} with {@link MidiRenderer}. */
@RunWith(AndroidJUnit4.class)
public final class DefaultRenderersFactoryTest {
@Test
public void createRenderers_instantiatesMidiRenderer() {
DefaultRenderersFactoryAsserts.assertExtensionRendererCreated(
MidiRenderer.class, C.TRACK_TYPE_AUDIO);
}
}

View file

@ -1,7 +1,7 @@
# Opus decoder module
The Opus module provides `LibopusAudioRenderer`, which uses the libopus native
library to decode Opus audio.
The Opus module provides `LibopusAudioRenderer`, which uses libopus (the Opus
decoding library) to decode Opus audio.
## License note
@ -17,7 +17,7 @@ To use the module you need to clone this GitHub project and depend on its
modules locally. Instructions for doing this can be found in the
[top level README][].
In addition, it's necessary to fetch libopus as follows:
In addition, it's necessary to build the module's native components as follows:
* Set the following environment variables:
@ -26,6 +26,13 @@ cd "<path to project checkout>"
OPUS_MODULE_PATH="$(pwd)/libraries/decoder_opus/src/main"
```
* Download the [Android NDK][] and set its location in an environment variable.
This build configuration has been tested on NDK r21.
```
NDK_PATH="<path to Android NDK>"
```
* Fetch libopus:
```
@ -33,17 +40,21 @@ cd "${OPUS_MODULE_PATH}/jni" && \
git clone https://gitlab.xiph.org/xiph/opus.git libopus
```
* [Install CMake][]
* Run the script to convert arm assembly to NDK compatible format:
Having followed these steps, gradle will build the module automatically when run
on the command line or via Android Studio, using [CMake][] and [Ninja][] to
configure and build libopus and the module's [JNI wrapper library][].
```
cd ${OPUS_MODULE_PATH}/jni && ./convert_android_asm.sh
```
* Build the JNI native libraries from the command line:
```
cd "${OPUS_MODULE_PATH}"/jni && \
${NDK_PATH}/ndk-build APP_ABI=all -j4
```
[top level README]: ../../README.md
[Install CMake]: https://developer.android.com/studio/projects/install-ndk
[CMake]: https://cmake.org/
[Ninja]: https://ninja-build.org
[JNI wrapper library]: src/main/jni/opus_jni.cc
[Android NDK]: https://developer.android.com/tools/sdk/ndk/index.html
## Build instructions (Windows)
@ -52,6 +63,14 @@ be possible to follow the Linux instructions in [Windows PowerShell][].
[Windows PowerShell]: https://docs.microsoft.com/en-us/powershell/scripting/getting-started/getting-started-with-windows-powershell
## Notes
* Every time there is a change to the libopus checkout:
* Arm assembly should be converted by running `convert_android_asm.sh`
* Clean and re-build the project.
* If you want to use your own version of libopus, place it in
`${OPUS_MODULE_PATH}/jni/libopus`.
## Using the module with ExoPlayer
Once you've followed the instructions above to check out, build and depend on

View file

@ -17,40 +17,12 @@ android {
namespace 'androidx.media3.decoder.opus'
sourceSets {
main {
jniLibs.srcDir 'src/main/libs'
jni.srcDirs = [] // Disable the automatic ndk-build call by Android Studio.
}
androidTest.assets.srcDir '../test_data/src/test/assets'
}
defaultConfig {
externalNativeBuild {
cmake {
targets "opusV2JNI"
}
}
}
// TODO(Internal: b/372449691): Remove packagingOptions once AGP is updated
// to version 8.5.1 or higher.
packagingOptions {
jniLibs {
useLegacyPackaging true
}
}
}
// Configure the native build only if libopus is present to avoid gradle sync
// failures if libopus hasn't been built according to the README instructions.
if (project.file('src/main/jni/libopus').exists()) {
android.externalNativeBuild.cmake {
path = 'src/main/jni/CMakeLists.txt'
version = '3.21.0+'
if (project.hasProperty('externalNativeBuildDir')) {
if (!new File(externalNativeBuildDir).isAbsolute()) {
ext.externalNativeBuildDir =
new File(rootDir, it.externalNativeBuildDir)
}
buildStagingDirectory = "${externalNativeBuildDir}/${project.name}"
}
}
}
dependencies {

View file

@ -20,6 +20,7 @@ import static org.junit.Assume.assumeTrue;
import androidx.annotation.Nullable;
import androidx.media3.common.C;
import androidx.media3.common.util.LibraryLoader;
import androidx.media3.decoder.DecoderInputBuffer;
import androidx.media3.decoder.SimpleDecoderOutputBuffer;
import androidx.test.ext.junit.runners.AndroidJUnit4;
@ -33,6 +34,14 @@ import org.junit.runner.RunWith;
@RunWith(AndroidJUnit4.class)
public final class OpusDecoderTest {
private static final LibraryLoader LOADER =
new LibraryLoader("opusV2JNI") {
@Override
protected void loadLibrary(String name) {
System.loadLibrary(name);
}
};
private static final byte[] HEADER =
new byte[] {79, 112, 117, 115, 72, 101, 97, 100, 0, 2, 1, 56, 0, 0, -69, -128, 0, 0, 0};
@ -106,7 +115,7 @@ public final class OpusDecoderTest {
@Test
public void decode_removesPreSkipFromOutput() throws OpusDecoderException {
assumeTrue(OpusLibrary.isAvailable());
assumeTrue(LOADER.isAvailable());
OpusDecoder decoder =
new OpusDecoder(
/* numInputBuffers= */ 0,
@ -126,7 +135,7 @@ public final class OpusDecoderTest {
@Test
public void decode_whenDiscardPaddingDisabled_returnsDiscardPadding()
throws OpusDecoderException {
assumeTrue(OpusLibrary.isAvailable());
assumeTrue(LOADER.isAvailable());
OpusDecoder decoder =
new OpusDecoder(
/* numInputBuffers= */ 0,
@ -147,7 +156,7 @@ public final class OpusDecoderTest {
@Test
public void decode_whenDiscardPaddingEnabled_removesDiscardPadding() throws OpusDecoderException {
assumeTrue(OpusLibrary.isAvailable());
assumeTrue(LOADER.isAvailable());
OpusDecoder decoder =
new OpusDecoder(
/* numInputBuffers= */ 0,

View file

@ -0,0 +1,33 @@
#
# Copyright (C) 2016 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
WORKING_DIR := $(call my-dir)
include $(CLEAR_VARS)
# build libopus.a
LOCAL_PATH := $(WORKING_DIR)
include libopus.mk
# build libopusV2JNI.so
include $(CLEAR_VARS)
LOCAL_PATH := $(WORKING_DIR)
LOCAL_MODULE := libopusV2JNI
LOCAL_ARM_MODE := arm
LOCAL_CPP_EXTENSION := .cc
LOCAL_SRC_FILES := opus_jni.cc
LOCAL_LDLIBS := -llog -lz -lm
LOCAL_STATIC_LIBRARIES := libopus
include $(BUILD_SHARED_LIBRARY)

View file

@ -0,0 +1,20 @@
#
# Copyright (C) 2016 The Android Open Source Project
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
APP_OPTIM := release
APP_STL := c++_static
APP_CPPFLAGS := -frtti
APP_PLATFORM := android-9

Some files were not shown because too many files have changed in this diff Show more