Compare commits

..

No commits in common. "release" and "1.4.0" have entirely different histories.

1441 changed files with 15587 additions and 269400 deletions

View file

@ -19,9 +19,6 @@ body:
options:
- Media3 main branch
- Media3 pre-release (alpha, beta or RC not in this list)
- Media3 1.5.1
- Media3 1.5.0
- Media3 1.4.1
- Media3 1.4.0
- Media3 1.3.1
- Media3 1.3.0

41
.gitignore vendored
View file

@ -52,31 +52,30 @@ tmp
# External native builds
.externalNativeBuild
.cxx
# VP9 decoder extension
libraries/decoder_vp9/src/main/jni/libvpx
libraries/decoder_vp9/src/main/jni/libvpx_android_configs
libraries/decoder_vp9/src/main/jni/libyuv
# VP9 extension
extensions/vp9/src/main/jni/libvpx
extensions/vp9/src/main/jni/libvpx_android_configs
extensions/vp9/src/main/jni/libyuv
# AV1 decoder extension
libraries/decoder_av1/src/main/jni/cpu_features
libraries/decoder_av1/src/main/jni/libgav1
# AV1 extension
extensions/av1/src/main/jni/cpu_features
extensions/av1/src/main/jni/libgav1
# Opus decoder extension
libraries/decoder_opus/src/main/jni/libopus
# Opus extension
extensions/opus/src/main/jni/libopus
# FLAC decoder extension
libraries/decoder_flac/src/main/jni/flac
# FLAC extension
extensions/flac/src/main/jni/flac
# FFmpeg decoder extension
libraries/decoder_ffmpeg/src/main/jni/ffmpeg
# FFmpeg extension
extensions/ffmpeg/src/main/jni/ffmpeg
# Cronet datasource extension
libraries/datasource_cronet/jniLibs/*
!libraries/datasource_cronet/jniLibs/README.md
libraries/datasource_cronet/libs/*
!libraries/datasource_cronet/libs/README.md
# Cronet extension
extensions/cronet/jniLibs/*
!extensions/cronet/jniLibs/README.md
extensions/cronet/libs/*
!extensions/cronet/libs/README.md
# MIDI decoder extension
libraries/decoder_midi/lib
# MIDI extension
extensions/midi/lib

View file

@ -100,6 +100,12 @@ compileOptions {
}
```
#### 3. Enable multidex
If your Gradle `minSdkVersion` is 20 or lower, you should
[enable multidex](https://developer.android.com/studio/build/multidex) in order
to prevent build errors.
### Locally
Cloning the repository and depending on the modules locally is required when
@ -110,6 +116,7 @@ First, clone the repository into a local directory:
```sh
git clone https://github.com/androidx/media.git
cd media
```
Next, add the following to your project's `settings.gradle.kts` file, replacing
@ -123,7 +130,7 @@ apply(from = file("path/to/media/core_settings.gradle"))
Or in Gradle Groovy DSL `settings.gradle`:
```groovy
gradle.ext.androidxMediaModulePrefix = 'media3-'
gradle.ext.androidxMediaModulePrefix = 'media-'
apply from: file("path/to/media/core_settings.gradle")
```
@ -132,17 +139,17 @@ You can depend on them from `build.gradle.kts` as you would on any other local
module, for example:
```kotlin
implementation(project(":media3-lib-exoplayer"))
implementation(project(":media3-lib-exoplayer-dash"))
implementation(project(":media3-lib-ui"))
implementation(project(":media-lib-exoplayer"))
implementation(project(":media-lib-exoplayer-dash"))
implementation(project(":media-lib-ui"))
```
Or in Gradle Groovy DSL `build.gradle`:
```groovy
implementation project(':media3-lib-exoplayer')
implementation project(':media3-lib-exoplayer-dash')
implementation project(':media3-lib-ui')
implementation project(':media-lib-exoplayer')
implementation project(':media-lib-exoplayer-dash')
implementation project(':media-lib-ui')
```
#### MIDI module

View file

@ -1,423 +1,7 @@
# Release notes
## 1.5
### 1.5.1 (2024-12-19)
This release includes the following changes since the
[1.5.0 release](#150-2024-11-27):
* ExoPlayer:
* Disable use of asynchronous decryption in MediaCodec to avoid reported
codec timeout issues with this platform API
([#1641](https://github.com/androidx/media/issues/1641)).
* Extractors:
* MP3: Don't stop playback early when a `VBRI` frame's table of contents
doesn't cover all the MP3 data in a file
([#1904](https://github.com/androidx/media/issues/1904)).
* Video:
* Rollback of using `MediaCodecAdapter` supplied pixel aspect ratio values
when provided while processing `onOutputFormatChanged`
([#1371](https://github.com/androidx/media/pull/1371)).
* Text:
* Fix bug in `ReplacingCuesResolver.discardCuesBeforeTimeUs` where the cue
active at `timeUs` (started before but not yet ended) was incorrectly
discarded ([#1939](https://github.com/androidx/media/issues/1939)).
* Metadata:
* Extract disc/track numbering and genre from Vorbis comments into
`MediaMetadata`
([#1958](https://github.com/androidx/media/issues/1958)).
### 1.5.0 (2024-11-27)
This release includes the following changes since the
[1.4.1 release](#141-2024-08-23):
* Common Library:
* Add `ForwardingSimpleBasePlayer` that allows forwarding to another
player with small adjustments while ensuring full consistency and
listener handling
([#1183](https://github.com/androidx/media/issues/1183)).
* Replace `SimpleBasePlayer.State.playlist` by `getPlaylist()` method.
* Add override for `SimpleBasePlayer.State.Builder.setPlaylist()` to
directly specify a `Timeline` and current `Tracks` and `Metadata`
instead of building a playlist structure.
* Increase `minSdk` to 21 (Android Lollipop). This is aligned with all
other AndroidX libraries.
* Add `androidx.media3:media3-common-ktx` artifact which provides
Kotlin-specific functionality built on top of the Common library
* Add `Player.listen` suspending extension function to spin a coroutine to
listen to `Player.Events` to the `media3-common-ktx` library.
* Remove `@DoNotInline` annotations from manually out-of-lined inner
classes designed to avoid
[runtime class verification failures](https://chromium.googlesource.com/chromium/src/+/HEAD/build/android/docs/class_verification_failures.md).
Recent versions of [R8](https://developer.android.com/build/shrink-code)
now automatically out-of-line calls like these to avoid the runtime
failures (so the manual out-of-lining is no longer required). All Gradle
users of the library must already be a using a version of the Android
Gradle Plugin that uses a version of R8 which does this,
[due to `compileSdk = 35`](https://issuetracker.google.com/345472586#comment7).
Users of the library with non-Gradle build systems will need to ensure
their R8-equivalent shrinking/obfuscating step does a similar automatic
out-of-lining process in order to avoid runtime class verification
failures. This change has
[already been done in other AndroidX libraries](http://r.android.com/3156141).
* ExoPlayer:
* `MediaCodecRenderer.onProcessedStreamChange()` can now be called for
every media item. Previously it was not called for the first one. Use
`MediaCodecRenderer.experimentalEnableProcessedStreamChangedAtStart()`
to enable this.
* Add `PreloadMediaSource.PreloadControl.onPreloadError` to allow
`PreloadMediaSource.PreloadControl` implementations to take actions when
error occurs.
* Add `BasePreloadManager.Listener` to propagate preload events to apps.
* Allow changing SNTP client timeout and retry alternative addresses on
timeout ([#1540](https://github.com/androidx/media/issues/1540)).
* Remove `MediaCodecAdapter.Configuration.flags` as the field was always
zero.
* Allow the user to select the built-in speaker for playback on Wear OS
API 35+ (where the device advertises support for this).
* Defer the blocking call to
`Context.getSystemService(Context.AUDIO_SERVICE)` until audio focus
handling is enabled. This ensures the blocking call isn't done if audio
focus handling is not enabled
([#1616](https://github.com/androidx/media/pull/1616)).
* Allow playback regardless of buffered duration when loading fails
([#1571](https://github.com/androidx/media/issues/1571)).
* Add `AnalyticsListener.onRendererReadyChanged()` to signal when
individual renderers allow playback to be ready.
* Fix `MediaCodec.CryptoException` sometimes being reported as an
"unexpected runtime error" when `MediaCodec` is operated in asynchronous
mode (default behaviour on API 31+).
* Pass `bufferedDurationUs` instead of `bufferedPositionUs` with
`PreloadMediaSource.PreloadControl.onContinueLoadingRequested()`. Also
changes `DefaultPreloadManager.Status.STAGE_LOADED_TO_POSITION_MS` to
`DefaultPreloadManager.Status.STAGE_LOADED_FOR_DURATION_MS`, apps then
need to pass a value representing a specific duration from the default
start position for which the corresponding media source has to be
preloaded with this IntDef, instead of a position.
* Add `ForwardingRenderer` implementation that forwards all method calls
to another renderer
([1703](https://github.com/androidx/media/pull/1703)).
* Add playlist preloading for the next item in the playlist. Apps can
enable preloading by calling
`ExoPlayer.setPreloadConfiguration(PreloadConfiguration)` accordingly.
By default preloading is disabled. When opted-in and to not interfere
with playback, `DefaultLoadControl` restricts preloading to start and
continue only when the player is not loading for playback. Apps can
change this behaviour by implementing
`LoadControl.shouldContinuePreloading()` accordingly (like when
overriding this method in `DefaultLoadControl`). The default
implementation of `LoadControl` disables preloading in case an app is
using a custom implementation of `LoadControl`.
* Add method `MediaSourceEventListener.EventDispatcher.dispatchEvent()` to
allow invoking events of subclass listeners
([1736](https://github.com/androidx/media/pull/1736)).
* Add `DefaultPreloadManager.Builder` that builds the
`DefaultPreloadManager` and `ExoPlayer` instances with consistently
shared configurations.
* Remove `Renderer[]` parameter from `LoadControl.onTracksSelected()` as
`DefaultLoadControl` implementation can retrieve the stream types from
`ExoTrackSelection[]`.
* Deprecated `DefaultLoadControl.calculateTargetBufferBytes(Renderer[],
ExoTrackSelection[])` and marked method as final to prevent overrides.
The new
`DefaultLoadControl.calculateTargetBufferBytes(ExoTrackSelection[])`
should be used instead.
* Report `MediaSourceEventListener` events from secondary sources in
`MergingMediaSource`. This will result in load
start/error/cancelled/completed events being reported for sideloaded
subtitles (those added with
`MediaItem.LocalConfiguration.subtitleConfigurations`), which may appear
as duplicate load events emitted from `AnalyticsListener`.
* Prevent subtitle & metadata errors from completely stopping playback.
Instead the problematic track is disabled and playback of the remaining
tracks continues
([#1722](https://github.com/google/ExoPlayer/issues/1722)).
* In new subtitle handling (during extraction), associated parse (e.g.
invalid subtitle data) and load errors (e.g. HTTP 404) are emitted
via `onLoadError` callbacks.
* In legacy subtitle handling (during rendering), only associated load
errors are emitted via `onLoadError` callbacks while parse errors
are silently ignored (this is pre-existing behaviour).
* Fix bug where playlist items or periods in multi-period DASH streams
with durations that don't match the actual content could cause frame
freezes at the end of the item
([#1698](https://github.com/androidx/media/issues/1698)).
* Add a setter to `SntpClient` to set the max elapsed time since the last
update after which the client is re-initialized
([#1794](https://github.com/androidx/media/pull/1794)).
* Transformer:
* Add `SurfaceAssetLoader`, which supports queueing video data to
Transformer via a `Surface`.
* `ImageAssetLoader` reports unsupported input via `AssetLoader.onError`
instead of throwing an `IllegalStateException`.
* Make setting the image duration using
`MediaItem.Builder.setImageDurationMs` mandatory for image export.
* Add export support for gaps in sequences of audio EditedMediaItems.
* Track Selection:
* `DefaultTrackSelector`: Prefer object-based audio over channel-based
audio when other factors are equal.
* Extractors:
* Allow `Mp4Extractor` and `FragmentedMp4Extractor` to identify H264
samples that are not used as reference by subsequent samples.
* Add option to enable index-based seeking in `AmrExtractor`.
* Treat MP3 files with more than 128kB between valid frames as truncated
(instead of invalid). This means files with non-MP3 data at the end,
with no other metadata to indicate the length of the MP3 bytes, now stop
playback at the end of the MP3 data instead of failing with
`ParserException: Searched too many bytes.{contentIsMalformed=true,
dataType=1}` ([#1563](https://github.com/androidx/media/issues/1563)).
* Fix preroll sample handling for non-keyframe media start positions when
processing edit lists in MP4 files
([#1659](https://github.com/google/ExoPlayer/issues/1659)).
* Improved frame rate calculation by using media duration from the `mdhd`
box in `Mp4Extractor` and `FragmentedMp4Extractor`
([#1531](https://github.com/androidx/media/issues/1531)).
* Fix incorrect scaling of `media_time` in MP4 edit lists. While
`segment_duration` was already correctly scaled using the movie
timescale, `media_time` is now properly scaled using the track
timescale, as specified by the MP4 format standard
([#1792](https://github.com/androidx/media/issues/1792)).
* Handle out-of-order frames in `endIndices` calculation for MP4 with edit
list ([#1797](https://github.com/androidx/media/issues/1797)).
* Fix media duration parsing in `mdhd` box of MP4 files to handle `-1`
values ([#1819](https://github.com/androidx/media/issues/1819)).
* Add support for identifying `h263` box in MP4 files for H.263 video
([#1821](https://github.com/androidx/media/issues/1821)).
* Add AC-4 Level-4 ISO base media file format support
([#1265](https://github.com/androidx/media/pull/1265)).
* DataSource:
* Update `HttpEngineDataSource` to allow use starting at version S
extension 7 instead of API level 34
([#1262](https://github.com/androidx/media/issues/1262)).
* `DataSourceContractTest`: Assert that `DataSource.getUri()` returns the
resolved URI (as documented). Where this is different to the requested
URI, tests can indicate this using the new
`DataSourceContractTest.TestResource.Builder.setResolvedUri()` method.
* `DataSourceContractTest`: Assert that `DataSource.getUri()` and
`getResponseHeaders()` return their 'open' value after a failed call to
`open()` (due to a 'not found' resource) and before a subsequent
`close()` call.
* Overriding `DataSourceContractTest.getNotFoundResources()` allows
test sub-classes to provide multiple 'not found' resources, and to
provide any expected headers too. This allows to distinguish between
HTTP 404 (with headers) and "server not found" (no headers).
* Audio:
* Automatically configure CTA-2075 loudness metadata on the codec if
present in the media.
* Ensure smooth volume ramp down when seeking.
* Fix pop sounds that may occur during seeks.
* Fix truncation error accumulation for Sonic's
time-stretching/pitch-shifting algorithm.
* Fix bug in `SpeedChangingAudioProcessor` that causes dropped output
frames.
* Video:
* `MediaCodecVideoRenderer` avoids decoding samples that are neither
rendered nor used as reference by other samples.
* On API 35 and above, `MediaCodecAdapter` may now receive a `null`
`Surface` in `configure` and calls to a new method `detachOutputSurface`
to remove a previously set `Surface` if the codec supports this
(`MediaCodecInfo.detachedSurfaceSupported`).
* Use `MediaCodecAdapter` supplied pixel aspect ratio values if provided
when processing `onOutputFormatChanged`
([#1371](https://github.com/androidx/media/pull/1371)).
* Add workaround for a device issue on Galaxy Tab S7 FE that causes 60fps
secure H264 streams to be marked as unsupported
([#1619](https://github.com/androidx/media/issues/1619)).
* Add workaround for codecs that get stuck after the last sample without
returning an end-of-stream signal.
* Text:
* Add a custom `VoiceSpan` and populate it for
[WebVTT voice spans](https://www.w3.org/TR/webvtt1/#webvtt-cue-voice-span)
([#1632](https://github.com/androidx/media/issues/1632)).
* Ensure WebVTT in HLS with very large subtitle timestamps (which overflow
a 64-bit `long` when represented as microseconds and multiplied by the
`90,000` MPEG timebase) are displayed
([#1763](https://github.com/androidx/media/issues/1763)).
* Support CEA-608 subtitles in Dolby Vision content
([#1820](https://github.com/androidx/media/issues/1820)).
* Fix playback hanging on DASH multi-period streams when CEA-608 subtitles
are enabled ([#1863](https://github.com/androidx/media/issues/1863)).
* Metadata:
* Assign the `C.TRACK_TYPE_METADATA` type to tracks containing icy or
vnd.dvb.ait content.
* Image:
* Add `ExternallyLoadedImageDecoder` for simplified integration with
external image loading libraries like Glide or Coil.
* DataSource:
* Add `FileDescriptorDataSource`, a new `DataSource` that can be used to
read from a `FileDescriptor`
([#3757](https://github.com/google/ExoPlayer/issues/3757)).
* Effect:
* Add `DefaultVideoFrameProcessor` workaround for minor `SurfaceTexture`
scaling. `SurfaceTexture` may include a small scaling that cuts off a
1-texel border around the edge of a cropped buffer. This is now handled
such that output is closer to expected.
* Speed up `DefaultVideoFrameProcessor.queueInputBitmap()`. As a result,
exporting images to videos with `Transformer` is faster.
* IMA extension:
* Fix bug where clearing the playlist may cause an
`ArrayIndexOutOfBoundsException` in
`ImaServerSideAdInsertionMediaSource`.
* Fix bug where server-side inserted DAI streams without a preroll can
result in an `ArrayIndexOutOfBoundsException` when playing past the last
midroll ([#1741](https://github.com/androidx/media/issues/1741)).
* Session:
* Add `MediaButtonReceiver.shouldStartForegroundService(Intent)` to allow
apps to suppress a play command coming in for playback resumption by
overriding this method. By default, the service is always started and
playback can't be suppressed without the system crashing the service
with a `ForegroundServiceDidNotStartInTimeException`
([#1528](https://github.com/google/ExoPlayer/issues/1528)).
* Fix bug that caused custom commands sent from a `MediaBrowser` being
dispatched to the `MediaSessionCompat.Callback` instead of the
`MediaBrowserServiceCompat` variant of the method when connected to a
legacy service. This prevented the `MediaBrowser` to receive the actual
return value sent back by the legacy service
([#1474](https://github.com/androidx/media/issues/1474)).
* Handle `IllegalArgumentException` thrown by devices of certain
manufacturers when setting the broadcast receiver for media button
intents ([#1730](https://github.com/androidx/media/issues/1730)).
* Add command buttons for media items. This adds the Media3 API for what
was known as `Custom browse actions` with the legacy library with
`MediaBrowserCompat`. Note that with Media3 command buttons for media
items are available for both, `MediaBrowser` and `MediaController`. See
[Custom Browse actions of AAOS](https://developer.android.com/training/cars/media#custom_browse_actions).
* Fix bug where a Media3 controller was sometimes unable to let a session
app start a foreground service after requesting `play()`.
* Restrict `CommandButton.Builder.setIconUri` to only accept content Uris.
* Pass connection hints of a Media3 browser to the initial
`MediaBrowserCompat` when connecting to a legacy `MediaBrowserCompat`.
The service can receive the connection hints passed in as root hints
with the first call to `onGetRoot()`.
* Fix bug where a `MediaBrowser` connected to a legacy browser service,
didn't receive an error sent by the service after the browser has
subscribed to a `parentid`.
* Improve interoperability behavior, so that a Media3 browser that is
connected to a legacy `MediaBrowserService` doesn't request the children
of a `parentId` twice when subscribing to a parent.
* UI:
* Make the stretched/cropped video in
`PlayerView`-in-Compose-`AndroidView` workaround opt-in, due to issues
with XML-based shared transitions. Apps using `PlayerView` inside
`AndroidView` need to call
`PlayerView.setEnableComposeSurfaceSyncWorkaround` in order to opt-in
([#1237](https://github.com/androidx/media/issues/1237),
[#1594](https://github.com/androidx/media/issues/1594)).
* Add `setFullscreenButtonState` to `PlayerView` to allow updates of
fullscreen button's icon on demand, i.e. out-of-band and not reactively
to a click interaction
([#1590](https://github.com/androidx/media/issues/1590),
[#184](https://github.com/androidx/media/issues/184)).
* Fix bug where the "None" choice in the text selection is not working if
there are app-defined text track selection preferences.
* DASH Extension:
* Add support for periods starting in the middle of a segment
([#1440](https://github.com/androidx/media/issues/1440)).
* Smooth Streaming Extension:
* Fix a `Bad magic number for Bundle` error when playing SmoothStreaming
streams with text tracks
([#1779](https://github.com/androidx/media/issues/1779)).
* RTSP Extension:
* Fix user info removal for URLs that contain encoded @ characters
([#1138](https://github.com/androidx/media/pull/1138)).
* Fix crashing when parsing of RTP packets with header extensions
([#1225](https://github.com/androidx/media/pull/1225)).
* Decoder Extensions (FFmpeg, VP9, AV1, etc.):
* Add the IAMF decoder module, which provides support for playback of MP4
files containing IAMF tracks using the libiamf native library to
synthesize audio.
* Playback is enabled with a stereo layout as well as 5.1 with
spatialization together with optional head tracking enabled, but
binaural playback support is currently not available.
* Add 16 KB page support for decoder extensions on Android 15
([#1685](https://github.com/androidx/media/issues/1685)).
* Cast Extension:
* Stop cleaning the timeline after the CastSession disconnects, which
enables the sender app to resume playback locally after a disconnection.
* Populate CastPlayer's `DeviceInfo` when a `Context` is provided. This
enables linking the `MediaSession` to a `RoutingSession`, which is
necessary for integrating Output Switcher
([#1056](https://github.com/androidx/media/issues/1056)).
* Test Utilities:
* `DataSourceContractTest` now includes tests to verify:
* Input stream `read position` is updated.
* Output buffer `offset` is applied correctly.
* Demo app
* Resolve the memory leaks in demo short-form app
([#1839](https://github.com/androidx/media/issues/1839)).
* Remove deprecated symbols:
* Remove deprecated `Player.hasPrevious`, `Player.hasPreviousWindow()`.
Use `Player.hasPreviousMediaItem()` instead.
* Remove deprecated `Player.previous()`method. Use
`Player.seekToPreviousMediaItem()` instead.
* Remove deprecated `DrmSessionEventListener.onDrmSessionAcquired` method.
* Remove deprecated `DefaultEncoderFactory` constructors. Use
`DefaultEncoderFactory.Builder` instead.
### 1.5.0-rc02 (2024-11-19)
Use the 1.5.0 [stable version](#150-2024-11-27).
### 1.5.0-rc01 (2024-11-13)
Use the 1.5.0 [stable version](#150-2024-11-27).
### 1.5.0-beta01 (2024-10-30)
Use the 1.5.0 [stable version](#150-2024-11-27).
### 1.5.0-alpha01 (2024-09-06)
Use the 1.5.0 [stable version](#150-2024-11-27).
## 1.4
### 1.4.1 (2024-08-23)
This release includes the following changes since the
[1.4.0 release](#140-2024-07-24):
* ExoPlayer:
* Handle preload callbacks asynchronously in `PreloadMediaSource`
([#1568](https://github.com/androidx/media/issues/1568)).
* Allow playback regardless of buffered duration when loading fails
([#1571](https://github.com/androidx/media/issues/1571)).
* Extractors:
* MP3: Fix `Searched too many bytes` error by correctly ignoring trailing
non-MP3 data based on the length field in an `Info` frame
([#1480](https://github.com/androidx/media/issues/1480)).
* Text:
* TTML: Fix handling of percentage `tts:fontSize` values to ensure they
are correctly inherited from parent nodes with percentage `tts:fontSize`
values.
* Fix `IndexOutOfBoundsException` in `LegacySubtitleUtil` due to
incorrectly handling the case of the requested output start time being
greater than or equal to the final event time in the `Subtitle`
([#1516](https://github.com/androidx/media/issues/1516)).
* DRM:
* Fix `android.media.MediaCodec$CryptoException: Operation not supported
in this configuration: ERROR_DRM_CANNOT_HANDLE` error on API 31+ devices
playing L1 Widevine content. This error is caused by an incomplete
implementation of the framework
[`MediaDrm.requiresSecureDecoder`](https://developer.android.com/reference/android/media/MediaDrm#requiresSecureDecoder\(java.lang.String\))
method ([#1603](https://github.com/androidx/media/issues/1603)).
* Effect:
* Add a `release()` method to `GlObjectsProvider`.
* Session:
* Transform a double-tap of `KEYCODE_HEADSETHOOK` into a 'seek to next'
action, as
[documented](https://developer.android.com/reference/androidx/media3/session/MediaSession#media-key-events-mapping)
([#1493](https://github.com/androidx/media/issues/1493)).
* Handle `KEYCODE_HEADSETHOOK` as a 'play' command in
`MediaButtonReceiver` when deciding whether to ignore it to avoid a
`ForegroundServiceDidNotStartInTimeException`
([#1581](https://github.com/androidx/media/issues/1581)).
* RTSP Extension:
* Skip invalid Media Descriptions in SDP parsing
([#1087](https://github.com/androidx/media/issues/1472)).
### 1.4.0 (2024-07-24)
This release includes the following changes since the

10
api.txt
View file

@ -26,7 +26,7 @@ package androidx.media3.common {
}
public final class AudioAttributes {
method public androidx.media3.common.AudioAttributes.AudioAttributesV21 getAudioAttributesV21();
method @RequiresApi(21) public androidx.media3.common.AudioAttributes.AudioAttributesV21 getAudioAttributesV21();
field public static final androidx.media3.common.AudioAttributes DEFAULT;
field @androidx.media3.common.C.AudioAllowedCapturePolicy public final int allowedCapturePolicy;
field @androidx.media3.common.C.AudioContentType public final int contentType;
@ -35,7 +35,7 @@ package androidx.media3.common {
field @androidx.media3.common.C.AudioUsage public final int usage;
}
public static final class AudioAttributes.AudioAttributesV21 {
@RequiresApi(21) public static final class AudioAttributes.AudioAttributesV21 {
field public final android.media.AudioAttributes audioAttributes;
}
@ -79,7 +79,6 @@ package androidx.media3.common {
field public static final java.util.UUID PLAYREADY_UUID;
field public static final float RATE_UNSET = -3.4028235E38f;
field public static final int ROLE_FLAG_ALTERNATE = 2; // 0x2
field public static final int ROLE_FLAG_AUXILIARY = 32768; // 0x8000
field public static final int ROLE_FLAG_CAPTION = 64; // 0x40
field public static final int ROLE_FLAG_COMMENTARY = 8; // 0x8
field public static final int ROLE_FLAG_DESCRIBES_MUSIC_AND_SOUND = 1024; // 0x400
@ -157,7 +156,7 @@ package androidx.media3.common {
@IntDef(open=true, value={androidx.media3.common.C.CRYPTO_TYPE_UNSUPPORTED, androidx.media3.common.C.CRYPTO_TYPE_NONE, androidx.media3.common.C.CRYPTO_TYPE_FRAMEWORK}) @java.lang.annotation.Documented @java.lang.annotation.Retention(java.lang.annotation.RetentionPolicy.SOURCE) @java.lang.annotation.Target(java.lang.annotation.ElementType.TYPE_USE) public static @interface C.CryptoType {
}
@IntDef(flag=true, value={androidx.media3.common.C.ROLE_FLAG_MAIN, androidx.media3.common.C.ROLE_FLAG_ALTERNATE, androidx.media3.common.C.ROLE_FLAG_SUPPLEMENTARY, androidx.media3.common.C.ROLE_FLAG_COMMENTARY, androidx.media3.common.C.ROLE_FLAG_DUB, androidx.media3.common.C.ROLE_FLAG_EMERGENCY, androidx.media3.common.C.ROLE_FLAG_CAPTION, androidx.media3.common.C.ROLE_FLAG_SUBTITLE, androidx.media3.common.C.ROLE_FLAG_SIGN, androidx.media3.common.C.ROLE_FLAG_DESCRIBES_VIDEO, androidx.media3.common.C.ROLE_FLAG_DESCRIBES_MUSIC_AND_SOUND, androidx.media3.common.C.ROLE_FLAG_ENHANCED_DIALOG_INTELLIGIBILITY, androidx.media3.common.C.ROLE_FLAG_TRANSCRIBES_DIALOG, androidx.media3.common.C.ROLE_FLAG_EASY_TO_READ, androidx.media3.common.C.ROLE_FLAG_TRICK_PLAY, androidx.media3.common.C.ROLE_FLAG_AUXILIARY}) @java.lang.annotation.Documented @java.lang.annotation.Retention(java.lang.annotation.RetentionPolicy.SOURCE) @java.lang.annotation.Target({java.lang.annotation.ElementType.FIELD, java.lang.annotation.ElementType.METHOD, java.lang.annotation.ElementType.PARAMETER, java.lang.annotation.ElementType.LOCAL_VARIABLE, java.lang.annotation.ElementType.TYPE_USE}) public static @interface C.RoleFlags {
@IntDef(flag=true, value={androidx.media3.common.C.ROLE_FLAG_MAIN, androidx.media3.common.C.ROLE_FLAG_ALTERNATE, androidx.media3.common.C.ROLE_FLAG_SUPPLEMENTARY, androidx.media3.common.C.ROLE_FLAG_COMMENTARY, androidx.media3.common.C.ROLE_FLAG_DUB, androidx.media3.common.C.ROLE_FLAG_EMERGENCY, androidx.media3.common.C.ROLE_FLAG_CAPTION, androidx.media3.common.C.ROLE_FLAG_SUBTITLE, androidx.media3.common.C.ROLE_FLAG_SIGN, androidx.media3.common.C.ROLE_FLAG_DESCRIBES_VIDEO, androidx.media3.common.C.ROLE_FLAG_DESCRIBES_MUSIC_AND_SOUND, androidx.media3.common.C.ROLE_FLAG_ENHANCED_DIALOG_INTELLIGIBILITY, androidx.media3.common.C.ROLE_FLAG_TRANSCRIBES_DIALOG, androidx.media3.common.C.ROLE_FLAG_EASY_TO_READ, androidx.media3.common.C.ROLE_FLAG_TRICK_PLAY}) @java.lang.annotation.Documented @java.lang.annotation.Retention(java.lang.annotation.RetentionPolicy.SOURCE) @java.lang.annotation.Target({java.lang.annotation.ElementType.FIELD, java.lang.annotation.ElementType.METHOD, java.lang.annotation.ElementType.PARAMETER, java.lang.annotation.ElementType.LOCAL_VARIABLE, java.lang.annotation.ElementType.TYPE_USE}) public static @interface C.RoleFlags {
}
@IntDef(flag=true, value={androidx.media3.common.C.SELECTION_FLAG_DEFAULT, androidx.media3.common.C.SELECTION_FLAG_FORCED, androidx.media3.common.C.SELECTION_FLAG_AUTOSELECT}) @java.lang.annotation.Documented @java.lang.annotation.Retention(java.lang.annotation.RetentionPolicy.SOURCE) @java.lang.annotation.Target({java.lang.annotation.ElementType.FIELD, java.lang.annotation.ElementType.METHOD, java.lang.annotation.ElementType.PARAMETER, java.lang.annotation.ElementType.LOCAL_VARIABLE, java.lang.annotation.ElementType.TYPE_USE}) public static @interface C.SelectionFlags {
@ -549,7 +548,6 @@ package androidx.media3.common {
field public static final String APPLICATION_PGS = "application/pgs";
field @Deprecated public static final String APPLICATION_RAWCC = "application/x-rawcc";
field public static final String APPLICATION_RTSP = "application/x-rtsp";
field public static final String APPLICATION_SDP = "application/sdp";
field public static final String APPLICATION_SS = "application/vnd.ms-sstr+xml";
field public static final String APPLICATION_SUBRIP = "application/x-subrip";
field public static final String APPLICATION_TTML = "application/ttml+xml";
@ -1192,7 +1190,7 @@ package androidx.media3.common {
field public static final androidx.media3.common.VideoSize UNKNOWN;
field @IntRange(from=0) public final int height;
field @FloatRange(from=0, fromInclusive=false) public final float pixelWidthHeightRatio;
field @Deprecated @IntRange(from=0, to=359) public final int unappliedRotationDegrees;
field @IntRange(from=0, to=359) public final int unappliedRotationDegrees;
field @IntRange(from=0) public final int width;
}

View file

@ -19,7 +19,7 @@ buildscript {
dependencies {
classpath 'com.android.tools.build:gradle:8.3.2'
classpath 'com.google.android.gms:strict-version-matcher-plugin:1.2.4'
classpath 'org.jetbrains.kotlin:kotlin-gradle-plugin:1.9.10'
classpath 'org.jetbrains.kotlin:kotlin-gradle-plugin:1.9.0'
}
}
allprojects {

View file

@ -14,8 +14,6 @@
apply from: "$gradle.ext.androidxMediaSettingsDir/constants.gradle"
apply plugin: 'com.android.library'
group = 'androidx.media3'
android {
compileSdkVersion project.ext.compileSdkVersion
@ -27,6 +25,7 @@ android {
aarMetadata {
minCompileSdk = project.ext.compileSdkVersion
}
multiDexEnabled true
}
compileOptions {
@ -41,3 +40,7 @@ android {
unitTests.includeAndroidResources true
}
}
dependencies {
androidTestImplementation 'androidx.multidex:multidex:' + androidxMultidexVersion
}

View file

@ -12,24 +12,23 @@
// See the License for the specific language governing permissions and
// limitations under the License.
project.ext {
releaseVersion = '1.5.1'
releaseVersionCode = 1_005_001_3_00
minSdkVersion = 21
releaseVersion = '1.4.0'
releaseVersionCode = 1_004_000_3_00
minSdkVersion = 19
// See https://developer.android.com/training/cars/media/automotive-os#automotive-module
automotiveMinSdkVersion = 28
appTargetSdkVersion = 34
// Upgrading this requires [Internal ref: b/193254928] to be fixed, or some
// additional robolectric config.
targetSdkVersion = 30
compileSdkVersion = 35
compileSdkVersion = 34
dexmakerVersion = '2.28.3'
// Use the same JUnit version as the Android repo:
// https://cs.android.com/android/platform/superproject/main/+/main:external/junit/METADATA
junitVersion = '4.13.2'
// Use the same Guava version as the Android repo:
// https://cs.android.com/android/platform/superproject/main/+/main:external/guava/METADATA
guavaVersion = '33.3.1-android'
glideVersion = '4.14.2'
guavaVersion = '33.0.0-android'
kotlinxCoroutinesVersion = '1.8.1'
leakCanaryVersion = '2.10'
mockitoVersion = '3.12.4'
@ -39,15 +38,18 @@ project.ext {
errorProneVersion = '2.18.0'
jsr305Version = '3.0.2'
kotlinAnnotationsVersion = '1.9.0'
androidxAnnotationVersion = '1.6.0'
// Updating this to 1.4.0+ will import Kotlin stdlib [internal ref: b/277891049].
androidxAnnotationVersion = '1.3.0'
androidxAnnotationExperimentalVersion = '1.3.1'
androidxAppCompatVersion = '1.6.1'
androidxCollectionVersion = '1.2.0'
androidxConstraintLayoutVersion = '2.1.4'
// Updating this to 1.9.0+ will import Kotlin stdlib [internal ref: b/277891049].
androidxCoreVersion = '1.8.0'
androidxExifInterfaceVersion = '1.3.6'
androidxLifecycleVersion = '2.6.0'
androidxMediaVersion = '1.7.0'
androidxMultidexVersion = '2.0.1'
androidxRecyclerViewVersion = '1.3.0'
androidxMaterialVersion = '1.8.0'
androidxTestCoreVersion = '1.5.0'

View file

@ -24,9 +24,6 @@ if (gradle.ext.has('androidxMediaModulePrefix')) {
include modulePrefix + 'lib-common'
project(modulePrefix + 'lib-common').projectDir = new File(rootDir, 'libraries/common')
include modulePrefix + 'lib-common-ktx'
project(modulePrefix + 'lib-common-ktx').projectDir = new File(rootDir, 'libraries/common_ktx')
include modulePrefix + 'lib-container'
project(modulePrefix + 'lib-container').projectDir = new File(rootDir, 'libraries/container')
@ -75,8 +72,6 @@ include modulePrefix + 'lib-decoder-ffmpeg'
project(modulePrefix + 'lib-decoder-ffmpeg').projectDir = new File(rootDir, 'libraries/decoder_ffmpeg')
include modulePrefix + 'lib-decoder-flac'
project(modulePrefix + 'lib-decoder-flac').projectDir = new File(rootDir, 'libraries/decoder_flac')
include modulePrefix + 'lib-decoder-iamf'
project(modulePrefix + 'lib-decoder-iamf').projectDir = new File(rootDir, 'libraries/decoder_iamf')
if (gradle.ext.has('androidxMediaEnableMidiModule') && gradle.ext.androidxMediaEnableMidiModule) {
include modulePrefix + 'lib-decoder-midi'
project(modulePrefix + 'lib-decoder-midi').projectDir = new File(rootDir, 'libraries/decoder_midi')

View file

@ -29,6 +29,7 @@ android {
versionCode project.ext.releaseVersionCode
minSdkVersion project.ext.minSdkVersion
targetSdkVersion project.ext.appTargetSdkVersion
multiDexEnabled true
}
buildTypes {
@ -61,6 +62,7 @@ dependencies {
implementation project(modulePrefix + 'lib-ui')
implementation project(modulePrefix + 'lib-cast')
implementation 'androidx.appcompat:appcompat:' + androidxAppCompatVersion
implementation 'androidx.multidex:multidex:' + androidxMultidexVersion
implementation 'androidx.recyclerview:recyclerview:' + androidxRecyclerViewVersion
implementation 'com.google.android.material:material:' + androidxMaterialVersion
}

View file

@ -23,6 +23,7 @@
<uses-sdk/>
<application
android:name="androidx.multidex.MultiDexApplication"
android:label="@string/application_name"
android:icon="@mipmap/ic_launcher"
android:largeHeap="true"

View file

@ -1,11 +1,11 @@
/*
* Copyright 2024 The Android Open Source Project
* Copyright 2020 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@ -13,7 +13,11 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
@NonNullApi
package androidx.media3.decoder.iamf;
package androidx.media3.demo.cast;
import androidx.media3.common.util.NonNullApi;
import androidx.multidex.MultiDexApplication;
// Note: Multidex is enabled in code not AndroidManifest.xml because the internal build system
// doesn't dejetify MultiDexApplication in AndroidManifest.xml.
/** Application for multidex support. */
public final class DemoApplication extends MultiDexApplication {}

View file

@ -32,7 +32,7 @@ android {
defaultConfig {
versionName project.ext.releaseVersion
versionCode project.ext.releaseVersionCode
minSdkVersion project.ext.minSdkVersion
minSdkVersion 21
targetSdkVersion project.ext.appTargetSdkVersion
}
@ -56,13 +56,7 @@ android {
compose true
}
composeOptions {
kotlinCompilerExtensionVersion = "1.5.3"
}
testOptions {
unitTests {
includeAndroidResources = true
}
kotlinCompilerExtensionVersion = "1.5.0"
}
}
@ -79,9 +73,4 @@ dependencies {
// For detecting and debugging leaks only. LeakCanary is not needed for demo app to work.
debugImplementation 'com.squareup.leakcanary:leakcanary-android:' + leakCanaryVersion
testImplementation 'org.jetbrains.kotlinx:kotlinx-coroutines-test:' + kotlinxCoroutinesVersion
testImplementation 'org.robolectric:robolectric:' + robolectricVersion
testImplementation project(modulePrefix + 'test-utils')
}

View file

@ -13,7 +13,7 @@
See the License for the specific language governing permissions and
limitations under the License.
-->
<resources>
<resources xmlns:tools="http://schemas.android.com/tools">
<!-- Base application theme. -->
<style name="Theme.Media3ComposeDemo" parent="Theme.MaterialComponents.DayNight.DarkActionBar">
<!-- Primary brand color. -->
@ -25,7 +25,9 @@
<item name="colorSecondaryVariant">@color/teal_200</item>
<item name="colorOnSecondary">@color/black</item>
<!-- Status bar color. -->
<item name="android:statusBarColor">?attr/colorPrimaryVariant</item>
<item name="android:statusBarColor" tools:targetApi="l">
?attr/colorPrimaryVariant
</item>
<!-- Customize your theme here. -->
</style>
</resources>

View file

@ -13,7 +13,7 @@
See the License for the specific language governing permissions and
limitations under the License.
-->
<resources>
<resources xmlns:tools="http://schemas.android.com/tools">
<!-- Base application theme. -->
<style name="Theme.Media3ComposeDemo" parent="Theme.MaterialComponents.DayNight.DarkActionBar">
<!-- Primary brand color. -->
@ -25,7 +25,9 @@
<item name="colorSecondaryVariant">@color/teal_700</item>
<item name="colorOnSecondary">@color/black</item>
<!-- Status bar color. -->
<item name="android:statusBarColor">?attr/colorPrimaryVariant</item>
<item name="android:statusBarColor" tools:targetApi="l">
?attr/colorPrimaryVariant
</item>
<!-- Customize your theme here. -->
</style>
</resources>

View file

@ -29,8 +29,9 @@ android {
defaultConfig {
versionName project.ext.releaseVersion
versionCode project.ext.releaseVersionCode
minSdkVersion project.ext.minSdkVersion
minSdkVersion 21
targetSdkVersion project.ext.appTargetSdkVersion
multiDexEnabled true
}
buildTypes {
@ -54,9 +55,9 @@ dependencies {
implementation project(modulePrefix + 'lib-effect')
implementation project(modulePrefix + 'lib-exoplayer')
implementation project(modulePrefix + 'lib-exoplayer-dash')
implementation project(modulePrefix + 'lib-muxer')
implementation project(modulePrefix + 'lib-transformer')
implementation project(modulePrefix + 'lib-ui')
implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion
implementation 'androidx.multidex:multidex:' + androidxMultidexVersion
compileOnly 'org.checkerframework:checker-qual:' + checkerframeworkVersion
}

View file

@ -15,38 +15,23 @@
*/
package androidx.media3.demo.composition;
import static androidx.media3.transformer.Composition.HDR_MODE_EXPERIMENTAL_FORCE_INTERPRET_HDR_AS_SDR;
import static androidx.media3.transformer.Composition.HDR_MODE_KEEP_HDR;
import static androidx.media3.transformer.Composition.HDR_MODE_TONE_MAP_HDR_TO_SDR_USING_MEDIACODEC;
import static androidx.media3.transformer.Composition.HDR_MODE_TONE_MAP_HDR_TO_SDR_USING_OPEN_GL;
import android.app.Activity;
import android.content.DialogInterface;
import android.os.Bundle;
import android.view.LayoutInflater;
import android.view.View;
import android.widget.ArrayAdapter;
import android.widget.CheckBox;
import android.widget.Spinner;
import android.widget.Toast;
import androidx.annotation.Nullable;
import androidx.appcompat.app.AlertDialog;
import androidx.appcompat.app.AppCompatActivity;
import androidx.appcompat.widget.AppCompatButton;
import androidx.appcompat.widget.AppCompatCheckBox;
import androidx.appcompat.widget.AppCompatTextView;
import androidx.media3.common.Effect;
import androidx.media3.common.MediaItem;
import androidx.media3.common.MimeTypes;
import androidx.media3.common.PlaybackException;
import androidx.media3.common.Player;
import androidx.media3.common.audio.SonicAudioProcessor;
import androidx.media3.common.util.Clock;
import androidx.media3.common.util.Log;
import androidx.media3.common.util.Util;
import androidx.media3.effect.DebugTraceUtil;
import androidx.media3.effect.LanczosResample;
import androidx.media3.effect.Presentation;
import androidx.media3.effect.RgbFilter;
import androidx.media3.transformer.Composition;
import androidx.media3.transformer.CompositionPlayer;
@ -55,7 +40,6 @@ import androidx.media3.transformer.EditedMediaItemSequence;
import androidx.media3.transformer.Effects;
import androidx.media3.transformer.ExportException;
import androidx.media3.transformer.ExportResult;
import androidx.media3.transformer.InAppMuxer;
import androidx.media3.transformer.JsonUtil;
import androidx.media3.transformer.Transformer;
import androidx.media3.ui.PlayerView;
@ -65,7 +49,6 @@ import androidx.recyclerview.widget.RecyclerView;
import com.google.common.base.Stopwatch;
import com.google.common.base.Ticker;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableMap;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
@ -80,19 +63,6 @@ import org.json.JSONObject;
*/
public final class CompositionPreviewActivity extends AppCompatActivity {
private static final String TAG = "CompPreviewActivity";
private static final String AUDIO_URI =
"https://storage.googleapis.com/exoplayer-test-media-0/play.mp3";
private static final String SAME_AS_INPUT_OPTION = "same as input";
private static final ImmutableMap<String, @Composition.HdrMode Integer> HDR_MODE_DESCRIPTIONS =
new ImmutableMap.Builder<String, @Composition.HdrMode Integer>()
.put("Keep HDR", HDR_MODE_KEEP_HDR)
.put("MediaCodec tone-map HDR to SDR", HDR_MODE_TONE_MAP_HDR_TO_SDR_USING_MEDIACODEC)
.put("OpenGL tone-map HDR to SDR", HDR_MODE_TONE_MAP_HDR_TO_SDR_USING_OPEN_GL)
.put("Force Interpret HDR as SDR", HDR_MODE_EXPERIMENTAL_FORCE_INTERPRET_HDR_AS_SDR)
.build();
private static final ImmutableList<String> RESOLUTION_HEIGHTS =
ImmutableList.of(
SAME_AS_INPUT_OPTION, "144", "240", "360", "480", "720", "1080", "1440", "2160");
private ArrayList<String> sequenceAssetTitles;
private boolean[] selectedMediaItems;
@ -105,8 +75,6 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
private AppCompatButton exportButton;
private AppCompatTextView exportInformationTextView;
private Stopwatch exportStopwatch;
private boolean includeBackgroundAudioTrack;
private boolean appliesVideoEffects;
@Override
protected void onCreate(@Nullable Bundle savedInstanceState) {
@ -124,28 +92,7 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
exportInformationTextView = findViewById(R.id.export_information_text);
exportButton = findViewById(R.id.composition_export_button);
exportButton.setOnClickListener(view -> showExportSettings());
AppCompatCheckBox backgroundAudioCheckBox = findViewById(R.id.background_audio_checkbox);
backgroundAudioCheckBox.setOnCheckedChangeListener(
(compoundButton, checked) -> includeBackgroundAudioTrack = checked);
ArrayAdapter<String> resolutionHeightAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
resolutionHeightAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
Spinner resolutionHeightSpinner = findViewById(R.id.resolution_height_spinner);
resolutionHeightSpinner.setAdapter(resolutionHeightAdapter);
resolutionHeightAdapter.addAll(RESOLUTION_HEIGHTS);
ArrayAdapter<String> hdrModeAdapter = new ArrayAdapter<>(this, R.layout.spinner_item);
hdrModeAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
Spinner hdrModeSpinner = findViewById(R.id.hdr_mode_spinner);
hdrModeSpinner.setAdapter(hdrModeAdapter);
hdrModeAdapter.addAll(HDR_MODE_DESCRIPTIONS.keySet());
AppCompatCheckBox applyVideoEffectsCheckBox = findViewById(R.id.apply_video_effects_checkbox);
applyVideoEffectsCheckBox.setOnCheckedChangeListener(
((compoundButton, checked) -> appliesVideoEffects = checked));
exportButton.setOnClickListener(view -> exportComposition());
presetDescriptions = getResources().getStringArray(R.array.preset_descriptions);
// Select two media items by default.
@ -190,22 +137,9 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
String[] presetUris = getResources().getStringArray(/* id= */ R.array.preset_uris);
int[] presetDurationsUs = getResources().getIntArray(/* id= */ R.array.preset_durations);
List<EditedMediaItem> mediaItems = new ArrayList<>();
ImmutableList.Builder<Effect> videoEffectsBuilder = new ImmutableList.Builder<>();
if (appliesVideoEffects) {
videoEffectsBuilder.add(MatrixTransformationFactory.createDizzyCropEffect());
videoEffectsBuilder.add(RgbFilter.createGrayscaleFilter());
}
Spinner resolutionHeightSpinner = findViewById(R.id.resolution_height_spinner);
String selectedResolutionHeight = String.valueOf(resolutionHeightSpinner.getSelectedItem());
if (!SAME_AS_INPUT_OPTION.equals(selectedResolutionHeight)) {
int resolutionHeight = Integer.parseInt(selectedResolutionHeight);
videoEffectsBuilder.add(LanczosResample.scaleToFit(10000, resolutionHeight));
videoEffectsBuilder.add(Presentation.createForHeight(resolutionHeight));
}
ImmutableList<Effect> videoEffects = videoEffectsBuilder.build();
// Preview requires all sequences to be the same duration, so calculate main sequence duration
// and limit background sequence duration to match.
long videoSequenceDurationUs = 0;
ImmutableList<Effect> effects =
ImmutableList.of(
MatrixTransformationFactory.createDizzyCropEffect(), RgbFilter.createGrayscaleFilter());
for (int i = 0; i < selectedMediaItems.length; i++) {
if (selectedMediaItems[i]) {
SonicAudioProcessor pitchChanger = new SonicAudioProcessor();
@ -220,47 +154,22 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
.setEffects(
new Effects(
/* audioProcessors= */ ImmutableList.of(pitchChanger),
/* videoEffects= */ videoEffects))
/* videoEffects= */ effects))
.setDurationUs(presetDurationsUs[i]);
videoSequenceDurationUs += presetDurationsUs[i];
mediaItems.add(itemBuilder.build());
}
}
EditedMediaItemSequence videoSequence = new EditedMediaItemSequence.Builder(mediaItems).build();
List<EditedMediaItemSequence> compositionSequences = new ArrayList<>();
compositionSequences.add(videoSequence);
if (includeBackgroundAudioTrack) {
compositionSequences.add(getAudioBackgroundSequence(Util.usToMs(videoSequenceDurationUs)));
}
EditedMediaItemSequence videoSequence = new EditedMediaItemSequence(mediaItems);
SonicAudioProcessor sampleRateChanger = new SonicAudioProcessor();
sampleRateChanger.setOutputSampleRateHz(8_000);
Spinner hdrModeSpinner = findViewById(R.id.hdr_mode_spinner);
int selectedHdrMode =
HDR_MODE_DESCRIPTIONS.get(String.valueOf(hdrModeSpinner.getSelectedItem()));
return new Composition.Builder(compositionSequences)
return new Composition.Builder(/* sequences= */ ImmutableList.of(videoSequence))
.setEffects(
new Effects(
/* audioProcessors= */ ImmutableList.of(sampleRateChanger),
/* videoEffects= */ ImmutableList.of()))
.setHdrMode(selectedHdrMode)
.build();
}
private EditedMediaItemSequence getAudioBackgroundSequence(long durationMs) {
MediaItem audioMediaItem =
new MediaItem.Builder()
.setUri(AUDIO_URI)
.setClippingConfiguration(
new MediaItem.ClippingConfiguration.Builder()
.setStartPositionMs(0)
.setEndPositionMs(durationMs)
.build())
.build();
EditedMediaItem audioItem =
new EditedMediaItem.Builder(audioMediaItem).setDurationUs(59_000_000).build();
return new EditedMediaItemSequence.Builder(audioItem).build();
}
private void previewComposition() {
releasePlayer();
Composition composition = prepareComposition();
@ -279,7 +188,6 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
Log.e(TAG, "Preview error", error);
}
});
player.setRepeatMode(Player.REPEAT_MODE_ALL);
player.setComposition(composition);
player.prepare();
player.play();
@ -289,7 +197,7 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
new AlertDialog.Builder(/* context= */ this)
.setTitle(R.string.select_preset_title)
.setMultiChoiceItems(presetDescriptions, selectedMediaItems, this::selectPresetInDialog)
.setPositiveButton(R.string.ok, /* listener= */ null)
.setPositiveButton(android.R.string.ok, /* listener= */ null)
.setCancelable(false)
.create()
.show();
@ -308,67 +216,7 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
}
}
private void showExportSettings() {
AlertDialog.Builder alertDialogBuilder = new AlertDialog.Builder(this);
LayoutInflater inflater = this.getLayoutInflater();
View exportSettingsDialogView = inflater.inflate(R.layout.export_settings, null);
alertDialogBuilder
.setView(exportSettingsDialogView)
.setTitle(R.string.export_settings)
.setPositiveButton(
R.string.export, (dialog, id) -> exportComposition(exportSettingsDialogView))
.setNegativeButton(R.string.cancel, (dialog, id) -> dialog.dismiss());
ArrayAdapter<String> audioMimeAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
audioMimeAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
Spinner audioMimeSpinner = exportSettingsDialogView.findViewById(R.id.audio_mime_spinner);
audioMimeSpinner.setAdapter(audioMimeAdapter);
audioMimeAdapter.addAll(
SAME_AS_INPUT_OPTION, MimeTypes.AUDIO_AAC, MimeTypes.AUDIO_AMR_NB, MimeTypes.AUDIO_AMR_WB);
ArrayAdapter<String> videoMimeAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
videoMimeAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
Spinner videoMimeSpinner = exportSettingsDialogView.findViewById(R.id.video_mime_spinner);
videoMimeSpinner.setAdapter(videoMimeAdapter);
videoMimeAdapter.addAll(
SAME_AS_INPUT_OPTION,
MimeTypes.VIDEO_H263,
MimeTypes.VIDEO_H264,
MimeTypes.VIDEO_H265,
MimeTypes.VIDEO_MP4V,
MimeTypes.VIDEO_AV1);
CheckBox enableDebugTracingCheckBox =
exportSettingsDialogView.findViewById(R.id.enable_debug_tracing_checkbox);
enableDebugTracingCheckBox.setOnCheckedChangeListener(
(buttonView, isChecked) -> DebugTraceUtil.enableTracing = isChecked);
// Connect producing fragmented MP4 to using Media3 Muxer
CheckBox useMedia3MuxerCheckBox =
exportSettingsDialogView.findViewById(R.id.use_media3_muxer_checkbox);
CheckBox produceFragmentedMp4CheckBox =
exportSettingsDialogView.findViewById(R.id.produce_fragmented_mp4_checkbox);
useMedia3MuxerCheckBox.setOnCheckedChangeListener(
(buttonView, isChecked) -> {
if (!isChecked) {
produceFragmentedMp4CheckBox.setChecked(false);
}
});
produceFragmentedMp4CheckBox.setOnCheckedChangeListener(
(buttonView, isChecked) -> {
if (isChecked) {
useMedia3MuxerCheckBox.setChecked(true);
}
});
AlertDialog dialog = alertDialogBuilder.create();
dialog.show();
}
private void exportComposition(View exportSettingsDialogView) {
private void exportComposition() {
// Cancel and clean up files from any ongoing export.
cancelExport();
@ -389,33 +237,8 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
}
String filePath = outputFile.getAbsolutePath();
Transformer.Builder transformerBuilder = new Transformer.Builder(/* context= */ this);
Spinner audioMimeTypeSpinner = exportSettingsDialogView.findViewById(R.id.audio_mime_spinner);
String selectedAudioMimeType = String.valueOf(audioMimeTypeSpinner.getSelectedItem());
if (!SAME_AS_INPUT_OPTION.equals(selectedAudioMimeType)) {
transformerBuilder.setAudioMimeType(selectedAudioMimeType);
}
Spinner videoMimeTypeSpinner = exportSettingsDialogView.findViewById(R.id.video_mime_spinner);
String selectedVideoMimeType = String.valueOf(videoMimeTypeSpinner.getSelectedItem());
if (!SAME_AS_INPUT_OPTION.equals(selectedVideoMimeType)) {
transformerBuilder.setVideoMimeType(selectedVideoMimeType);
}
CheckBox useMedia3MuxerCheckBox =
exportSettingsDialogView.findViewById(R.id.use_media3_muxer_checkbox);
CheckBox produceFragmentedMp4CheckBox =
exportSettingsDialogView.findViewById(R.id.produce_fragmented_mp4_checkbox);
if (useMedia3MuxerCheckBox.isChecked()) {
transformerBuilder.setMuxerFactory(
new InAppMuxer.Factory.Builder()
.setOutputFragmentedMp4(produceFragmentedMp4CheckBox.isChecked())
.build());
}
transformer =
transformerBuilder
new Transformer.Builder(/* context= */ this)
.addListener(
new Transformer.Listener() {
@Override
@ -424,7 +247,6 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
long elapsedTimeMs = exportStopwatch.elapsed(TimeUnit.MILLISECONDS);
String details =
getString(R.string.export_completed, elapsedTimeMs / 1000.f, filePath);
Log.d(TAG, DebugTraceUtil.generateTraceSummary());
Log.i(TAG, details);
exportInformationTextView.setText(details);
@ -453,7 +275,6 @@ public final class CompositionPreviewActivity extends AppCompatActivity {
Toast.LENGTH_LONG)
.show();
Log.e(TAG, "Export error", exportException);
Log.d(TAG, DebugTraceUtil.generateTraceSummary());
exportInformationTextView.setText(R.string.export_error);
}
})

View file

@ -43,7 +43,7 @@
android:layout_marginBottom="8dp"
android:padding="8dp"
android:textAppearance="@style/TextAppearance.AppCompat.Medium"
android:text="@string/preview_composition" />
android:text="@string/preview_single_sequence" />
<FrameLayout
android:layout_width="match_parent"
@ -64,7 +64,7 @@
android:id="@+id/sequence_header_text"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/video_sequence_items"
android:text="@string/single_sequence_items"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toBottomOf="@id/composition_preview_card_view"
app:layout_constraintBottom_toTopOf="@id/composition_preset_list"/>
@ -75,18 +75,8 @@
android:layout_height="wrap_content"
android:textAppearance="@style/TextAppearance.AppCompat.Small"
android:text="@string/edit"
app:layout_constraintStart_toEndOf="@id/sequence_header_text"
app:layout_constraintTop_toTopOf="@id/sequence_header_text"
app:layout_constraintBottom_toBottomOf="@id/sequence_header_text"/>
<androidx.appcompat.widget.AppCompatCheckBox
android:id="@+id/apply_video_effects_checkbox"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/add_effects"
app:layout_constraintStart_toEndOf="@id/edit_sequence_button"
app:layout_constraintTop_toTopOf="@id/sequence_header_text"
app:layout_constraintBottom_toBottomOf="@id/sequence_header_text" />
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintTop_toBottomOf="@id/composition_preview_card_view"/>
<androidx.recyclerview.widget.RecyclerView
android:id="@+id/composition_preset_list"
@ -102,58 +92,7 @@
android:layout_width="match_parent"
android:layout_height="wrap_content"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintBottom_toTopOf="@id/background_audio_checkbox"/>
<androidx.appcompat.widget.AppCompatCheckBox
android:id="@+id/background_audio_checkbox"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:text="@string/add_background_audio"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintBottom_toTopOf="@id/resolution_height_setting" />
<LinearLayout
android:id="@+id/resolution_height_setting"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical"
android:layout_marginBottom="8dp"
app:layout_constraintBottom_toTopOf="@id/hdr_mode_setting">
<TextView
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1"
android:text="@string/output_video_resolution"/>
<Spinner
android:id="@+id/resolution_height_spinner"
android:layout_gravity="end|center_vertical"
android:gravity="end"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
<LinearLayout
android:id="@+id/hdr_mode_setting"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical"
android:layout_marginBottom="12dp"
app:layout_constraintBottom_toTopOf="@id/preview_button">
<TextView
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1"
android:text="@string/hdr_mode" />
<Spinner
android:id="@+id/hdr_mode_spinner"
android:layout_gravity="end|center_vertical"
android:gravity="end"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
app:layout_constraintBottom_toTopOf="@id/composition_export_button"/>
<androidx.appcompat.widget.AppCompatButton
android:id="@+id/composition_export_button"
@ -161,9 +100,9 @@
android:layout_marginTop="16dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
app:layout_constraintStart_toEndOf="@id/preview_button"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintBottom_toBottomOf="parent"/>
app:layout_constraintBottom_toTopOf="@id/preview_button"/>
<androidx.appcompat.widget.AppCompatButton
android:id="@+id/preview_button"
@ -172,7 +111,7 @@
android:layout_width="wrap_content"
android:layout_height="wrap_content"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintEnd_toStartOf="@id/composition_export_button"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintBottom_toBottomOf="parent"/>
</androidx.constraintlayout.widget.ConstraintLayout>

View file

@ -1,110 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Copyright 2024 The Android Open Source Project
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/export_settings_list"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="vertical"
android:padding="8dp">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical"
android:layout_marginBottom="12dp"
android:layout_marginTop="12dp">
<TextView
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1"
android:text="@string/output_audio_mime_type"/>
<Spinner
android:id="@+id/audio_mime_spinner"
android:layout_gravity="end|center_vertical"
android:gravity="end"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical"
android:layout_marginBottom="12dp">
<TextView
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1"
android:text="@string/output_video_mime_type"/>
<Spinner
android:id="@+id/video_mime_spinner"
android:layout_gravity="end|center_vertical"
android:gravity="end"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical">
<TextView
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1"
android:text="@string/enable_debug_tracing"/>
<CheckBox
android:id="@+id/enable_debug_tracing_checkbox"
android:layout_gravity="end"
android:checked="false"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical">
<TextView
android:text="@string/use_media3_muxer"
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1" />
<CheckBox
android:id="@+id/use_media3_muxer_checkbox"
android:layout_gravity="end"
android:checked="false"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center_vertical">
<TextView
android:text="@string/produce_fragmented_mp4"
android:layout_height="wrap_content"
android:layout_width="0dp"
android:layout_weight="1" />
<CheckBox
android:id="@+id/produce_fragmented_mp4_checkbox"
android:layout_gravity="end"
android:checked="false"
android:layout_height="wrap_content"
android:layout_width="wrap_content"/>
</LinearLayout>
</LinearLayout>

View file

@ -1,25 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Copyright 2024 The Android Open Source Project
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<TextView
xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="wrap_content"
android:layout_height="32dp"
android:gravity="start|center_vertical"
android:paddingLeft="4dp"
android:paddingRight="4dp"
android:layout_marginLeft="4dp"
android:layout_marginRight="4dp"
android:textIsSelectable="false" />

View file

@ -27,7 +27,7 @@
<item>H264 video and AAC audio (portrait, H &lt; W, 90°)</item>
<item>SEF slow motion with 240 fps</item>
<item>480p DASH (non-square pixels)</item>
<item>HDR (HDR10+) H265 limited range video (encoding may fail)</item>
<item>HDR (HDR10) H265 limited range video (encoding may fail)</item>
<item>HDR (HLG) H265 limited range video (encoding may fail)</item>
<item>720p H264 video with no audio</item>
<item>London JPG image (plays for 5 secs at 30 fps)</item>

View file

@ -16,24 +16,12 @@
<resources>
<string name="app_name">Composition Demo</string>
<string name="edit">Edit</string>
<string name="add_effects">Add effects</string>
<string name="preview" translatable="false">Preview</string>
<string name="preview_composition" translatable="false">Composition preview</string>
<string name="video_sequence_items" translatable="false">Video sequence items:</string>
<string name="preview_single_sequence" translatable="false">Single sequence preview</string>
<string name="single_sequence_items" translatable="false">Single sequence items:</string>
<string name="select_preset_title" translatable="false">Choose preset input</string>
<string name="export" translatable="false">Export</string>
<string name="export_completed" translatable="false">Export completed in %.3f seconds.\nOutput: %s</string>
<string name="export_error" translatable="false">Export error</string>
<string name="export_started" translatable="false">Export started</string>
<string name="add_background_audio" translatable="false">Add background audio</string>
<string name="output_video_resolution" translatable="false">Output video resolution</string>
<string name="hdr_mode" translatable="false">HDR mode</string>
<string name="ok" translatable="false">OK</string>
<string name="cancel" translatable="false">Cancel</string>
<string name="export_settings" translatable="false">Export Settings</string>
<string name="output_audio_mime_type" translatable="false">Output audio MIME type</string>
<string name="output_video_mime_type" translatable="false">Output video MIME type</string>
<string name="enable_debug_tracing" translatable="false">Enable debug tracing</string>
<string name="use_media3_muxer" translatable="false">Use Media3 muxer</string>
<string name="produce_fragmented_mp4" translatable="false">Produce fragmented MP4</string>
</resources>

View file

@ -29,6 +29,7 @@ android {
versionCode project.ext.releaseVersionCode
minSdkVersion project.ext.minSdkVersion
targetSdkVersion project.ext.appTargetSdkVersion
multiDexEnabled true
}
buildTypes {
@ -54,5 +55,6 @@ dependencies {
implementation project(modulePrefix + 'lib-exoplayer-smoothstreaming')
implementation project(modulePrefix + 'lib-ui')
implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion
implementation 'androidx.multidex:multidex:' + androidxMultidexVersion
compileOnly 'org.checkerframework:checker-qual:' + checkerframeworkVersion
}

View file

@ -22,6 +22,7 @@
<uses-sdk/>
<application
android:name="androidx.multidex.MultiDexApplication"
android:allowBackup="false"
android:icon="@mipmap/ic_launcher"
android:label="@string/application_name">

View file

@ -31,6 +31,7 @@ android {
versionCode project.ext.releaseVersionCode
minSdkVersion project.ext.minSdkVersion
targetSdkVersion project.ext.appTargetSdkVersion
multiDexEnabled true
}
buildTypes {
@ -74,6 +75,7 @@ dependencies {
compileOnly 'org.checkerframework:checker-qual:' + checkerframeworkVersion
implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion
implementation 'androidx.appcompat:appcompat:' + androidxAppCompatVersion
implementation 'androidx.multidex:multidex:' + androidxMultidexVersion
implementation 'com.google.android.material:material:' + androidxMaterialVersion
implementation project(modulePrefix + 'lib-exoplayer')
implementation project(modulePrefix + 'lib-exoplayer-dash')
@ -87,7 +89,6 @@ dependencies {
withDecoderExtensionsImplementation project(modulePrefix + 'lib-decoder-ffmpeg')
withDecoderExtensionsImplementation project(modulePrefix + 'lib-decoder-flac')
withDecoderExtensionsImplementation project(modulePrefix + 'lib-decoder-opus')
withDecoderExtensionsImplementation project(modulePrefix + 'lib-decoder-iamf')
withDecoderExtensionsImplementation project(modulePrefix + 'lib-decoder-vp9')
withDecoderExtensionsImplementation project(modulePrefix + 'lib-decoder-midi')
withDecoderExtensionsImplementation project(modulePrefix + 'lib-datasource-rtmp')

View file

@ -40,6 +40,7 @@
android:largeHeap="true"
android:allowBackup="false"
android:supportsRtl="true"
android:name="androidx.multidex.MultiDexApplication"
tools:targetApi="29">
<activity android:name=".SampleChooserActivity"

View file

@ -758,10 +758,6 @@
{
"name": "One hour frame counter (MP4)",
"uri": "https://storage.googleapis.com/exoplayer-test-media-1/mp4/frame-counter-one-hour.mp4"
},
{
"name": "Immersive Audio Format Sample (MP4, IAMF)",
"uri": "https://github.com/AOMediaCodec/libiamf/raw/main/tests/test_000036_s.mp4"
}
]
},

View file

@ -63,7 +63,7 @@ public class DemoDownloadService extends DownloadService {
@Override
protected Scheduler getScheduler() {
return new PlatformScheduler(this, JOB_ID);
return Util.SDK_INT >= 21 ? new PlatformScheduler(this, JOB_ID) : null;
}
@Override

View file

@ -18,7 +18,6 @@ package androidx.media3.demo.main;
import android.content.Context;
import android.net.http.HttpEngine;
import android.os.Build;
import android.os.ext.SdkExtensions;
import androidx.annotation.OptIn;
import androidx.media3.database.DatabaseProvider;
import androidx.media3.database.StandaloneDatabaseProvider;
@ -50,6 +49,16 @@ public final class DemoUtil {
public static final String DOWNLOAD_NOTIFICATION_CHANNEL_ID = "download_channel";
/**
* Whether the demo application uses Cronet for networking when {@link HttpEngine} is not
* supported. Note that Cronet does not provide automatic support for cookies
* (https://github.com/google/ExoPlayer/issues/5975).
*
* <p>If set to false, the {@link DefaultHttpDataSource} is used with a {@link CookieManager}
* configured in {@link #getHttpDataSourceFactory} when {@link HttpEngine} is not supported.
*/
private static final boolean ALLOW_CRONET_FOR_NETWORKING = true;
private static final String TAG = "DemoUtil";
private static final String DOWNLOAD_CONTENT_DIRECTORY = "downloads";
@ -97,20 +106,22 @@ public final class DemoUtil {
return httpDataSourceFactory;
}
context = context.getApplicationContext();
if (Build.VERSION.SDK_INT >= 30
&& SdkExtensions.getExtensionVersion(Build.VERSION_CODES.S) >= 7) {
if (Build.VERSION.SDK_INT >= 34) {
HttpEngine httpEngine = new HttpEngine.Builder(context).build();
httpDataSourceFactory =
new HttpEngineDataSource.Factory(httpEngine, Executors.newSingleThreadExecutor());
return httpDataSourceFactory;
}
@Nullable CronetEngine cronetEngine = CronetUtil.buildCronetEngine(context);
if (cronetEngine != null) {
httpDataSourceFactory =
new CronetDataSource.Factory(cronetEngine, Executors.newSingleThreadExecutor());
return httpDataSourceFactory;
if (ALLOW_CRONET_FOR_NETWORKING) {
@Nullable CronetEngine cronetEngine = CronetUtil.buildCronetEngine(context);
if (cronetEngine != null) {
httpDataSourceFactory =
new CronetDataSource.Factory(cronetEngine, Executors.newSingleThreadExecutor());
return httpDataSourceFactory;
}
}
// The device doesn't support HttpEngine and we failed to instantiate a CronetEngine.
// The device doesn't support HttpEngine or we don't want to allow Cronet, or we failed to
// instantiate a CronetEngine.
CookieManager cookieManager = new CookieManager();
cookieManager.setCookiePolicy(CookiePolicy.ACCEPT_ORIGINAL_SERVER);
CookieHandler.setDefault(cookieManager);

View file

@ -45,6 +45,7 @@ import android.widget.ExpandableListView.OnChildClickListener;
import android.widget.ImageButton;
import android.widget.TextView;
import android.widget.Toast;
import androidx.annotation.DoNotInline;
import androidx.annotation.Nullable;
import androidx.annotation.OptIn;
import androidx.annotation.RequiresApi;
@ -666,6 +667,7 @@ public class SampleChooserActivity extends AppCompatActivity
@RequiresApi(33)
private static class Api33 {
@DoNotInline
public static String getPostNotificationPermissionString() {
return Manifest.permission.POST_NOTIFICATIONS;
}

View file

@ -34,6 +34,7 @@ android {
versionCode project.ext.releaseVersionCode
minSdkVersion project.ext.minSdkVersion
targetSdkVersion project.ext.appTargetSdkVersion
multiDexEnabled true
}
buildTypes {
@ -64,6 +65,7 @@ dependencies {
implementation 'androidx.lifecycle:lifecycle-common:' + androidxLifecycleVersion
implementation 'androidx.lifecycle:lifecycle-runtime-ktx:' + androidxLifecycleVersion
implementation 'androidx.appcompat:appcompat:' + androidxAppCompatVersion
implementation 'androidx.multidex:multidex:' + androidxMultidexVersion
implementation 'com.google.android.material:material:' + androidxMaterialVersion
implementation 'org.jetbrains.kotlinx:kotlinx-coroutines-guava:' + kotlinxCoroutinesVersion
implementation project(modulePrefix + 'lib-ui')

View file

@ -14,6 +14,7 @@
limitations under the License.
-->
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
package="androidx.media3.demo.session">
<uses-sdk/>
@ -22,10 +23,12 @@
<uses-permission android:name="android.permission.FOREGROUND_SERVICE_MEDIA_PLAYBACK" />
<application
android:name="androidx.multidex.MultiDexApplication"
android:allowBackup="false"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:theme="@style/Theme.Media3Demo">
android:theme="@style/Theme.Media3Demo"
tools:replace="android:name">
<!-- Declare that this session demo supports Android Auto. -->
<meta-data

View file

@ -13,7 +13,7 @@
See the License for the specific language governing permissions and
limitations under the License.
-->
<resources>
<resources xmlns:tools="http://schemas.android.com/tools">
<!-- Base application theme. -->
<style name="Theme.Media3Demo" parent="Theme.MaterialComponents.DayNight.DarkActionBar">
<!-- Primary brand color. -->
@ -25,7 +25,9 @@
<item name="colorSecondaryVariant">@color/teal_200</item>
<item name="colorOnSecondary">@color/black</item>
<!-- Status bar color. -->
<item name="android:statusBarColor">?attr/colorPrimaryVariant</item>
<item name="android:statusBarColor" tools:targetApi="l">
?attr/colorPrimaryVariant
</item>
<!-- Customize your theme here. -->
</style>
</resources>

View file

@ -13,7 +13,7 @@
See the License for the specific language governing permissions and
limitations under the License.
-->
<resources>
<resources xmlns:tools="http://schemas.android.com/tools">
<!-- Base application theme. -->
<style name="Theme.Media3Demo" parent="Theme.MaterialComponents.DayNight.DarkActionBar">
<!-- Primary brand color. -->
@ -25,7 +25,9 @@
<item name="colorSecondaryVariant">@color/teal_700</item>
<item name="colorOnSecondary">@color/black</item>
<!-- Status bar color. -->
<item name="android:statusBarColor">?attr/colorPrimaryVariant</item>
<item name="android:statusBarColor" tools:targetApi="l">
?attr/colorPrimaryVariant
</item>
<!-- Customize your theme here. -->
</style>
</resources>

View file

@ -34,6 +34,7 @@ android {
versionCode project.ext.releaseVersionCode
minSdkVersion project.ext.automotiveMinSdkVersion
targetSdkVersion project.ext.appTargetSdkVersion
multiDexEnabled true
}
buildTypes {
@ -59,6 +60,7 @@ android {
dependencies {
implementation 'androidx.core:core-ktx:' + androidxCoreVersion
implementation 'androidx.appcompat:appcompat:' + androidxAppCompatVersion
implementation 'androidx.multidex:multidex:' + androidxMultidexVersion
implementation 'com.google.android.material:material:' + androidxMaterialVersion
implementation project(modulePrefix + 'lib-session')
implementation project(modulePrefix + 'demo-session-service')

View file

@ -14,6 +14,7 @@
limitations under the License.
-->
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
package="androidx.media3.demo.session.automotive">
<uses-sdk/>
@ -38,11 +39,13 @@
android:resource="@xml/automotive_app_desc"/>
<application
android:name="androidx.multidex.MultiDexApplication"
android:allowBackup="false"
android:taskAffinity=""
android:appCategory="audio"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name">
android:label="@string/app_name"
tools:replace="android:name">
<meta-data
android:name="androidx.car.app.TintableAttributionIcon"

View file

@ -33,6 +33,7 @@ android {
versionCode project.ext.releaseVersionCode
minSdkVersion project.ext.minSdkVersion
targetSdkVersion project.ext.appTargetSdkVersion
multiDexEnabled true
}
buildTypes {
@ -53,6 +54,7 @@ android {
dependencies {
implementation 'androidx.core:core-ktx:' + androidxCoreVersion
implementation 'androidx.appcompat:appcompat:' + androidxAppCompatVersion
implementation 'androidx.multidex:multidex:' + androidxMultidexVersion
implementation project(modulePrefix + 'lib-exoplayer')
implementation project(modulePrefix + 'lib-exoplayer-dash')
implementation project(modulePrefix + 'lib-exoplayer-hls')

View file

@ -24,13 +24,11 @@ import android.os.Build
import androidx.annotation.OptIn
import androidx.core.app.NotificationCompat
import androidx.core.app.NotificationManagerCompat
import androidx.core.os.bundleOf
import androidx.media3.common.AudioAttributes
import androidx.media3.common.util.UnstableApi
import androidx.media3.demo.session.service.R
import androidx.media3.exoplayer.ExoPlayer
import androidx.media3.exoplayer.util.EventLogger
import androidx.media3.session.MediaConstants
import androidx.media3.session.MediaLibraryService
import androidx.media3.session.MediaSession
import androidx.media3.session.MediaSession.ControllerInfo
@ -113,17 +111,6 @@ open class DemoPlaybackService : MediaLibraryService() {
MediaLibrarySession.Builder(this, player, createLibrarySessionCallback())
.also { builder -> getSingleTopActivity()?.let { builder.setSessionActivity(it) } }
.build()
.also { mediaLibrarySession ->
// The media session always supports skip, except at the start and end of the playlist.
// Reserve the space for the skip action in these cases to avoid custom actions jumping
// around when the user skips.
mediaLibrarySession.setSessionExtras(
bundleOf(
MediaConstants.EXTRAS_KEY_SLOT_RESERVATION_SEEK_TO_PREV to true,
MediaConstants.EXTRAS_KEY_SLOT_RESERVATION_SEEK_TO_NEXT to true,
)
)
}
}
@OptIn(UnstableApi::class) // MediaSessionService.Listener

View file

@ -34,6 +34,7 @@ android {
versionCode project.ext.releaseVersionCode
minSdkVersion project.ext.minSdkVersion
targetSdkVersion project.ext.appTargetSdkVersion
multiDexEnabled true
}
buildTypes {
@ -79,6 +80,7 @@ dependencies {
implementation 'androidx.core:core-ktx:' + androidxCoreVersion
implementation 'androidx.appcompat:appcompat:' + androidxAppCompatVersion
implementation 'androidx.multidex:multidex:' + androidxMultidexVersion
implementation 'com.google.android.material:material:' + androidxMaterialVersion
implementation project(modulePrefix + 'lib-exoplayer')
implementation project(modulePrefix + 'lib-exoplayer-dash')

View file

@ -14,14 +14,17 @@
limitations under the License.
-->
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
package="androidx.media3.demo.shortform">
<application
android:allowBackup="false"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:name="androidx.multidex.MultiDexApplication"
android:theme="@style/Theme.MaterialComponents.DayNight.NoActionBar"
android:taskAffinity="">
android:taskAffinity=""
tools:replace="android:name">
<activity
android:exported="true"
android:name=".MainActivity">

View file

@ -15,13 +15,16 @@
*/
package androidx.media3.demo.shortform
import android.content.Context
import android.os.Handler
import android.os.Looper
import androidx.annotation.OptIn
import androidx.media3.common.Player
import androidx.media3.common.util.UnstableApi
import androidx.media3.exoplayer.ExoPlayer
import androidx.media3.exoplayer.source.preload.DefaultPreloadManager.Builder
import androidx.media3.exoplayer.LoadControl
import androidx.media3.exoplayer.RenderersFactory
import androidx.media3.exoplayer.upstream.BandwidthMeter
import androidx.media3.exoplayer.util.EventLogger
import com.google.common.collect.BiMap
import com.google.common.collect.HashBiMap
@ -31,7 +34,14 @@ import java.util.LinkedList
import java.util.Queue
@OptIn(UnstableApi::class)
class PlayerPool(private val numberOfPlayers: Int, preloadManagerBuilder: Builder) {
class PlayerPool(
private val numberOfPlayers: Int,
context: Context,
playbackLooper: Looper,
loadControl: LoadControl,
renderersFactory: RenderersFactory,
bandwidthMeter: BandwidthMeter,
) {
/** Creates a player instance to be used by the pool. */
interface PlayerFactory {
@ -42,7 +52,8 @@ class PlayerPool(private val numberOfPlayers: Int, preloadManagerBuilder: Builde
private val availablePlayerQueue: Queue<Int> = LinkedList()
private val playerMap: BiMap<Int, ExoPlayer> = Maps.synchronizedBiMap(HashBiMap.create())
private val playerRequestTokenSet: MutableSet<Int> = Collections.synchronizedSet(HashSet<Int>())
private val playerFactory: PlayerFactory = DefaultPlayerFactory(preloadManagerBuilder)
private val playerFactory: PlayerFactory =
DefaultPlayerFactory(context, playbackLooper, loadControl, renderersFactory, bandwidthMeter)
fun acquirePlayer(token: Int, callback: (ExoPlayer) -> Unit) {
synchronized(playerMap) {
@ -115,11 +126,23 @@ class PlayerPool(private val numberOfPlayers: Int, preloadManagerBuilder: Builde
}
@OptIn(UnstableApi::class)
private class DefaultPlayerFactory(private val preloadManagerBuilder: Builder) : PlayerFactory {
private class DefaultPlayerFactory(
private val context: Context,
private val playbackLooper: Looper,
private val loadControl: LoadControl,
private val renderersFactory: RenderersFactory,
private val bandwidthMeter: BandwidthMeter,
) : PlayerFactory {
private var playerCounter = 0
override fun createPlayer(): ExoPlayer {
val player = preloadManagerBuilder.buildExoPlayer()
val player =
ExoPlayer.Builder(context)
.setPlaybackLooper(playbackLooper)
.setLoadControl(loadControl)
.setRenderersFactory(renderersFactory)
.setBandwidthMeter(bandwidthMeter)
.build()
player.addAnalyticsListener(EventLogger("player-$playerCounter"))
playerCounter++
player.repeatMode = ExoPlayer.REPEAT_MODE_ONE

View file

@ -25,7 +25,7 @@ import androidx.viewpager2.widget.ViewPager2
class ViewPagerActivity : AppCompatActivity() {
private lateinit var viewPagerView: ViewPager2
private lateinit var onPageChangeCallback: ViewPager2.OnPageChangeCallback
private lateinit var adapter: ViewPagerMediaAdapter
private var numberOfPlayers = 3
private var mediaItemDatabase = MediaItemDatabase()
@ -40,24 +40,23 @@ class ViewPagerActivity : AppCompatActivity() {
Log.d(TAG, "Using a pool of $numberOfPlayers players")
viewPagerView = findViewById(R.id.viewPager)
viewPagerView.offscreenPageLimit = 1
}
override fun onStart() {
super.onStart()
val adapter = ViewPagerMediaAdapter(mediaItemDatabase, numberOfPlayers, applicationContext)
viewPagerView.adapter = adapter
onPageChangeCallback =
viewPagerView.registerOnPageChangeCallback(
object : ViewPager2.OnPageChangeCallback() {
override fun onPageSelected(position: Int) {
adapter.onPageSelected(position)
}
}
viewPagerView.registerOnPageChangeCallback(onPageChangeCallback)
)
}
override fun onStart() {
super.onStart()
adapter = ViewPagerMediaAdapter(mediaItemDatabase, numberOfPlayers, this)
viewPagerView.adapter = adapter
}
override fun onStop() {
viewPagerView.unregisterOnPageChangeCallback(onPageChangeCallback)
viewPagerView.adapter = null
adapter.onDestroy()
super.onStop()
}
}

View file

@ -16,6 +16,8 @@
package androidx.media3.demo.shortform.viewpager
import android.content.Context
import android.os.HandlerThread
import android.os.Process
import android.view.LayoutInflater
import android.view.ViewGroup
import androidx.annotation.OptIn
@ -27,9 +29,14 @@ import androidx.media3.demo.shortform.MediaItemDatabase
import androidx.media3.demo.shortform.PlayerPool
import androidx.media3.demo.shortform.R
import androidx.media3.exoplayer.DefaultLoadControl
import androidx.media3.exoplayer.DefaultRendererCapabilitiesList
import androidx.media3.exoplayer.DefaultRenderersFactory
import androidx.media3.exoplayer.source.DefaultMediaSourceFactory
import androidx.media3.exoplayer.source.preload.DefaultPreloadManager
import androidx.media3.exoplayer.source.preload.DefaultPreloadManager.Status.STAGE_LOADED_FOR_DURATION_MS
import androidx.media3.exoplayer.source.preload.DefaultPreloadManager.Status.STAGE_LOADED_TO_POSITION_MS
import androidx.media3.exoplayer.source.preload.TargetPreloadStatusControl
import androidx.media3.exoplayer.trackselection.DefaultTrackSelector
import androidx.media3.exoplayer.upstream.DefaultBandwidthMeter
import androidx.recyclerview.widget.RecyclerView
import kotlin.math.abs
@ -39,11 +46,13 @@ class ViewPagerMediaAdapter(
numberOfPlayers: Int,
context: Context,
) : RecyclerView.Adapter<ViewPagerMediaHolder>() {
private val playbackThread: HandlerThread =
HandlerThread("playback-thread", Process.THREAD_PRIORITY_AUDIO)
private val preloadManager: DefaultPreloadManager
private val currentMediaItemsAndIndexes: ArrayDeque<Pair<MediaItem, Int>> = ArrayDeque()
private var playerPool: PlayerPool
private val holderMap: MutableMap<Int, ViewPagerMediaHolder>
private val preloadControl: DefaultPreloadControl
private var currentPlayingIndex: Int = C.INDEX_UNSET
companion object {
private const val TAG = "ViewPagerMediaAdapter"
@ -55,6 +64,7 @@ class ViewPagerMediaAdapter(
}
init {
playbackThread.start()
val loadControl =
DefaultLoadControl.Builder()
.setBufferDurationsMs(
@ -65,26 +75,35 @@ class ViewPagerMediaAdapter(
)
.setPrioritizeTimeOverSizeThresholds(true)
.build()
preloadControl = DefaultPreloadControl()
val preloadManagerBuilder =
DefaultPreloadManager.Builder(context.applicationContext, preloadControl)
.setLoadControl(loadControl)
playerPool = PlayerPool(numberOfPlayers, preloadManagerBuilder)
val renderersFactory = DefaultRenderersFactory(context)
playerPool =
PlayerPool(
numberOfPlayers,
context,
playbackThread.looper,
loadControl,
renderersFactory,
DefaultBandwidthMeter.getSingletonInstance(context),
)
holderMap = mutableMapOf()
preloadManager = preloadManagerBuilder.build()
val trackSelector = DefaultTrackSelector(context)
trackSelector.init({}, DefaultBandwidthMeter.getSingletonInstance(context))
preloadManager =
DefaultPreloadManager(
DefaultPreloadControl(),
DefaultMediaSourceFactory(context),
trackSelector,
DefaultBandwidthMeter.getSingletonInstance(context),
DefaultRendererCapabilitiesList.Factory(renderersFactory),
loadControl.allocator,
playbackThread.looper,
)
for (i in 0 until MANAGED_ITEM_COUNT) {
addMediaItem(index = i, isAddingToRightEnd = true)
}
preloadManager.invalidate()
}
override fun onDetachedFromRecyclerView(recyclerView: RecyclerView) {
playerPool.destroyPlayers()
preloadManager.release()
holderMap.clear()
super.onDetachedFromRecyclerView(recyclerView)
}
override fun onCreateViewHolder(parent: ViewGroup, viewType: Int): ViewPagerMediaHolder {
val view =
LayoutInflater.from(parent.context).inflate(R.layout.media_item_view_pager, parent, false)
@ -137,9 +156,15 @@ class ViewPagerMediaAdapter(
return Int.MAX_VALUE
}
fun onDestroy() {
preloadManager.release()
playerPool.destroyPlayers()
playbackThread.quit()
}
fun onPageSelected(position: Int) {
currentPlayingIndex = position
holderMap[position]?.playIfPossible()
preloadControl.currentPlayingIndex = position
preloadManager.setCurrentPlayingIndex(position)
preloadManager.invalidate()
}
@ -172,14 +197,12 @@ class ViewPagerMediaAdapter(
preloadManager.remove(itemAndIndex.first)
}
inner class DefaultPreloadControl(var currentPlayingIndex: Int = C.INDEX_UNSET) :
TargetPreloadStatusControl<Int> {
inner class DefaultPreloadControl : TargetPreloadStatusControl<Int> {
override fun getTargetPreloadStatus(rankingData: Int): DefaultPreloadManager.Status? {
if (abs(rankingData - currentPlayingIndex) == 2) {
return DefaultPreloadManager.Status(STAGE_LOADED_FOR_DURATION_MS, 500L)
return DefaultPreloadManager.Status(STAGE_LOADED_TO_POSITION_MS, 500L)
} else if (abs(rankingData - currentPlayingIndex) == 1) {
return DefaultPreloadManager.Status(STAGE_LOADED_FOR_DURATION_MS, 1000L)
return DefaultPreloadManager.Status(STAGE_LOADED_TO_POSITION_MS, 1000L)
}
return null
}

View file

@ -0,0 +1 @@
../../proguard-rules.txt

View file

@ -42,7 +42,7 @@
android:background="@color/purple_700"
android:gravity="center"
android:hint="@string/num_of_players"
android:inputType="number"
android:inputType="numberDecimal"
android:textColorHint="@color/grey" />
</LinearLayout>

View file

@ -13,7 +13,7 @@
See the License for the specific language governing permissions and
limitations under the License.
-->
<resources>
<resources xmlns:tools="http://schemas.android.com/tools">
<!-- Base application theme. -->
<style name="Theme.Media3Demo" parent="Theme.MaterialComponents.DayNight.DarkActionBar">
<!-- Primary brand color. -->
@ -25,7 +25,9 @@
<item name="colorSecondaryVariant">@color/teal_700</item>
<item name="colorOnSecondary">@color/black</item>
<!-- Status bar color. -->
<item name="android:statusBarColor">?attr/colorPrimaryVariant</item>
<item name="android:statusBarColor" tools:targetApi="l">
?attr/colorPrimaryVariant
</item>
<!-- Customize your theme here. -->
</style>
</resources>

View file

@ -62,8 +62,8 @@ public final class MainActivity extends Activity {
private boolean isOwner;
@Nullable private LegacyPlayerControlView playerControlView;
@Nullable private SurfaceView fullscreenView;
@Nullable private SurfaceView nonFullscreenView;
@Nullable private SurfaceView fullScreenView;
@Nullable private SurfaceView nonFullScreenView;
@Nullable private SurfaceView currentOutputView;
@Nullable private static ExoPlayer player;
@ -75,13 +75,13 @@ public final class MainActivity extends Activity {
super.onCreate(savedInstanceState);
setContentView(R.layout.main_activity);
playerControlView = findViewById(R.id.player_control_view);
fullscreenView = findViewById(R.id.full_screen_view);
fullscreenView.setOnClickListener(
fullScreenView = findViewById(R.id.full_screen_view);
fullScreenView.setOnClickListener(
v -> {
setCurrentOutputView(nonFullscreenView);
Assertions.checkNotNull(fullscreenView).setVisibility(View.GONE);
setCurrentOutputView(nonFullScreenView);
Assertions.checkNotNull(fullScreenView).setVisibility(View.GONE);
});
attachSurfaceListener(fullscreenView);
attachSurfaceListener(fullScreenView);
isOwner = getIntent().getBooleanExtra(OWNER_EXTRA, /* defaultValue= */ true);
GridLayout gridLayout = findViewById(R.id.grid_layout);
for (int i = 0; i < 9; i++) {
@ -97,8 +97,8 @@ public final class MainActivity extends Activity {
button.setText(getString(R.string.full_screen_label));
button.setOnClickListener(
v -> {
setCurrentOutputView(fullscreenView);
Assertions.checkNotNull(fullscreenView).setVisibility(View.VISIBLE);
setCurrentOutputView(fullScreenView);
Assertions.checkNotNull(fullScreenView).setVisibility(View.VISIBLE);
});
} else if (i == 2) {
Button button = new Button(/* context= */ this);
@ -116,10 +116,10 @@ public final class MainActivity extends Activity {
surfaceView.setOnClickListener(
v -> {
setCurrentOutputView(surfaceView);
nonFullscreenView = surfaceView;
nonFullScreenView = surfaceView;
});
if (nonFullscreenView == null) {
nonFullscreenView = surfaceView;
if (nonFullScreenView == null) {
nonFullScreenView = surfaceView;
}
}
gridLayout.addView(view);
@ -144,7 +144,7 @@ public final class MainActivity extends Activity {
initializePlayer();
}
setCurrentOutputView(nonFullscreenView);
setCurrentOutputView(nonFullScreenView);
LegacyPlayerControlView playerControlView = Assertions.checkNotNull(this.playerControlView);
playerControlView.setPlayer(player);

View file

@ -12,8 +12,8 @@ Building the demo app with [MediaPipe][] integration enabled requires some extra
manual steps.
1. Follow the
[instructions](https://ai.google.dev/edge/mediapipe/solutions/guide#get_started)
to get started with MediaPipe.
[instructions](https://google.github.io/mediapipe/getting_started/install.html)
to install MediaPipe.
1. Copy the Transformer demo's build configuration and MediaPipe graph text
protocol buffer under the MediaPipe source tree. This makes it easy to
[build an AAR][] with bazel by reusing MediaPipe's workspace.
@ -62,5 +62,5 @@ manual steps.
app and select a MediaPipe-based effect.
[Transformer]: https://developer.android.com/media/media3/transformer
[MediaPipe]: https://ai.google.dev/edge/mediapipe/solutions/guide
[build an AAR]: https://ai.google.dev/edge/mediapipe/framework/getting_started/android_archive_library
[MediaPipe]: https://google.github.io/mediapipe/
[build an AAR]: https://google.github.io/mediapipe/getting_started/android_archive_library.html

View file

@ -29,8 +29,9 @@ android {
defaultConfig {
versionName project.ext.releaseVersion
versionCode project.ext.releaseVersionCode
minSdkVersion project.ext.minSdkVersion
minSdkVersion 21
targetSdkVersion project.ext.appTargetSdkVersion
multiDexEnabled true
}
buildTypes {
@ -76,6 +77,7 @@ dependencies {
implementation 'androidx.appcompat:appcompat:' + androidxAppCompatVersion
implementation 'androidx.constraintlayout:constraintlayout:' + androidxConstraintLayoutVersion
implementation 'androidx.recyclerview:recyclerview:' + androidxRecyclerViewVersion
implementation 'androidx.multidex:multidex:' + androidxMultidexVersion
implementation 'com.google.android.material:material:' + androidxMaterialVersion
implementation project(modulePrefix + 'lib-effect')
implementation project(modulePrefix + 'lib-exoplayer')

View file

@ -257,12 +257,13 @@ public final class ConfigurationActivity extends AppCompatActivity {
videoMimeSpinner = findViewById(R.id.video_mime_spinner);
videoMimeSpinner.setAdapter(videoMimeAdapter);
videoMimeAdapter.addAll(
SAME_AS_INPUT_OPTION,
MimeTypes.VIDEO_H263,
MimeTypes.VIDEO_H264,
MimeTypes.VIDEO_H265,
MimeTypes.VIDEO_MP4V,
MimeTypes.VIDEO_AV1);
SAME_AS_INPUT_OPTION, MimeTypes.VIDEO_H263, MimeTypes.VIDEO_H264, MimeTypes.VIDEO_MP4V);
if (SDK_INT >= 24) {
videoMimeAdapter.add(MimeTypes.VIDEO_H265);
}
if (SDK_INT >= 34) {
videoMimeAdapter.add(MimeTypes.VIDEO_AV1);
}
ArrayAdapter<String> resolutionHeightAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
@ -301,18 +302,6 @@ public final class ConfigurationActivity extends AppCompatActivity {
abortSlowExportCheckBox = findViewById(R.id.abort_slow_export_checkbox);
useMedia3Muxer = findViewById(R.id.use_media3_muxer_checkbox);
produceFragmentedMp4CheckBox = findViewById(R.id.produce_fragmented_mp4_checkbox);
useMedia3Muxer.setOnCheckedChangeListener(
(buttonView, isChecked) -> {
if (!isChecked) {
produceFragmentedMp4CheckBox.setChecked(false);
}
});
produceFragmentedMp4CheckBox.setOnCheckedChangeListener(
(buttonView, isChecked) -> {
if (isChecked) {
useMedia3Muxer.setChecked(true);
}
});
ArrayAdapter<String> hdrModeAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);

View file

@ -20,8 +20,6 @@ import static android.Manifest.permission.READ_MEDIA_VIDEO;
import static androidx.media3.common.util.Assertions.checkNotNull;
import static androidx.media3.common.util.Assertions.checkState;
import static androidx.media3.common.util.Util.SDK_INT;
import static androidx.media3.exoplayer.DefaultLoadControl.DEFAULT_BUFFER_FOR_PLAYBACK_AFTER_REBUFFER_MS;
import static androidx.media3.exoplayer.DefaultLoadControl.DEFAULT_BUFFER_FOR_PLAYBACK_MS;
import static androidx.media3.transformer.Transformer.PROGRESS_STATE_NOT_STARTED;
import android.app.Activity;
@ -80,12 +78,13 @@ import androidx.media3.effect.ScaleAndRotateTransformation;
import androidx.media3.effect.SingleColorLut;
import androidx.media3.effect.TextOverlay;
import androidx.media3.effect.TextureOverlay;
import androidx.media3.exoplayer.DefaultLoadControl;
import androidx.media3.exoplayer.ExoPlayer;
import androidx.media3.exoplayer.audio.SilenceSkippingAudioProcessor;
import androidx.media3.exoplayer.util.DebugTextViewHelper;
import androidx.media3.muxer.Muxer;
import androidx.media3.transformer.Composition;
import androidx.media3.transformer.DefaultEncoderFactory;
import androidx.media3.transformer.DefaultMuxer;
import androidx.media3.transformer.EditedMediaItem;
import androidx.media3.transformer.EditedMediaItemSequence;
import androidx.media3.transformer.Effects;
@ -119,10 +118,6 @@ import org.json.JSONObject;
/** An {@link Activity} that exports and plays media using {@link Transformer}. */
public final class TransformerActivity extends AppCompatActivity {
private static final String TAG = "TransformerActivity";
private static final int IMAGE_DURATION_MS = 5_000;
private static final int IMAGE_FRAME_RATE_FPS = 30;
private static int LOAD_CONTROL_MIN_BUFFER_MS = 5_000;
private static int LOAD_CONTROL_MAX_BUFFER_MS = 5_000;
private Button displayInputButton;
private MaterialCardView inputCardView;
@ -135,7 +130,7 @@ public final class TransformerActivity extends AppCompatActivity {
private TextView informationTextView;
private ViewGroup progressViewGroup;
private LinearProgressIndicator progressIndicator;
private Button pauseButton;
private Button cancelButton;
private Button resumeButton;
private Stopwatch exportStopwatch;
private AspectRatioFrameLayout debugFrame;
@ -162,8 +157,8 @@ public final class TransformerActivity extends AppCompatActivity {
informationTextView = findViewById(R.id.information_text_view);
progressViewGroup = findViewById(R.id.progress_view_group);
progressIndicator = findViewById(R.id.progress_indicator);
pauseButton = findViewById(R.id.pause_button);
pauseButton.setOnClickListener(view -> pauseExport());
cancelButton = findViewById(R.id.cancel_button);
cancelButton.setOnClickListener(view -> cancelExport());
resumeButton = findViewById(R.id.resume_button);
resumeButton.setOnClickListener(view -> startExport());
debugFrame = findViewById(R.id.debug_aspect_ratio_frame_layout);
@ -246,7 +241,7 @@ public final class TransformerActivity extends AppCompatActivity {
debugTextView.setVisibility(View.GONE);
informationTextView.setText(R.string.export_started);
progressViewGroup.setVisibility(View.VISIBLE);
pauseButton.setVisibility(View.VISIBLE);
cancelButton.setVisibility(View.VISIBLE);
resumeButton.setVisibility(View.GONE);
progressIndicator.setProgress(0);
Handler mainHandler = new Handler(getMainLooper());
@ -267,8 +262,7 @@ public final class TransformerActivity extends AppCompatActivity {
}
private MediaItem createMediaItem(@Nullable Bundle bundle, Uri uri) {
MediaItem.Builder mediaItemBuilder =
new MediaItem.Builder().setUri(uri).setImageDurationMs(IMAGE_DURATION_MS);
MediaItem.Builder mediaItemBuilder = new MediaItem.Builder().setUri(uri);
if (bundle != null) {
long trimStartMs =
bundle.getLong(ConfigurationActivity.TRIM_START_MS, /* defaultValue= */ C.TIME_UNSET);
@ -323,13 +317,14 @@ public final class TransformerActivity extends AppCompatActivity {
transformerBuilder.setMaxDelayBetweenMuxerSamplesMs(C.TIME_UNSET);
}
Muxer.Factory muxerFactory = new DefaultMuxer.Factory();
if (bundle.getBoolean(ConfigurationActivity.USE_MEDIA3_MUXER)) {
transformerBuilder.setMuxerFactory(
new InAppMuxer.Factory.Builder()
.setOutputFragmentedMp4(
bundle.getBoolean(ConfigurationActivity.PRODUCE_FRAGMENTED_MP4))
.build());
muxerFactory = new InAppMuxer.Factory.Builder().build();
}
if (bundle.getBoolean(ConfigurationActivity.PRODUCE_FRAGMENTED_MP4)) {
muxerFactory = new InAppMuxer.Factory.Builder().setOutputFragmentedMp4(true).build();
}
transformerBuilder.setMuxerFactory(muxerFactory);
if (bundle.getBoolean(ConfigurationActivity.ENABLE_DEBUG_PREVIEW)) {
transformerBuilder.setDebugViewProvider(new DemoDebugViewProvider());
@ -359,7 +354,7 @@ public final class TransformerActivity extends AppCompatActivity {
private Composition createComposition(MediaItem mediaItem, @Nullable Bundle bundle) {
EditedMediaItem.Builder editedMediaItemBuilder = new EditedMediaItem.Builder(mediaItem);
// For image inputs. Automatically ignored if input is audio/video.
editedMediaItemBuilder.setFrameRate(IMAGE_FRAME_RATE_FPS);
editedMediaItemBuilder.setDurationUs(5_000_000).setFrameRate(30);
if (bundle != null) {
ImmutableList<AudioProcessor> audioProcessors = createAudioProcessorsFromBundle(bundle);
ImmutableList<Effect> videoEffects = createVideoEffectsFromBundle(bundle);
@ -371,8 +366,7 @@ public final class TransformerActivity extends AppCompatActivity {
.setEffects(new Effects(audioProcessors, videoEffects));
}
Composition.Builder compositionBuilder =
new Composition.Builder(
new EditedMediaItemSequence.Builder(editedMediaItemBuilder.build()).build());
new Composition.Builder(new EditedMediaItemSequence(editedMediaItemBuilder.build()));
if (bundle != null) {
compositionBuilder
.setHdrMode(bundle.getInt(ConfigurationActivity.HDR_MODE))
@ -704,17 +698,7 @@ public final class TransformerActivity extends AppCompatActivity {
releasePlayer();
Uri uri = checkNotNull(inputMediaItem.localConfiguration).uri;
ExoPlayer outputPlayer =
new ExoPlayer.Builder(/* context= */ this)
.setLoadControl(
new DefaultLoadControl.Builder()
.setBufferDurationsMs(
LOAD_CONTROL_MIN_BUFFER_MS,
LOAD_CONTROL_MAX_BUFFER_MS,
DEFAULT_BUFFER_FOR_PLAYBACK_MS,
DEFAULT_BUFFER_FOR_PLAYBACK_AFTER_REBUFFER_MS)
.build())
.build();
ExoPlayer outputPlayer = new ExoPlayer.Builder(/* context= */ this).build();
outputPlayerView.setPlayer(outputPlayer);
outputPlayerView.setControllerAutoShow(false);
outputPlayer.setMediaItem(outputMediaItem);
@ -740,17 +724,7 @@ public final class TransformerActivity extends AppCompatActivity {
inputImageView.setVisibility(View.GONE);
inputTextView.setText(getString(R.string.input_video_no_sound));
ExoPlayer inputPlayer =
new ExoPlayer.Builder(/* context= */ this)
.setLoadControl(
new DefaultLoadControl.Builder()
.setBufferDurationsMs(
LOAD_CONTROL_MIN_BUFFER_MS,
LOAD_CONTROL_MAX_BUFFER_MS,
DEFAULT_BUFFER_FOR_PLAYBACK_MS,
DEFAULT_BUFFER_FOR_PLAYBACK_AFTER_REBUFFER_MS)
.build())
.build();
ExoPlayer inputPlayer = new ExoPlayer.Builder(/* context= */ this).build();
inputPlayerView.setPlayer(inputPlayer);
inputPlayerView.setControllerAutoShow(false);
inputPlayerView.setOnClickListener(this::handlePlayerViewClick);
@ -825,11 +799,11 @@ public final class TransformerActivity extends AppCompatActivity {
}
}
private void pauseExport() {
private void cancelExport() {
transformer.cancel();
transformer = null;
exportStopwatch.stop();
pauseButton.setVisibility(View.GONE);
cancelButton.setVisibility(View.GONE);
resumeButton.setVisibility(View.VISIBLE);
if (oldOutputFile != null) {
oldOutputFile.delete();

View file

@ -49,6 +49,7 @@
android:text="@string/hide_input_video"
android:layout_margin="8dp" />
</LinearLayout>
</com.google.android.material.card.MaterialCardView>
@ -75,23 +76,28 @@
android:padding="8dp"
android:text="@string/input_video_no_sound" />
<FrameLayout
<FrameLayout
android:layout_width="match_parent"
android:layout_height="wrap_content" >
<ImageView
android:id="@+id/input_image_view"
android:layout_width="match_parent"
android:layout_height="wrap_content" >
android:layout_height="wrap_content" />
<ImageView
android:id="@+id/input_image_view"
<androidx.media3.ui.PlayerView
android:id="@+id/input_player_view"
android:layout_width="match_parent"
android:layout_height="wrap_content" />
<androidx.media3.ui.AspectRatioFrameLayout
android:id="@+id/input_debug_aspect_ratio_frame_layout"
android:layout_width="match_parent"
android:layout_height="wrap_content" />
<androidx.media3.ui.PlayerView
android:id="@+id/input_player_view"
android:layout_width="match_parent"
android:layout_height="wrap_content" />
android:layout_height="match_parent" />
</FrameLayout>
</LinearLayout>
</com.google.android.material.card.MaterialCardView>
<com.google.android.material.card.MaterialCardView
@ -154,10 +160,10 @@
android:text="@string/debug_preview" />
<Button
android:id="@+id/pause_button"
android:id="@+id/cancel_button"
android:layout_height="wrap_content"
android:layout_width="match_parent"
android:text="@string/pause"/>
android:text="@string/cancel"/>
<Button
android:id="@+id/resume_button"
@ -181,6 +187,7 @@
</FrameLayout>
</LinearLayout>
</com.google.android.material.card.MaterialCardView>

View file

@ -51,7 +51,7 @@
<item>Tokyo JPG image (portrait, plays for 5 secs at 30 fps)</item>
<item>SEF slow motion with 240 fps</item>
<item>480p DASH (non-square pixels)</item>
<item>HDR (HDR10+) H265 limited range video (encoding may fail)</item>
<item>HDR (HDR10) H265 limited range video (encoding may fail)</item>
<item>HDR (HLG) H265 limited range video (encoding may fail)</item>
<item>720p H264 video with no audio (B-frames)</item>
</string-array>

View file

@ -42,7 +42,7 @@
<string name="no_media_pipe_error" translatable="false">Failed to load MediaPipeShaderProgram. Check the README for instructions.</string>
<string name="export" translatable="false">Export</string>
<string name="debug_preview" translatable="false">Debug preview:</string>
<string name="pause" translatable="false">Pause</string>
<string name="cancel" translatable="false">Cancel</string>
<string name="resume" translatable="false">Resume</string>
<string name="debug_preview_not_available" translatable="false">No debug preview available.</string>
<string name="export_started" translatable="false">Export started</string>

View file

@ -17,17 +17,9 @@ package androidx.media3.cast;
import static androidx.annotation.VisibleForTesting.PROTECTED;
import static androidx.media3.common.util.Assertions.checkArgument;
import static androidx.media3.common.util.Util.SDK_INT;
import static androidx.media3.common.util.Util.castNonNull;
import static java.lang.Math.min;
import android.content.Context;
import android.media.MediaRouter2;
import android.media.MediaRouter2.RouteCallback;
import android.media.MediaRouter2.RoutingController;
import android.media.MediaRouter2.TransferCallback;
import android.media.RouteDiscoveryPreference;
import android.os.Handler;
import android.os.Looper;
import android.view.Surface;
import android.view.SurfaceHolder;
@ -35,7 +27,6 @@ import android.view.SurfaceView;
import android.view.TextureView;
import androidx.annotation.IntRange;
import androidx.annotation.Nullable;
import androidx.annotation.RequiresApi;
import androidx.annotation.VisibleForTesting;
import androidx.media3.common.AudioAttributes;
import androidx.media3.common.BasePlayer;
@ -92,11 +83,8 @@ import org.checkerframework.checker.nullness.qual.RequiresNonNull;
@UnstableApi
public final class CastPlayer extends BasePlayer {
/**
* A {@link DeviceInfo#PLAYBACK_TYPE_REMOTE remote} {@link DeviceInfo} with a null {@link
* DeviceInfo#routingControllerId}.
*/
public static final DeviceInfo DEVICE_INFO_REMOTE_EMPTY =
/** The {@link DeviceInfo} returned by {@link #getDeviceInfo() this player}. */
public static final DeviceInfo DEVICE_INFO =
new DeviceInfo.Builder(DeviceInfo.PLAYBACK_TYPE_REMOTE).build();
static {
@ -140,7 +128,6 @@ public final class CastPlayer extends BasePlayer {
// TODO: Allow custom implementations of CastTimelineTracker.
private final CastTimelineTracker timelineTracker;
private final Timeline.Period period;
@Nullable private final Api30Impl api30Impl;
// Result callbacks.
private final StatusListener statusListener;
@ -166,7 +153,6 @@ public final class CastPlayer extends BasePlayer {
private long pendingSeekPositionMs;
@Nullable private PositionInfo pendingMediaItemRemovalPosition;
private MediaMetadata mediaMetadata;
private DeviceInfo deviceInfo;
/**
* Creates a new cast player.
@ -216,7 +202,6 @@ public final class CastPlayer extends BasePlayer {
@IntRange(from = 1) long seekBackIncrementMs,
@IntRange(from = 1) long seekForwardIncrementMs) {
this(
/* context= */ null,
castContext,
mediaItemConverter,
seekBackIncrementMs,
@ -227,8 +212,6 @@ public final class CastPlayer extends BasePlayer {
/**
* Creates a new cast player.
*
* @param context A {@link Context} used to populate {@link #getDeviceInfo()}. If null, {@link
* #getDeviceInfo()} will always return {@link #DEVICE_INFO_REMOTE_EMPTY}.
* @param castContext The context from which the cast session is obtained.
* @param mediaItemConverter The {@link MediaItemConverter} to use.
* @param seekBackIncrementMs The {@link #seekBack()} increment, in milliseconds.
@ -240,7 +223,6 @@ public final class CastPlayer extends BasePlayer {
* negative.
*/
public CastPlayer(
@Nullable Context context,
CastContext castContext,
MediaItemConverter mediaItemConverter,
@IntRange(from = 1) long seekBackIncrementMs,
@ -278,14 +260,6 @@ public final class CastPlayer extends BasePlayer {
CastSession session = sessionManager.getCurrentCastSession();
setRemoteMediaClient(session != null ? session.getRemoteMediaClient() : null);
updateInternalStateAndNotifyIfChanged();
if (SDK_INT >= 30 && context != null) {
api30Impl = new Api30Impl(context);
api30Impl.initialize();
deviceInfo = api30Impl.fetchDeviceInfo();
} else {
api30Impl = null;
deviceInfo = DEVICE_INFO_REMOTE_EMPTY;
}
}
/**
@ -556,10 +530,6 @@ public final class CastPlayer extends BasePlayer {
@Override
public void release() {
// The SDK_INT check is not necessary, but it prevents a lint error for the release call.
if (SDK_INT >= 30 && api30Impl != null) {
api30Impl.release();
}
SessionManager sessionManager = castContext.getSessionManager();
sessionManager.removeSessionManagerListener(statusListener, CastSession.class);
sessionManager.endCurrentSession(false);
@ -812,14 +782,10 @@ public final class CastPlayer extends BasePlayer {
return CueGroup.EMPTY_TIME_ZERO;
}
/**
* Returns a {@link DeviceInfo} describing the receiver device. Returns {@link
* #DEVICE_INFO_REMOTE_EMPTY} if no {@link Context} was provided at construction, or if the Cast
* {@link RoutingController} could not be identified.
*/
/** This method always returns {@link CastPlayer#DEVICE_INFO}. */
@Override
public DeviceInfo getDeviceInfo() {
return deviceInfo;
return DEVICE_INFO;
}
/** This method is not supported and always returns {@code 0}. */
@ -1317,8 +1283,11 @@ public final class CastPlayer extends BasePlayer {
remoteMediaClient.registerCallback(statusListener);
remoteMediaClient.addProgressListener(statusListener, PROGRESS_REPORT_PERIOD_MS);
updateInternalStateAndNotifyIfChanged();
} else if (sessionAvailabilityListener != null) {
sessionAvailabilityListener.onCastSessionUnavailable();
} else {
updateTimelineAndNotifyIfChanged();
if (sessionAvailabilityListener != null) {
sessionAvailabilityListener.onCastSessionUnavailable();
}
}
}
@ -1568,105 +1537,4 @@ public final class CastPlayer extends BasePlayer {
return pendingResultCallback == resultCallback;
}
}
@RequiresApi(30)
private final class Api30Impl {
private final MediaRouter2 mediaRouter2;
private final TransferCallback transferCallback;
private final RouteCallback emptyRouteCallback;
private final Handler handler;
public Api30Impl(Context context) {
mediaRouter2 = MediaRouter2.getInstance(context);
transferCallback = new MediaRouter2TransferCallbackImpl();
emptyRouteCallback = new MediaRouter2RouteCallbackImpl();
handler = new Handler(Looper.getMainLooper());
}
/** Acquires necessary resources and registers callbacks. */
public void initialize() {
mediaRouter2.registerTransferCallback(handler::post, transferCallback);
// We need at least one route callback registered in order to get transfer callback updates.
mediaRouter2.registerRouteCallback(
handler::post,
emptyRouteCallback,
new RouteDiscoveryPreference.Builder(ImmutableList.of(), /* activeScan= */ false)
.build());
}
/**
* Releases any resources acquired in {@link #initialize()} and unregisters any registered
* callbacks.
*/
public void release() {
mediaRouter2.unregisterTransferCallback(transferCallback);
mediaRouter2.unregisterRouteCallback(emptyRouteCallback);
handler.removeCallbacksAndMessages(/* token= */ null);
}
/** Updates the device info with an up-to-date value and notifies the listeners. */
private void updateDeviceInfo() {
DeviceInfo oldDeviceInfo = deviceInfo;
DeviceInfo newDeviceInfo = fetchDeviceInfo();
deviceInfo = newDeviceInfo;
if (!deviceInfo.equals(oldDeviceInfo)) {
listeners.sendEvent(
EVENT_DEVICE_INFO_CHANGED, listener -> listener.onDeviceInfoChanged(newDeviceInfo));
}
}
/**
* Returns a {@link DeviceInfo} with the {@link RoutingController#getId() id} that corresponds
* to the Cast session, or {@link #DEVICE_INFO_REMOTE_EMPTY} if not available.
*/
public DeviceInfo fetchDeviceInfo() {
// TODO: b/364833997 - Fetch this information from the AndroidX MediaRouter selected route
// once the selected route id matches the controller id.
List<RoutingController> controllers = mediaRouter2.getControllers();
// The controller at position zero is always the system controller (local playback). All other
// controllers are for remote playback, and could be the Cast one.
if (controllers.size() != 2) {
// There's either no remote routing controller, or there's more than one. In either case we
// don't populate the device info because either there's no Cast routing controller, or we
// cannot safely identify the Cast routing controller.
return DEVICE_INFO_REMOTE_EMPTY;
} else {
// There's only one remote routing controller. It's safe to assume it's the Cast routing
// controller.
RoutingController remoteController = controllers.get(1);
// TODO b/364580007 - Populate volume information, and implement Player volume-related
// methods.
return new DeviceInfo.Builder(DeviceInfo.PLAYBACK_TYPE_REMOTE)
.setRoutingControllerId(remoteController.getId())
.build();
}
}
/**
* Empty {@link RouteCallback} implementation necessary for registering the {@link MediaRouter2}
* instance with the system_server.
*
* <p>This callback must be registered so that the media router service notifies the {@link
* MediaRouter2TransferCallbackImpl} of transfer events.
*/
private final class MediaRouter2RouteCallbackImpl extends RouteCallback {}
/**
* {@link TransferCallback} implementation to listen for {@link RoutingController} creation and
* releases.
*/
private final class MediaRouter2TransferCallbackImpl extends TransferCallback {
@Override
public void onTransfer(RoutingController oldController, RoutingController newController) {
updateDeviceInfo();
}
@Override
public void onStop(RoutingController controller) {
updateDeviceInfo();
}
}
}
}

View file

@ -1902,7 +1902,7 @@ public class CastPlayerTest {
public void getDeviceInfo_returnsCorrectDeviceInfoWithPlaybackTypeRemote() {
DeviceInfo deviceInfo = castPlayer.getDeviceInfo();
assertThat(deviceInfo).isEqualTo(CastPlayer.DEVICE_INFO_REMOTE_EMPTY);
assertThat(deviceInfo).isEqualTo(CastPlayer.DEVICE_INFO);
assertThat(deviceInfo.playbackType).isEqualTo(DeviceInfo.PLAYBACK_TYPE_REMOTE);
}

View file

@ -65,11 +65,6 @@ dependencies {
}
api 'androidx.annotation:annotation-experimental:' + androidxAnnotationExperimentalVersion
implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion
// Workaround for 'duplicate class' error caused by incomplete version
// metadata in Kotlin std lib (https://issuetracker.google.com/278545487).
// This can be removed when one of the other deps here (probably
// androidx.annotation) depends on kotlin-stdlib:1.9.20.
implementation platform('org.jetbrains.kotlin:kotlin-bom:1.8.0')
compileOnly 'com.google.code.findbugs:jsr305:' + jsr305Version
compileOnly 'com.google.errorprone:error_prone_annotations:' + errorProneVersion
compileOnly 'org.checkerframework:checker-qual:' + checkerframeworkVersion

View file

@ -16,6 +16,7 @@
package androidx.media3.common;
import android.os.Bundle;
import androidx.annotation.DoNotInline;
import androidx.annotation.Nullable;
import androidx.annotation.RequiresApi;
import androidx.media3.common.util.UnstableApi;
@ -36,6 +37,7 @@ import com.google.errorprone.annotations.CanIgnoreReturnValue;
public final class AudioAttributes {
/** A direct wrapper around {@link android.media.AudioAttributes}. */
@RequiresApi(21)
public static final class AudioAttributesV21 {
public final android.media.AudioAttributes audioAttributes;
@ -163,6 +165,7 @@ public final class AudioAttributes {
* <p>Some fields are ignored if the corresponding {@link android.media.AudioAttributes.Builder}
* setter is not available on the current API level.
*/
@RequiresApi(21)
public AudioAttributesV21 getAudioAttributesV21() {
if (audioAttributesV21 == null) {
audioAttributesV21 = new AudioAttributesV21(this);
@ -239,6 +242,7 @@ public final class AudioAttributes {
@RequiresApi(29)
private static final class Api29 {
@DoNotInline
public static void setAllowedCapturePolicy(
android.media.AudioAttributes.Builder builder,
@C.AudioAllowedCapturePolicy int allowedCapturePolicy) {
@ -248,6 +252,7 @@ public final class AudioAttributes {
@RequiresApi(32)
private static final class Api32 {
@DoNotInline
public static void setSpatializationBehavior(
android.media.AudioAttributes.Builder builder,
@C.SpatializationBehavior int spatializationBehavior) {

View file

@ -147,11 +147,38 @@ public abstract class BasePlayer implements Player {
seekToOffset(getSeekForwardIncrement(), Player.COMMAND_SEEK_FORWARD);
}
/**
* @deprecated Use {@link #hasPreviousMediaItem()} instead.
*/
@Deprecated
@Override
public final boolean hasPrevious() {
return hasPreviousMediaItem();
}
/**
* @deprecated Use {@link #hasPreviousMediaItem()} instead.
*/
@Deprecated
@Override
public final boolean hasPreviousWindow() {
return hasPreviousMediaItem();
}
@Override
public final boolean hasPreviousMediaItem() {
return getPreviousMediaItemIndex() != C.INDEX_UNSET;
}
/**
* @deprecated Use {@link #seekToPreviousMediaItem()} instead.
*/
@Deprecated
@Override
public final void previous() {
seekToPreviousMediaItem();
}
/**
* @deprecated Use {@link #seekToPreviousMediaItem()} instead.
*/

View file

@ -32,6 +32,7 @@ import android.media.MediaFormat;
import android.net.Uri;
import android.view.Surface;
import androidx.annotation.IntDef;
import androidx.annotation.RequiresApi;
import androidx.media3.common.util.UnstableApi;
import androidx.media3.common.util.Util;
import com.google.errorprone.annotations.InlineMe;
@ -619,7 +620,6 @@ public final class C {
* <ul>
* <li>{@link #BUFFER_FLAG_KEY_FRAME}
* <li>{@link #BUFFER_FLAG_END_OF_STREAM}
* <li>{@link #BUFFER_FLAG_NOT_DEPENDED_ON}
* <li>{@link #BUFFER_FLAG_FIRST_SAMPLE}
* <li>{@link #BUFFER_FLAG_LAST_SAMPLE}
* <li>{@link #BUFFER_FLAG_ENCRYPTED}
@ -634,7 +634,6 @@ public final class C {
value = {
BUFFER_FLAG_KEY_FRAME,
BUFFER_FLAG_END_OF_STREAM,
BUFFER_FLAG_NOT_DEPENDED_ON,
BUFFER_FLAG_FIRST_SAMPLE,
BUFFER_FLAG_HAS_SUPPLEMENTAL_DATA,
BUFFER_FLAG_LAST_SAMPLE,
@ -649,9 +648,6 @@ public final class C {
@UnstableApi
public static final int BUFFER_FLAG_END_OF_STREAM = MediaCodec.BUFFER_FLAG_END_OF_STREAM;
/** Indicates that no other buffers depend on the data in this buffer. */
@UnstableApi public static final int BUFFER_FLAG_NOT_DEPENDED_ON = 1 << 26; // 0x04000000
/** Indicates that a buffer is known to contain the first media sample of the stream. */
@UnstableApi public static final int BUFFER_FLAG_FIRST_SAMPLE = 1 << 27; // 0x08000000
@ -1096,8 +1092,7 @@ public final class C {
/**
* The stereo mode for 360/3D/VR videos. One of {@link Format#NO_VALUE}, {@link
* #STEREO_MODE_MONO}, {@link #STEREO_MODE_TOP_BOTTOM}, {@link #STEREO_MODE_LEFT_RIGHT} or {@link
* #STEREO_MODE_STEREO_MESH}, {@link #STEREO_MODE_INTERLEAVED_LEFT_PRIMARY}, {@link
* #STEREO_MODE_INTERLEAVED_RIGHT_PRIMARY}.
* #STEREO_MODE_STEREO_MESH}.
*/
@UnstableApi
@Documented
@ -1108,9 +1103,7 @@ public final class C {
STEREO_MODE_MONO,
STEREO_MODE_TOP_BOTTOM,
STEREO_MODE_LEFT_RIGHT,
STEREO_MODE_STEREO_MESH,
STEREO_MODE_INTERLEAVED_LEFT_PRIMARY,
STEREO_MODE_INTERLEAVED_RIGHT_PRIMARY
STEREO_MODE_STEREO_MESH
})
public @interface StereoMode {}
@ -1129,18 +1122,6 @@ public final class C {
*/
@UnstableApi public static final int STEREO_MODE_STEREO_MESH = 3;
/**
* Indicates interleaved stereo layout with the left view being the primary view, used with
* 360/3D/VR videos.
*/
@UnstableApi public static final int STEREO_MODE_INTERLEAVED_LEFT_PRIMARY = 4;
/**
* Indicates interleaved stereo layout with the right view being the primary view, used with
* 360/3D/VR videos.
*/
@UnstableApi public static final int STEREO_MODE_INTERLEAVED_RIGHT_PRIMARY = 5;
// LINT.IfChange(color_space)
/**
* Video color spaces, also referred to as color standards. One of {@link Format#NO_VALUE}, {@link
@ -1447,8 +1428,7 @@ public final class C {
ROLE_FLAG_ENHANCED_DIALOG_INTELLIGIBILITY,
ROLE_FLAG_TRANSCRIBES_DIALOG,
ROLE_FLAG_EASY_TO_READ,
ROLE_FLAG_TRICK_PLAY,
ROLE_FLAG_AUXILIARY
ROLE_FLAG_TRICK_PLAY
})
public @interface RoleFlags {}
@ -1513,58 +1493,6 @@ public final class C {
/** Indicates the track is intended for trick play. */
public static final int ROLE_FLAG_TRICK_PLAY = 1 << 14;
/**
* Indicates an auxiliary track. An auxiliary track provides additional information about other
* tracks and is generally not meant for stand-alone playback, but rather for further processing
* in conjunction with other tracks (for example, a track with depth information).
*/
public static final int ROLE_FLAG_AUXILIARY = 1 << 15;
/**
* {@linkplain #ROLE_FLAG_AUXILIARY Auxiliary track types}. One of {@link
* #AUXILIARY_TRACK_TYPE_UNDEFINED}, {@link #AUXILIARY_TRACK_TYPE_ORIGINAL}, {@link
* #AUXILIARY_TRACK_TYPE_DEPTH_LINEAR}, {@link #AUXILIARY_TRACK_TYPE_DEPTH_INVERSE}, {@link
* #AUXILIARY_TRACK_TYPE_DEPTH_METADATA}.
*/
@UnstableApi
@Documented
@Retention(RetentionPolicy.SOURCE)
@Target({FIELD, METHOD, PARAMETER, LOCAL_VARIABLE, TYPE_USE})
@IntDef({
AUXILIARY_TRACK_TYPE_UNDEFINED,
AUXILIARY_TRACK_TYPE_ORIGINAL,
AUXILIARY_TRACK_TYPE_DEPTH_LINEAR,
AUXILIARY_TRACK_TYPE_DEPTH_INVERSE,
AUXILIARY_TRACK_TYPE_DEPTH_METADATA
})
public @interface AuxiliaryTrackType {}
// LINT.IfChange(auxiliary_track_type)
/** Not an auxiliary track or an auxiliary track with an undefined type. */
@UnstableApi public static final int AUXILIARY_TRACK_TYPE_UNDEFINED = 0;
/** The original video track without any depth based effects applied. */
@UnstableApi public static final int AUXILIARY_TRACK_TYPE_ORIGINAL = 1;
/**
* A linear encoded depth video track.
*
* <p>See https://developer.android.com/static/media/camera/camera2/Dynamic-depth-v1.0.pdf for
* linear depth encoding.
*/
@UnstableApi public static final int AUXILIARY_TRACK_TYPE_DEPTH_LINEAR = 2;
/**
* An inverse encoded depth video track.
*
* <p>See https://developer.android.com/static/media/camera/camera2/Dynamic-depth-v1.0.pdf for
* inverse depth encoding.
*/
@UnstableApi public static final int AUXILIARY_TRACK_TYPE_DEPTH_INVERSE = 3;
/** A timed metadata of depth video track. */
@UnstableApi public static final int AUXILIARY_TRACK_TYPE_DEPTH_METADATA = 4;
/**
* Level of support for a format. One of {@link #FORMAT_HANDLED}, {@link
* #FORMAT_EXCEEDS_CAPABILITIES}, {@link #FORMAT_UNSUPPORTED_DRM}, {@link
@ -1698,6 +1626,7 @@ public final class C {
replacement = "Util.generateAudioSessionIdV21(context)",
imports = {"androidx.media3.common.util.Util"})
@Deprecated
@RequiresApi(21)
public static int generateAudioSessionIdV21(Context context) {
return Util.generateAudioSessionIdV21(context);
}

View file

@ -203,7 +203,7 @@ public final class ColorInfo {
/**
* Returns the {@link C.ColorSpace} corresponding to the given ISO color primary code, as per
* table A.7.21.1 in Rec. ITU-T T.832 (06/2019), or {@link Format#NO_VALUE} if no mapping can be
* table A.7.21.1 in Rec. ITU-T T.832 (03/2009), or {@link Format#NO_VALUE} if no mapping can be
* made.
*/
@Pure
@ -219,52 +219,13 @@ public final class ColorInfo {
case 9:
return C.COLOR_SPACE_BT2020;
default:
// Remaining color primaries are either reserved or unspecified.
return Format.NO_VALUE;
}
}
/**
* Returns the ISO color primary code corresponding to the given {@link C.ColorSpace}, as per
* table A.7.21.1 in Rec. ITU-T T.832 (06/2019). made.
*/
public static int colorSpaceToIsoColorPrimaries(@C.ColorSpace int colorSpace) {
switch (colorSpace) {
// Default to BT.709 SDR as per the <a
// href="https://www.webmproject.org/vp9/mp4/#optional-fields">recommendation</a>.
case Format.NO_VALUE:
case C.COLOR_SPACE_BT709:
return 1;
case C.COLOR_SPACE_BT601:
return 5;
case C.COLOR_SPACE_BT2020:
return 9;
}
return 1;
}
/**
* Returns the ISO matrix coefficients code corresponding to the given {@link C.ColorSpace}, as
* per table A.7.21.3 in Rec. ITU-T T.832 (06/2019).
*/
public static int colorSpaceToIsoMatrixCoefficients(@C.ColorSpace int colorSpace) {
switch (colorSpace) {
// Default to BT.709 SDR as per the <a
// href="https://www.webmproject.org/vp9/mp4/#optional-fields">recommendation</a>.
case Format.NO_VALUE:
case C.COLOR_SPACE_BT709:
return 1;
case C.COLOR_SPACE_BT601:
return 6;
case C.COLOR_SPACE_BT2020:
return 9;
}
return 1;
}
/**
* Returns the {@link C.ColorTransfer} corresponding to the given ISO transfer characteristics
* code, as per table A.7.21.2 in Rec. ITU-T T.832 (06/2019), or {@link Format#NO_VALUE} if no
* code, as per table A.7.21.2 in Rec. ITU-T T.832 (03/2009), or {@link Format#NO_VALUE} if no
* mapping can be made.
*/
@Pure
@ -288,31 +249,6 @@ public final class ColorInfo {
}
}
/**
* Returns the ISO transfer characteristics code corresponding to the given {@link
* C.ColorTransfer}, as per table A.7.21.2 in Rec. ITU-T T.832 (06/2019).
*/
public static int colorTransferToIsoTransferCharacteristics(@C.ColorTransfer int colorTransfer) {
switch (colorTransfer) {
// Default to BT.709 SDR as per the <a
// href="https://www.webmproject.org/vp9/mp4/#optional-fields">recommendation</a>.
case C.COLOR_TRANSFER_LINEAR:
return 8;
case C.COLOR_TRANSFER_SRGB:
return 13;
case Format.NO_VALUE:
case C.COLOR_TRANSFER_SDR:
return 1;
case C.COLOR_TRANSFER_ST2084:
return 16;
case C.COLOR_TRANSFER_HLG:
return 18;
case C.COLOR_TRANSFER_GAMMA_2_2:
return 4;
}
return 1;
}
/**
* Returns whether the {@code ColorInfo} uses an HDR {@link C.ColorTransfer}.
*

View file

@ -16,7 +16,6 @@
package androidx.media3.common;
import static androidx.media3.common.util.Assertions.checkState;
import static com.google.common.math.DoubleMath.fuzzyEquals;
import static java.lang.annotation.ElementType.TYPE_USE;
import android.os.Bundle;
@ -28,7 +27,6 @@ import androidx.media3.common.util.UnstableApi;
import androidx.media3.common.util.Util;
import com.google.common.base.Joiner;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.Lists;
import com.google.errorprone.annotations.CanIgnoreReturnValue;
import java.lang.annotation.Documented;
import java.lang.annotation.Retention;
@ -148,7 +146,6 @@ public final class Format {
@Nullable private String language;
private @C.SelectionFlags int selectionFlags;
private @C.RoleFlags int roleFlags;
private @C.AuxiliaryTrackType int auxiliaryTrackType;
private int averageBitrate;
private int peakBitrate;
@Nullable private String codecs;
@ -167,7 +164,6 @@ public final class Format {
@Nullable private List<byte[]> initializationData;
@Nullable private DrmInitData drmInitData;
private long subsampleOffsetUs;
private boolean hasPrerollSamples;
// Video specific.
@ -229,7 +225,6 @@ public final class Format {
tileCountVertical = NO_VALUE;
// Provided by the source.
cryptoType = C.CRYPTO_TYPE_NONE;
auxiliaryTrackType = C.AUXILIARY_TRACK_TYPE_UNDEFINED;
}
/**
@ -258,7 +253,6 @@ public final class Format {
this.initializationData = format.initializationData;
this.drmInitData = format.drmInitData;
this.subsampleOffsetUs = format.subsampleOffsetUs;
this.hasPrerollSamples = format.hasPrerollSamples;
// Video specific.
this.width = format.width;
this.height = format.height;
@ -366,9 +360,6 @@ public final class Format {
/**
* Sets {@link Format#roleFlags}. The default value is 0.
*
* <p>When {@code roleFlags} includes {@link C#ROLE_FLAG_AUXILIARY}, then the specific {@link
* C.AuxiliaryTrackType} can also be {@linkplain #setAuxiliaryTrackType(int) set}.
*
* @param roleFlags The {@link Format#roleFlags}.
* @return The builder.
*/
@ -378,22 +369,6 @@ public final class Format {
return this;
}
/**
* Sets {@link Format#auxiliaryTrackType}. The default value is {@link
* C#AUXILIARY_TRACK_TYPE_UNDEFINED}.
*
* <p>This must be set to a value other than {@link C#AUXILIARY_TRACK_TYPE_UNDEFINED} only when
* {@linkplain #setRoleFlags(int) role flags} contains {@link C#ROLE_FLAG_AUXILIARY}.
*
* @param auxiliaryTrackType The {@link Format#auxiliaryTrackType}.
* @return The builder.
*/
@CanIgnoreReturnValue
public Builder setAuxiliaryTrackType(@C.AuxiliaryTrackType int auxiliaryTrackType) {
this.auxiliaryTrackType = auxiliaryTrackType;
return this;
}
/**
* Sets {@link Format#averageBitrate}. The default value is {@link #NO_VALUE}.
*
@ -546,18 +521,6 @@ public final class Format {
return this;
}
/**
* Sets {@link Format#hasPrerollSamples}. The default value is {@code false}.
*
* @param hasPrerollSamples The {@link Format#hasPrerollSamples}.
* @return The builder.
*/
@CanIgnoreReturnValue
public Builder setHasPrerollSamples(boolean hasPrerollSamples) {
this.hasPrerollSamples = hasPrerollSamples;
return this;
}
// Video specific.
/**
@ -750,7 +713,7 @@ public final class Format {
/**
* Sets {@link Format#tileCountHorizontal}. The default value is {@link #NO_VALUE}.
*
* @param tileCountHorizontal The {@link Format#tileCountHorizontal}.
* @param tileCountHorizontal The {@link Format#accessibilityChannel}.
* @return The builder.
*/
@CanIgnoreReturnValue
@ -762,7 +725,7 @@ public final class Format {
/**
* Sets {@link Format#tileCountVertical}. The default value is {@link #NO_VALUE}.
*
* @param tileCountVertical The {@link Format#tileCountVertical}.
* @param tileCountVertical The {@link Format#accessibilityChannel}.
* @return The builder.
*/
@CanIgnoreReturnValue
@ -861,9 +824,6 @@ public final class Format {
/** Track role flags. */
public final @C.RoleFlags int roleFlags;
/** The auxiliary track type. */
@UnstableApi public final @C.AuxiliaryTrackType int auxiliaryTrackType;
/**
* The average bitrate in bits per second, or {@link #NO_VALUE} if unknown or not applicable. The
* way in which this field is populated depends on the type of media to which the format
@ -967,15 +927,6 @@ public final class Format {
*/
@UnstableApi public final long subsampleOffsetUs;
/**
* Indicates whether the stream contains preroll samples.
*
* <p>When this field is set to {@code true}, it means that the stream includes decode-only
* samples that occur before the intended playback start position. These samples are necessary for
* decoding but are not meant to be rendered and should be skipped after decoding.
*/
@UnstableApi public final boolean hasPrerollSamples;
// Video specific.
/** The width of the video in pixels, or {@link #NO_VALUE} if unknown or not applicable. */
@ -1092,14 +1043,7 @@ public final class Format {
label = builder.label;
}
selectionFlags = builder.selectionFlags;
checkState(
builder.auxiliaryTrackType == C.AUXILIARY_TRACK_TYPE_UNDEFINED
|| (builder.roleFlags & C.ROLE_FLAG_AUXILIARY) != 0,
"Auxiliary track type must only be set to a value other than AUXILIARY_TRACK_TYPE_UNDEFINED"
+ " only when ROLE_FLAG_AUXILIARY is set");
roleFlags = builder.roleFlags;
auxiliaryTrackType = builder.auxiliaryTrackType;
averageBitrate = builder.averageBitrate;
peakBitrate = builder.peakBitrate;
bitrate = peakBitrate != NO_VALUE ? peakBitrate : averageBitrate;
@ -1116,7 +1060,6 @@ public final class Format {
builder.initializationData == null ? Collections.emptyList() : builder.initializationData;
drmInitData = builder.drmInitData;
subsampleOffsetUs = builder.subsampleOffsetUs;
hasPrerollSamples = builder.hasPrerollSamples;
// Video specific.
width = builder.width;
height = builder.height;
@ -1286,7 +1229,6 @@ public final class Format {
result = 31 * result + (language == null ? 0 : language.hashCode());
result = 31 * result + selectionFlags;
result = 31 * result + roleFlags;
result = 31 * result + auxiliaryTrackType;
result = 31 * result + averageBitrate;
result = 31 * result + peakBitrate;
result = 31 * result + (codecs == null ? 0 : codecs.hashCode());
@ -1342,7 +1284,6 @@ public final class Format {
// Field equality checks ordered by type, with the cheapest checks first.
return selectionFlags == other.selectionFlags
&& roleFlags == other.roleFlags
&& auxiliaryTrackType == other.auxiliaryTrackType
&& averageBitrate == other.averageBitrate
&& peakBitrate == other.peakBitrate
&& maxInputSize == other.maxInputSize
@ -1406,7 +1347,6 @@ public final class Format {
if (format == null) {
return "null";
}
Joiner commaJoiner = Joiner.on(',');
StringBuilder builder = new StringBuilder();
builder.append("id=").append(format.id).append(", mimeType=").append(format.sampleMimeType);
if (format.containerMimeType != null) {
@ -1437,15 +1377,12 @@ public final class Format {
}
}
builder.append(", drm=[");
commaJoiner.appendTo(builder, schemes);
Joiner.on(',').appendTo(builder, schemes);
builder.append(']');
}
if (format.width != NO_VALUE && format.height != NO_VALUE) {
builder.append(", res=").append(format.width).append("x").append(format.height);
}
if (!fuzzyEquals(format.pixelWidthHeightRatio, 1, 0.001)) {
builder.append(", par=").append(Util.formatInvariant("%.3f", format.pixelWidthHeightRatio));
}
if (format.colorInfo != null && format.colorInfo.isValid()) {
builder.append(", color=").append(format.colorInfo.toLogString());
}
@ -1463,28 +1400,22 @@ public final class Format {
}
if (!format.labels.isEmpty()) {
builder.append(", labels=[");
commaJoiner.appendTo(
builder, Lists.transform(format.labels, l -> l.language + ": " + l.value));
Joiner.on(',').appendTo(builder, format.labels);
builder.append("]");
}
if (format.selectionFlags != 0) {
builder.append(", selectionFlags=[");
commaJoiner.appendTo(builder, Util.getSelectionFlagStrings(format.selectionFlags));
Joiner.on(',').appendTo(builder, Util.getSelectionFlagStrings(format.selectionFlags));
builder.append("]");
}
if (format.roleFlags != 0) {
builder.append(", roleFlags=[");
commaJoiner.appendTo(builder, Util.getRoleFlagStrings(format.roleFlags));
Joiner.on(',').appendTo(builder, Util.getRoleFlagStrings(format.roleFlags));
builder.append("]");
}
if (format.customData != null) {
builder.append(", customData=").append(format.customData);
}
if ((format.roleFlags & C.ROLE_FLAG_AUXILIARY) != 0) {
builder
.append(", auxiliaryTrackType=")
.append(Util.getAuxiliaryTrackTypeString(format.auxiliaryTrackType));
}
return builder.toString();
}
@ -1521,7 +1452,6 @@ public final class Format {
private static final String FIELD_TILE_COUNT_HORIZONTAL = Util.intToStringMaxRadix(30);
private static final String FIELD_TILE_COUNT_VERTICAL = Util.intToStringMaxRadix(31);
private static final String FIELD_LABELS = Util.intToStringMaxRadix(32);
private static final String FIELD_AUXILIARY_TRACK_TYPE = Util.intToStringMaxRadix(33);
/**
* @deprecated Use {@link #toBundle(boolean)} instead.
@ -1546,9 +1476,6 @@ public final class Format {
bundle.putString(FIELD_LANGUAGE, language);
bundle.putInt(FIELD_SELECTION_FLAGS, selectionFlags);
bundle.putInt(FIELD_ROLE_FLAGS, roleFlags);
if (auxiliaryTrackType != DEFAULT.auxiliaryTrackType) {
bundle.putInt(FIELD_AUXILIARY_TRACK_TYPE, auxiliaryTrackType);
}
bundle.putInt(FIELD_AVERAGE_BITRATE, averageBitrate);
bundle.putInt(FIELD_PEAK_BITRATE, peakBitrate);
bundle.putString(FIELD_CODECS, codecs);
@ -1613,8 +1540,6 @@ public final class Format {
.setLanguage(defaultIfNull(bundle.getString(FIELD_LANGUAGE), DEFAULT.language))
.setSelectionFlags(bundle.getInt(FIELD_SELECTION_FLAGS, DEFAULT.selectionFlags))
.setRoleFlags(bundle.getInt(FIELD_ROLE_FLAGS, DEFAULT.roleFlags))
.setAuxiliaryTrackType(
bundle.getInt(FIELD_AUXILIARY_TRACK_TYPE, DEFAULT.auxiliaryTrackType))
.setAverageBitrate(bundle.getInt(FIELD_AVERAGE_BITRATE, DEFAULT.averageBitrate))
.setPeakBitrate(bundle.getInt(FIELD_PEAK_BITRATE, DEFAULT.peakBitrate))
.setCodecs(defaultIfNull(bundle.getString(FIELD_CODECS), DEFAULT.codecs))

View file

@ -30,25 +30,6 @@ import java.util.List;
/**
* A {@link Player} that forwards method calls to another {@link Player}. Applications can use this
* class to suppress or modify specific operations, by overriding the respective methods.
*
* <p>Subclasses must ensure they maintain consistency with the {@link Player} interface, including
* interactions with {@link Player.Listener}, which can be quite fiddly. For example, if removing an
* available {@link Player.Command} and disabling the corresponding method, subclasses need to:
*
* <ul>
* <li>Override {@link #isCommandAvailable(int)} and {@link #getAvailableCommands()}
* <li>Override and no-op the method itself
* <li>Override {@link #addListener(Listener)} and wrap the provided {@link Player.Listener} with
* an implementation that drops calls to {@link
* Player.Listener#onAvailableCommandsChanged(Commands)} and {@link
* Player.Listener#onEvents(Player, Events)} if they were only triggered by a change in
* command availability that is 'invisible' after the command removal.
* </ul>
*
* <p>Many customization use-cases are instead better served by {@link ForwardingSimpleBasePlayer},
* which allows subclasses to more concisely modify the behavior of an operation, or disallow a
* {@link Player.Command}. In many cases {@link ForwardingSimpleBasePlayer} should be used in
* preference to {@code ForwardingPlayer}.
*/
@UnstableApi
public class ForwardingPlayer implements Player {
@ -346,12 +327,48 @@ public class ForwardingPlayer implements Player {
player.seekForward();
}
/**
* Calls {@link Player#hasPrevious()} on the delegate and returns the result.
*
* @deprecated Use {@link #hasPreviousMediaItem()} instead.
*/
@SuppressWarnings("deprecation") // Forwarding to deprecated method
@Deprecated
@Override
public boolean hasPrevious() {
return player.hasPrevious();
}
/**
* Calls {@link Player#hasPreviousWindow()} on the delegate and returns the result.
*
* @deprecated Use {@link #hasPreviousMediaItem()} instead.
*/
@SuppressWarnings("deprecation") // Forwarding to deprecated method
@Deprecated
@Override
public boolean hasPreviousWindow() {
return player.hasPreviousWindow();
}
/** Calls {@link Player#hasPreviousMediaItem()} on the delegate and returns the result. */
@Override
public boolean hasPreviousMediaItem() {
return player.hasPreviousMediaItem();
}
/**
* Calls {@link Player#previous()} on the delegate.
*
* @deprecated Use {@link #seekToPreviousMediaItem()} instead.
*/
@SuppressWarnings("deprecation") // Forwarding to deprecated method
@Deprecated
@Override
public void previous() {
player.previous();
}
/**
* Calls {@link Player#seekToPreviousWindow()} on the delegate.
*

View file

@ -1,499 +0,0 @@
/*
* Copyright 2024 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package androidx.media3.common;
import android.view.Surface;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.TextureView;
import androidx.annotation.Nullable;
import androidx.media3.common.util.UnstableApi;
import com.google.common.util.concurrent.Futures;
import com.google.common.util.concurrent.ListenableFuture;
import java.util.List;
// LINT.IfChange(javadoc)
/**
* A {@link SimpleBasePlayer} that forwards all calls to another {@link Player} instance.
*
* <p>The class can be used to selectively override {@link #getState()} or {@code handle{Action}}
* methods:
*
* <pre>{@code
* new ForwardingSimpleBasePlayer(player) {
* @Override
* protected State getState() {
* State state = super.getState();
* // Modify current state as required:
* return state.buildUpon().setAvailableCommands(filteredCommands).build();
* }
*
* @Override
* protected ListenableFuture<?> handleSetRepeatMode(int repeatMode) {
* // Modify actions by directly calling the underlying player as needed:
* getPlayer().setShuffleModeEnabled(true);
* // ..or forward to the default handling with modified parameters:
* return super.handleSetRepeatMode(Player.REPEAT_MODE_ALL);
* }
* }
* }</pre>
*
* This base class handles many aspect of the player implementation to simplify the subclass, for
* example listener handling. See the documentation of {@link SimpleBasePlayer} for a more detailed
* description.
*/
@UnstableApi
public class ForwardingSimpleBasePlayer extends SimpleBasePlayer {
private final Player player;
private ForwardingPositionSupplier currentPositionSupplier;
private Metadata lastTimedMetadata;
private @Player.PlayWhenReadyChangeReason int playWhenReadyChangeReason;
private @Player.DiscontinuityReason int pendingDiscontinuityReason;
private long pendingPositionDiscontinuityNewPositionMs;
private boolean pendingFirstFrameRendered;
/**
* Creates the forwarding player.
*
* @param player The {@link Player} to forward to.
*/
public ForwardingSimpleBasePlayer(Player player) {
super(player.getApplicationLooper());
this.player = player;
this.lastTimedMetadata = new Metadata(/* presentationTimeUs= */ C.TIME_UNSET);
this.playWhenReadyChangeReason = Player.PLAY_WHEN_READY_CHANGE_REASON_USER_REQUEST;
this.pendingDiscontinuityReason = Player.DISCONTINUITY_REASON_INTERNAL;
this.currentPositionSupplier = new ForwardingPositionSupplier(player);
player.addListener(
new Listener() {
@Override
public void onMetadata(Metadata metadata) {
lastTimedMetadata = metadata;
}
@Override
public void onPlayWhenReadyChanged(
boolean playWhenReady, @Player.PlayWhenReadyChangeReason int reason) {
playWhenReadyChangeReason = reason;
}
@Override
public void onPositionDiscontinuity(
PositionInfo oldPosition,
PositionInfo newPosition,
@Player.DiscontinuityReason int reason) {
pendingDiscontinuityReason = reason;
pendingPositionDiscontinuityNewPositionMs = newPosition.positionMs;
// Any previously created State will directly call through to player.getCurrentPosition
// via the existing position supplier. From this point onwards, this is wrong as the
// player had a discontinuity and will now return a new position unrelated to the old
// State. We can disconnect these old State objects from the underlying Player by fixing
// the position to the one before the discontinuity and using a new (live) position
// supplier for future State objects.
currentPositionSupplier.setConstant(
oldPosition.positionMs, oldPosition.contentPositionMs);
currentPositionSupplier = new ForwardingPositionSupplier(player);
}
@Override
public void onRenderedFirstFrame() {
pendingFirstFrameRendered = true;
}
@SuppressWarnings("method.invocation.invalid") // Calling method from constructor.
@Override
public void onEvents(Player player, Events events) {
invalidateState();
}
});
}
/** Returns the wrapped player. */
protected final Player getPlayer() {
return player;
}
@Override
protected State getState() {
// Ordered alphabetically by State.Builder setters.
State.Builder state = new State.Builder();
ForwardingPositionSupplier positionSupplier = currentPositionSupplier;
if (player.isCommandAvailable(Player.COMMAND_GET_CURRENT_MEDIA_ITEM)) {
state.setAdBufferedPositionMs(positionSupplier::getBufferedPositionMs);
state.setAdPositionMs(positionSupplier::getCurrentPositionMs);
}
if (player.isCommandAvailable(Player.COMMAND_GET_AUDIO_ATTRIBUTES)) {
state.setAudioAttributes(player.getAudioAttributes());
}
state.setAvailableCommands(player.getAvailableCommands());
if (player.isCommandAvailable(Player.COMMAND_GET_CURRENT_MEDIA_ITEM)) {
state.setContentBufferedPositionMs(positionSupplier::getContentBufferedPositionMs);
state.setContentPositionMs(positionSupplier::getContentPositionMs);
if (player.isCommandAvailable(Player.COMMAND_GET_TIMELINE)) {
state.setCurrentAd(player.getCurrentAdGroupIndex(), player.getCurrentAdIndexInAdGroup());
}
}
if (player.isCommandAvailable(Player.COMMAND_GET_TEXT)) {
state.setCurrentCues(player.getCurrentCues());
}
if (player.isCommandAvailable(Player.COMMAND_GET_TIMELINE)) {
state.setCurrentMediaItemIndex(player.getCurrentMediaItemIndex());
}
state.setDeviceInfo(player.getDeviceInfo());
if (player.isCommandAvailable(Player.COMMAND_GET_DEVICE_VOLUME)) {
state.setDeviceVolume(player.getDeviceVolume());
state.setIsDeviceMuted(player.isDeviceMuted());
}
state.setIsLoading(player.isLoading());
state.setMaxSeekToPreviousPositionMs(player.getMaxSeekToPreviousPosition());
if (pendingFirstFrameRendered) {
state.setNewlyRenderedFirstFrame(true);
pendingFirstFrameRendered = false;
}
state.setPlaybackParameters(player.getPlaybackParameters());
state.setPlaybackState(player.getPlaybackState());
state.setPlaybackSuppressionReason(player.getPlaybackSuppressionReason());
state.setPlayerError(player.getPlayerError());
if (player.isCommandAvailable(Player.COMMAND_GET_TIMELINE)) {
Tracks tracks =
player.isCommandAvailable(Player.COMMAND_GET_TRACKS)
? player.getCurrentTracks()
: Tracks.EMPTY;
MediaMetadata mediaMetadata =
player.isCommandAvailable(Player.COMMAND_GET_METADATA) ? player.getMediaMetadata() : null;
state.setPlaylist(player.getCurrentTimeline(), tracks, mediaMetadata);
}
if (player.isCommandAvailable(Player.COMMAND_GET_METADATA)) {
state.setPlaylistMetadata(player.getPlaylistMetadata());
}
state.setPlayWhenReady(player.getPlayWhenReady(), playWhenReadyChangeReason);
if (pendingPositionDiscontinuityNewPositionMs != C.TIME_UNSET) {
state.setPositionDiscontinuity(
pendingDiscontinuityReason, pendingPositionDiscontinuityNewPositionMs);
pendingPositionDiscontinuityNewPositionMs = C.TIME_UNSET;
}
state.setRepeatMode(player.getRepeatMode());
state.setSeekBackIncrementMs(player.getSeekBackIncrement());
state.setSeekForwardIncrementMs(player.getSeekForwardIncrement());
state.setShuffleModeEnabled(player.getShuffleModeEnabled());
state.setSurfaceSize(player.getSurfaceSize());
state.setTimedMetadata(lastTimedMetadata);
if (player.isCommandAvailable(Player.COMMAND_GET_CURRENT_MEDIA_ITEM)) {
state.setTotalBufferedDurationMs(positionSupplier::getTotalBufferedDurationMs);
}
state.setTrackSelectionParameters(player.getTrackSelectionParameters());
state.setVideoSize(player.getVideoSize());
if (player.isCommandAvailable(Player.COMMAND_GET_VOLUME)) {
state.setVolume(player.getVolume());
}
return state.build();
}
@Override
protected ListenableFuture<?> handleSetPlayWhenReady(boolean playWhenReady) {
player.setPlayWhenReady(playWhenReady);
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handlePrepare() {
player.prepare();
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleStop() {
player.stop();
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleRelease() {
player.release();
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleSetRepeatMode(@Player.RepeatMode int repeatMode) {
player.setRepeatMode(repeatMode);
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleSetShuffleModeEnabled(boolean shuffleModeEnabled) {
player.setShuffleModeEnabled(shuffleModeEnabled);
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleSetPlaybackParameters(PlaybackParameters playbackParameters) {
player.setPlaybackParameters(playbackParameters);
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleSetTrackSelectionParameters(
TrackSelectionParameters trackSelectionParameters) {
player.setTrackSelectionParameters(trackSelectionParameters);
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleSetPlaylistMetadata(MediaMetadata playlistMetadata) {
player.setPlaylistMetadata(playlistMetadata);
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleSetVolume(float volume) {
player.setVolume(volume);
return Futures.immediateVoidFuture();
}
@SuppressWarnings("deprecation") // Calling deprecated method if updated command not available.
@Override
protected ListenableFuture<?> handleSetDeviceVolume(int deviceVolume, int flags) {
if (player.isCommandAvailable(Player.COMMAND_SET_DEVICE_VOLUME_WITH_FLAGS)) {
player.setDeviceVolume(deviceVolume, flags);
} else {
player.setDeviceVolume(deviceVolume);
}
return Futures.immediateVoidFuture();
}
@SuppressWarnings("deprecation") // Calling deprecated method if updated command not available.
@Override
protected ListenableFuture<?> handleIncreaseDeviceVolume(@C.VolumeFlags int flags) {
if (player.isCommandAvailable(Player.COMMAND_ADJUST_DEVICE_VOLUME_WITH_FLAGS)) {
player.increaseDeviceVolume(flags);
} else {
player.increaseDeviceVolume();
}
return Futures.immediateVoidFuture();
}
@SuppressWarnings("deprecation") // Calling deprecated method if updated command not available.
@Override
protected ListenableFuture<?> handleDecreaseDeviceVolume(@C.VolumeFlags int flags) {
if (player.isCommandAvailable(Player.COMMAND_ADJUST_DEVICE_VOLUME_WITH_FLAGS)) {
player.decreaseDeviceVolume(flags);
} else {
player.decreaseDeviceVolume();
}
return Futures.immediateVoidFuture();
}
@SuppressWarnings("deprecation") // Calling deprecated method if updated command not available.
@Override
protected ListenableFuture<?> handleSetDeviceMuted(boolean muted, @C.VolumeFlags int flags) {
if (player.isCommandAvailable(Player.COMMAND_ADJUST_DEVICE_VOLUME_WITH_FLAGS)) {
player.setDeviceMuted(muted, flags);
} else {
player.setDeviceMuted(muted);
}
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleSetAudioAttributes(
AudioAttributes audioAttributes, boolean handleAudioFocus) {
player.setAudioAttributes(audioAttributes, handleAudioFocus);
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleSetVideoOutput(Object videoOutput) {
if (videoOutput instanceof SurfaceView) {
player.setVideoSurfaceView((SurfaceView) videoOutput);
} else if (videoOutput instanceof TextureView) {
player.setVideoTextureView((TextureView) videoOutput);
} else if (videoOutput instanceof SurfaceHolder) {
player.setVideoSurfaceHolder((SurfaceHolder) videoOutput);
} else if (videoOutput instanceof Surface) {
player.setVideoSurface((Surface) videoOutput);
} else {
throw new IllegalStateException();
}
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleClearVideoOutput(@Nullable Object videoOutput) {
if (videoOutput instanceof SurfaceView) {
player.clearVideoSurfaceView((SurfaceView) videoOutput);
} else if (videoOutput instanceof TextureView) {
player.clearVideoTextureView((TextureView) videoOutput);
} else if (videoOutput instanceof SurfaceHolder) {
player.clearVideoSurfaceHolder((SurfaceHolder) videoOutput);
} else if (videoOutput instanceof Surface) {
player.clearVideoSurface((Surface) videoOutput);
} else if (videoOutput == null) {
player.clearVideoSurface();
} else {
throw new IllegalStateException();
}
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleSetMediaItems(
List<MediaItem> mediaItems, int startIndex, long startPositionMs) {
boolean useSingleItemCall =
mediaItems.size() == 1 && player.isCommandAvailable(Player.COMMAND_SET_MEDIA_ITEM);
if (startIndex == C.INDEX_UNSET) {
if (useSingleItemCall) {
player.setMediaItem(mediaItems.get(0));
} else {
player.setMediaItems(mediaItems);
}
} else {
if (useSingleItemCall) {
player.setMediaItem(mediaItems.get(0), startPositionMs);
} else {
player.setMediaItems(mediaItems, startIndex, startPositionMs);
}
}
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleAddMediaItems(int index, List<MediaItem> mediaItems) {
if (mediaItems.size() == 1) {
player.addMediaItem(index, mediaItems.get(0));
} else {
player.addMediaItems(index, mediaItems);
}
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleMoveMediaItems(int fromIndex, int toIndex, int newIndex) {
if (toIndex == fromIndex + 1) {
player.moveMediaItem(fromIndex, newIndex);
} else {
player.moveMediaItems(fromIndex, toIndex, newIndex);
}
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleReplaceMediaItems(
int fromIndex, int toIndex, List<MediaItem> mediaItems) {
if (toIndex == fromIndex + 1 && mediaItems.size() == 1) {
player.replaceMediaItem(fromIndex, mediaItems.get(0));
} else {
player.replaceMediaItems(fromIndex, toIndex, mediaItems);
}
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleRemoveMediaItems(int fromIndex, int toIndex) {
if (toIndex == fromIndex + 1) {
player.removeMediaItem(fromIndex);
} else {
player.removeMediaItems(fromIndex, toIndex);
}
return Futures.immediateVoidFuture();
}
@Override
protected ListenableFuture<?> handleSeek(
int mediaItemIndex, long positionMs, @Command int seekCommand) {
switch (seekCommand) {
case Player.COMMAND_SEEK_BACK:
player.seekBack();
break;
case Player.COMMAND_SEEK_FORWARD:
player.seekForward();
break;
case Player.COMMAND_SEEK_IN_CURRENT_MEDIA_ITEM:
player.seekTo(positionMs);
break;
case Player.COMMAND_SEEK_TO_DEFAULT_POSITION:
player.seekToDefaultPosition();
break;
case Player.COMMAND_SEEK_TO_MEDIA_ITEM:
if (mediaItemIndex != C.INDEX_UNSET) {
player.seekTo(mediaItemIndex, positionMs);
}
break;
case Player.COMMAND_SEEK_TO_NEXT:
player.seekToNext();
break;
case Player.COMMAND_SEEK_TO_NEXT_MEDIA_ITEM:
player.seekToNextMediaItem();
break;
case Player.COMMAND_SEEK_TO_PREVIOUS:
player.seekToPrevious();
break;
case Player.COMMAND_SEEK_TO_PREVIOUS_MEDIA_ITEM:
player.seekToPreviousMediaItem();
break;
default:
throw new IllegalStateException();
}
return Futures.immediateVoidFuture();
}
/**
* Forwards to the changing position values of the wrapped player until the forwarding is
* deactivated with constant values.
*/
private static final class ForwardingPositionSupplier {
private final Player player;
private long positionsMs;
private long contentPositionMs;
public ForwardingPositionSupplier(Player player) {
this.player = player;
this.positionsMs = C.TIME_UNSET;
this.contentPositionMs = C.TIME_UNSET;
}
public void setConstant(long positionMs, long contentPositionMs) {
this.positionsMs = positionMs;
this.contentPositionMs = contentPositionMs;
}
public long getCurrentPositionMs() {
return positionsMs == C.TIME_UNSET ? player.getCurrentPosition() : positionsMs;
}
public long getBufferedPositionMs() {
return positionsMs == C.TIME_UNSET ? player.getBufferedPosition() : positionsMs;
}
public long getContentPositionMs() {
return contentPositionMs == C.TIME_UNSET ? player.getContentPosition() : contentPositionMs;
}
public long getContentBufferedPositionMs() {
return contentPositionMs == C.TIME_UNSET
? player.getContentBufferedPosition()
: contentPositionMs;
}
public long getTotalBufferedDurationMs() {
return positionsMs == C.TIME_UNSET ? player.getTotalBufferedDuration() : 0;
}
}
}

View file

@ -81,11 +81,4 @@ public interface GlObjectsProvider {
* @throws GlException If an error occurs during creation.
*/
GlTextureInfo createBuffersForTexture(int texId, int width, int height) throws GlException;
/**
* Releases the created objects.
*
* @param eglDisplay The {@link EGLDisplay} to release the objects for.
*/
void release(EGLDisplay eglDisplay) throws GlException;
}

View file

@ -29,11 +29,11 @@ public final class MediaLibraryInfo {
/** The version of the library expressed as a string, for example "1.2.3" or "1.2.0-beta01". */
// Intentionally hardcoded. Do not derive from other constants (e.g. VERSION_INT) or vice versa.
public static final String VERSION = "1.5.1";
public static final String VERSION = "1.4.0";
/** The version of the library expressed as {@code TAG + "/" + VERSION}. */
// Intentionally hardcoded. Do not derive from other constants (e.g. VERSION) or vice versa.
public static final String VERSION_SLASHY = "AndroidXMedia3/1.5.1";
public static final String VERSION_SLASHY = "AndroidXMedia3/1.4.0";
/**
* The version of the library expressed as an integer, for example 1002003300.
@ -47,7 +47,7 @@ public final class MediaLibraryInfo {
* (123-045-006-3-00).
*/
// Intentionally hardcoded. Do not derive from other constants (e.g. VERSION) or vice versa.
public static final int VERSION_INT = 1_005_001_3_00;
public static final int VERSION_INT = 1_004_000_3_00;
/** Whether the library was compiled with {@link Assertions} checks enabled. */
public static final boolean ASSERTIONS_ENABLED = true;

View file

@ -30,13 +30,11 @@ import androidx.annotation.Nullable;
import androidx.media3.common.util.UnstableApi;
import androidx.media3.common.util.Util;
import com.google.common.base.Objects;
import com.google.common.collect.ImmutableList;
import com.google.errorprone.annotations.CanIgnoreReturnValue;
import java.lang.annotation.Documented;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
@ -87,11 +85,8 @@ public final class MediaMetadata {
@Nullable private CharSequence station;
@Nullable private @MediaType Integer mediaType;
@Nullable private Bundle extras;
private ImmutableList<String> supportedCommands;
public Builder() {
supportedCommands = ImmutableList.of();
}
public Builder() {}
@SuppressWarnings("deprecation") // Assigning from deprecated fields.
private Builder(MediaMetadata mediaMetadata) {
@ -128,7 +123,6 @@ public final class MediaMetadata {
this.compilation = mediaMetadata.compilation;
this.station = mediaMetadata.station;
this.mediaType = mediaMetadata.mediaType;
this.supportedCommands = mediaMetadata.supportedCommands;
this.extras = mediaMetadata.extras;
}
@ -446,17 +440,6 @@ public final class MediaMetadata {
return this;
}
/**
* Sets the IDs of the supported commands (see for instance {@code
* CommandButton.sessionCommand.customAction} of the Media3 session module).
*/
@CanIgnoreReturnValue
@UnstableApi
public Builder setSupportedCommands(List<String> supportedCommands) {
this.supportedCommands = ImmutableList.copyOf(supportedCommands);
return this;
}
/**
* Sets all fields supported by the {@link Metadata.Entry entries} within the {@link Metadata}.
*
@ -613,10 +596,6 @@ public final class MediaMetadata {
setExtras(mediaMetadata.extras);
}
if (!mediaMetadata.supportedCommands.isEmpty()) {
setSupportedCommands(mediaMetadata.supportedCommands);
}
return this;
}
@ -1144,12 +1123,6 @@ public final class MediaMetadata {
*/
@Nullable public final Bundle extras;
/**
* The IDs of the supported commands of this media item (see for instance {@code
* CommandButton.sessionCommand.customAction} of the Media3 session module).
*/
@UnstableApi public final ImmutableList<String> supportedCommands;
@SuppressWarnings("deprecation") // Assigning deprecated fields.
private MediaMetadata(Builder builder) {
// Handle compatibility for deprecated fields.
@ -1202,7 +1175,6 @@ public final class MediaMetadata {
this.compilation = builder.compilation;
this.station = builder.station;
this.mediaType = mediaType;
this.supportedCommands = builder.supportedCommands;
this.extras = builder.extras;
}
@ -1255,7 +1227,6 @@ public final class MediaMetadata {
&& Util.areEqual(compilation, that.compilation)
&& Util.areEqual(station, that.station)
&& Util.areEqual(mediaType, that.mediaType)
&& Util.areEqual(supportedCommands, that.supportedCommands)
&& ((extras == null) == (that.extras == null));
}
@ -1296,8 +1267,7 @@ public final class MediaMetadata {
compilation,
station,
mediaType,
extras == null,
supportedCommands);
extras == null);
}
private static final String FIELD_TITLE = Util.intToStringMaxRadix(0);
@ -1334,7 +1304,6 @@ public final class MediaMetadata {
private static final String FIELD_MEDIA_TYPE = Util.intToStringMaxRadix(31);
private static final String FIELD_IS_BROWSABLE = Util.intToStringMaxRadix(32);
private static final String FIELD_DURATION_MS = Util.intToStringMaxRadix(33);
private static final String FIELD_SUPPORTED_COMMANDS = Util.intToStringMaxRadix(34);
private static final String FIELD_EXTRAS = Util.intToStringMaxRadix(1000);
@SuppressWarnings("deprecation") // Bundling deprecated fields.
@ -1440,9 +1409,6 @@ public final class MediaMetadata {
if (mediaType != null) {
bundle.putInt(FIELD_MEDIA_TYPE, mediaType);
}
if (!supportedCommands.isEmpty()) {
bundle.putStringArrayList(FIELD_SUPPORTED_COMMANDS, new ArrayList<>(supportedCommands));
}
if (extras != null) {
bundle.putBundle(FIELD_EXTRAS, extras);
}
@ -1533,11 +1499,6 @@ public final class MediaMetadata {
if (bundle.containsKey(FIELD_MEDIA_TYPE)) {
builder.setMediaType(bundle.getInt(FIELD_MEDIA_TYPE));
}
@Nullable
ArrayList<String> supportedCommands = bundle.getStringArrayList(FIELD_SUPPORTED_COMMANDS);
if (supportedCommands != null) {
builder.setSupportedCommands(supportedCommands);
}
return builder.build();
}

View file

@ -62,7 +62,6 @@ public final class MimeTypes {
public static final String VIDEO_MJPEG = BASE_TYPE_VIDEO + "/mjpeg";
public static final String VIDEO_MP42 = BASE_TYPE_VIDEO + "/mp42";
public static final String VIDEO_MP43 = BASE_TYPE_VIDEO + "/mp43";
@UnstableApi public static final String VIDEO_MV_HEVC = BASE_TYPE_VIDEO + "/mv-hevc";
@UnstableApi public static final String VIDEO_RAW = BASE_TYPE_VIDEO + "/raw";
@UnstableApi public static final String VIDEO_UNKNOWN = BASE_TYPE_VIDEO + "/x-unknown";
@ -100,7 +99,6 @@ public final class MimeTypes {
public static final String AUDIO_OGG = BASE_TYPE_AUDIO + "/ogg";
public static final String AUDIO_WAV = BASE_TYPE_AUDIO + "/wav";
public static final String AUDIO_MIDI = BASE_TYPE_AUDIO + "/midi";
@UnstableApi public static final String AUDIO_IAMF = BASE_TYPE_AUDIO + "/iamf";
@UnstableApi
public static final String AUDIO_EXOPLAYER_MIDI = BASE_TYPE_AUDIO + "/x-exoplayer-midi";
@ -141,15 +139,10 @@ public final class MimeTypes {
public static final String APPLICATION_VOBSUB = BASE_TYPE_APPLICATION + "/vobsub";
public static final String APPLICATION_PGS = BASE_TYPE_APPLICATION + "/pgs";
@UnstableApi public static final String APPLICATION_SCTE35 = BASE_TYPE_APPLICATION + "/x-scte35";
public static final String APPLICATION_SDP = BASE_TYPE_APPLICATION + "/sdp";
@UnstableApi
public static final String APPLICATION_CAMERA_MOTION = BASE_TYPE_APPLICATION + "/x-camera-motion";
@UnstableApi
public static final String APPLICATION_DEPTH_METADATA =
BASE_TYPE_APPLICATION + "/x-depth-metadata";
@UnstableApi public static final String APPLICATION_EMSG = BASE_TYPE_APPLICATION + "/x-emsg";
public static final String APPLICATION_DVBSUBS = BASE_TYPE_APPLICATION + "/dvbsubs";
@UnstableApi public static final String APPLICATION_EXIF = BASE_TYPE_APPLICATION + "/x-exif";
@ -495,29 +488,6 @@ public final class MimeTypes {
}
}
/**
* Returns the MP4 object type identifier corresponding to a MIME type, as defined in RFC 6381 and
* <a href="https://mp4ra.org/registered-types/object-types">MPEG-4 Object Types</a>.
*
* @param sampleMimeType The MIME type of the track.
* @return The corresponding MP4 object type identifier, or {@code null} if it could not be
* determined.
*/
@UnstableApi
@Nullable
public static Byte getMp4ObjectTypeFromMimeType(String sampleMimeType) {
switch (sampleMimeType) {
case MimeTypes.AUDIO_AAC:
return (byte) 0x40;
case MimeTypes.AUDIO_VORBIS:
return (byte) 0xDD;
case MimeTypes.VIDEO_MP4V:
return (byte) 0x20;
default:
return null;
}
}
/**
* Returns the MIME type corresponding to an MP4 object type identifier, as defined in RFC 6381
* and https://mp4ra.org/#/object_types.
@ -601,9 +571,7 @@ public final class MimeTypes {
return C.TRACK_TYPE_IMAGE;
} else if (APPLICATION_ID3.equals(mimeType)
|| APPLICATION_EMSG.equals(mimeType)
|| APPLICATION_SCTE35.equals(mimeType)
|| APPLICATION_ICY.equals(mimeType)
|| APPLICATION_AIT.equals(mimeType)) {
|| APPLICATION_SCTE35.equals(mimeType)) {
return C.TRACK_TYPE_METADATA;
} else if (APPLICATION_CAMERA_MOTION.equals(mimeType)) {
return C.TRACK_TYPE_CAMERA_MOTION;
@ -685,17 +653,14 @@ public final class MimeTypes {
}
mimeType = Ascii.toLowerCase(mimeType);
switch (mimeType) {
// Normalize uncommon versions of some video MIME types to their standard equivalent.
case BASE_TYPE_VIDEO + "/x-mvhevc":
return VIDEO_MV_HEVC;
// Normalize uncommon versions of some audio MIME types to their standard equivalent.
// Normalize uncommon versions of some audio MIME types to their standard equivalent.
case BASE_TYPE_AUDIO + "/x-flac":
return AUDIO_FLAC;
case BASE_TYPE_AUDIO + "/mp3":
return AUDIO_MPEG;
case BASE_TYPE_AUDIO + "/x-wav":
return AUDIO_WAV;
// Normalize MIME types that are often written with upper-case letters to their common form.
// Normalize MIME types that are often written with upper-case letters to their common form.
case "application/x-mpegurl":
return APPLICATION_M3U8;
case "audio/mpeg-l1":

View file

@ -113,7 +113,7 @@ public class ParserException extends IOException {
@Override
public String getMessage() {
return super.getMessage()
+ " {contentIsMalformed="
+ "{contentIsMalformed="
+ contentIsMalformed
+ ", dataType="
+ dataType

View file

@ -2635,6 +2635,20 @@ public interface Player {
*/
void seekForward();
/**
* @deprecated Use {@link #hasPreviousMediaItem()} instead.
*/
@UnstableApi
@Deprecated
boolean hasPrevious();
/**
* @deprecated Use {@link #hasPreviousMediaItem()} instead.
*/
@UnstableApi
@Deprecated
boolean hasPreviousWindow();
/**
* Returns whether a previous media item exists, which may depend on the current repeat mode and
* whether shuffle mode is enabled.
@ -2648,6 +2662,13 @@ public interface Player {
*/
boolean hasPreviousMediaItem();
/**
* @deprecated Use {@link #seekToPreviousMediaItem()} instead.
*/
@UnstableApi
@Deprecated
void previous();
/**
* @deprecated Use {@link #seekToPreviousMediaItem()} instead.
*/

View file

@ -53,7 +53,6 @@ import java.util.ArrayList;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Objects;
import org.checkerframework.checker.nullness.qual.EnsuresNonNull;
import org.checkerframework.checker.nullness.qual.MonotonicNonNull;
import org.checkerframework.checker.nullness.qual.RequiresNonNull;
@ -127,10 +126,8 @@ public abstract class SimpleBasePlayer extends BasePlayer {
private Size surfaceSize;
private boolean newlyRenderedFirstFrame;
private Metadata timedMetadata;
@Nullable private ImmutableList<MediaItemData> playlist;
private ImmutableList<MediaItemData> playlist;
private Timeline timeline;
@Nullable private Tracks currentTracks;
@Nullable private MediaMetadata currentMetadata;
private MediaMetadata playlistMetadata;
private int currentMediaItemIndex;
private int currentAdGroupIndex;
@ -174,8 +171,6 @@ public abstract class SimpleBasePlayer extends BasePlayer {
timedMetadata = new Metadata(/* presentationTimeUs= */ C.TIME_UNSET);
playlist = ImmutableList.of();
timeline = Timeline.EMPTY;
currentTracks = null;
currentMetadata = null;
playlistMetadata = MediaMetadata.EMPTY;
currentMediaItemIndex = C.INDEX_UNSET;
currentAdGroupIndex = C.INDEX_UNSET;
@ -217,13 +212,8 @@ public abstract class SimpleBasePlayer extends BasePlayer {
this.surfaceSize = state.surfaceSize;
this.newlyRenderedFirstFrame = state.newlyRenderedFirstFrame;
this.timedMetadata = state.timedMetadata;
this.playlist = state.playlist;
this.timeline = state.timeline;
if (state.timeline instanceof PlaylistTimeline) {
this.playlist = ((PlaylistTimeline) state.timeline).playlist;
} else {
this.currentTracks = state.currentTracks;
this.currentMetadata = state.currentMetadata;
}
this.playlistMetadata = state.playlistMetadata;
this.currentMediaItemIndex = state.currentMediaItemIndex;
this.currentAdGroupIndex = state.currentAdGroupIndex;
@ -548,13 +538,10 @@ public abstract class SimpleBasePlayer extends BasePlayer {
}
/**
* Sets the playlist as a list of {@link MediaItemData media items}.
* Sets the list of {@link MediaItemData media items} in the playlist.
*
* <p>All items must have unique {@linkplain MediaItemData.Builder#setUid UIDs}.
*
* <p>This call replaces any previous playlist set via {@link #setPlaylist(Timeline, Tracks,
* MediaMetadata)}.
*
* @param playlist The list of {@link MediaItemData media items} in the playlist.
* @return This builder.
*/
@ -566,33 +553,6 @@ public abstract class SimpleBasePlayer extends BasePlayer {
}
this.playlist = ImmutableList.copyOf(playlist);
this.timeline = new PlaylistTimeline(this.playlist);
this.currentTracks = null;
this.currentMetadata = null;
return this;
}
/**
* Sets the playlist as a {@link Timeline} with information about the current {@link Tracks}
* and {@link MediaMetadata}.
*
* <p>This call replaces any previous playlist set via {@link #setPlaylist(List)}.
*
* @param timeline The {@link Timeline} containing the playlist data.
* @param currentTracks The {@link Tracks} of the {@linkplain #setCurrentMediaItemIndex
* current media item}.
* @param currentMetadata The combined {@link MediaMetadata} of the {@linkplain
* #setCurrentMediaItemIndex current media item}. If null, the current metadata is assumed
* to be the combination of the {@link MediaItem#mediaMetadata MediaItem} metadata and the
* metadata of the selected {@link Format#metadata Formats}.
* @return This builder.
*/
@CanIgnoreReturnValue
public Builder setPlaylist(
Timeline timeline, Tracks currentTracks, @Nullable MediaMetadata currentMetadata) {
this.playlist = null;
this.timeline = timeline;
this.currentTracks = currentTracks;
this.currentMetadata = currentMetadata;
return this;
}
@ -890,15 +850,12 @@ public abstract class SimpleBasePlayer extends BasePlayer {
/** The most recent timed metadata. */
public final Metadata timedMetadata;
/** The {@link Timeline}. */
/** The media items in the playlist. */
public final ImmutableList<MediaItemData> playlist;
/** The {@link Timeline} derived from the {@link #playlist}. */
public final Timeline timeline;
/** The current {@link Tracks}. */
public final Tracks currentTracks;
/** The current combined {@link MediaMetadata}. */
public final MediaMetadata currentMetadata;
/** The playlist {@link MediaMetadata}. */
public final MediaMetadata playlistMetadata;
@ -959,8 +916,6 @@ public abstract class SimpleBasePlayer extends BasePlayer {
public final long discontinuityPositionMs;
private State(Builder builder) {
Tracks currentTracks = builder.currentTracks;
MediaMetadata currentMetadata = builder.currentMetadata;
if (builder.timeline.isEmpty()) {
checkArgument(
builder.playbackState == Player.STATE_IDLE
@ -970,12 +925,6 @@ public abstract class SimpleBasePlayer extends BasePlayer {
builder.currentAdGroupIndex == C.INDEX_UNSET
&& builder.currentAdIndexInAdGroup == C.INDEX_UNSET,
"Ads not allowed if playlist is empty");
if (currentTracks == null) {
currentTracks = Tracks.EMPTY;
}
if (currentMetadata == null) {
currentMetadata = MediaMetadata.EMPTY;
}
} else {
int mediaItemIndex = builder.currentMediaItemIndex;
if (mediaItemIndex == C.INDEX_UNSET) {
@ -1006,17 +955,6 @@ public abstract class SimpleBasePlayer extends BasePlayer {
"Ad group has less ads than adIndexInGroupIndex");
}
}
if (builder.playlist != null) {
MediaItemData mediaItemData = builder.playlist.get(mediaItemIndex);
currentTracks = mediaItemData.tracks;
currentMetadata = mediaItemData.mediaMetadata;
}
if (currentMetadata == null) {
currentMetadata =
getCombinedMediaMetadata(
builder.timeline.getWindow(mediaItemIndex, new Timeline.Window()).mediaItem,
checkNotNull(currentTracks));
}
}
if (builder.playerError != null) {
checkArgument(
@ -1077,9 +1015,8 @@ public abstract class SimpleBasePlayer extends BasePlayer {
this.surfaceSize = builder.surfaceSize;
this.newlyRenderedFirstFrame = builder.newlyRenderedFirstFrame;
this.timedMetadata = builder.timedMetadata;
this.playlist = builder.playlist;
this.timeline = builder.timeline;
this.currentTracks = checkNotNull(currentTracks);
this.currentMetadata = currentMetadata;
this.playlistMetadata = builder.playlistMetadata;
this.currentMediaItemIndex = builder.currentMediaItemIndex;
this.currentAdGroupIndex = builder.currentAdGroupIndex;
@ -1099,27 +1036,6 @@ public abstract class SimpleBasePlayer extends BasePlayer {
return new Builder(this);
}
/**
* Returns the list of {@link MediaItemData} for the current playlist.
*
* @see Builder#setPlaylist(List)
*/
public ImmutableList<MediaItemData> getPlaylist() {
if (timeline instanceof PlaylistTimeline) {
return ((PlaylistTimeline) timeline).playlist;
}
Timeline.Window window = new Timeline.Window();
Timeline.Period period = new Timeline.Period();
ImmutableList.Builder<MediaItemData> items =
ImmutableList.builderWithExpectedSize(timeline.getWindowCount());
for (int i = 0; i < timeline.getWindowCount(); i++) {
items.add(
MediaItemData.buildFromState(
/* state= */ this, /* mediaItemIndex= */ i, period, window));
}
return items.build();
}
@Override
public boolean equals(@Nullable Object o) {
if (this == o) {
@ -1134,7 +1050,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
&& availableCommands.equals(state.availableCommands)
&& playbackState == state.playbackState
&& playbackSuppressionReason == state.playbackSuppressionReason
&& Objects.equals(playerError, state.playerError)
&& Util.areEqual(playerError, state.playerError)
&& repeatMode == state.repeatMode
&& shuffleModeEnabled == state.shuffleModeEnabled
&& isLoading == state.isLoading
@ -1153,9 +1069,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
&& surfaceSize.equals(state.surfaceSize)
&& newlyRenderedFirstFrame == state.newlyRenderedFirstFrame
&& timedMetadata.equals(state.timedMetadata)
&& timeline.equals(state.timeline)
&& currentTracks.equals(state.currentTracks)
&& currentMetadata.equals(state.currentMetadata)
&& playlist.equals(state.playlist)
&& playlistMetadata.equals(state.playlistMetadata)
&& currentMediaItemIndex == state.currentMediaItemIndex
&& currentAdGroupIndex == state.currentAdGroupIndex
@ -1198,9 +1112,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
result = 31 * result + surfaceSize.hashCode();
result = 31 * result + (newlyRenderedFirstFrame ? 1 : 0);
result = 31 * result + timedMetadata.hashCode();
result = 31 * result + timeline.hashCode();
result = 31 * result + currentTracks.hashCode();
result = 31 * result + currentMetadata.hashCode();
result = 31 * result + playlist.hashCode();
result = 31 * result + playlistMetadata.hashCode();
result = 31 * result + currentMediaItemIndex;
result = 31 * result + currentAdGroupIndex;
@ -1224,9 +1136,9 @@ public abstract class SimpleBasePlayer extends BasePlayer {
private final int[] windowIndexByPeriodIndex;
private final HashMap<Object, Integer> periodIndexByUid;
public PlaylistTimeline(List<MediaItemData> playlist) {
public PlaylistTimeline(ImmutableList<MediaItemData> playlist) {
int mediaItemCount = playlist.size();
this.playlist = ImmutableList.copyOf(playlist);
this.playlist = playlist;
this.firstPeriodIndexByWindowIndex = new int[mediaItemCount];
int periodCount = 0;
for (int i = 0; i < mediaItemCount; i++) {
@ -1724,6 +1636,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
public final ImmutableList<PeriodData> periods;
private final long[] periodPositionInWindowUs;
private final MediaMetadata combinedMediaMetadata;
private MediaItemData(Builder builder) {
if (builder.liveConfiguration == null) {
@ -1771,6 +1684,8 @@ public abstract class SimpleBasePlayer extends BasePlayer {
periodPositionInWindowUs[i + 1] = periodPositionInWindowUs[i] + periods.get(i).durationUs;
}
}
combinedMediaMetadata =
mediaMetadata != null ? mediaMetadata : getCombinedMediaMetadata(mediaItem, tracks);
}
/** Returns a {@link Builder} pre-populated with the current values. */
@ -1829,39 +1744,6 @@ public abstract class SimpleBasePlayer extends BasePlayer {
return result;
}
private static MediaItemData buildFromState(
State state, int mediaItemIndex, Timeline.Period period, Timeline.Window window) {
boolean isCurrentItem = getCurrentMediaItemIndexInternal(state) == mediaItemIndex;
state.timeline.getWindow(mediaItemIndex, window);
ImmutableList.Builder<PeriodData> periods = ImmutableList.builder();
for (int i = window.firstPeriodIndex; i <= window.lastPeriodIndex; i++) {
state.timeline.getPeriod(/* periodIndex= */ i, period, /* setIds= */ true);
periods.add(
new PeriodData.Builder(checkNotNull(period.uid))
.setAdPlaybackState(period.adPlaybackState)
.setDurationUs(period.durationUs)
.setIsPlaceholder(period.isPlaceholder)
.build());
}
return new MediaItemData.Builder(window.uid)
.setDefaultPositionUs(window.defaultPositionUs)
.setDurationUs(window.durationUs)
.setElapsedRealtimeEpochOffsetMs(window.elapsedRealtimeEpochOffsetMs)
.setIsDynamic(window.isDynamic)
.setIsPlaceholder(window.isPlaceholder)
.setIsSeekable(window.isSeekable)
.setLiveConfiguration(window.liveConfiguration)
.setManifest(window.manifest)
.setMediaItem(window.mediaItem)
.setMediaMetadata(isCurrentItem ? state.currentMetadata : null)
.setPeriods(periods.build())
.setPositionInFirstPeriodUs(window.positionInFirstPeriodUs)
.setPresentationStartTimeMs(window.presentationStartTimeMs)
.setTracks(isCurrentItem ? state.currentTracks : Tracks.EMPTY)
.setWindowStartTimeMs(window.windowStartTimeMs)
.build();
}
private Timeline.Window getWindow(int firstPeriodIndex, Timeline.Window window) {
int periodCount = periods.isEmpty() ? 1 : periods.size();
window.set(
@ -1917,6 +1799,25 @@ public abstract class SimpleBasePlayer extends BasePlayer {
Object periodId = periods.get(periodIndexInMediaItem).uid;
return Pair.create(uid, periodId);
}
private static MediaMetadata getCombinedMediaMetadata(MediaItem mediaItem, Tracks tracks) {
MediaMetadata.Builder metadataBuilder = new MediaMetadata.Builder();
int trackGroupCount = tracks.getGroups().size();
for (int i = 0; i < trackGroupCount; i++) {
Tracks.Group group = tracks.getGroups().get(i);
for (int j = 0; j < group.length; j++) {
if (group.isTrackSelected(j)) {
Format format = group.getTrackFormat(j);
if (format.metadata != null) {
for (int k = 0; k < format.metadata.length(); k++) {
format.metadata.get(k).populateMediaMetadata(metadataBuilder);
}
}
}
}
}
return metadataBuilder.populate(mediaItem.mediaMetadata).build();
}
}
/** Data describing the properties of a period inside a {@link MediaItemData}. */
@ -2232,7 +2133,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
placeholderPlaylist.add(getPlaceholderMediaItemData(mediaItems.get(i)));
}
return getStateWithNewPlaylistAndPosition(
state, placeholderPlaylist, startIndex, startPositionMs, window);
state, placeholderPlaylist, startIndex, startPositionMs);
});
}
@ -2242,7 +2143,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
checkArgument(index >= 0);
// Use a local copy to ensure the lambda below uses the current state value.
State state = this.state;
int playlistSize = state.timeline.getWindowCount();
int playlistSize = state.playlist.size();
if (!shouldHandleCommand(Player.COMMAND_CHANGE_MEDIA_ITEMS) || mediaItems.isEmpty()) {
return;
}
@ -2250,22 +2151,20 @@ public abstract class SimpleBasePlayer extends BasePlayer {
updateStateForPendingOperation(
/* pendingOperation= */ handleAddMediaItems(correctedIndex, mediaItems),
/* placeholderStateSupplier= */ () -> {
List<MediaItemData> placeholderPlaylist =
buildMutablePlaylistFromState(state, period, window);
ArrayList<MediaItemData> placeholderPlaylist = new ArrayList<>(state.playlist);
for (int i = 0; i < mediaItems.size(); i++) {
placeholderPlaylist.add(
i + correctedIndex, getPlaceholderMediaItemData(mediaItems.get(i)));
}
if (!state.timeline.isEmpty()) {
return getStateWithNewPlaylist(state, placeholderPlaylist, period, window);
if (!state.playlist.isEmpty()) {
return getStateWithNewPlaylist(state, placeholderPlaylist, period);
} else {
// Handle initial position update when these are the first items added to the playlist.
return getStateWithNewPlaylistAndPosition(
state,
placeholderPlaylist,
state.currentMediaItemIndex,
state.contentPositionMsSupplier.get(),
window);
state.contentPositionMsSupplier.get());
}
});
}
@ -2276,14 +2175,14 @@ public abstract class SimpleBasePlayer extends BasePlayer {
checkArgument(fromIndex >= 0 && toIndex >= fromIndex && newIndex >= 0);
// Use a local copy to ensure the lambda below uses the current state value.
State state = this.state;
int playlistSize = state.timeline.getWindowCount();
int playlistSize = state.playlist.size();
if (!shouldHandleCommand(Player.COMMAND_CHANGE_MEDIA_ITEMS)
|| playlistSize == 0
|| fromIndex >= playlistSize) {
return;
}
int correctedToIndex = min(toIndex, playlistSize);
int correctedNewIndex = min(newIndex, playlistSize - (correctedToIndex - fromIndex));
int correctedNewIndex = min(newIndex, state.playlist.size() - (correctedToIndex - fromIndex));
if (fromIndex == correctedToIndex || correctedNewIndex == fromIndex) {
return;
}
@ -2291,10 +2190,9 @@ public abstract class SimpleBasePlayer extends BasePlayer {
/* pendingOperation= */ handleMoveMediaItems(
fromIndex, correctedToIndex, correctedNewIndex),
/* placeholderStateSupplier= */ () -> {
List<MediaItemData> placeholderPlaylist =
buildMutablePlaylistFromState(state, period, window);
ArrayList<MediaItemData> placeholderPlaylist = new ArrayList<>(state.playlist);
Util.moveItems(placeholderPlaylist, fromIndex, correctedToIndex, correctedNewIndex);
return getStateWithNewPlaylist(state, placeholderPlaylist, period, window);
return getStateWithNewPlaylist(state, placeholderPlaylist, period);
});
}
@ -2303,7 +2201,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
verifyApplicationThreadAndInitState();
checkArgument(fromIndex >= 0 && fromIndex <= toIndex);
State state = this.state;
int playlistSize = state.timeline.getWindowCount();
int playlistSize = state.playlist.size();
if (!shouldHandleCommand(Player.COMMAND_CHANGE_MEDIA_ITEMS) || fromIndex > playlistSize) {
return;
}
@ -2311,15 +2209,14 @@ public abstract class SimpleBasePlayer extends BasePlayer {
updateStateForPendingOperation(
/* pendingOperation= */ handleReplaceMediaItems(fromIndex, correctedToIndex, mediaItems),
/* placeholderStateSupplier= */ () -> {
List<MediaItemData> placeholderPlaylist =
buildMutablePlaylistFromState(state, period, window);
ArrayList<MediaItemData> placeholderPlaylist = new ArrayList<>(state.playlist);
for (int i = 0; i < mediaItems.size(); i++) {
placeholderPlaylist.add(
i + correctedToIndex, getPlaceholderMediaItemData(mediaItems.get(i)));
}
State updatedState;
if (!state.timeline.isEmpty()) {
updatedState = getStateWithNewPlaylist(state, placeholderPlaylist, period, window);
if (!state.playlist.isEmpty()) {
updatedState = getStateWithNewPlaylist(state, placeholderPlaylist, period);
} else {
// Handle initial position update when these are the first items added to the playlist.
updatedState =
@ -2327,12 +2224,11 @@ public abstract class SimpleBasePlayer extends BasePlayer {
state,
placeholderPlaylist,
state.currentMediaItemIndex,
state.contentPositionMsSupplier.get(),
window);
state.contentPositionMsSupplier.get());
}
if (fromIndex < correctedToIndex) {
Util.removeRange(placeholderPlaylist, fromIndex, correctedToIndex);
return getStateWithNewPlaylist(updatedState, placeholderPlaylist, period, window);
return getStateWithNewPlaylist(updatedState, placeholderPlaylist, period);
} else {
return updatedState;
}
@ -2345,7 +2241,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
checkArgument(fromIndex >= 0 && toIndex >= fromIndex);
// Use a local copy to ensure the lambda below uses the current state value.
State state = this.state;
int playlistSize = state.timeline.getWindowCount();
int playlistSize = state.playlist.size();
if (!shouldHandleCommand(Player.COMMAND_CHANGE_MEDIA_ITEMS)
|| playlistSize == 0
|| fromIndex >= playlistSize) {
@ -2358,10 +2254,9 @@ public abstract class SimpleBasePlayer extends BasePlayer {
updateStateForPendingOperation(
/* pendingOperation= */ handleRemoveMediaItems(fromIndex, correctedToIndex),
/* placeholderStateSupplier= */ () -> {
List<MediaItemData> placeholderPlaylist =
buildMutablePlaylistFromState(state, period, window);
ArrayList<MediaItemData> placeholderPlaylist = new ArrayList<>(state.playlist);
Util.removeRange(placeholderPlaylist, fromIndex, correctedToIndex);
return getStateWithNewPlaylist(state, placeholderPlaylist, period, window);
return getStateWithNewPlaylist(state, placeholderPlaylist, period);
});
}
@ -2466,14 +2361,14 @@ public abstract class SimpleBasePlayer extends BasePlayer {
boolean ignoreSeekForPlaceholderState =
mediaItemIndex == C.INDEX_UNSET
|| isPlayingAd()
|| (!state.timeline.isEmpty() && mediaItemIndex >= state.timeline.getWindowCount());
|| (!state.playlist.isEmpty() && mediaItemIndex >= state.playlist.size());
updateStateForPendingOperation(
/* pendingOperation= */ handleSeek(mediaItemIndex, positionMs, seekCommand),
/* placeholderStateSupplier= */ () ->
ignoreSeekForPlaceholderState
? state
: getStateWithNewPlaylistAndPosition(
state, /* newPlaylist= */ null, mediaItemIndex, positionMs, window),
state, state.playlist, mediaItemIndex, positionMs),
/* forceSeekDiscontinuity= */ !ignoreSeekForPlaceholderState,
isRepeatingCurrentItem);
}
@ -2532,7 +2427,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
.setPlaybackState(Player.STATE_IDLE)
.setTotalBufferedDurationMs(PositionSupplier.ZERO)
.setContentBufferedPositionMs(
PositionSupplier.getConstant(getContentPositionMsInternal(state, window)))
PositionSupplier.getConstant(getContentPositionMsInternal(state)))
.setAdBufferedPositionMs(state.adPositionMsSupplier)
.setIsLoading(false)
.build());
@ -2557,7 +2452,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
.setPlaybackState(Player.STATE_IDLE)
.setTotalBufferedDurationMs(PositionSupplier.ZERO)
.setContentBufferedPositionMs(
PositionSupplier.getConstant(getContentPositionMsInternal(state, window)))
PositionSupplier.getConstant(getContentPositionMsInternal(state)))
.setAdBufferedPositionMs(state.adPositionMsSupplier)
.setIsLoading(false)
.build();
@ -2566,7 +2461,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
@Override
public final Tracks getCurrentTracks() {
verifyApplicationThreadAndInitState();
return state.currentTracks;
return getCurrentTracksInternal(state);
}
@Override
@ -2592,7 +2487,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
@Override
public final MediaMetadata getMediaMetadata() {
verifyApplicationThreadAndInitState();
return state.currentMetadata;
return getMediaMetadataInternal(state);
}
@Override
@ -2686,15 +2581,13 @@ public abstract class SimpleBasePlayer extends BasePlayer {
@Override
public final long getContentPosition() {
verifyApplicationThreadAndInitState();
return getContentPositionMsInternal(state, window);
return getContentPositionMsInternal(state);
}
@Override
public final long getContentBufferedPosition() {
verifyApplicationThreadAndInitState();
return max(
getContentBufferedPositionMsInternal(state, window),
getContentPositionMsInternal(state, window));
return max(getContentBufferedPositionMsInternal(state), getContentPositionMsInternal(state));
}
@Override
@ -3291,7 +3184,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
* Player#setDeviceMuted(boolean, int)}.
*
* <p>Will only be called if {@link Player#COMMAND_ADJUST_DEVICE_VOLUME} or {@link
* Player#COMMAND_ADJUST_DEVICE_VOLUME_WITH_FLAGS} is available.
* Player#COMMAND_ADJUST_DEVICE_VOLUME} is available.
*
* @param muted Whether the device was requested to be muted.
* @param flags Either 0 or a bitwise combination of one or more {@link C.VolumeFlags}.
@ -3517,6 +3410,10 @@ public abstract class SimpleBasePlayer extends BasePlayer {
boolean playWhenReadyChanged = previousState.playWhenReady != newState.playWhenReady;
boolean playbackStateChanged = previousState.playbackState != newState.playbackState;
Tracks previousTracks = getCurrentTracksInternal(previousState);
Tracks newTracks = getCurrentTracksInternal(newState);
MediaMetadata previousMediaMetadata = getMediaMetadataInternal(previousState);
MediaMetadata newMediaMetadata = getMediaMetadataInternal(newState);
int positionDiscontinuityReason =
getPositionDiscontinuityReason(
previousState, newState, forceSeekDiscontinuity, window, period);
@ -3527,8 +3424,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
if (timelineChanged) {
@Player.TimelineChangeReason
int timelineChangeReason =
getTimelineChangeReason(previousState.timeline, newState.timeline, window);
int timelineChangeReason = getTimelineChangeReason(previousState.playlist, newState.playlist);
listeners.queueEvent(
Player.EVENT_TIMELINE_CHANGED,
listener -> listener.onTimelineChanged(newState.timeline, timelineChangeReason));
@ -3555,8 +3451,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
MediaItem mediaItem =
newState.timeline.isEmpty()
? null
: newState.timeline.getWindow(getCurrentMediaItemIndexInternal(newState), window)
.mediaItem;
: newState.playlist.get(getCurrentMediaItemIndexInternal(newState)).mediaItem;
listeners.queueEvent(
Player.EVENT_MEDIA_ITEM_TRANSITION,
listener -> listener.onMediaItemTransition(mediaItem, mediaItemTransitionReason));
@ -3577,15 +3472,14 @@ public abstract class SimpleBasePlayer extends BasePlayer {
listener ->
listener.onTrackSelectionParametersChanged(newState.trackSelectionParameters));
}
if (!previousState.currentTracks.equals(newState.currentTracks)) {
if (!previousTracks.equals(newTracks)) {
listeners.queueEvent(
Player.EVENT_TRACKS_CHANGED,
listener -> listener.onTracksChanged(newState.currentTracks));
Player.EVENT_TRACKS_CHANGED, listener -> listener.onTracksChanged(newTracks));
}
if (!previousState.currentMetadata.equals(newState.currentMetadata)) {
if (!previousMediaMetadata.equals(newMediaMetadata)) {
listeners.queueEvent(
EVENT_MEDIA_METADATA_CHANGED,
listener -> listener.onMediaMetadataChanged(newState.currentMetadata));
listener -> listener.onMediaMetadataChanged(newMediaMetadata));
}
if (previousState.isLoading != newState.isLoading) {
listeners.queueEvent(
@ -3781,6 +3675,18 @@ public abstract class SimpleBasePlayer extends BasePlayer {
&& state.playbackSuppressionReason == PLAYBACK_SUPPRESSION_REASON_NONE;
}
private static Tracks getCurrentTracksInternal(State state) {
return state.playlist.isEmpty()
? Tracks.EMPTY
: state.playlist.get(getCurrentMediaItemIndexInternal(state)).tracks;
}
private static MediaMetadata getMediaMetadataInternal(State state) {
return state.playlist.isEmpty()
? MediaMetadata.EMPTY
: state.playlist.get(getCurrentMediaItemIndexInternal(state)).combinedMediaMetadata;
}
private static int getCurrentMediaItemIndexInternal(State state) {
if (state.currentMediaItemIndex != C.INDEX_UNSET) {
return state.currentMediaItemIndex;
@ -3788,27 +3694,22 @@ public abstract class SimpleBasePlayer extends BasePlayer {
return 0; // TODO: Use shuffle order to get first item if playlist is not empty.
}
private static long getContentPositionMsInternal(State state, Timeline.Window window) {
return getPositionOrDefaultInMediaItem(state.contentPositionMsSupplier.get(), state, window);
private static long getContentPositionMsInternal(State state) {
return getPositionOrDefaultInMediaItem(state.contentPositionMsSupplier.get(), state);
}
private static long getContentBufferedPositionMsInternal(State state, Timeline.Window window) {
return getPositionOrDefaultInMediaItem(
state.contentBufferedPositionMsSupplier.get(), state, window);
private static long getContentBufferedPositionMsInternal(State state) {
return getPositionOrDefaultInMediaItem(state.contentBufferedPositionMsSupplier.get(), state);
}
private static long getPositionOrDefaultInMediaItem(
long positionMs, State state, Timeline.Window window) {
private static long getPositionOrDefaultInMediaItem(long positionMs, State state) {
if (positionMs != C.TIME_UNSET) {
return positionMs;
}
if (state.timeline.isEmpty()) {
if (state.playlist.isEmpty()) {
return 0;
}
return state
.timeline
.getWindow(getCurrentMediaItemIndexInternal(state), window)
.getDefaultPositionMs();
return usToMs(state.playlist.get(getCurrentMediaItemIndexInternal(state)).defaultPositionUs);
}
private static int getCurrentPeriodIndexInternal(
@ -3818,11 +3719,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
return currentMediaItemIndex;
}
return getPeriodIndexFromWindowPosition(
state.timeline,
currentMediaItemIndex,
getContentPositionMsInternal(state, window),
window,
period);
state.timeline, currentMediaItemIndex, getContentPositionMsInternal(state), window, period);
}
private static int getPeriodIndexFromWindowPosition(
@ -3837,13 +3734,13 @@ public abstract class SimpleBasePlayer extends BasePlayer {
}
private static @Player.TimelineChangeReason int getTimelineChangeReason(
Timeline previousTimeline, Timeline newTimeline, Timeline.Window window) {
if (previousTimeline.getWindowCount() != newTimeline.getWindowCount()) {
List<MediaItemData> previousPlaylist, List<MediaItemData> newPlaylist) {
if (previousPlaylist.size() != newPlaylist.size()) {
return Player.TIMELINE_CHANGE_REASON_PLAYLIST_CHANGED;
}
for (int i = 0; i < previousTimeline.getWindowCount(); i++) {
Object previousUid = previousTimeline.getWindow(/* windowIndex= */ i, window).uid;
Object newUid = newTimeline.getWindow(/* windowIndex= */ i, window).uid;
for (int i = 0; i < previousPlaylist.size(); i++) {
Object previousUid = previousPlaylist.get(i).uid;
Object newUid = newPlaylist.get(i).uid;
boolean resolvedAutoGeneratedPlaceholder =
previousUid instanceof PlaceholderUid && !(newUid instanceof PlaceholderUid);
if (!previousUid.equals(newUid) && !resolvedAutoGeneratedPlaceholder) {
@ -3866,11 +3763,11 @@ public abstract class SimpleBasePlayer extends BasePlayer {
if (forceSeekDiscontinuity) {
return Player.DISCONTINUITY_REASON_SEEK;
}
if (previousState.timeline.isEmpty()) {
if (previousState.playlist.isEmpty()) {
// First change from an empty playlist is not reported as a discontinuity.
return C.INDEX_UNSET;
}
if (newState.timeline.isEmpty()) {
if (newState.playlist.isEmpty()) {
// The playlist became empty.
return Player.DISCONTINUITY_REASON_REMOVE;
}
@ -3893,7 +3790,7 @@ public abstract class SimpleBasePlayer extends BasePlayer {
}
// Check if reached the previous period's or ad's duration to assume an auto-transition.
long previousPositionMs =
getCurrentPeriodOrAdPositionMs(previousState, previousPeriodUid, period, window);
getCurrentPeriodOrAdPositionMs(previousState, previousPeriodUid, period);
long previousDurationMs = getPeriodOrAdDurationMs(previousState, previousPeriodUid, period);
return previousDurationMs != C.TIME_UNSET && previousPositionMs >= previousDurationMs
? Player.DISCONTINUITY_REASON_AUTO_TRANSITION
@ -3902,8 +3799,8 @@ public abstract class SimpleBasePlayer extends BasePlayer {
// We are in the same content period or ad. Check if the position deviates more than a
// reasonable threshold from the previous one.
long previousPositionMs =
getCurrentPeriodOrAdPositionMs(previousState, previousPeriodUid, period, window);
long newPositionMs = getCurrentPeriodOrAdPositionMs(newState, newPeriodUid, period, window);
getCurrentPeriodOrAdPositionMs(previousState, previousPeriodUid, period);
long newPositionMs = getCurrentPeriodOrAdPositionMs(newState, newPeriodUid, period);
if (Math.abs(previousPositionMs - newPositionMs) < POSITION_DISCONTINUITY_THRESHOLD_MS) {
return C.INDEX_UNSET;
}
@ -3915,10 +3812,10 @@ public abstract class SimpleBasePlayer extends BasePlayer {
}
private static long getCurrentPeriodOrAdPositionMs(
State state, Object currentPeriodUid, Timeline.Period period, Timeline.Window window) {
State state, Object currentPeriodUid, Timeline.Period period) {
return state.currentAdGroupIndex != C.INDEX_UNSET
? state.adPositionMsSupplier.get()
: getContentPositionMsInternal(state, window)
: getContentPositionMsInternal(state)
- state.timeline.getPeriodByUid(currentPeriodUid, period).getPositionInWindowMs();
}
@ -3955,9 +3852,9 @@ public abstract class SimpleBasePlayer extends BasePlayer {
contentPositionMs =
state.currentAdGroupIndex == C.INDEX_UNSET
? positionMs
: getContentPositionMsInternal(state, window);
: getContentPositionMsInternal(state);
} else {
contentPositionMs = getContentPositionMsInternal(state, window);
contentPositionMs = getContentPositionMsInternal(state);
positionMs =
state.currentAdGroupIndex != C.INDEX_UNSET
? state.adPositionMsSupplier.get()
@ -4008,14 +3905,9 @@ public abstract class SimpleBasePlayer extends BasePlayer {
}
// Only mark changes within the current item as a transition if we are repeating automatically
// or via a seek to next/previous.
if (positionDiscontinuityReason == DISCONTINUITY_REASON_AUTO_TRANSITION) {
if ((getContentPositionMsInternal(previousState, window)
> getContentPositionMsInternal(newState, window))
|| (newState.hasPositionDiscontinuity
&& newState.discontinuityPositionMs == C.TIME_UNSET
&& isRepeatingCurrentItem)) {
return MEDIA_ITEM_TRANSITION_REASON_REPEAT;
}
if (positionDiscontinuityReason == DISCONTINUITY_REASON_AUTO_TRANSITION
&& getContentPositionMsInternal(previousState) > getContentPositionMsInternal(newState)) {
return MEDIA_ITEM_TRANSITION_REASON_REPEAT;
}
if (positionDiscontinuityReason == DISCONTINUITY_REASON_SEEK && isRepeatingCurrentItem) {
return MEDIA_ITEM_TRANSITION_REASON_SEEK;
@ -4032,42 +3924,38 @@ public abstract class SimpleBasePlayer extends BasePlayer {
}
private static int getMediaItemIndexInNewPlaylist(
Timeline oldTimeline,
Timeline newTimeline,
List<MediaItemData> oldPlaylist,
Timeline newPlaylistTimeline,
int oldMediaItemIndex,
Timeline.Period period,
Timeline.Window window) {
if (oldTimeline.isEmpty()) {
return oldMediaItemIndex < newTimeline.getWindowCount() ? oldMediaItemIndex : C.INDEX_UNSET;
Timeline.Period period) {
if (oldPlaylist.isEmpty()) {
return oldMediaItemIndex < newPlaylistTimeline.getWindowCount()
? oldMediaItemIndex
: C.INDEX_UNSET;
}
int oldFirstPeriodIndex = oldTimeline.getWindow(oldMediaItemIndex, window).firstPeriodIndex;
Object oldFirstPeriodUid =
checkNotNull(oldTimeline.getPeriod(oldFirstPeriodIndex, period, /* setIds= */ true).uid);
if (newTimeline.getIndexOfPeriod(oldFirstPeriodUid) == C.INDEX_UNSET) {
oldPlaylist.get(oldMediaItemIndex).getPeriodUid(/* periodIndexInMediaItem= */ 0);
if (newPlaylistTimeline.getIndexOfPeriod(oldFirstPeriodUid) == C.INDEX_UNSET) {
return C.INDEX_UNSET;
}
return newTimeline.getPeriodByUid(oldFirstPeriodUid, period).windowIndex;
return newPlaylistTimeline.getPeriodByUid(oldFirstPeriodUid, period).windowIndex;
}
private static State getStateWithNewPlaylist(
State oldState,
List<MediaItemData> newPlaylist,
Timeline.Period period,
Timeline.Window window) {
State oldState, List<MediaItemData> newPlaylist, Timeline.Period period) {
State.Builder stateBuilder = oldState.buildUpon();
Timeline newTimeline = new PlaylistTimeline(newPlaylist);
Timeline oldTimeline = oldState.timeline;
stateBuilder.setPlaylist(newPlaylist);
Timeline newTimeline = stateBuilder.timeline;
long oldPositionMs = oldState.contentPositionMsSupplier.get();
int oldIndex = getCurrentMediaItemIndexInternal(oldState);
int newIndex =
getMediaItemIndexInNewPlaylist(oldTimeline, newTimeline, oldIndex, period, window);
int newIndex = getMediaItemIndexInNewPlaylist(oldState.playlist, newTimeline, oldIndex, period);
long newPositionMs = newIndex == C.INDEX_UNSET ? C.TIME_UNSET : oldPositionMs;
// If the current item no longer exists, try to find a matching subsequent item.
for (int i = oldIndex + 1; newIndex == C.INDEX_UNSET && i < oldTimeline.getWindowCount(); i++) {
for (int i = oldIndex + 1; newIndex == C.INDEX_UNSET && i < oldState.playlist.size(); i++) {
// TODO: Use shuffle order to iterate.
newIndex =
getMediaItemIndexInNewPlaylist(
oldTimeline, newTimeline, /* oldMediaItemIndex= */ i, period, window);
oldState.playlist, newTimeline, /* oldMediaItemIndex= */ i, period);
}
// If this fails, transition to ENDED state.
if (oldState.playbackState != Player.STATE_IDLE && newIndex == C.INDEX_UNSET) {
@ -4077,25 +3965,18 @@ public abstract class SimpleBasePlayer extends BasePlayer {
stateBuilder,
oldState,
oldPositionMs,
newTimeline,
newPlaylist,
newIndex,
newPositionMs,
/* keepAds= */ true,
window);
/* keepAds= */ true);
}
private static State getStateWithNewPlaylistAndPosition(
State oldState,
@Nullable List<MediaItemData> newPlaylist,
int newIndex,
long newPositionMs,
Timeline.Window window) {
State oldState, List<MediaItemData> newPlaylist, int newIndex, long newPositionMs) {
State.Builder stateBuilder = oldState.buildUpon();
Timeline newTimeline =
newPlaylist == null ? oldState.timeline : new PlaylistTimeline(newPlaylist);
stateBuilder.setPlaylist(newPlaylist);
if (oldState.playbackState != Player.STATE_IDLE) {
if (newTimeline.isEmpty()
|| (newIndex != C.INDEX_UNSET && newIndex >= newTimeline.getWindowCount())) {
if (newPlaylist.isEmpty() || (newIndex != C.INDEX_UNSET && newIndex >= newPlaylist.size())) {
stateBuilder.setPlaybackState(Player.STATE_ENDED).setIsLoading(false);
} else {
stateBuilder.setPlaybackState(Player.STATE_BUFFERING);
@ -4106,53 +3987,37 @@ public abstract class SimpleBasePlayer extends BasePlayer {
stateBuilder,
oldState,
oldPositionMs,
newTimeline,
newPlaylist,
newIndex,
newPositionMs,
/* keepAds= */ false,
window);
/* keepAds= */ false);
}
private static State buildStateForNewPosition(
State.Builder stateBuilder,
State oldState,
long oldPositionMs,
Timeline newTimeline,
List<MediaItemData> newPlaylist,
int newIndex,
long newPositionMs,
boolean keepAds,
Timeline.Window window) {
boolean keepAds) {
// Resolve unset or invalid index and position.
oldPositionMs = getPositionOrDefaultInMediaItem(oldPositionMs, oldState, window);
if (!newTimeline.isEmpty()
&& (newIndex == C.INDEX_UNSET || newIndex >= newTimeline.getWindowCount())) {
oldPositionMs = getPositionOrDefaultInMediaItem(oldPositionMs, oldState);
if (!newPlaylist.isEmpty() && (newIndex == C.INDEX_UNSET || newIndex >= newPlaylist.size())) {
newIndex = 0; // TODO: Use shuffle order to get first index.
newPositionMs = C.TIME_UNSET;
}
if (!newTimeline.isEmpty() && newPositionMs == C.TIME_UNSET) {
newPositionMs = newTimeline.getWindow(newIndex, window).getDefaultPositionMs();
if (!newPlaylist.isEmpty() && newPositionMs == C.TIME_UNSET) {
newPositionMs = usToMs(newPlaylist.get(newIndex).defaultPositionUs);
}
boolean oldOrNewPlaylistEmpty = oldState.timeline.isEmpty() || newTimeline.isEmpty();
boolean oldOrNewPlaylistEmpty = oldState.playlist.isEmpty() || newPlaylist.isEmpty();
boolean mediaItemChanged =
!oldOrNewPlaylistEmpty
&& !oldState
.timeline
.getWindow(getCurrentMediaItemIndexInternal(oldState), window)
.playlist
.get(getCurrentMediaItemIndexInternal(oldState))
.uid
.equals(newTimeline.getWindow(newIndex, window).uid);
// Set timeline, resolving tracks and metadata to the new index.
if (newTimeline.isEmpty()) {
stateBuilder.setPlaylist(newTimeline, Tracks.EMPTY, /* currentMetadata= */ null);
} else if (newTimeline instanceof PlaylistTimeline) {
MediaItemData mediaItemData = ((PlaylistTimeline) newTimeline).playlist.get(newIndex);
stateBuilder.setPlaylist(newTimeline, mediaItemData.tracks, mediaItemData.mediaMetadata);
} else {
boolean keepTracksAndMetadata = !oldOrNewPlaylistEmpty && !mediaItemChanged;
stateBuilder.setPlaylist(
newTimeline,
keepTracksAndMetadata ? oldState.currentTracks : Tracks.EMPTY,
keepTracksAndMetadata ? oldState.currentMetadata : null);
}
.equals(newPlaylist.get(newIndex).uid);
if (oldOrNewPlaylistEmpty || mediaItemChanged || newPositionMs < oldPositionMs) {
// New item or seeking back. Assume no buffer and no ad playback persists.
stateBuilder
@ -4173,12 +4038,12 @@ public abstract class SimpleBasePlayer extends BasePlayer {
.setCurrentAd(C.INDEX_UNSET, C.INDEX_UNSET)
.setTotalBufferedDurationMs(
PositionSupplier.getConstant(
getContentBufferedPositionMsInternal(oldState, window) - oldPositionMs));
getContentBufferedPositionMsInternal(oldState) - oldPositionMs));
}
} else {
// Seeking forward. Assume remaining buffer in current item persist, but no ad playback.
long contentBufferedDurationMs =
max(getContentBufferedPositionMsInternal(oldState, window), newPositionMs);
max(getContentBufferedPositionMsInternal(oldState), newPositionMs);
long totalBufferedDurationMs =
max(0, oldState.totalBufferedDurationMsSupplier.get() - (newPositionMs - oldPositionMs));
stateBuilder
@ -4191,36 +4056,5 @@ public abstract class SimpleBasePlayer extends BasePlayer {
return stateBuilder.build();
}
private static MediaMetadata getCombinedMediaMetadata(MediaItem mediaItem, Tracks tracks) {
MediaMetadata.Builder metadataBuilder = new MediaMetadata.Builder();
int trackGroupCount = tracks.getGroups().size();
for (int i = 0; i < trackGroupCount; i++) {
Tracks.Group group = tracks.getGroups().get(i);
for (int j = 0; j < group.length; j++) {
if (group.isTrackSelected(j)) {
Format format = group.getTrackFormat(j);
if (format.metadata != null) {
for (int k = 0; k < format.metadata.length(); k++) {
format.metadata.get(k).populateMediaMetadata(metadataBuilder);
}
}
}
}
}
return metadataBuilder.populate(mediaItem.mediaMetadata).build();
}
private static List<MediaItemData> buildMutablePlaylistFromState(
State state, Timeline.Period period, Timeline.Window window) {
if (state.timeline instanceof PlaylistTimeline) {
return new ArrayList<>(((PlaylistTimeline) state.timeline).playlist);
}
ArrayList<MediaItemData> items = new ArrayList<>(state.timeline.getWindowCount());
for (int i = 0; i < state.timeline.getWindowCount(); i++) {
items.add(MediaItemData.buildFromState(state, /* mediaItemIndex= */ i, period, window));
}
return items;
}
private static final class PlaceholderUid {}
}

View file

@ -41,9 +41,6 @@ public final class SurfaceInfo {
*/
public final int orientationDegrees;
/** Whether the {@link #surface} is an encoder input surface. */
public final boolean isEncoderInputSurface;
/** Creates a new instance. */
public SurfaceInfo(Surface surface, int width, int height) {
this(surface, width, height, /* orientationDegrees= */ 0);
@ -51,16 +48,6 @@ public final class SurfaceInfo {
/** Creates a new instance. */
public SurfaceInfo(Surface surface, int width, int height, int orientationDegrees) {
this(surface, width, height, orientationDegrees, /* isEncoderInputSurface= */ false);
}
/** Creates a new instance. */
public SurfaceInfo(
Surface surface,
int width,
int height,
int orientationDegrees,
boolean isEncoderInputSurface) {
checkArgument(
orientationDegrees == 0
|| orientationDegrees == 90
@ -71,7 +58,6 @@ public final class SurfaceInfo {
this.width = width;
this.height = height;
this.orientationDegrees = orientationDegrees;
this.isEncoderInputSurface = isEncoderInputSurface;
}
@Override
@ -86,7 +72,6 @@ public final class SurfaceInfo {
return width == that.width
&& height == that.height
&& orientationDegrees == that.orientationDegrees
&& isEncoderInputSurface == that.isEncoderInputSurface
&& surface.equals(that.surface);
}
@ -96,7 +81,6 @@ public final class SurfaceInfo {
result = 31 * result + width;
result = 31 * result + height;
result = 31 * result + orientationDegrees;
result = 31 * result + (isEncoderInputSurface ? 1 : 0);
return result;
}
}

View file

@ -575,8 +575,7 @@ public abstract class Timeline {
*/
public boolean isPlaceholder;
/** The {@link AdPlaybackState} for all ads in this period. */
@UnstableApi public AdPlaybackState adPlaybackState;
private AdPlaybackState adPlaybackState;
/** Creates a new instance with no ad playback state. */
public Period() {

View file

@ -390,4 +390,5 @@ public final class Tracks {
: BundleCollectionUtil.fromBundleList(Group::fromBundle, groupBundles);
return new Tracks(groups);
}
;
}

View file

@ -19,7 +19,6 @@ import static java.lang.annotation.ElementType.TYPE_USE;
import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.SurfaceTexture;
import android.opengl.EGLExt;
import android.view.Surface;
import androidx.annotation.IntDef;
@ -49,18 +48,12 @@ import java.util.concurrent.Executor;
public interface VideoFrameProcessor {
/**
* Specifies how the input frames are made available to the {@link VideoFrameProcessor}. One of
* {@link #INPUT_TYPE_SURFACE}, {@link #INPUT_TYPE_BITMAP}, {@link #INPUT_TYPE_TEXTURE_ID} or
* {@link #INPUT_TYPE_SURFACE_AUTOMATIC_FRAME_REGISTRATION}.
* {@link #INPUT_TYPE_SURFACE}, {@link #INPUT_TYPE_BITMAP} or {@link #INPUT_TYPE_TEXTURE_ID}.
*/
@Documented
@Retention(RetentionPolicy.SOURCE)
@Target(TYPE_USE)
@IntDef({
INPUT_TYPE_SURFACE,
INPUT_TYPE_BITMAP,
INPUT_TYPE_TEXTURE_ID,
INPUT_TYPE_SURFACE_AUTOMATIC_FRAME_REGISTRATION,
})
@IntDef({INPUT_TYPE_SURFACE, INPUT_TYPE_BITMAP, INPUT_TYPE_TEXTURE_ID})
@interface InputType {}
/**
@ -80,16 +73,6 @@ public interface VideoFrameProcessor {
*/
int INPUT_TYPE_TEXTURE_ID = 3;
/**
* Input frames come from the {@linkplain #getInputSurface input surface} and don't need to be
* {@linkplain #registerInputFrame registered} (unlike with {@link #INPUT_TYPE_SURFACE}).
*
* <p>Every frame must use the {@linkplain #registerInputStream(int, List, FrameInfo) input
* stream's registered} frame info. Also sets the surface's {@linkplain
* android.graphics.SurfaceTexture#setDefaultBufferSize(int, int) default buffer size}.
*/
int INPUT_TYPE_SURFACE_AUTOMATIC_FRAME_REGISTRATION = 4;
/** A factory for {@link VideoFrameProcessor} instances. */
interface Factory {
@ -143,8 +126,8 @@ public interface VideoFrameProcessor {
* @param effects The list of {@link Effect effects} to apply to the new input stream.
* @param frameInfo The {@link FrameInfo} of the new input stream.
*/
default void onInputStreamRegistered(
@InputType int inputType, List<Effect> effects, FrameInfo frameInfo) {}
void onInputStreamRegistered(
@InputType int inputType, List<Effect> effects, FrameInfo frameInfo);
/**
* Called when the output size changes.
@ -155,7 +138,7 @@ public interface VideoFrameProcessor {
* <p>The output size may differ from the size specified using {@link
* #setOutputSurfaceInfo(SurfaceInfo)}.
*/
default void onOutputSizeChanged(int width, int height) {}
void onOutputSizeChanged(int width, int height);
/**
* Called when an output frame with the given {@code presentationTimeUs} becomes available for
@ -163,7 +146,7 @@ public interface VideoFrameProcessor {
*
* @param presentationTimeUs The presentation time of the frame, in microseconds.
*/
default void onOutputFrameAvailableForRendering(long presentationTimeUs) {}
void onOutputFrameAvailableForRendering(long presentationTimeUs);
/**
* Called when an exception occurs during asynchronous video frame processing.
@ -171,10 +154,10 @@ public interface VideoFrameProcessor {
* <p>If this is called, the calling {@link VideoFrameProcessor} must immediately be {@linkplain
* VideoFrameProcessor#release() released}.
*/
default void onError(VideoFrameProcessingException exception) {}
void onError(VideoFrameProcessingException exception);
/** Called after the {@link VideoFrameProcessor} has rendered its final output frame. */
default void onEnded() {}
void onEnded();
}
/**
@ -186,13 +169,6 @@ public interface VideoFrameProcessor {
/** Indicates the frame should be dropped after {@link #renderOutputFrame(long)} is invoked. */
long DROP_OUTPUT_FRAME = -2;
/**
* Indicates the frame should preserve the input presentation time when {@link
* #renderOutputFrame(long)} is invoked.
*/
@SuppressWarnings("GoodTime-ApiWithNumericTimeUnit") // This is a named constant, not a time unit.
long RENDER_OUTPUT_FRAME_WITH_PRESENTATION_TIME = -3;
/**
* Provides an input {@link Bitmap} to the {@link VideoFrameProcessor}.
*
@ -230,14 +206,6 @@ public interface VideoFrameProcessor {
*/
void setOnInputFrameProcessedListener(OnInputFrameProcessedListener listener);
/**
* Sets a listener that's called when the {@linkplain #getInputSurface() input surface} is ready
* to use.
*/
void setOnInputSurfaceReadyListener(Runnable listener);
// TODO: b/351776002 - Call setDefaultBufferSize on the INPUT_TYPE_SURFACE path too and remove
// mentions of the method (which leak an implementation detail) throughout this file.
/**
* Returns the input {@link Surface}, where {@link VideoFrameProcessor} consumes input frames
* from.
@ -246,16 +214,6 @@ public interface VideoFrameProcessor {
* VideoFrameProcessor} until {@link #registerInputStream} is called with {@link
* #INPUT_TYPE_SURFACE}.
*
* <p>For streams with {@link #INPUT_TYPE_SURFACE}, the returned surface is ready to use
* immediately and will not have a {@linkplain SurfaceTexture#setDefaultBufferSize(int, int)
* default buffer size} set on it. This is suitable for configuring a {@link
* android.media.MediaCodec} decoder.
*
* <p>For streams with {@link #INPUT_TYPE_SURFACE_AUTOMATIC_FRAME_REGISTRATION}, set a listener
* for the surface becoming ready via {@link #setOnInputSurfaceReadyListener(Runnable)} and wait
* for the event before using the returned surface. This is suitable for use with non-decoder
* producers like media projection.
*
* @throws UnsupportedOperationException If the {@code VideoFrameProcessor} does not accept
* {@linkplain #INPUT_TYPE_SURFACE surface input}.
*/
@ -340,10 +298,7 @@ public interface VideoFrameProcessor {
*
* @param renderTimeNs The render time to use for the frame, in nanoseconds. The render time can
* be before or after the current system time. Use {@link #DROP_OUTPUT_FRAME} to drop the
* frame, or {@link #RENDER_OUTPUT_FRAME_IMMEDIATELY} to render the frame immediately, or
* {@link #RENDER_OUTPUT_FRAME_WITH_PRESENTATION_TIME} to render the frame to the {@linkplain
* #setOutputSurfaceInfo output surface} with the presentation timestamp seen in {@link
* Listener#onOutputFrameAvailableForRendering(long)}.
* frame, or {@link #RENDER_OUTPUT_FRAME_IMMEDIATELY} to render the frame immediately.
*/
void renderOutputFrame(long renderTimeNs);

View file

@ -20,7 +20,7 @@ import androidx.annotation.IntRange;
import androidx.annotation.Nullable;
import androidx.media3.common.util.UnstableApi;
/** Represents a graph for processing raw video frames. */
/** Represents a graph for processing decoded video frames. */
@UnstableApi
public interface VideoGraph {
@ -33,7 +33,7 @@ public interface VideoGraph {
* @param width The new output width in pixels.
* @param height The new output width in pixels.
*/
default void onOutputSizeChanged(int width, int height) {}
void onOutputSizeChanged(int width, int height);
/**
* Called when an output frame with the given {@code framePresentationTimeUs} becomes available
@ -41,14 +41,14 @@ public interface VideoGraph {
*
* @param framePresentationTimeUs The presentation time of the frame, in microseconds.
*/
default void onOutputFrameAvailableForRendering(long framePresentationTimeUs) {}
void onOutputFrameAvailableForRendering(long framePresentationTimeUs);
/**
* Called after the {@link VideoGraph} has rendered its final output frame.
*
* @param finalFramePresentationTimeUs The timestamp of the last output frame, in microseconds.
*/
default void onEnded(long finalFramePresentationTimeUs) {}
void onEnded(long finalFramePresentationTimeUs);
/**
* Called when an exception occurs during video frame processing.
@ -56,7 +56,7 @@ public interface VideoGraph {
* <p>If this is called, the calling {@link VideoGraph} must immediately be {@linkplain
* #release() released}.
*/
default void onError(VideoFrameProcessingException exception) {}
void onError(VideoFrameProcessingException exception);
}
/**

View file

@ -27,6 +27,7 @@ public final class VideoSize {
private static final int DEFAULT_WIDTH = 0;
private static final int DEFAULT_HEIGHT = 0;
private static final int DEFAULT_UNAPPLIED_ROTATION_DEGREES = 0;
private static final float DEFAULT_PIXEL_WIDTH_HEIGHT_RATIO = 1F;
public static final VideoSize UNKNOWN = new VideoSize(DEFAULT_WIDTH, DEFAULT_HEIGHT);
@ -40,10 +41,19 @@ public final class VideoSize {
public final int height;
/**
* @deprecated Rotation is handled internally by the player, so this is always zero.
* Clockwise rotation in degrees that the application should apply for the video for it to be
* rendered in the correct orientation.
*
* <p>Is 0 if unknown or if no rotation is needed.
*
* <p>Player should apply video rotation internally, in which case unappliedRotationDegrees is 0.
* But when a player can't apply the rotation, for example before API level 21, the unapplied
* rotation is reported by this field for application to handle.
*
* <p>Applications that use {@link android.view.TextureView} can apply the rotation by calling
* {@link android.view.TextureView#setTransform}.
*/
@IntRange(from = 0, to = 359)
@Deprecated
public final int unappliedRotationDegrees;
/**
@ -63,7 +73,7 @@ public final class VideoSize {
*/
@UnstableApi
public VideoSize(@IntRange(from = 0) int width, @IntRange(from = 0) int height) {
this(width, height, DEFAULT_PIXEL_WIDTH_HEIGHT_RATIO);
this(width, height, DEFAULT_UNAPPLIED_ROTATION_DEGREES, DEFAULT_PIXEL_WIDTH_HEIGHT_RATIO);
}
/**
@ -71,34 +81,23 @@ public final class VideoSize {
*
* @param width The video width in pixels.
* @param height The video height in pixels.
* @param unappliedRotationDegrees Clockwise rotation in degrees that the application should apply
* for the video for it to be rendered in the correct orientation. See {@link
* #unappliedRotationDegrees}.
* @param pixelWidthHeightRatio The width to height ratio of each pixel. For the normal case of
* square pixels this will be equal to 1.0. Different values are indicative of anamorphic
* content.
*/
@SuppressWarnings("deprecation") // Setting deprecated field
@UnstableApi
public VideoSize(
@IntRange(from = 0) int width,
@IntRange(from = 0) int height,
@FloatRange(from = 0, fromInclusive = false) float pixelWidthHeightRatio) {
this.width = width;
this.height = height;
this.unappliedRotationDegrees = 0;
this.pixelWidthHeightRatio = pixelWidthHeightRatio;
}
/**
* @deprecated Use {@link VideoSize#VideoSize(int, int, float)} instead. {@code
* unappliedRotationDegrees} is not needed on API 21+ and is always zero.
*/
@Deprecated
@UnstableApi
public VideoSize(
@IntRange(from = 0) int width,
@IntRange(from = 0) int height,
@IntRange(from = 0, to = 359) int unappliedRotationDegrees,
@FloatRange(from = 0, fromInclusive = false) float pixelWidthHeightRatio) {
this(width, height, pixelWidthHeightRatio);
this.width = width;
this.height = height;
this.unappliedRotationDegrees = unappliedRotationDegrees;
this.pixelWidthHeightRatio = pixelWidthHeightRatio;
}
@Override
@ -110,6 +109,7 @@ public final class VideoSize {
VideoSize other = (VideoSize) obj;
return width == other.width
&& height == other.height
&& unappliedRotationDegrees == other.unappliedRotationDegrees
&& pixelWidthHeightRatio == other.pixelWidthHeightRatio;
}
return false;
@ -120,27 +120,23 @@ public final class VideoSize {
int result = 7;
result = 31 * result + width;
result = 31 * result + height;
result = 31 * result + unappliedRotationDegrees;
result = 31 * result + Float.floatToRawIntBits(pixelWidthHeightRatio);
return result;
}
private static final String FIELD_WIDTH = Util.intToStringMaxRadix(0);
private static final String FIELD_HEIGHT = Util.intToStringMaxRadix(1);
// 2 reserved for deprecated 'unappliedRotationDegrees'.
private static final String FIELD_UNAPPLIED_ROTATION_DEGREES = Util.intToStringMaxRadix(2);
private static final String FIELD_PIXEL_WIDTH_HEIGHT_RATIO = Util.intToStringMaxRadix(3);
@UnstableApi
public Bundle toBundle() {
Bundle bundle = new Bundle();
if (width != 0) {
bundle.putInt(FIELD_WIDTH, width);
}
if (height != 0) {
bundle.putInt(FIELD_HEIGHT, height);
}
if (pixelWidthHeightRatio != 1f) {
bundle.putFloat(FIELD_PIXEL_WIDTH_HEIGHT_RATIO, pixelWidthHeightRatio);
}
bundle.putInt(FIELD_WIDTH, width);
bundle.putInt(FIELD_HEIGHT, height);
bundle.putInt(FIELD_UNAPPLIED_ROTATION_DEGREES, unappliedRotationDegrees);
bundle.putFloat(FIELD_PIXEL_WIDTH_HEIGHT_RATIO, pixelWidthHeightRatio);
return bundle;
}
@ -149,8 +145,11 @@ public final class VideoSize {
public static VideoSize fromBundle(Bundle bundle) {
int width = bundle.getInt(FIELD_WIDTH, DEFAULT_WIDTH);
int height = bundle.getInt(FIELD_HEIGHT, DEFAULT_HEIGHT);
int unappliedRotationDegrees =
bundle.getInt(FIELD_UNAPPLIED_ROTATION_DEGREES, DEFAULT_UNAPPLIED_ROTATION_DEGREES);
float pixelWidthHeightRatio =
bundle.getFloat(FIELD_PIXEL_WIDTH_HEIGHT_RATIO, DEFAULT_PIXEL_WIDTH_HEIGHT_RATIO);
return new VideoSize(width, height, pixelWidthHeightRatio);
return new VideoSize(width, height, unappliedRotationDegrees, pixelWidthHeightRatio);
}
;
}

View file

@ -16,9 +16,9 @@
*/
package androidx.media3.common.audio;
import static androidx.media3.common.util.Assertions.checkState;
import static java.lang.Math.min;
import androidx.media3.common.util.Assertions;
import java.nio.ShortBuffer;
import java.util.Arrays;
@ -52,23 +52,11 @@ import java.util.Arrays;
private int pitchFrameCount;
private int oldRatePosition;
private int newRatePosition;
/**
* Number of frames pending to be copied from {@link #inputBuffer} directly to {@link
* #outputBuffer}.
*
* <p>This field is only relevant to time-stretching or pitch-shifting in {@link
* #changeSpeed(double)}, particularly when more frames need to be copied to the {@link
* #outputBuffer} than are available in {@link #inputBuffer} and Sonic must wait until the next
* buffer (or EOS) is queued.
*/
private int remainingInputToCopyFrameCount;
private int prevPeriod;
private int prevMinDiff;
private int minDiff;
private int maxDiff;
private double accumulatedSpeedAdjustmentError;
/**
* Creates a new Sonic audio stream processor.
@ -142,26 +130,10 @@ import java.util.Arrays;
*/
public void queueEndOfStream() {
int remainingFrameCount = inputFrameCount;
double s = speed / pitch;
double r = rate * pitch;
// If there are frames to be copied directly onto the output buffer, we should not count those
// as "input frames" because Sonic is not applying any processing on them.
int adjustedRemainingFrames = remainingFrameCount - remainingInputToCopyFrameCount;
// We add directly to the output the number of frames in remainingInputToCopyFrameCount.
// Otherwise, expectedOutputFrames will be off and will make Sonic output an incorrect number of
// frames.
float s = speed / pitch;
float r = rate * pitch;
int expectedOutputFrames =
outputFrameCount
+ (int)
((adjustedRemainingFrames / s
+ remainingInputToCopyFrameCount
+ accumulatedSpeedAdjustmentError
+ pitchFrameCount)
/ r
+ 0.5);
accumulatedSpeedAdjustmentError = 0;
outputFrameCount + (int) ((remainingFrameCount / s + pitchFrameCount) / r + 0.5f);
// Add enough silence to flush both input and pitch buffers.
inputBuffer =
@ -194,7 +166,6 @@ import java.util.Arrays;
prevMinDiff = 0;
minDiff = 0;
maxDiff = 0;
accumulatedSpeedAdjustmentError = 0;
}
/** Returns the size of output that can be read with {@link #getOutput(ShortBuffer)}, in bytes. */
@ -384,14 +355,14 @@ import java.util.Arrays;
pitchFrameCount -= frameCount;
}
private short interpolate(short[] in, int inPos, long oldSampleRate, long newSampleRate) {
private short interpolate(short[] in, int inPos, int oldSampleRate, int newSampleRate) {
short left = in[inPos];
short right = in[inPos + channelCount];
long position = newRatePosition * oldSampleRate;
long leftPosition = oldRatePosition * newSampleRate;
long rightPosition = (oldRatePosition + 1) * newSampleRate;
long ratio = rightPosition - position;
long width = rightPosition - leftPosition;
int position = newRatePosition * oldSampleRate;
int leftPosition = oldRatePosition * newSampleRate;
int rightPosition = (oldRatePosition + 1) * newSampleRate;
int ratio = rightPosition - position;
int width = rightPosition - leftPosition;
return (short) ((ratio * left + (width - ratio) * right) / width);
}
@ -399,23 +370,16 @@ import java.util.Arrays;
if (outputFrameCount == originalOutputFrameCount) {
return;
}
// Use long to avoid overflows int-int multiplications. The actual value of newSampleRate and
// oldSampleRate should always be comfortably within the int range.
long newSampleRate = (long) (inputSampleRateHz / rate);
long oldSampleRate = inputSampleRateHz;
int newSampleRate = (int) (inputSampleRateHz / rate);
int oldSampleRate = inputSampleRateHz;
// Set these values to help with the integer math.
while (newSampleRate != 0
&& oldSampleRate != 0
&& newSampleRate % 2 == 0
&& oldSampleRate % 2 == 0) {
while (newSampleRate > (1 << 14) || oldSampleRate > (1 << 14)) {
newSampleRate /= 2;
oldSampleRate /= 2;
}
moveNewSamplesToPitchBuffer(originalOutputFrameCount);
// Leave at least one pitch sample in the buffer.
for (int position = 0; position < pitchFrameCount - 1; position++) {
// Cast to long to avoid overflow.
while ((oldRatePosition + 1) * newSampleRate > newRatePosition * oldSampleRate) {
outputBuffer =
ensureSpaceForAdditionalFrames(
@ -430,26 +394,21 @@ import java.util.Arrays;
oldRatePosition++;
if (oldRatePosition == oldSampleRate) {
oldRatePosition = 0;
checkState(newRatePosition == newSampleRate);
Assertions.checkState(newRatePosition == newSampleRate);
newRatePosition = 0;
}
}
removePitchFrames(pitchFrameCount - 1);
}
private int skipPitchPeriod(short[] samples, int position, double speed, int period) {
private int skipPitchPeriod(short[] samples, int position, float speed, int period) {
// Skip over a pitch period, and copy period/speed samples to the output.
int newFrameCount;
if (speed >= 2.0f) {
double expectedFrameCount = period / (speed - 1.0) + accumulatedSpeedAdjustmentError;
newFrameCount = (int) Math.round(expectedFrameCount);
accumulatedSpeedAdjustmentError = expectedFrameCount - newFrameCount;
newFrameCount = (int) (period / (speed - 1.0f));
} else {
newFrameCount = period;
double expectedInputToCopy =
period * (2.0f - speed) / (speed - 1.0f) + accumulatedSpeedAdjustmentError;
remainingInputToCopyFrameCount = (int) Math.round(expectedInputToCopy);
accumulatedSpeedAdjustmentError = expectedInputToCopy - remainingInputToCopyFrameCount;
remainingInputToCopyFrameCount = (int) (period * (2.0f - speed) / (speed - 1.0f));
}
outputBuffer = ensureSpaceForAdditionalFrames(outputBuffer, outputFrameCount, newFrameCount);
overlapAdd(
@ -465,19 +424,14 @@ import java.util.Arrays;
return newFrameCount;
}
private int insertPitchPeriod(short[] samples, int position, double speed, int period) {
private int insertPitchPeriod(short[] samples, int position, float speed, int period) {
// Insert a pitch period, and determine how much input to copy directly.
int newFrameCount;
if (speed < 0.5f) {
double expectedFrameCount = period * speed / (1.0f - speed) + accumulatedSpeedAdjustmentError;
newFrameCount = (int) Math.round(expectedFrameCount);
accumulatedSpeedAdjustmentError = expectedFrameCount - newFrameCount;
newFrameCount = (int) (period * speed / (1.0f - speed));
} else {
newFrameCount = period;
double expectedInputToCopy =
period * (2.0f * speed - 1.0f) / (1.0f - speed) + accumulatedSpeedAdjustmentError;
remainingInputToCopyFrameCount = (int) Math.round(expectedInputToCopy);
accumulatedSpeedAdjustmentError = expectedInputToCopy - remainingInputToCopyFrameCount;
remainingInputToCopyFrameCount = (int) (period * (2.0f * speed - 1.0f) / (1.0f - speed));
}
outputBuffer =
ensureSpaceForAdditionalFrames(outputBuffer, outputFrameCount, period + newFrameCount);
@ -500,7 +454,7 @@ import java.util.Arrays;
return newFrameCount;
}
private void changeSpeed(double speed) {
private void changeSpeed(float speed) {
if (inputFrameCount < maxRequiredFrameCount) {
return;
}
@ -524,7 +478,7 @@ import java.util.Arrays;
private void processStreamInput() {
// Resample as many pitch periods as we have buffered on the input.
int originalOutputFrameCount = outputFrameCount;
double s = speed / pitch;
float s = speed / pitch;
float r = rate * pitch;
if (s > 1.00001 || s < 0.99999) {
changeSpeed(s);

View file

@ -28,6 +28,7 @@ import androidx.media3.common.util.SpeedProviderUtil;
import androidx.media3.common.util.TimestampConsumer;
import androidx.media3.common.util.UnstableApi;
import androidx.media3.common.util.Util;
import java.math.RoundingMode;
import java.nio.ByteBuffer;
import java.util.ArrayDeque;
import java.util.Queue;
@ -114,39 +115,27 @@ public final class SpeedChangingAudioProcessor extends BaseAudioProcessor {
@Override
public void queueInput(ByteBuffer inputBuffer) {
long currentTimeUs =
long timeUs =
Util.scaleLargeTimestamp(
/* timestamp= */ bytesRead,
/* multiplier= */ C.MICROS_PER_SECOND,
/* divisor= */ (long) inputAudioFormat.sampleRate * inputAudioFormat.bytesPerFrame);
float newSpeed = speedProvider.getSpeed(currentTimeUs);
long nextSpeedChangeTimeUs = speedProvider.getNextSpeedChangeTimeUs(currentTimeUs);
long sampleRateAlignedNextSpeedChangeTimeUs =
getSampleRateAlignedTimestamp(nextSpeedChangeTimeUs, inputAudioFormat.sampleRate);
float newSpeed = speedProvider.getSpeed(timeUs);
// If next speed change falls between the current sample position and the next sample, then get
// the next speed and next speed change from the following sample. If needed, this will ignore
// one or more mid-sample speed changes.
if (sampleRateAlignedNextSpeedChangeTimeUs == currentTimeUs) {
long sampleDuration =
Util.sampleCountToDurationUs(/* sampleCount= */ 1, inputAudioFormat.sampleRate);
newSpeed = speedProvider.getSpeed(currentTimeUs + sampleDuration);
nextSpeedChangeTimeUs =
speedProvider.getNextSpeedChangeTimeUs(currentTimeUs + sampleDuration);
}
updateSpeed(newSpeed, currentTimeUs);
updateSpeed(newSpeed, timeUs);
int inputBufferLimit = inputBuffer.limit();
long nextSpeedChangeTimeUs = speedProvider.getNextSpeedChangeTimeUs(timeUs);
int bytesToNextSpeedChange;
if (nextSpeedChangeTimeUs != C.TIME_UNSET) {
bytesToNextSpeedChange =
(int)
Util.scaleLargeTimestamp(
/* timestamp= */ nextSpeedChangeTimeUs - currentTimeUs,
Util.scaleLargeValue(
/* timestamp= */ nextSpeedChangeTimeUs - timeUs,
/* multiplier= */ (long) inputAudioFormat.sampleRate
* inputAudioFormat.bytesPerFrame,
/* divisor= */ C.MICROS_PER_SECOND);
/* divisor= */ C.MICROS_PER_SECOND,
RoundingMode.CEILING);
int bytesToNextFrame =
inputAudioFormat.bytesPerFrame - bytesToNextSpeedChange % inputAudioFormat.bytesPerFrame;
if (bytesToNextFrame != inputAudioFormat.bytesPerFrame) {
@ -421,15 +410,4 @@ public final class SpeedChangingAudioProcessor extends BaseAudioProcessor {
// because some clients register callbacks with getSpeedAdjustedTimeAsync before this audio
// processor is flushed.
}
/**
* Returns the timestamp in microseconds of the sample defined by {@code sampleRate} that is
* closest to {@code timestampUs}, using the rounding mode specified in {@link
* Util#scaleLargeTimestamp}.
*/
private static long getSampleRateAlignedTimestamp(long timestampUs, int sampleRate) {
long exactSamplePosition =
Util.scaleLargeTimestamp(timestampUs, sampleRate, C.MICROS_PER_SECOND);
return Util.scaleLargeTimestamp(exactSamplePosition, C.MICROS_PER_SECOND, sampleRate);
}
}

View file

@ -45,11 +45,20 @@ import java.util.ArrayList;
*/
/* package */ final class CustomSpanBundler {
/** Media3 custom span implementations. */
/**
* Media3 custom span implementations. One of the following:
*
* <ul>
* <li>{@link #UNKNOWN}
* <li>{@link #RUBY}
* <li>{@link #TEXT_EMPHASIS}
* <li>{@link #HORIZONTAL_TEXT_IN_VERTICAL_CONTEXT}
* </ul>
*/
@Documented
@Retention(RetentionPolicy.SOURCE)
@Target({TYPE_USE})
@IntDef({UNKNOWN, RUBY, TEXT_EMPHASIS, HORIZONTAL_TEXT_IN_VERTICAL_CONTEXT, VOICE})
@IntDef({UNKNOWN, RUBY, TEXT_EMPHASIS, HORIZONTAL_TEXT_IN_VERTICAL_CONTEXT})
private @interface CustomSpanType {}
private static final int UNKNOWN = -1;
@ -60,8 +69,6 @@ import java.util.ArrayList;
private static final int HORIZONTAL_TEXT_IN_VERTICAL_CONTEXT = 3;
private static final int VOICE = 4;
private static final String FIELD_START_INDEX = Util.intToStringMaxRadix(0);
private static final String FIELD_END_INDEX = Util.intToStringMaxRadix(1);
private static final String FIELD_FLAGS = Util.intToStringMaxRadix(2);
@ -87,11 +94,6 @@ import java.util.ArrayList;
text, span, /* spanType= */ HORIZONTAL_TEXT_IN_VERTICAL_CONTEXT, /* params= */ null);
bundledCustomSpans.add(bundle);
}
for (VoiceSpan span : text.getSpans(0, text.length(), VoiceSpan.class)) {
Bundle bundle =
spanToBundle(text, span, /* spanType= */ VOICE, /* params= */ span.toBundle());
bundledCustomSpans.add(bundle);
}
return bundledCustomSpans;
}
@ -111,9 +113,6 @@ import java.util.ArrayList;
case HORIZONTAL_TEXT_IN_VERTICAL_CONTEXT:
text.setSpan(new HorizontalTextInVerticalContextSpan(), start, end, flags);
break;
case VOICE:
text.setSpan(VoiceSpan.fromBundle(checkNotNull(span)), start, end, flags);
break;
default:
break;
}

View file

@ -17,7 +17,6 @@ package androidx.media3.common.text;
import android.text.Spannable;
import android.text.style.ForegroundColorSpan;
import android.text.style.RelativeSizeSpan;
import androidx.media3.common.util.UnstableApi;
/**
@ -45,52 +44,14 @@ public final class SpanUtil {
Spannable spannable, Object span, int start, int end, int spanFlags) {
Object[] existingSpans = spannable.getSpans(start, end, span.getClass());
for (Object existingSpan : existingSpans) {
removeIfStartEndAndFlagsMatch(spannable, existingSpan, start, end, spanFlags);
if (spannable.getSpanStart(existingSpan) == start
&& spannable.getSpanEnd(existingSpan) == end
&& spannable.getSpanFlags(existingSpan) == spanFlags) {
spannable.removeSpan(existingSpan);
}
}
spannable.setSpan(span, start, end, spanFlags);
}
/**
* Modifies the size of the text between {@code start} and {@code end} relative to any existing
* {@link RelativeSizeSpan} instances which cover <b>at least the same range</b>.
*
* <p>{@link RelativeSizeSpan} instances which only cover a part of the text between {@code start}
* and {@code end} are ignored.
*
* <p>A new {@link RelativeSizeSpan} instance is added between {@code start} and {@code end} with
* its {@code sizeChange} value computed by modifying the {@code size} parameter by the {@code
* sizeChange} of {@link RelativeSizeSpan} instances covering between {@code start} and {@code
* end}.
*
* <p>{@link RelativeSizeSpan} instances with the same {@code start}, {@code end}, and {@code
* spanFlags} are removed.
*
* @param spannable The {@link Spannable} to add the {@link RelativeSizeSpan} to.
* @param size The fraction to modify the text size by.
* @param start The start index to add the new span at.
* @param end The end index to add the new span at.
* @param spanFlags The flags to pass to {@link Spannable#setSpan(Object, int, int, int)}.
*/
public static void addInheritedRelativeSizeSpan(
Spannable spannable, float size, int start, int end, int spanFlags) {
for (RelativeSizeSpan existingSpan : spannable.getSpans(start, end, RelativeSizeSpan.class)) {
if (spannable.getSpanStart(existingSpan) <= start
&& spannable.getSpanEnd(existingSpan) >= end) {
size *= existingSpan.getSizeChange();
}
removeIfStartEndAndFlagsMatch(spannable, existingSpan, start, end, spanFlags);
}
spannable.setSpan(new RelativeSizeSpan(size), start, end, spanFlags);
}
private static void removeIfStartEndAndFlagsMatch(
Spannable spannable, Object span, int start, int end, int spanFlags) {
if (spannable.getSpanStart(span) == start
&& spannable.getSpanEnd(span) == end
&& spannable.getSpanFlags(span) == spanFlags) {
spannable.removeSpan(span);
}
}
private SpanUtil() {}
}

View file

@ -1,52 +0,0 @@
/*
* Copyright (C) 2024 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
*/
package androidx.media3.common.text;
import static androidx.media3.common.util.Assertions.checkNotNull;
import android.os.Bundle;
import androidx.media3.common.util.UnstableApi;
import androidx.media3.common.util.Util;
/**
* A span representing the speaker of the spanned text.
*
* <p>For example a <a href="https://www.w3.org/TR/webvtt1/#webvtt-cue-voice-span">WebVTT voice
* span</a>.
*/
@UnstableApi
public final class VoiceSpan {
/** The voice name. */
public final String name;
private static final String FIELD_NAME = Util.intToStringMaxRadix(0);
public VoiceSpan(String name) {
this.name = name;
}
public Bundle toBundle() {
Bundle bundle = new Bundle();
bundle.putString(FIELD_NAME, name);
return bundle;
}
public static VoiceSpan fromBundle(Bundle bundle) {
return new VoiceSpan(checkNotNull(bundle.getString(FIELD_NAME)));
}
}

View file

@ -17,23 +17,15 @@ package androidx.media3.common.util;
import static androidx.media3.common.util.Assertions.checkArgument;
import android.annotation.SuppressLint;
import android.media.MediaCodecInfo;
import android.util.Pair;
import androidx.annotation.Nullable;
import androidx.media3.common.C;
import androidx.media3.common.ColorInfo;
import androidx.media3.common.Format;
import androidx.media3.common.MimeTypes;
import com.google.common.collect.ImmutableList;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
/** Provides utilities for handling various types of codec-specific data. */
@SuppressLint("InlinedApi")
@UnstableApi
public final class CodecSpecificDataUtil {
@ -47,26 +39,6 @@ public final class CodecSpecificDataUtil {
private static final int EXTENDED_PAR = 0x0F;
private static final int RECTANGULAR = 0x00;
// Codecs to constant mappings.
// H263
private static final String CODEC_ID_H263 = "s263";
// AVC.
private static final String CODEC_ID_AVC1 = "avc1";
private static final String CODEC_ID_AVC2 = "avc2";
// VP9
private static final String CODEC_ID_VP09 = "vp09";
// HEVC.
private static final String CODEC_ID_HEV1 = "hev1";
private static final String CODEC_ID_HVC1 = "hvc1";
// AV1.
private static final String CODEC_ID_AV01 = "av01";
// MP4A AAC.
private static final String CODEC_ID_MP4A = "mp4a";
private static final Pattern PROFILE_PATTERN = Pattern.compile("^\\D?(\\d+)$");
private static final String TAG = "CodecSpecificDataUtil";
/**
* Parses an ALAC AudioSpecificConfig (i.e. an <a
* href="https://github.com/macosforge/alac/blob/master/ALACMagicCookieDescription.txt">ALACSpecificConfig</a>).
@ -108,35 +80,6 @@ public final class CodecSpecificDataUtil {
&& initializationData.get(0)[0] == 1;
}
/**
* Returns initialization data in CodecPrivate format of VP9.
*
* <p>Each feature of VP9 CodecPrivate is defined by the binary format of ID (1 byte), length (1
* byte), and data (1 byte). See <a>
* href="https://www.webmproject.org/docs/container/#vp9-codec-feature-metadata-codecprivate">CodecPrivate
* format of VP9</a> for more details.
*
* @param profile The VP9 codec profile.
* @param level The VP9 codec level.
* @param bitDepth The bit depth of the luma and color components.
* @param chromaSubsampling The chroma subsampling.
*/
public static ImmutableList<byte[]> buildVp9CodecPrivateInitializationData(
byte profile, byte level, byte bitDepth, byte chromaSubsampling) {
byte profileId = 0x01;
byte levelId = 0x02;
byte bitDepthId = 0x03;
byte chromaSubsamplingId = 0x04;
byte length = 0x01;
return ImmutableList.of(
new byte[] {
profileId, length, profile,
levelId, length, level,
bitDepthId, length, bitDepth,
chromaSubsamplingId, length, chromaSubsampling
});
}
/**
* Parses an MPEG-4 Visual configuration information, as defined in ISO/IEC14496-2.
*
@ -261,103 +204,6 @@ public final class CodecSpecificDataUtil {
return builder.toString();
}
/** Builds an RFC 6381 H263 codec string using profile and level. */
public static String buildH263CodecString(int profile, int level) {
return Util.formatInvariant("s263.%d.%d", profile, level);
}
/**
* Returns profile and level (as defined by {@link MediaCodecInfo.CodecProfileLevel})
* corresponding to the codec description string (as defined by RFC 6381) of the given format.
*
* @param format Media format with a codec description string, as defined by RFC 6381.
* @return A pair (profile constant, level constant) if the codec of the {@code format} is
* well-formed and recognized, or null otherwise.
*/
@Nullable
public static Pair<Integer, Integer> getCodecProfileAndLevel(Format format) {
if (format.codecs == null) {
return null;
}
String[] parts = format.codecs.split("\\.");
// Dolby Vision can use DV, AVC or HEVC codec IDs, so check the MIME type first.
if (MimeTypes.VIDEO_DOLBY_VISION.equals(format.sampleMimeType)) {
return getDolbyVisionProfileAndLevel(format.codecs, parts);
}
switch (parts[0]) {
case CODEC_ID_H263:
return getH263ProfileAndLevel(format.codecs, parts);
case CODEC_ID_AVC1:
case CODEC_ID_AVC2:
return getAvcProfileAndLevel(format.codecs, parts);
case CODEC_ID_VP09:
return getVp9ProfileAndLevel(format.codecs, parts);
case CODEC_ID_HEV1:
case CODEC_ID_HVC1:
return getHevcProfileAndLevel(format.codecs, parts, format.colorInfo);
case CODEC_ID_AV01:
return getAv1ProfileAndLevel(format.codecs, parts, format.colorInfo);
case CODEC_ID_MP4A:
return getAacCodecProfileAndLevel(format.codecs, parts);
default:
return null;
}
}
/**
* Returns Hevc profile and level corresponding to the codec description string (as defined by RFC
* 6381) and it {@link ColorInfo}.
*
* @param codec The codec description string (as defined by RFC 6381).
* @param parts The codec string split by ".".
* @param colorInfo The {@link ColorInfo}.
* @return A pair (profile constant, level constant) if profile and level are recognized, or
* {@code null} otherwise.
*/
@Nullable
public static Pair<Integer, Integer> getHevcProfileAndLevel(
String codec, String[] parts, @Nullable ColorInfo colorInfo) {
if (parts.length < 4) {
// The codec has fewer parts than required by the HEVC codec string format.
Log.w(TAG, "Ignoring malformed HEVC codec string: " + codec);
return null;
}
// The profile_space gets ignored.
Matcher matcher = PROFILE_PATTERN.matcher(parts[1]);
if (!matcher.matches()) {
Log.w(TAG, "Ignoring malformed HEVC codec string: " + codec);
return null;
}
@Nullable String profileString = matcher.group(1);
int profile;
if ("1".equals(profileString)) {
profile = MediaCodecInfo.CodecProfileLevel.HEVCProfileMain;
} else if ("2".equals(profileString)) {
if (colorInfo != null && colorInfo.colorTransfer == C.COLOR_TRANSFER_ST2084) {
profile = MediaCodecInfo.CodecProfileLevel.HEVCProfileMain10HDR10;
} else {
// For all other cases, we map to the Main10 profile. Note that this includes HLG
// HDR. On Android 13+, the platform guarantees that a decoder that advertises
// HEVCProfileMain10 will be able to decode HLG. This is not guaranteed for older
// Android versions, but we still map to Main10 for backwards compatibility.
profile = MediaCodecInfo.CodecProfileLevel.HEVCProfileMain10;
}
} else if ("6".equals(profileString)) {
// Framework does not have profileLevel.HEVCProfileMultiviewMain defined.
profile = 6;
} else {
Log.w(TAG, "Unknown HEVC profile string: " + profileString);
return null;
}
@Nullable String levelString = parts[3];
@Nullable Integer level = hevcCodecStringToProfileLevel(levelString);
if (level == null) {
Log.w(TAG, "Unknown HEVC level string: " + levelString);
return null;
}
return new Pair<>(profile, level);
}
/**
* Constructs a NAL unit consisting of the NAL start code followed by the specified data.
*
@ -443,528 +289,5 @@ public final class CodecSpecificDataUtil {
return true;
}
@Nullable
private static Pair<Integer, Integer> getDolbyVisionProfileAndLevel(
String codec, String[] parts) {
if (parts.length < 3) {
// The codec has fewer parts than required by the Dolby Vision codec string format.
Log.w(TAG, "Ignoring malformed Dolby Vision codec string: " + codec);
return null;
}
// The profile_space gets ignored.
Matcher matcher = PROFILE_PATTERN.matcher(parts[1]);
if (!matcher.matches()) {
Log.w(TAG, "Ignoring malformed Dolby Vision codec string: " + codec);
return null;
}
@Nullable String profileString = matcher.group(1);
@Nullable Integer profile = dolbyVisionStringToProfile(profileString);
if (profile == null) {
Log.w(TAG, "Unknown Dolby Vision profile string: " + profileString);
return null;
}
String levelString = parts[2];
@Nullable Integer level = dolbyVisionStringToLevel(levelString);
if (level == null) {
Log.w(TAG, "Unknown Dolby Vision level string: " + levelString);
return null;
}
return new Pair<>(profile, level);
}
/** Returns H263 profile and level from codec string. */
private static Pair<Integer, Integer> getH263ProfileAndLevel(String codec, String[] parts) {
Pair<Integer, Integer> defaultProfileAndLevel =
new Pair<>(
MediaCodecInfo.CodecProfileLevel.H263ProfileBaseline,
MediaCodecInfo.CodecProfileLevel.H263Level10);
if (parts.length < 3) {
Log.w(TAG, "Ignoring malformed H263 codec string: " + codec);
return defaultProfileAndLevel;
}
try {
int profile = Integer.parseInt(parts[1]);
int level = Integer.parseInt(parts[2]);
return new Pair<>(profile, level);
} catch (NumberFormatException e) {
Log.w(TAG, "Ignoring malformed H263 codec string: " + codec);
return defaultProfileAndLevel;
}
}
@Nullable
private static Pair<Integer, Integer> getAvcProfileAndLevel(String codec, String[] parts) {
if (parts.length < 2) {
// The codec has fewer parts than required by the AVC codec string format.
Log.w(TAG, "Ignoring malformed AVC codec string: " + codec);
return null;
}
int profileInteger;
int levelInteger;
try {
if (parts[1].length() == 6) {
// Format: avc1.xxccyy, where xx is profile and yy level, both hexadecimal.
profileInteger = Integer.parseInt(parts[1].substring(0, 2), 16);
levelInteger = Integer.parseInt(parts[1].substring(4), 16);
} else if (parts.length >= 3) {
// Format: avc1.xx.[y]yy where xx is profile and [y]yy level, both decimal.
profileInteger = Integer.parseInt(parts[1]);
levelInteger = Integer.parseInt(parts[2]);
} else {
// We don't recognize the format.
Log.w(TAG, "Ignoring malformed AVC codec string: " + codec);
return null;
}
} catch (NumberFormatException e) {
Log.w(TAG, "Ignoring malformed AVC codec string: " + codec);
return null;
}
int profile = avcProfileNumberToConst(profileInteger);
if (profile == -1) {
Log.w(TAG, "Unknown AVC profile: " + profileInteger);
return null;
}
int level = avcLevelNumberToConst(levelInteger);
if (level == -1) {
Log.w(TAG, "Unknown AVC level: " + levelInteger);
return null;
}
return new Pair<>(profile, level);
}
@Nullable
private static Pair<Integer, Integer> getVp9ProfileAndLevel(String codec, String[] parts) {
if (parts.length < 3) {
Log.w(TAG, "Ignoring malformed VP9 codec string: " + codec);
return null;
}
int profileInteger;
int levelInteger;
try {
profileInteger = Integer.parseInt(parts[1]);
levelInteger = Integer.parseInt(parts[2]);
} catch (NumberFormatException e) {
Log.w(TAG, "Ignoring malformed VP9 codec string: " + codec);
return null;
}
int profile = vp9ProfileNumberToConst(profileInteger);
if (profile == -1) {
Log.w(TAG, "Unknown VP9 profile: " + profileInteger);
return null;
}
int level = vp9LevelNumberToConst(levelInteger);
if (level == -1) {
Log.w(TAG, "Unknown VP9 level: " + levelInteger);
return null;
}
return new Pair<>(profile, level);
}
@Nullable
private static Pair<Integer, Integer> getAv1ProfileAndLevel(
String codec, String[] parts, @Nullable ColorInfo colorInfo) {
if (parts.length < 4) {
Log.w(TAG, "Ignoring malformed AV1 codec string: " + codec);
return null;
}
int profileInteger;
int levelInteger;
int bitDepthInteger;
try {
profileInteger = Integer.parseInt(parts[1]);
levelInteger = Integer.parseInt(parts[2].substring(0, 2));
bitDepthInteger = Integer.parseInt(parts[3]);
} catch (NumberFormatException e) {
Log.w(TAG, "Ignoring malformed AV1 codec string: " + codec);
return null;
}
if (profileInteger != 0) {
Log.w(TAG, "Unknown AV1 profile: " + profileInteger);
return null;
}
if (bitDepthInteger != 8 && bitDepthInteger != 10) {
Log.w(TAG, "Unknown AV1 bit depth: " + bitDepthInteger);
return null;
}
int profile;
if (bitDepthInteger == 8) {
profile = MediaCodecInfo.CodecProfileLevel.AV1ProfileMain8;
} else if (colorInfo != null
&& (colorInfo.hdrStaticInfo != null
|| colorInfo.colorTransfer == C.COLOR_TRANSFER_HLG
|| colorInfo.colorTransfer == C.COLOR_TRANSFER_ST2084)) {
profile = MediaCodecInfo.CodecProfileLevel.AV1ProfileMain10HDR10;
} else {
profile = MediaCodecInfo.CodecProfileLevel.AV1ProfileMain10;
}
int level = av1LevelNumberToConst(levelInteger);
if (level == -1) {
Log.w(TAG, "Unknown AV1 level: " + levelInteger);
return null;
}
return new Pair<>(profile, level);
}
@Nullable
private static Pair<Integer, Integer> getAacCodecProfileAndLevel(String codec, String[] parts) {
if (parts.length != 3) {
Log.w(TAG, "Ignoring malformed MP4A codec string: " + codec);
return null;
}
try {
// Get the object type indication, which is a hexadecimal value (see RFC 6381/ISO 14496-1).
int objectTypeIndication = Integer.parseInt(parts[1], 16);
@Nullable String mimeType = MimeTypes.getMimeTypeFromMp4ObjectType(objectTypeIndication);
if (MimeTypes.AUDIO_AAC.equals(mimeType)) {
// For MPEG-4 audio this is followed by an audio object type indication as a decimal number.
int audioObjectTypeIndication = Integer.parseInt(parts[2]);
int profile = mp4aAudioObjectTypeToProfile(audioObjectTypeIndication);
if (profile != -1) {
// Level is set to zero in AAC decoder CodecProfileLevels.
return new Pair<>(profile, 0);
}
}
} catch (NumberFormatException e) {
Log.w(TAG, "Ignoring malformed MP4A codec string: " + codec);
}
return null;
}
private static int avcProfileNumberToConst(int profileNumber) {
switch (profileNumber) {
case 66:
return MediaCodecInfo.CodecProfileLevel.AVCProfileBaseline;
case 77:
return MediaCodecInfo.CodecProfileLevel.AVCProfileMain;
case 88:
return MediaCodecInfo.CodecProfileLevel.AVCProfileExtended;
case 100:
return MediaCodecInfo.CodecProfileLevel.AVCProfileHigh;
case 110:
return MediaCodecInfo.CodecProfileLevel.AVCProfileHigh10;
case 122:
return MediaCodecInfo.CodecProfileLevel.AVCProfileHigh422;
case 244:
return MediaCodecInfo.CodecProfileLevel.AVCProfileHigh444;
default:
return -1;
}
}
private static int avcLevelNumberToConst(int levelNumber) {
// TODO: Find int for CodecProfileLevel.AVCLevel1b.
switch (levelNumber) {
case 10:
return MediaCodecInfo.CodecProfileLevel.AVCLevel1;
case 11:
return MediaCodecInfo.CodecProfileLevel.AVCLevel11;
case 12:
return MediaCodecInfo.CodecProfileLevel.AVCLevel12;
case 13:
return MediaCodecInfo.CodecProfileLevel.AVCLevel13;
case 20:
return MediaCodecInfo.CodecProfileLevel.AVCLevel2;
case 21:
return MediaCodecInfo.CodecProfileLevel.AVCLevel21;
case 22:
return MediaCodecInfo.CodecProfileLevel.AVCLevel22;
case 30:
return MediaCodecInfo.CodecProfileLevel.AVCLevel3;
case 31:
return MediaCodecInfo.CodecProfileLevel.AVCLevel31;
case 32:
return MediaCodecInfo.CodecProfileLevel.AVCLevel32;
case 40:
return MediaCodecInfo.CodecProfileLevel.AVCLevel4;
case 41:
return MediaCodecInfo.CodecProfileLevel.AVCLevel41;
case 42:
return MediaCodecInfo.CodecProfileLevel.AVCLevel42;
case 50:
return MediaCodecInfo.CodecProfileLevel.AVCLevel5;
case 51:
return MediaCodecInfo.CodecProfileLevel.AVCLevel51;
case 52:
return MediaCodecInfo.CodecProfileLevel.AVCLevel52;
default:
return -1;
}
}
private static int vp9ProfileNumberToConst(int profileNumber) {
switch (profileNumber) {
case 0:
return MediaCodecInfo.CodecProfileLevel.VP9Profile0;
case 1:
return MediaCodecInfo.CodecProfileLevel.VP9Profile1;
case 2:
return MediaCodecInfo.CodecProfileLevel.VP9Profile2;
case 3:
return MediaCodecInfo.CodecProfileLevel.VP9Profile3;
default:
return -1;
}
}
private static int vp9LevelNumberToConst(int levelNumber) {
switch (levelNumber) {
case 10:
return MediaCodecInfo.CodecProfileLevel.VP9Level1;
case 11:
return MediaCodecInfo.CodecProfileLevel.VP9Level11;
case 20:
return MediaCodecInfo.CodecProfileLevel.VP9Level2;
case 21:
return MediaCodecInfo.CodecProfileLevel.VP9Level21;
case 30:
return MediaCodecInfo.CodecProfileLevel.VP9Level3;
case 31:
return MediaCodecInfo.CodecProfileLevel.VP9Level31;
case 40:
return MediaCodecInfo.CodecProfileLevel.VP9Level4;
case 41:
return MediaCodecInfo.CodecProfileLevel.VP9Level41;
case 50:
return MediaCodecInfo.CodecProfileLevel.VP9Level5;
case 51:
return MediaCodecInfo.CodecProfileLevel.VP9Level51;
case 60:
return MediaCodecInfo.CodecProfileLevel.VP9Level6;
case 61:
return MediaCodecInfo.CodecProfileLevel.VP9Level61;
case 62:
return MediaCodecInfo.CodecProfileLevel.VP9Level62;
default:
return -1;
}
}
@Nullable
private static Integer hevcCodecStringToProfileLevel(@Nullable String codecString) {
if (codecString == null) {
return null;
}
switch (codecString) {
case "L30":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel1;
case "L60":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel2;
case "L63":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel21;
case "L90":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel3;
case "L93":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel31;
case "L120":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel4;
case "L123":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel41;
case "L150":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel5;
case "L153":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel51;
case "L156":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel52;
case "L180":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel6;
case "L183":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel61;
case "L186":
return MediaCodecInfo.CodecProfileLevel.HEVCMainTierLevel62;
case "H30":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel1;
case "H60":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel2;
case "H63":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel21;
case "H90":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel3;
case "H93":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel31;
case "H120":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel4;
case "H123":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel41;
case "H150":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel5;
case "H153":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel51;
case "H156":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel52;
case "H180":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel6;
case "H183":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel61;
case "H186":
return MediaCodecInfo.CodecProfileLevel.HEVCHighTierLevel62;
default:
return null;
}
}
@Nullable
private static Integer dolbyVisionStringToProfile(@Nullable String profileString) {
if (profileString == null) {
return null;
}
switch (profileString) {
case "00":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvavPer;
case "01":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvavPen;
case "02":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheDer;
case "03":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheDen;
case "04":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheDtr;
case "05":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheStn;
case "06":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheDth;
case "07":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheDtb;
case "08":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvheSt;
case "09":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvavSe;
case "10":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionProfileDvav110;
default:
return null;
}
}
@Nullable
private static Integer dolbyVisionStringToLevel(@Nullable String levelString) {
if (levelString == null) {
return null;
}
// TODO (Internal: b/179261323): use framework constant for level 13.
switch (levelString) {
case "01":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelHd24;
case "02":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelHd30;
case "03":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelFhd24;
case "04":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelFhd30;
case "05":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelFhd60;
case "06":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd24;
case "07":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd30;
case "08":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd48;
case "09":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd60;
case "10":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevelUhd120;
case "11":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevel8k30;
case "12":
return MediaCodecInfo.CodecProfileLevel.DolbyVisionLevel8k60;
case "13":
return 0x1000;
default:
return null;
}
}
private static int av1LevelNumberToConst(int levelNumber) {
// See https://aomediacodec.github.io/av1-spec/av1-spec.pdf Annex A: Profiles and levels for
// more information on mapping AV1 codec strings to levels.
switch (levelNumber) {
case 0:
return MediaCodecInfo.CodecProfileLevel.AV1Level2;
case 1:
return MediaCodecInfo.CodecProfileLevel.AV1Level21;
case 2:
return MediaCodecInfo.CodecProfileLevel.AV1Level22;
case 3:
return MediaCodecInfo.CodecProfileLevel.AV1Level23;
case 4:
return MediaCodecInfo.CodecProfileLevel.AV1Level3;
case 5:
return MediaCodecInfo.CodecProfileLevel.AV1Level31;
case 6:
return MediaCodecInfo.CodecProfileLevel.AV1Level32;
case 7:
return MediaCodecInfo.CodecProfileLevel.AV1Level33;
case 8:
return MediaCodecInfo.CodecProfileLevel.AV1Level4;
case 9:
return MediaCodecInfo.CodecProfileLevel.AV1Level41;
case 10:
return MediaCodecInfo.CodecProfileLevel.AV1Level42;
case 11:
return MediaCodecInfo.CodecProfileLevel.AV1Level43;
case 12:
return MediaCodecInfo.CodecProfileLevel.AV1Level5;
case 13:
return MediaCodecInfo.CodecProfileLevel.AV1Level51;
case 14:
return MediaCodecInfo.CodecProfileLevel.AV1Level52;
case 15:
return MediaCodecInfo.CodecProfileLevel.AV1Level53;
case 16:
return MediaCodecInfo.CodecProfileLevel.AV1Level6;
case 17:
return MediaCodecInfo.CodecProfileLevel.AV1Level61;
case 18:
return MediaCodecInfo.CodecProfileLevel.AV1Level62;
case 19:
return MediaCodecInfo.CodecProfileLevel.AV1Level63;
case 20:
return MediaCodecInfo.CodecProfileLevel.AV1Level7;
case 21:
return MediaCodecInfo.CodecProfileLevel.AV1Level71;
case 22:
return MediaCodecInfo.CodecProfileLevel.AV1Level72;
case 23:
return MediaCodecInfo.CodecProfileLevel.AV1Level73;
default:
return -1;
}
}
private static int mp4aAudioObjectTypeToProfile(int profileNumber) {
switch (profileNumber) {
case 1:
return MediaCodecInfo.CodecProfileLevel.AACObjectMain;
case 2:
return MediaCodecInfo.CodecProfileLevel.AACObjectLC;
case 3:
return MediaCodecInfo.CodecProfileLevel.AACObjectSSR;
case 4:
return MediaCodecInfo.CodecProfileLevel.AACObjectLTP;
case 5:
return MediaCodecInfo.CodecProfileLevel.AACObjectHE;
case 6:
return MediaCodecInfo.CodecProfileLevel.AACObjectScalable;
case 17:
return MediaCodecInfo.CodecProfileLevel.AACObjectERLC;
case 20:
return MediaCodecInfo.CodecProfileLevel.AACObjectERScalable;
case 23:
return MediaCodecInfo.CodecProfileLevel.AACObjectLD;
case 29:
return MediaCodecInfo.CodecProfileLevel.AACObjectHE_PS;
case 39:
return MediaCodecInfo.CodecProfileLevel.AACObjectELD;
case 42:
return MediaCodecInfo.CodecProfileLevel.AACObjectXHE;
default:
return -1;
}
}
private CodecSpecificDataUtil() {}
}

View file

@ -473,7 +473,7 @@ public final class GlProgram {
? GLES20.GL_TEXTURE_2D
: GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
texIdValue,
type == GLES20.GL_SAMPLER_2D || !externalTexturesRequireNearestSampling
type == GLES20.GL_SAMPLER_2D && !externalTexturesRequireNearestSampling
? GLES20.GL_LINEAR
: GLES20.GL_NEAREST);
GLES20.glUniform1i(location, texUnitIndex);

View file

@ -1,48 +0,0 @@
/*
* Copyright 2024 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package androidx.media3.common.util;
import static androidx.media3.common.util.Assertions.checkArgument;
/**
* Represents a rectangle by the coordinates of its 4 edges (left, bottom, right, top).
*
* <p>Note that the right and top coordinates are exclusive.
*
* <p>This class represents coordinates in the OpenGL coordinate convention: {@code left <= right}
* and {@code bottom <= top}.
*/
@UnstableApi
public final class GlRect {
public int left;
public int bottom;
public int right;
public int top;
/** Creates an instance from (0, 0) to the specified width and height. */
public GlRect(int width, int height) {
this(/* left= */ 0, /* bottom= */ 0, width, height);
}
/** Creates an instance. */
public GlRect(int left, int bottom, int right, int top) {
checkArgument(left <= right && bottom <= top);
this.left = left;
this.bottom = bottom;
this.right = right;
this.top = top;
}
}

View file

@ -16,6 +16,7 @@
package androidx.media3.common.util;
import static android.opengl.EGL14.EGL_CONTEXT_CLIENT_VERSION;
import static android.opengl.EGL14.EGL_NO_SURFACE;
import static android.opengl.GLU.gluErrorString;
import static androidx.media3.common.util.Assertions.checkArgument;
import static androidx.media3.common.util.Assertions.checkState;
@ -35,7 +36,6 @@ import android.opengl.GLUtils;
import android.opengl.Matrix;
import androidx.annotation.IntRange;
import androidx.annotation.Nullable;
import androidx.annotation.RequiresApi;
import androidx.media3.common.C;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
@ -294,7 +294,7 @@ public final class GlUtil {
sharedContext,
contextAttributes,
/* offset= */ 0);
if (eglContext == null || eglContext.equals(EGL14.EGL_NO_CONTEXT)) {
if (eglContext == null) {
EGL14.eglTerminate(eglDisplay);
throw new GlException(
"eglCreateContext() failed to create a valid context. The device may not support EGL"
@ -779,13 +779,13 @@ public final class GlUtil {
*/
public static void destroyEglContext(
@Nullable EGLDisplay eglDisplay, @Nullable EGLContext eglContext) throws GlException {
if (eglDisplay == null || eglDisplay.equals(EGL14.EGL_NO_DISPLAY)) {
if (eglDisplay == null) {
return;
}
EGL14.eglMakeCurrent(
eglDisplay, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_CONTEXT);
checkEglException("Error releasing context");
if (eglContext != null && !eglContext.equals(EGL14.EGL_NO_CONTEXT)) {
if (eglContext != null) {
EGL14.eglDestroyContext(eglDisplay, eglContext);
checkEglException("Error destroying context");
}
@ -801,10 +801,10 @@ public final class GlUtil {
*/
public static void destroyEglSurface(
@Nullable EGLDisplay eglDisplay, @Nullable EGLSurface eglSurface) throws GlException {
if (eglDisplay == null || eglDisplay.equals(EGL14.EGL_NO_DISPLAY)) {
if (eglDisplay == null || eglSurface == null) {
return;
}
if (eglSurface == null || eglSurface.equals(EGL14.EGL_NO_SURFACE)) {
if (EGL14.eglGetCurrentSurface(EGL14.EGL_DRAW) == EGL_NO_SURFACE) {
return;
}
@ -825,181 +825,6 @@ public final class GlUtil {
checkGlError();
}
/**
* Copies the pixels from {@code readFboId} into {@code drawFboId}. Requires OpenGL ES 3.0.
*
* <p>When the input pixel region (given by {@code readRect}) doesn't have the same size as the
* output region (given by {@code drawRect}), this method uses {@link GLES20#GL_LINEAR} filtering
* to scale the image contents.
*
* @param readFboId The framebuffer object to read from.
* @param readRect The rectangular region of {@code readFboId} to read from.
* @param drawFboId The framebuffer object to draw into.
* @param drawRect The rectangular region of {@code drawFboId} to draw into.
*/
public static void blitFrameBuffer(int readFboId, GlRect readRect, int drawFboId, GlRect drawRect)
throws GlException {
int[] boundFramebuffer = new int[1];
GLES20.glGetIntegerv(GLES20.GL_FRAMEBUFFER_BINDING, boundFramebuffer, /* offset= */ 0);
checkGlError();
GLES30.glBindFramebuffer(GLES30.GL_READ_FRAMEBUFFER, readFboId);
checkGlError();
GLES30.glBindFramebuffer(GLES30.GL_DRAW_FRAMEBUFFER, drawFboId);
checkGlError();
GLES30.glBlitFramebuffer(
readRect.left,
readRect.bottom,
readRect.right,
readRect.top,
drawRect.left,
drawRect.bottom,
drawRect.right,
drawRect.top,
GLES30.GL_COLOR_BUFFER_BIT,
GLES30.GL_LINEAR);
checkGlError();
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, /* framebuffer= */ boundFramebuffer[0]);
checkGlError();
}
/**
* Creates a pixel buffer object with a data store of the given size and usage {@link
* GLES30#GL_DYNAMIC_READ}.
*
* <p>The buffer is suitable for repeated modification by OpenGL and reads by the application.
*
* @param size The size of the buffer object's data store.
* @return The pixel buffer object.
*/
public static int createPixelBufferObject(int size) throws GlException {
int[] ids = new int[1];
GLES30.glGenBuffers(/* n= */ 1, ids, /* offset= */ 0);
GlUtil.checkGlError();
GLES30.glBindBuffer(GLES30.GL_PIXEL_PACK_BUFFER, ids[0]);
GlUtil.checkGlError();
GLES30.glBufferData(
GLES30.GL_PIXEL_PACK_BUFFER, /* size= */ size, /* data= */ null, GLES30.GL_DYNAMIC_READ);
GlUtil.checkGlError();
GLES30.glBindBuffer(GLES30.GL_PIXEL_PACK_BUFFER, /* buffer= */ 0);
GlUtil.checkGlError();
return ids[0];
}
/**
* Reads pixel data from the {@link GLES30#GL_COLOR_ATTACHMENT0} attachment of a framebuffer into
* the data store of a pixel buffer object.
*
* <p>The texture backing the color attachment of {@code readFboId} and the buffer store of {@code
* bufferId} must hold an image of the given {@code width} and {@code height} with format {@link
* GLES30#GL_RGBA} and type {@link GLES30#GL_UNSIGNED_BYTE}.
*
* <p>This a non-blocking call which reads the data asynchronously.
*
* <p>Requires API 24: This method must call the version of {@link GLES30#glReadPixels(int, int,
* int, int, int, int, int)} which accepts an integer offset as the last parameter. This version
* of glReadPixels is not available in the Java {@link GLES30} wrapper until API 24.
*
* <p>HDR support is not yet implemented.
*
* @param readFboId The framebuffer that holds pixel data.
* @param width The image width.
* @param height The image height.
* @param bufferId The pixel buffer object to read into.
*/
@RequiresApi(24)
public static void schedulePixelBufferRead(int readFboId, int width, int height, int bufferId)
throws GlException {
focusFramebufferUsingCurrentContext(readFboId, width, height);
GLES30.glBindBuffer(GLES30.GL_PIXEL_PACK_BUFFER, bufferId);
GlUtil.checkGlError();
GLES30.glReadBuffer(GLES30.GL_COLOR_ATTACHMENT0);
GLES30.glReadPixels(
/* x= */ 0,
/* y= */ 0,
width,
height,
GLES30.GL_RGBA,
GLES30.GL_UNSIGNED_BYTE,
/* offset= */ 0);
GlUtil.checkGlError();
GLES30.glBindBuffer(GLES30.GL_PIXEL_PACK_BUFFER, /* buffer= */ 0);
GlUtil.checkGlError();
}
/**
* Maps the pixel buffer object's data store of a given size and returns a {@link ByteBuffer} of
* OpenGL managed memory.
*
* <p>The application must not write into the returned {@link ByteBuffer}.
*
* <p>The pixel buffer object should have a {@linkplain #schedulePixelBufferRead previously
* scheduled pixel buffer read}.
*
* <p>When the application no longer needs to access the returned buffer, call {@link
* #unmapPixelBufferObject}.
*
* <p>This call blocks until the pixel buffer data from the last {@link #schedulePixelBufferRead}
* call is available.
*
* <p>Requires API 24: see {@link #schedulePixelBufferRead}.
*
* @param bufferId The pixel buffer object.
* @param size The size of the pixel buffer object's data store to be mapped.
* @return The {@link ByteBuffer} that holds pixel data.
*/
@RequiresApi(24)
public static ByteBuffer mapPixelBufferObject(int bufferId, int size) throws GlException {
GLES20.glBindBuffer(GLES30.GL_PIXEL_PACK_BUFFER, bufferId);
checkGlError();
ByteBuffer mappedPixelBuffer =
(ByteBuffer)
GLES30.glMapBufferRange(
GLES30.GL_PIXEL_PACK_BUFFER,
/* offset= */ 0,
/* length= */ size,
GLES30.GL_MAP_READ_BIT);
GlUtil.checkGlError();
GLES30.glBindBuffer(GLES30.GL_PIXEL_PACK_BUFFER, /* buffer= */ 0);
GlUtil.checkGlError();
return mappedPixelBuffer;
}
/**
* Unmaps the pixel buffer object {@code bufferId}'s data store.
*
* <p>The pixel buffer object should be previously {@linkplain #mapPixelBufferObject mapped}.
*
* <p>After this method returns, accessing data inside a previously {@linkplain
* #mapPixelBufferObject mapped} {@link ByteBuffer} results in undefined behaviour.
*
* <p>When this method returns, the pixel buffer object {@code bufferId} can be reused by {@link
* #schedulePixelBufferRead}.
*
* <p>Requires API 24: see {@link #schedulePixelBufferRead}.
*
* @param bufferId The pixel buffer object.
*/
@RequiresApi(24)
public static void unmapPixelBufferObject(int bufferId) throws GlException {
GLES30.glBindBuffer(GLES30.GL_PIXEL_PACK_BUFFER, bufferId);
GlUtil.checkGlError();
GLES30.glUnmapBuffer(GLES30.GL_PIXEL_PACK_BUFFER);
GlUtil.checkGlError();
GLES30.glBindBuffer(GLES30.GL_PIXEL_PACK_BUFFER, /* buffer= */ 0);
GlUtil.checkGlError();
}
/** Deletes a buffer object, or silently ignores the method call if {@code bufferId} is unused. */
public static void deleteBuffer(int bufferId) throws GlException {
GLES20.glDeleteBuffers(/* n= */ 1, new int[] {bufferId}, /* offset= */ 0);
checkGlError();
}
/**
* Throws a {@link GlException} with the given message if {@code expression} evaluates to {@code
* false}.

View file

@ -198,9 +198,6 @@ public final class ListenerSet<T extends @NonNull Object> {
/** Removes all listeners from the set. */
public void clear() {
verifyCurrentThread();
for (ListenerHolder<T> listenerHolder : listeners) {
listenerHolder.release(iterationFinishedEvent);
}
listeners.clear();
}

View file

@ -15,8 +15,6 @@
*/
package androidx.media3.common.util;
import static java.lang.Math.max;
import java.util.Arrays;
/** An append-only, auto-growing {@code long[]}. */
@ -51,20 +49,6 @@ public final class LongArray {
values[size++] = value;
}
/**
* Appends all elements of the specified array.
*
* @param values The array whose elements are to be added.
*/
public void addAll(long[] values) {
int newSize = size + values.length;
if (newSize > this.values.length) {
this.values = Arrays.copyOf(this.values, max(this.values.length * 2, newSize));
}
System.arraycopy(values, 0, this.values, size, values.length);
size = newSize;
}
/**
* Returns the value at a specified index.
*

View file

@ -28,7 +28,6 @@ import androidx.media3.common.MimeTypes;
import com.google.common.collect.ImmutableList;
import java.nio.ByteBuffer;
import java.util.List;
import java.util.Objects;
/** Helper class containing utility methods for managing {@link MediaFormat} instances. */
@UnstableApi
@ -80,7 +79,7 @@ public final class MediaFormatUtil {
.setAverageBitrate(
getInteger(
mediaFormat, MediaFormat.KEY_BIT_RATE, /* defaultValue= */ Format.NO_VALUE))
.setCodecs(getCodecString(mediaFormat))
.setCodecs(mediaFormat.getString(MediaFormat.KEY_CODECS_STRING))
.setFrameRate(getFrameRate(mediaFormat, /* defaultValue= */ Format.NO_VALUE))
.setWidth(
getInteger(mediaFormat, MediaFormat.KEY_WIDTH, /* defaultValue= */ Format.NO_VALUE))
@ -96,7 +95,8 @@ public final class MediaFormatUtil {
/* defaultValue= */ Format.NO_VALUE))
.setRotationDegrees(
getInteger(mediaFormat, MediaFormat.KEY_ROTATION, /* defaultValue= */ 0))
.setColorInfo(getColorInfo(mediaFormat))
// TODO(b/278101856): Disallow invalid values after confirming.
.setColorInfo(getColorInfo(mediaFormat, /* allowInvalidValues= */ true))
.setSampleRate(
getInteger(
mediaFormat, MediaFormat.KEY_SAMPLE_RATE, /* defaultValue= */ Format.NO_VALUE))
@ -269,6 +269,13 @@ public final class MediaFormatUtil {
*/
@Nullable
public static ColorInfo getColorInfo(MediaFormat mediaFormat) {
return getColorInfo(mediaFormat, /* allowInvalidValues= */ false);
}
// Internal methods.
@Nullable
private static ColorInfo getColorInfo(MediaFormat mediaFormat, boolean allowInvalidValues) {
if (SDK_INT < 24) {
// MediaFormat KEY_COLOR_TRANSFER and other KEY_COLOR values available from API 24.
return null;
@ -286,17 +293,21 @@ public final class MediaFormatUtil {
@Nullable
byte[] hdrStaticInfo =
hdrStaticInfoByteBuffer != null ? getArray(hdrStaticInfoByteBuffer) : null;
// Some devices may produce invalid values from MediaFormat#getInteger.
// See b/239435670 for more information.
if (!isValidColorSpace(colorSpace)) {
colorSpace = Format.NO_VALUE;
}
if (!isValidColorRange(colorRange)) {
colorRange = Format.NO_VALUE;
}
if (!isValidColorTransfer(colorTransfer)) {
colorTransfer = Format.NO_VALUE;
if (!allowInvalidValues) {
// Some devices may produce invalid values from MediaFormat#getInteger.
// See b/239435670 for more information.
if (!isValidColorSpace(colorSpace)) {
colorSpace = Format.NO_VALUE;
}
if (!isValidColorRange(colorRange)) {
colorRange = Format.NO_VALUE;
}
if (!isValidColorTransfer(colorTransfer)) {
colorTransfer = Format.NO_VALUE;
}
}
if (colorSpace != Format.NO_VALUE
|| colorRange != Format.NO_VALUE
|| colorTransfer != Format.NO_VALUE
@ -321,32 +332,6 @@ public final class MediaFormatUtil {
return mediaFormat.containsKey(name) ? mediaFormat.getFloat(name) : defaultValue;
}
/** Supports {@link MediaFormat#getString(String, String)} for {@code API < 29}. */
@Nullable
public static String getString(
MediaFormat mediaFormat, String name, @Nullable String defaultValue) {
return mediaFormat.containsKey(name) ? mediaFormat.getString(name) : defaultValue;
}
/**
* Returns a {@code Codecs string} of {@link MediaFormat}. In case of an H263 codec string, builds
* and returns an RFC 6381 H263 codec string using profile and level.
*/
@Nullable
@SuppressLint("InlinedApi") // Inlined MediaFormat keys.
private static String getCodecString(MediaFormat mediaFormat) {
// Add H263 profile and level to codec string as per RFC 6381.
if (Objects.equals(mediaFormat.getString(MediaFormat.KEY_MIME), MimeTypes.VIDEO_H263)
&& mediaFormat.containsKey(MediaFormat.KEY_PROFILE)
&& mediaFormat.containsKey(MediaFormat.KEY_LEVEL)) {
return CodecSpecificDataUtil.buildH263CodecString(
mediaFormat.getInteger(MediaFormat.KEY_PROFILE),
mediaFormat.getInteger(MediaFormat.KEY_LEVEL));
} else {
return getString(mediaFormat, MediaFormat.KEY_CODECS_STRING, /* defaultValue= */ null);
}
}
/**
* Returns the frame rate from a {@link MediaFormat}.
*

View file

@ -17,9 +17,9 @@ package androidx.media3.common.util;
import static java.lang.Math.min;
import com.google.common.base.Charsets;
import com.google.errorprone.annotations.CheckReturnValue;
import java.nio.charset.Charset;
import java.nio.charset.StandardCharsets;
/** Wraps a byte array, providing methods that allow it to be read as a bitstream. */
@UnstableApi
@ -285,7 +285,7 @@ public final class ParsableBitArray {
* @return The string encoded by the bytes in UTF-8.
*/
public String readBytesAsString(int length) {
return readBytesAsString(length, StandardCharsets.UTF_8);
return readBytesAsString(length, Charsets.UTF_8);
}
/**

View file

@ -16,14 +16,13 @@
package androidx.media3.common.util;
import androidx.annotation.Nullable;
import com.google.common.base.Charsets;
import com.google.common.collect.ImmutableSet;
import com.google.common.primitives.Chars;
import com.google.common.primitives.Ints;
import com.google.common.primitives.UnsignedBytes;
import com.google.errorprone.annotations.CheckReturnValue;
import java.nio.ByteBuffer;
import java.nio.charset.Charset;
import java.nio.charset.StandardCharsets;
import java.util.Arrays;
/**
@ -38,11 +37,7 @@ public final class ParsableByteArray {
private static final char[] LF = {'\n'};
private static final ImmutableSet<Charset> SUPPORTED_CHARSETS_FOR_READLINE =
ImmutableSet.of(
StandardCharsets.US_ASCII,
StandardCharsets.UTF_8,
StandardCharsets.UTF_16,
StandardCharsets.UTF_16BE,
StandardCharsets.UTF_16LE);
Charsets.US_ASCII, Charsets.UTF_8, Charsets.UTF_16, Charsets.UTF_16BE, Charsets.UTF_16LE);
private byte[] data;
private int position;
@ -243,8 +238,8 @@ public final class ParsableByteArray {
/**
* Peeks at the next char.
*
* <p>Equivalent to passing {@link StandardCharsets#UTF_16} or {@link StandardCharsets#UTF_16BE}
* to {@link #peekChar(Charset)}.
* <p>Equivalent to passing {@link Charsets#UTF_16} or {@link Charsets#UTF_16BE} to {@link
* #peekChar(Charset)}.
*/
public char peekChar() {
return (char) ((data[position] & 0xFF) << 8 | (data[position + 1] & 0xFF));
@ -451,7 +446,7 @@ public final class ParsableByteArray {
* @return The string encoded by the bytes.
*/
public String readString(int length) {
return readString(length, StandardCharsets.UTF_8);
return readString(length, Charsets.UTF_8);
}
/**
@ -525,11 +520,11 @@ public final class ParsableByteArray {
/**
* Reads a line of text in UTF-8.
*
* <p>Equivalent to passing {@link StandardCharsets#UTF_8} to {@link #readLine(Charset)}.
* <p>Equivalent to passing {@link Charsets#UTF_8} to {@link #readLine(Charset)}.
*/
@Nullable
public String readLine() {
return readLine(StandardCharsets.UTF_8);
return readLine(Charsets.UTF_8);
}
/**
@ -555,7 +550,7 @@ public final class ParsableByteArray {
if (bytesLeft() == 0) {
return null;
}
if (!charset.equals(StandardCharsets.US_ASCII)) {
if (!charset.equals(Charsets.US_ASCII)) {
Charset unused = readUtfCharsetFromBom(); // Skip BOM if present
}
int lineLimit = findNextLineTerminator(charset);
@ -602,41 +597,6 @@ public final class ParsableByteArray {
return value;
}
/**
* Reads a little endian long of variable length.
*
* @throws IllegalStateException if the byte to be read is over the limit of the parsable byte
* array
* @return long value
*/
public long readUnsignedLeb128ToLong() {
long value = 0;
// At most, 63 bits of unsigned data can be stored in a long, which corresponds to 63/7=9 bytes
// in LEB128.
for (int i = 0; i < 9; i++) {
if (this.position == limit) {
throw new IllegalStateException("Attempting to read a byte over the limit.");
}
long currentByte = this.readUnsignedByte();
value |= (currentByte & 0x7F) << (i * 7);
if ((currentByte & 0x80) == 0) {
break;
}
}
return value;
}
/**
* Reads a little endian integer of variable length.
*
* @throws IllegalArgumentException if the read value is greater than {@link Integer#MAX_VALUE} or
* less than {@link Integer#MIN_VALUE}
* @return integer value
*/
public int readUnsignedLeb128ToInt() {
return Ints.checkedCast(readUnsignedLeb128ToLong());
}
/**
* Reads a UTF byte order mark (BOM) and returns the UTF {@link Charset} it represents. Returns
* {@code null} without advancing {@link #getPosition() position} if no BOM is found.
@ -648,14 +608,14 @@ public final class ParsableByteArray {
&& data[position + 1] == (byte) 0xBB
&& data[position + 2] == (byte) 0xBF) {
position += 3;
return StandardCharsets.UTF_8;
return Charsets.UTF_8;
} else if (bytesLeft() >= 2) {
if (data[position] == (byte) 0xFE && data[position + 1] == (byte) 0xFF) {
position += 2;
return StandardCharsets.UTF_16BE;
return Charsets.UTF_16BE;
} else if (data[position] == (byte) 0xFF && data[position + 1] == (byte) 0xFE) {
position += 2;
return StandardCharsets.UTF_16LE;
return Charsets.UTF_16LE;
}
}
return null;
@ -666,25 +626,24 @@ public final class ParsableByteArray {
*/
private int findNextLineTerminator(Charset charset) {
int stride;
if (charset.equals(StandardCharsets.UTF_8) || charset.equals(StandardCharsets.US_ASCII)) {
if (charset.equals(Charsets.UTF_8) || charset.equals(Charsets.US_ASCII)) {
stride = 1;
} else if (charset.equals(StandardCharsets.UTF_16)
|| charset.equals(StandardCharsets.UTF_16LE)
|| charset.equals(StandardCharsets.UTF_16BE)) {
} else if (charset.equals(Charsets.UTF_16)
|| charset.equals(Charsets.UTF_16LE)
|| charset.equals(Charsets.UTF_16BE)) {
stride = 2;
} else {
throw new IllegalArgumentException("Unsupported charset: " + charset);
}
for (int i = position; i < limit - (stride - 1); i += stride) {
if ((charset.equals(StandardCharsets.UTF_8) || charset.equals(StandardCharsets.US_ASCII))
if ((charset.equals(Charsets.UTF_8) || charset.equals(Charsets.US_ASCII))
&& Util.isLinebreak(data[i])) {
return i;
} else if ((charset.equals(StandardCharsets.UTF_16)
|| charset.equals(StandardCharsets.UTF_16BE))
} else if ((charset.equals(Charsets.UTF_16) || charset.equals(Charsets.UTF_16BE))
&& data[i] == 0x00
&& Util.isLinebreak(data[i + 1])) {
return i;
} else if (charset.equals(StandardCharsets.UTF_16LE)
} else if (charset.equals(Charsets.UTF_16LE)
&& data[i + 1] == 0x00
&& Util.isLinebreak(data[i])) {
return i;
@ -732,16 +691,14 @@ public final class ParsableByteArray {
private int peekCharacterAndSize(Charset charset) {
byte character;
short characterSize;
if ((charset.equals(StandardCharsets.UTF_8) || charset.equals(StandardCharsets.US_ASCII))
&& bytesLeft() >= 1) {
if ((charset.equals(Charsets.UTF_8) || charset.equals(Charsets.US_ASCII)) && bytesLeft() >= 1) {
character = (byte) Chars.checkedCast(UnsignedBytes.toInt(data[position]));
characterSize = 1;
} else if ((charset.equals(StandardCharsets.UTF_16)
|| charset.equals(StandardCharsets.UTF_16BE))
} else if ((charset.equals(Charsets.UTF_16) || charset.equals(Charsets.UTF_16BE))
&& bytesLeft() >= 2) {
character = (byte) Chars.fromBytes(data[position], data[position + 1]);
characterSize = 2;
} else if (charset.equals(StandardCharsets.UTF_16LE) && bytesLeft() >= 2) {
} else if (charset.equals(Charsets.UTF_16LE) && bytesLeft() >= 2) {
character = (byte) Chars.fromBytes(data[position + 1], data[position]);
characterSize = 2;
} else {

View file

@ -63,12 +63,12 @@ public final class RepeatModeUtil {
/**
* Gets the next repeat mode out of {@code enabledModes} starting from {@code currentMode}.
*
* @param currentMode The current {@link Player.RepeatMode}.
* @param enabledModes The bitmask of enabled {@link RepeatToggleModes}.
* @param currentMode The current repeat mode.
* @param enabledModes Bitmask of enabled modes.
* @return The next repeat mode.
*/
public static @Player.RepeatMode int getNextRepeatMode(
@Player.RepeatMode int currentMode, @RepeatToggleModes int enabledModes) {
@Player.RepeatMode int currentMode, int enabledModes) {
for (int offset = 1; offset <= 2; offset++) {
@Player.RepeatMode int proposedMode = (currentMode + offset) % 3;
if (isRepeatModeEnabled(proposedMode, enabledModes)) {
@ -79,15 +79,13 @@ public final class RepeatModeUtil {
}
/**
* Verifies whether a given {@link Player.RepeatMode} is enabled in the bitmask of {@link
* RepeatToggleModes}.
* Verifies whether a given {@code repeatMode} is enabled in the bitmask {@code enabledModes}.
*
* @param repeatMode The {@link Player.RepeatMode} to check.
* @param enabledModes The bitmask of enabled {@link RepeatToggleModes}.
* @param repeatMode The mode to check.
* @param enabledModes The bitmask representing the enabled modes.
* @return {@code true} if enabled.
*/
public static boolean isRepeatModeEnabled(
@Player.RepeatMode int repeatMode, @RepeatToggleModes int enabledModes) {
public static boolean isRepeatModeEnabled(@Player.RepeatMode int repeatMode, int enabledModes) {
switch (repeatMode) {
case Player.REPEAT_MODE_OFF:
return true;

View file

@ -271,7 +271,7 @@ public final class TimestampAdjuster {
* @return The corresponding value in microseconds.
*/
public static long ptsToUs(long pts) {
return Util.scaleLargeTimestamp(pts, C.MICROS_PER_SECOND, 90000);
return (pts * C.MICROS_PER_SECOND) / 90000;
}
/**
@ -295,6 +295,6 @@ public final class TimestampAdjuster {
* @return The corresponding value as a 90 kHz clock timestamp.
*/
public static long usToNonWrappedPts(long us) {
return Util.scaleLargeTimestamp(us, 90000, C.MICROS_PER_SECOND);
return (us * 90000) / C.MICROS_PER_SECOND;
}
}

View file

@ -16,11 +16,6 @@
package androidx.media3.common.util;
import static android.content.Context.UI_MODE_SERVICE;
import static androidx.media3.common.C.AUXILIARY_TRACK_TYPE_DEPTH_INVERSE;
import static androidx.media3.common.C.AUXILIARY_TRACK_TYPE_DEPTH_LINEAR;
import static androidx.media3.common.C.AUXILIARY_TRACK_TYPE_DEPTH_METADATA;
import static androidx.media3.common.C.AUXILIARY_TRACK_TYPE_ORIGINAL;
import static androidx.media3.common.C.AUXILIARY_TRACK_TYPE_UNDEFINED;
import static androidx.media3.common.Player.COMMAND_PLAY_PAUSE;
import static androidx.media3.common.Player.COMMAND_PREPARE;
import static androidx.media3.common.Player.COMMAND_SEEK_BACK;
@ -80,6 +75,7 @@ import android.view.Display;
import android.view.SurfaceView;
import android.view.WindowManager;
import androidx.annotation.ChecksSdkIntAtLeast;
import androidx.annotation.DoNotInline;
import androidx.annotation.DrawableRes;
import androidx.annotation.Nullable;
import androidx.annotation.RequiresApi;
@ -95,6 +91,7 @@ import androidx.media3.common.Player;
import androidx.media3.common.Player.Commands;
import androidx.media3.common.audio.AudioProcessor;
import com.google.common.base.Ascii;
import com.google.common.base.Charsets;
import com.google.common.io.ByteStreams;
import com.google.common.math.DoubleMath;
import com.google.common.math.LongMath;
@ -105,7 +102,6 @@ import com.google.common.util.concurrent.Futures;
import com.google.common.util.concurrent.ListenableFuture;
import com.google.common.util.concurrent.MoreExecutors;
import com.google.common.util.concurrent.SettableFuture;
import com.google.errorprone.annotations.InlineMe;
import java.io.ByteArrayOutputStream;
import java.io.Closeable;
import java.io.File;
@ -116,7 +112,6 @@ import java.math.BigDecimal;
import java.math.RoundingMode;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.charset.StandardCharsets;
import java.util.ArrayDeque;
import java.util.ArrayList;
import java.util.Arrays;
@ -480,15 +475,16 @@ public final class Util {
}
/**
* @deprecated Use {@link Objects#equals(Object, Object)} instead.
* Tests two objects for {@link Object#equals(Object)} equality, handling the case where one or
* both may be {@code null}.
*
* @param o1 The first object.
* @param o2 The second object.
* @return {@code o1 == null ? o2 == null : o1.equals(o2)}.
*/
@UnstableApi
@Deprecated
@InlineMe(
replacement = "Objects.equals(o1, o2)",
imports = {"java.util.Objects"})
public static boolean areEqual(@Nullable Object o1, @Nullable Object o2) {
return Objects.equals(o1, o2);
return o1 == null ? o2 == null : o1.equals(o2);
}
/**
@ -966,14 +962,16 @@ public final class Util {
/**
* Returns the language tag for a {@link Locale}.
*
* <p>This tag is IETF BCP 47 compliant.
* <p>For API levels &ge; 21, this tag is IETF BCP 47 compliant. Use {@link
* #normalizeLanguageCode(String)} to retrieve a normalized IETF BCP 47 language tag for all API
* levels if needed.
*
* @param locale A {@link Locale}.
* @return The language tag.
*/
@UnstableApi
public static String getLocaleLanguageTag(Locale locale) {
return locale.toLanguageTag();
return SDK_INT >= 21 ? getLocaleLanguageTagV21(locale) : locale.toString();
}
/**
@ -1045,7 +1043,7 @@ public final class Util {
*/
@UnstableApi
public static String fromUtf8Bytes(byte[] bytes) {
return new String(bytes, StandardCharsets.UTF_8);
return new String(bytes, Charsets.UTF_8);
}
/**
@ -1058,7 +1056,7 @@ public final class Util {
*/
@UnstableApi
public static String fromUtf8Bytes(byte[] bytes, int offset, int length) {
return new String(bytes, offset, length, StandardCharsets.UTF_8);
return new String(bytes, offset, length, Charsets.UTF_8);
}
/**
@ -1069,7 +1067,7 @@ public final class Util {
*/
@UnstableApi
public static byte[] getUtf8Bytes(String value) {
return value.getBytes(StandardCharsets.UTF_8);
return value.getBytes(Charsets.UTF_8);
}
/**
@ -1600,7 +1598,7 @@ public final class Util {
*/
@UnstableApi
public static long sampleCountToDurationUs(long sampleCount, int sampleRate) {
return scaleLargeValue(sampleCount, C.MICROS_PER_SECOND, sampleRate, RoundingMode.DOWN);
return scaleLargeValue(sampleCount, C.MICROS_PER_SECOND, sampleRate, RoundingMode.FLOOR);
}
/**
@ -1617,7 +1615,7 @@ public final class Util {
*/
@UnstableApi
public static long durationUsToSampleCount(long durationUs, int sampleRate) {
return scaleLargeValue(durationUs, sampleRate, C.MICROS_PER_SECOND, RoundingMode.UP);
return scaleLargeValue(durationUs, sampleRate, C.MICROS_PER_SECOND, RoundingMode.CEILING);
}
/**
@ -1902,18 +1900,16 @@ public final class Util {
* Scales a large timestamp.
*
* <p>Equivalent to {@link #scaleLargeValue(long, long, long, RoundingMode)} with {@link
* RoundingMode#DOWN}.
* RoundingMode#FLOOR}.
*
* @param timestamp The timestamp to scale.
* @param multiplier The multiplier.
* @param divisor The divisor.
* @return The scaled timestamp.
*/
// TODO: b/372204124 - Consider switching this (and impls below) to HALF_UP rounding to reduce
// round-trip errors when switching between time bases with different resolutions.
@UnstableApi
public static long scaleLargeTimestamp(long timestamp, long multiplier, long divisor) {
return scaleLargeValue(timestamp, multiplier, divisor, RoundingMode.DOWN);
return scaleLargeValue(timestamp, multiplier, divisor, RoundingMode.FLOOR);
}
/**
@ -1926,7 +1922,7 @@ public final class Util {
*/
@UnstableApi
public static long[] scaleLargeTimestamps(List<Long> timestamps, long multiplier, long divisor) {
return scaleLargeValues(timestamps, multiplier, divisor, RoundingMode.DOWN);
return scaleLargeValues(timestamps, multiplier, divisor, RoundingMode.FLOOR);
}
/**
@ -1938,7 +1934,7 @@ public final class Util {
*/
@UnstableApi
public static void scaleLargeTimestampsInPlace(long[] timestamps, long multiplier, long divisor) {
scaleLargeValuesInPlace(timestamps, multiplier, divisor, RoundingMode.DOWN);
scaleLargeValuesInPlace(timestamps, multiplier, divisor, RoundingMode.FLOOR);
}
/**
@ -2250,24 +2246,6 @@ public final class Util {
}
case 12:
return AudioFormat.CHANNEL_OUT_7POINT1POINT4;
case 24:
if (Util.SDK_INT >= 32) {
return AudioFormat.CHANNEL_OUT_7POINT1POINT4
| AudioFormat.CHANNEL_OUT_FRONT_LEFT_OF_CENTER
| AudioFormat.CHANNEL_OUT_FRONT_RIGHT_OF_CENTER
| AudioFormat.CHANNEL_OUT_BACK_CENTER
| AudioFormat.CHANNEL_OUT_TOP_CENTER
| AudioFormat.CHANNEL_OUT_TOP_FRONT_CENTER
| AudioFormat.CHANNEL_OUT_TOP_BACK_CENTER
| AudioFormat.CHANNEL_OUT_TOP_SIDE_LEFT
| AudioFormat.CHANNEL_OUT_TOP_SIDE_RIGHT
| AudioFormat.CHANNEL_OUT_BOTTOM_FRONT_LEFT
| AudioFormat.CHANNEL_OUT_BOTTOM_FRONT_RIGHT
| AudioFormat.CHANNEL_OUT_BOTTOM_FRONT_CENTER
| AudioFormat.CHANNEL_OUT_LOW_FREQUENCY_2;
} else {
return AudioFormat.CHANNEL_INVALID;
}
default:
return AudioFormat.CHANNEL_INVALID;
}
@ -2275,6 +2253,7 @@ public final class Util {
/** Creates {@link AudioFormat} with given sampleRate, channelConfig, and encoding. */
@UnstableApi
@RequiresApi(21)
public static AudioFormat getAudioFormat(int sampleRate, int channelConfig, int encoding) {
return new AudioFormat.Builder()
.setSampleRate(sampleRate)
@ -2334,30 +2313,19 @@ public final class Util {
*/
@UnstableApi
public static int getPcmFrameSize(@C.PcmEncoding int pcmEncoding, int channelCount) {
return getByteDepth(pcmEncoding) * channelCount;
}
/**
* Returns the byte depth for audio with the specified encoding.
*
* @param pcmEncoding The encoding of the audio data.
* @return The byte depth of the audio.
*/
@UnstableApi
public static int getByteDepth(@C.PcmEncoding int pcmEncoding) {
switch (pcmEncoding) {
case C.ENCODING_PCM_8BIT:
return 1;
return channelCount;
case C.ENCODING_PCM_16BIT:
case C.ENCODING_PCM_16BIT_BIG_ENDIAN:
return 2;
return channelCount * 2;
case C.ENCODING_PCM_24BIT:
case C.ENCODING_PCM_24BIT_BIG_ENDIAN:
return 3;
return channelCount * 3;
case C.ENCODING_PCM_32BIT:
case C.ENCODING_PCM_32BIT_BIG_ENDIAN:
case C.ENCODING_PCM_FLOAT:
return 4;
return channelCount * 4;
case C.ENCODING_INVALID:
case Format.NO_VALUE:
default:
@ -2449,6 +2417,7 @@ public final class Util {
* @see AudioManager#generateAudioSessionId()
*/
@UnstableApi
@RequiresApi(21)
public static int generateAudioSessionIdV21(Context context) {
@Nullable
AudioManager audioManager = ((AudioManager) context.getSystemService(Context.AUDIO_SERVICE));
@ -3096,7 +3065,8 @@ public final class Util {
*/
@UnstableApi
public static boolean isWear(Context context) {
return context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_WATCH);
return SDK_INT >= 20
&& context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_WATCH);
}
/**
@ -3324,32 +3294,9 @@ public final class Util {
if ((roleFlags & C.ROLE_FLAG_TRICK_PLAY) != 0) {
result.add("trick-play");
}
if ((roleFlags & C.ROLE_FLAG_AUXILIARY) != 0) {
result.add("auxiliary");
}
return result;
}
/** Returns a string representation of the {@link C.AuxiliaryTrackType}. */
@UnstableApi
public static String getAuxiliaryTrackTypeString(@C.AuxiliaryTrackType int auxiliaryTrackType) {
// LINT.IfChange(auxiliary_track_type)
switch (auxiliaryTrackType) {
case AUXILIARY_TRACK_TYPE_UNDEFINED:
return "undefined";
case AUXILIARY_TRACK_TYPE_ORIGINAL:
return "original";
case AUXILIARY_TRACK_TYPE_DEPTH_LINEAR:
return "depth-linear";
case AUXILIARY_TRACK_TYPE_DEPTH_INVERSE:
return "depth-inverse";
case AUXILIARY_TRACK_TYPE_DEPTH_METADATA:
return "depth metadata";
default:
throw new IllegalStateException("Unsupported auxiliary track type");
}
}
/**
* Returns the current time in milliseconds since the epoch.
*
@ -3426,14 +3373,13 @@ public final class Util {
// bounds. From API 29, if the app targets API 29 or later, the {@link
// MediaFormat#KEY_ALLOW_FRAME_DROP} key prevents frame dropping even when the surface is
// full.
// Some devices might drop frames despite setting {@link
// MediaFormat#KEY_ALLOW_FRAME_DROP} to 0. See b/307518793, b/289983935 and b/353487886.
// Some API 30 devices might drop frames despite setting {@link
// MediaFormat#KEY_ALLOW_FRAME_DROP} to 0. See b/307518793 and b/289983935.
return SDK_INT < 29
|| context.getApplicationInfo().targetSdkVersion < 29
|| ((SDK_INT == 30
&& (Ascii.equalsIgnoreCase(MODEL, "moto g(20)")
|| Ascii.equalsIgnoreCase(MODEL, "rmx3231")))
|| (SDK_INT == 34 && Ascii.equalsIgnoreCase(MODEL, "sm-x200")));
|| (SDK_INT == 30
&& (Ascii.equalsIgnoreCase(MODEL, "moto g(20)")
|| Ascii.equalsIgnoreCase(MODEL, "rmx3231")));
}
/**
@ -3553,7 +3499,9 @@ public final class Util {
@UnstableApi
public static Drawable getDrawable(
Context context, Resources resources, @DrawableRes int drawableRes) {
return resources.getDrawable(drawableRes, context.getTheme());
return SDK_INT >= 21
? Api21.getDrawable(context, resources, drawableRes)
: resources.getDrawable(drawableRes);
}
/**
@ -3715,6 +3663,11 @@ public final class Util {
return split(config.getLocales().toLanguageTags(), ",");
}
@RequiresApi(21)
private static String getLocaleLanguageTagV21(Locale locale) {
return locale.toLanguageTag();
}
private static HashMap<String, String> createIsoLanguageReplacementMap() {
String[] iso2Languages = Locale.getISOLanguages();
HashMap<String, String> replacedLanguages =
@ -3934,9 +3887,18 @@ public final class Util {
0xF3
};
@RequiresApi(21)
private static final class Api21 {
@DoNotInline
public static Drawable getDrawable(Context context, Resources resources, @DrawableRes int res) {
return resources.getDrawable(res, context.getTheme());
}
}
@RequiresApi(29)
private static class Api29 {
@DoNotInline
public static void startForeground(
Service mediaSessionService,
int notificationId,

View file

@ -20,7 +20,6 @@ import static com.google.common.truth.Truth.assertThat;
import android.net.Uri;
import android.os.Bundle;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import com.google.common.collect.ImmutableList;
import org.junit.Test;
import org.junit.runner.RunWith;
@ -69,7 +68,6 @@ public class MediaMetadataTest {
assertThat(mediaMetadata.compilation).isNull();
assertThat(mediaMetadata.station).isNull();
assertThat(mediaMetadata.mediaType).isNull();
assertThat(mediaMetadata.supportedCommands).isEmpty();
assertThat(mediaMetadata.extras).isNull();
}
@ -280,7 +278,6 @@ public class MediaMetadataTest {
.setCompilation("Amazing songs.")
.setStation("radio station")
.setMediaType(MediaMetadata.MEDIA_TYPE_MIXED)
.setSupportedCommands(ImmutableList.of("command1", "command2"))
.setExtras(extras)
.build();
}

View file

@ -148,7 +148,6 @@ public final class MimeTypesTest {
assertThat(MimeTypes.getTrackType(MimeTypes.APPLICATION_CEA608)).isEqualTo(C.TRACK_TYPE_TEXT);
assertThat(MimeTypes.getTrackType(MimeTypes.APPLICATION_EMSG)).isEqualTo(C.TRACK_TYPE_METADATA);
assertThat(MimeTypes.getTrackType(MimeTypes.APPLICATION_AIT)).isEqualTo(C.TRACK_TYPE_METADATA);
assertThat(MimeTypes.getTrackType(MimeTypes.APPLICATION_CAMERA_MOTION))
.isEqualTo(C.TRACK_TYPE_CAMERA_MOTION);
assertThat(MimeTypes.getTrackType("application/custom")).isEqualTo(C.TRACK_TYPE_UNKNOWN);

View file

@ -40,9 +40,7 @@ import androidx.media3.common.SimpleBasePlayer.State;
import androidx.media3.common.text.Cue;
import androidx.media3.common.text.CueGroup;
import androidx.media3.common.util.Size;
import androidx.media3.extractor.metadata.icy.IcyInfo;
import androidx.media3.test.utils.FakeMetadataEntry;
import androidx.media3.test.utils.FakeTimeline;
import androidx.media3.test.utils.TestUtil;
import androidx.test.core.app.ApplicationProvider;
import androidx.test.ext.junit.runners.AndroidJUnit4;
@ -227,15 +225,6 @@ public class SimpleBasePlayerTest {
Size surfaceSize = new Size(480, 360);
DeviceInfo deviceInfo =
new DeviceInfo.Builder(DeviceInfo.PLAYBACK_TYPE_LOCAL).setMaxVolume(7).build();
MediaMetadata mediaMetadata = new MediaMetadata.Builder().setTitle("title").build();
Tracks tracks =
new Tracks(
ImmutableList.of(
new Tracks.Group(
new TrackGroup(new Format.Builder().build()),
/* adaptiveSupported= */ true,
/* trackSupport= */ new int[] {C.FORMAT_HANDLED},
/* trackSelected= */ new boolean[] {true})));
ImmutableList<SimpleBasePlayer.MediaItemData> playlist =
ImmutableList.of(
new SimpleBasePlayer.MediaItemData.Builder(/* uid= */ new Object()).build(),
@ -247,8 +236,6 @@ public class SimpleBasePlayerTest {
new AdPlaybackState(
/* adsId= */ new Object(), /* adGroupTimesUs...= */ 555, 666))
.build()))
.setMediaMetadata(mediaMetadata)
.setTracks(tracks)
.build());
MediaMetadata playlistMetadata = new MediaMetadata.Builder().setArtist("artist").build();
SimpleBasePlayer.PositionSupplier contentPositionSupplier = () -> 456;
@ -325,10 +312,7 @@ public class SimpleBasePlayerTest {
assertThat(state.surfaceSize).isEqualTo(surfaceSize);
assertThat(state.newlyRenderedFirstFrame).isTrue();
assertThat(state.timedMetadata).isEqualTo(timedMetadata);
assertThat(state.getPlaylist()).isEqualTo(playlist);
assertThat(state.timeline.getWindowCount()).isEqualTo(2);
assertThat(state.currentTracks).isEqualTo(tracks);
assertThat(state.currentMetadata).isEqualTo(mediaMetadata);
assertThat(state.playlist).isEqualTo(playlist);
assertThat(state.playlistMetadata).isEqualTo(playlistMetadata);
assertThat(state.currentMediaItemIndex).isEqualTo(1);
assertThat(state.currentAdGroupIndex).isEqualTo(1);
@ -343,69 +327,6 @@ public class SimpleBasePlayerTest {
assertThat(state.discontinuityPositionMs).isEqualTo(400);
}
@Test
public void stateBuilderBuild_withExplicitTimeline_setsCorrectValues() {
MediaMetadata mediaMetadata = new MediaMetadata.Builder().setTitle("title").build();
Tracks tracks =
new Tracks(
ImmutableList.of(
new Tracks.Group(
new TrackGroup(new Format.Builder().build()),
/* adaptiveSupported= */ true,
/* trackSupport= */ new int[] {C.FORMAT_HANDLED},
/* trackSelected= */ new boolean[] {true})));
Timeline timeline = new FakeTimeline(/* windowCount= */ 2);
State state = new State.Builder().setPlaylist(timeline, tracks, mediaMetadata).build();
assertThat(state.timeline).isEqualTo(timeline);
assertThat(state.currentTracks).isEqualTo(tracks);
assertThat(state.currentMetadata).isEqualTo(mediaMetadata);
}
@Test
public void
stateBuilderBuild_withUndefinedMediaMetadataAndExplicitTimeline_derivesMediaMetadataFromTracksAndMediaItem()
throws Exception {
Timeline timeline =
new FakeTimeline(
new FakeTimeline.TimelineWindowDefinition(
/* periodCount= */ 1,
/* id= */ 0,
/* isSeekable= */ true,
/* isDynamic= */ true,
/* isLive= */ true,
/* isPlaceholder= */ false,
/* durationUs= */ 1000,
/* defaultPositionUs= */ 0,
/* windowOffsetInFirstPeriodUs= */ 0,
ImmutableList.of(AdPlaybackState.NONE),
new MediaItem.Builder()
.setMediaId("1")
.setMediaMetadata(new MediaMetadata.Builder().setArtist("artist").build())
.build()));
Tracks tracks =
new Tracks(
ImmutableList.of(
new Tracks.Group(
new TrackGroup(
new Format.Builder()
.setMetadata(
new Metadata(
new IcyInfo(
/* rawMetadata= */ new byte[0], "title", /* url= */ null)))
.build()),
/* adaptiveSupported= */ true,
/* trackSupport= */ new int[] {C.FORMAT_HANDLED},
/* trackSelected= */ new boolean[] {true})));
State state =
new State.Builder().setPlaylist(timeline, tracks, /* currentMetadata= */ null).build();
assertThat(state.currentMetadata)
.isEqualTo(new MediaMetadata.Builder().setArtist("artist").setTitle("title").build());
}
@Test
public void stateBuilderBuild_emptyTimelineWithReadyState_throwsException() {
assertThrows(
@ -8148,211 +8069,6 @@ public class SimpleBasePlayerTest {
verifyNoMoreInteractions(listener);
}
@SuppressWarnings("deprecation") // Verifying deprecated listener calls.
@Test
public void seekTo_asyncHandlingToNewItem_usesPlaceholderStateWithUpdatedTracksAndMetadata() {
MediaItem newMediaItem = new MediaItem.Builder().setMediaId("2").build();
Tracks newTracks =
new Tracks(
ImmutableList.of(
new Tracks.Group(
new TrackGroup(new Format.Builder().build()),
/* adaptiveSupported= */ true,
/* trackSupport= */ new int[] {C.FORMAT_HANDLED},
/* trackSelected= */ new boolean[] {true})));
MediaMetadata newMediaMetadata = new MediaMetadata.Builder().setTitle("title").build();
State state =
new State.Builder()
.setAvailableCommands(new Commands.Builder().addAllCommands().build())
.setPlaylist(
ImmutableList.of(
new SimpleBasePlayer.MediaItemData.Builder(/* uid= */ 1).build(),
new SimpleBasePlayer.MediaItemData.Builder(/* uid= */ 2)
.setMediaItem(newMediaItem)
.setTracks(newTracks)
.setMediaMetadata(newMediaMetadata)
.build()))
.build();
SettableFuture<?> future = SettableFuture.create();
SimpleBasePlayer player =
new SimpleBasePlayer(Looper.myLooper()) {
@Override
protected State getState() {
return state;
}
@Override
protected ListenableFuture<?> handleSeek(
int mediaItemIndex, long positionMs, @Player.Command int seekCommand) {
return future;
}
};
Listener listener = mock(Listener.class);
player.addListener(listener);
player.seekTo(/* mediaItemIndex= */ 1, /* positionMs= */ 3000);
// Verify placeholder state and listener calls.
assertThat(player.getCurrentMediaItemIndex()).isEqualTo(1);
assertThat(player.getCurrentTracks()).isEqualTo(newTracks);
assertThat(player.getMediaMetadata()).isEqualTo(newMediaMetadata);
verify(listener).onMediaItemTransition(newMediaItem, Player.MEDIA_ITEM_TRANSITION_REASON_SEEK);
verify(listener).onTracksChanged(newTracks);
verify(listener).onMediaMetadataChanged(newMediaMetadata);
verify(listener).onPositionDiscontinuity(Player.DISCONTINUITY_REASON_SEEK);
verify(listener).onPositionDiscontinuity(any(), any(), eq(Player.DISCONTINUITY_REASON_SEEK));
verifyNoMoreInteractions(listener);
}
@SuppressWarnings("deprecation") // Verifying deprecated listener calls.
@Test
public void
seekTo_asyncHandlingToNewItemWithExplicitTimeline_usesPlaceholderStateWithEmptyTracksAndMetadata() {
Tracks tracks =
new Tracks(
ImmutableList.of(
new Tracks.Group(
new TrackGroup(new Format.Builder().build()),
/* adaptiveSupported= */ true,
/* trackSupport= */ new int[] {C.FORMAT_HANDLED},
/* trackSelected= */ new boolean[] {true})));
MediaMetadata mediaMetadata = new MediaMetadata.Builder().setTitle("title").build();
Timeline timeline = new FakeTimeline(/* windowCount= */ 2);
State state =
new State.Builder()
.setAvailableCommands(new Commands.Builder().addAllCommands().build())
.setPlaylist(timeline, tracks, mediaMetadata)
.build();
SettableFuture<?> future = SettableFuture.create();
SimpleBasePlayer player =
new SimpleBasePlayer(Looper.myLooper()) {
@Override
protected State getState() {
return state;
}
@Override
protected ListenableFuture<?> handleSeek(
int mediaItemIndex, long positionMs, @Player.Command int seekCommand) {
return future;
}
};
Listener listener = mock(Listener.class);
player.addListener(listener);
player.seekTo(/* mediaItemIndex= */ 1, /* positionMs= */ 3000);
// Verify placeholder state and listener calls.
assertThat(player.getCurrentMediaItemIndex()).isEqualTo(1);
assertThat(player.getCurrentTracks()).isEqualTo(Tracks.EMPTY);
assertThat(player.getMediaMetadata()).isEqualTo(MediaMetadata.EMPTY);
verify(listener)
.onMediaItemTransition(
timeline.getWindow(/* windowIndex= */ 1, new Timeline.Window()).mediaItem,
Player.MEDIA_ITEM_TRANSITION_REASON_SEEK);
verify(listener).onTracksChanged(Tracks.EMPTY);
verify(listener).onMediaMetadataChanged(MediaMetadata.EMPTY);
verify(listener).onPositionDiscontinuity(Player.DISCONTINUITY_REASON_SEEK);
verify(listener).onPositionDiscontinuity(any(), any(), eq(Player.DISCONTINUITY_REASON_SEEK));
verifyNoMoreInteractions(listener);
}
@SuppressWarnings("deprecation") // Verifying deprecated listener calls.
@Test
public void
seekTo_asyncHandlingToSameItem_usesPlaceholderStateWithoutChangingTracksAndMetadata() {
Tracks tracks =
new Tracks(
ImmutableList.of(
new Tracks.Group(
new TrackGroup(new Format.Builder().build()),
/* adaptiveSupported= */ true,
/* trackSupport= */ new int[] {C.FORMAT_HANDLED},
/* trackSelected= */ new boolean[] {true})));
MediaMetadata mediaMetadata = new MediaMetadata.Builder().setTitle("title").build();
State state =
new State.Builder()
.setAvailableCommands(new Commands.Builder().addAllCommands().build())
.setPlaylist(
ImmutableList.of(
new SimpleBasePlayer.MediaItemData.Builder(/* uid= */ 1)
.setTracks(tracks)
.setMediaMetadata(mediaMetadata)
.build()))
.build();
SettableFuture<?> future = SettableFuture.create();
SimpleBasePlayer player =
new SimpleBasePlayer(Looper.myLooper()) {
@Override
protected State getState() {
return state;
}
@Override
protected ListenableFuture<?> handleSeek(
int mediaItemIndex, long positionMs, @Player.Command int seekCommand) {
return future;
}
};
Listener listener = mock(Listener.class);
player.addListener(listener);
player.seekTo(/* positionMs= */ 3000);
// Verify placeholder state and listener calls.
assertThat(player.getCurrentTracks()).isEqualTo(tracks);
assertThat(player.getMediaMetadata()).isEqualTo(mediaMetadata);
verify(listener).onPositionDiscontinuity(Player.DISCONTINUITY_REASON_SEEK);
verify(listener).onPositionDiscontinuity(any(), any(), eq(Player.DISCONTINUITY_REASON_SEEK));
verifyNoMoreInteractions(listener);
}
@SuppressWarnings("deprecation") // Verifying deprecated listener calls.
@Test
public void
seekTo_asyncHandlingToSameItemWithExplicitTimeline_usesPlaceholderStateWithoutChangingTracksAndMetadata() {
Tracks tracks =
new Tracks(
ImmutableList.of(
new Tracks.Group(
new TrackGroup(new Format.Builder().build()),
/* adaptiveSupported= */ true,
/* trackSupport= */ new int[] {C.FORMAT_HANDLED},
/* trackSelected= */ new boolean[] {true})));
MediaMetadata mediaMetadata = new MediaMetadata.Builder().setTitle("title").build();
Timeline timeline = new FakeTimeline(/* windowCount= */ 2);
State state =
new State.Builder()
.setAvailableCommands(new Commands.Builder().addAllCommands().build())
.setPlaylist(timeline, tracks, mediaMetadata)
.build();
SettableFuture<?> future = SettableFuture.create();
SimpleBasePlayer player =
new SimpleBasePlayer(Looper.myLooper()) {
@Override
protected State getState() {
return state;
}
@Override
protected ListenableFuture<?> handleSeek(
int mediaItemIndex, long positionMs, @Player.Command int seekCommand) {
return future;
}
};
Listener listener = mock(Listener.class);
player.addListener(listener);
player.seekTo(/* positionMs= */ 3000);
// Verify placeholder state and listener calls.
assertThat(player.getCurrentTracks()).isEqualTo(tracks);
assertThat(player.getMediaMetadata()).isEqualTo(mediaMetadata);
verify(listener).onPositionDiscontinuity(Player.DISCONTINUITY_REASON_SEEK);
verify(listener).onPositionDiscontinuity(any(), any(), eq(Player.DISCONTINUITY_REASON_SEEK));
verifyNoMoreInteractions(listener);
}
@Test
public void seekTo_withoutAvailableCommandForSeekToMediaItem_isNotForwarded() {
State state =

View file

@ -34,7 +34,11 @@ public final class VideoSizeTest {
@Test
public void roundTripViaBundle_ofArbitraryVideoSize_yieldsEqualInstance() {
VideoSize videoSize =
new VideoSize(/* width= */ 9, /* height= */ 8, /* pixelWidthHeightRatio= */ 6);
new VideoSize(
/* width= */ 9,
/* height= */ 8,
/* unappliedRotationDegrees= */ 7,
/* pixelWidthHeightRatio= */ 6);
assertThat(roundTripViaBundle(videoSize)).isEqualTo(videoSize);
}

View file

@ -1,227 +0,0 @@
/*
* Copyright (C) 2024 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package androidx.media3.common.audio;
import static androidx.media3.common.audio.SonicTestingUtils.calculateAccumulatedTruncationErrorForResampling;
import static androidx.media3.test.utils.TestUtil.generateFloatInRange;
import static com.google.common.truth.Truth.assertThat;
import static java.lang.Math.max;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableSet;
import com.google.common.collect.Range;
import java.math.BigDecimal;
import java.math.RoundingMode;
import java.nio.ByteBuffer;
import java.nio.ShortBuffer;
import java.util.Random;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.robolectric.ParameterizedRobolectricTestRunner;
import org.robolectric.ParameterizedRobolectricTestRunner.Parameter;
import org.robolectric.ParameterizedRobolectricTestRunner.Parameters;
/** Parameterized robolectric test for {@link Sonic}. */
@RunWith(ParameterizedRobolectricTestRunner.class)
public final class RandomParameterizedSonicTest {
private static final int BLOCK_SIZE = 4096;
private static final int BYTES_PER_SAMPLE = 2;
private static final int SAMPLE_RATE = 48000;
// Max 10 min streams.
private static final long MAX_LENGTH_SAMPLES = 10 * 60 * SAMPLE_RATE;
/** Defines how many random instances of each parameter the test runner should generate. */
private static final int PARAM_COUNT = 5;
private static final int SPEED_DECIMAL_PRECISION = 2;
/**
* Allowed error tolerance ratio for number of output samples for Sonic's time stretching
* algorithm.
*
* <p>The actual tolerance is calculated as {@code expectedOutputSampleCount /
* TIME_STRETCHING_SAMPLE_DRIFT_TOLERANCE}, rounded to the nearest integer value. However, we
* always allow a minimum tolerance of ±1 samples.
*
* <p>This tolerance is roughly equal to an error of 900us/~44 samples/0.000017% for a 90 min mono
* stream @48KHz. To obtain the value, we ran 100 iterations of {@link
* #timeStretching_returnsExpectedNumberOfSamples()} (by setting {@link #PARAM_COUNT} to 10) and
* we calculated the average delta percentage between expected number of samples and actual number
* of samples (b/366169590).
*/
private static final BigDecimal TIME_STRETCHING_SAMPLE_DRIFT_TOLERANCE =
new BigDecimal("0.00000017");
private static final ImmutableList<Range<Float>> SPEED_RANGES =
ImmutableList.of(
Range.closedOpen(0f, 0.5f),
Range.closedOpen(0.5f, 1f),
Range.closedOpen(1f, 2f),
Range.closedOpen(2f, 20f));
private static final Random random = new Random(/* seed */ 0);
private static final ImmutableList<Object[]> sParams = initParams();
@Parameters(name = "speed={0}, streamLength={1}")
public static ImmutableList<Object[]> params() {
// params() is called multiple times, so return cached parameters to avoid regenerating
// different random parameter values.
return sParams;
}
/**
* Returns a list of random parameter combinations with which to run the tests in this class.
*
* <p>Each list item contains a value for {{@link #speed}, {@link #streamLength}} stored within an
* Object array.
*
* <p>The method generates {@link #PARAM_COUNT} random {@link #speed} values and {@link
* #PARAM_COUNT} random {@link #streamLength} values. These generated values are then grouped into
* all possible combinations, and every group passed as parameters for each test.
*/
private static ImmutableList<Object[]> initParams() {
ImmutableSet.Builder<Object[]> paramsBuilder = new ImmutableSet.Builder<>();
ImmutableSet.Builder<BigDecimal> speedsBuilder = new ImmutableSet.Builder<>();
for (int i = 0; i < PARAM_COUNT; i++) {
Range<Float> range = SPEED_RANGES.get(i % SPEED_RANGES.size());
BigDecimal speed =
BigDecimal.valueOf(generateFloatInRange(random, range))
.setScale(SPEED_DECIMAL_PRECISION, RoundingMode.HALF_EVEN);
speedsBuilder.add(speed);
}
ImmutableSet<BigDecimal> speeds = speedsBuilder.build();
ImmutableSet<Long> lengths =
new ImmutableSet.Builder<Long>()
.addAll(
random
.longs(/* min */ 0, MAX_LENGTH_SAMPLES)
.distinct()
.limit(PARAM_COUNT)
.iterator())
.build();
for (long length : lengths) {
for (BigDecimal speed : speeds) {
paramsBuilder.add(new Object[] {speed, length});
}
}
return paramsBuilder.build().asList();
}
@Parameter(0)
public BigDecimal speed;
@Parameter(1)
public long streamLength;
@Test
public void resampling_returnsExpectedNumberOfSamples() {
byte[] inputBuffer = new byte[BLOCK_SIZE * BYTES_PER_SAMPLE];
ShortBuffer outBuffer = ShortBuffer.allocate(BLOCK_SIZE);
// Use same speed and pitch values for Sonic to resample stream.
Sonic sonic =
new Sonic(
/* inputSampleRateHz= */ SAMPLE_RATE,
/* channelCount= */ 1,
/* speed= */ speed.floatValue(),
/* pitch= */ speed.floatValue(),
/* outputSampleRateHz= */ SAMPLE_RATE);
long readSampleCount = 0;
for (long samplesLeft = streamLength; samplesLeft > 0; samplesLeft -= BLOCK_SIZE) {
random.nextBytes(inputBuffer);
if (samplesLeft >= BLOCK_SIZE) {
sonic.queueInput(ByteBuffer.wrap(inputBuffer).asShortBuffer());
} else {
// The last buffer to queue might have less samples than BLOCK_SIZE, so we should only queue
// the remaining number of samples (samplesLeft).
sonic.queueInput(
ByteBuffer.wrap(inputBuffer, 0, (int) (samplesLeft * BYTES_PER_SAMPLE))
.asShortBuffer());
sonic.queueEndOfStream();
}
while (sonic.getOutputSize() > 0) {
sonic.getOutput(outBuffer);
readSampleCount += outBuffer.position();
outBuffer.clear();
}
}
sonic.flush();
BigDecimal bigLength = new BigDecimal(String.valueOf(streamLength));
// The scale of expectedSize will be bigLength.scale() - speed.scale(). Thus, the result should
// always yield an integer.
BigDecimal expectedSize = bigLength.divide(speed, RoundingMode.HALF_EVEN);
long accumulatedTruncationError =
calculateAccumulatedTruncationErrorForResampling(
bigLength, new BigDecimal(SAMPLE_RATE), speed);
assertThat(readSampleCount)
.isWithin(1)
.of(expectedSize.longValueExact() - accumulatedTruncationError);
}
@Test
public void timeStretching_returnsExpectedNumberOfSamples() {
byte[] buf = new byte[BLOCK_SIZE * BYTES_PER_SAMPLE];
ShortBuffer outBuffer = ShortBuffer.allocate(BLOCK_SIZE);
Sonic sonic =
new Sonic(
/* inputSampleRateHz= */ SAMPLE_RATE,
/* channelCount= */ 1,
speed.floatValue(),
/* pitch= */ 1,
/* outputSampleRateHz= */ SAMPLE_RATE);
long readSampleCount = 0;
for (long samplesLeft = streamLength; samplesLeft > 0; samplesLeft -= BLOCK_SIZE) {
random.nextBytes(buf);
if (samplesLeft >= BLOCK_SIZE) {
sonic.queueInput(ByteBuffer.wrap(buf).asShortBuffer());
} else {
sonic.queueInput(
ByteBuffer.wrap(buf, 0, (int) (samplesLeft * BYTES_PER_SAMPLE)).asShortBuffer());
sonic.queueEndOfStream();
}
while (sonic.getOutputSize() > 0) {
sonic.getOutput(outBuffer);
readSampleCount += outBuffer.position();
outBuffer.clear();
}
}
sonic.flush();
BigDecimal bigLength = new BigDecimal(String.valueOf(streamLength));
// The scale of expectedSampleCount will be bigLength.scale() - speed.scale(). Thus, the result
// should always yield an integer.
BigDecimal expectedSampleCount = bigLength.divide(speed, RoundingMode.HALF_EVEN);
// Calculate allowed tolerance and round to nearest integer.
BigDecimal allowedTolerance =
TIME_STRETCHING_SAMPLE_DRIFT_TOLERANCE
.multiply(expectedSampleCount)
.setScale(/* newScale= */ 0, RoundingMode.HALF_EVEN);
// Always allow at least 1 sample of tolerance.
long tolerance = max(allowedTolerance.longValue(), 1);
assertThat(readSampleCount).isWithin(tolerance).of(expectedSampleCount.longValueExact());
}
}

Some files were not shown because too many files have changed in this diff Show more