Merge branch 'google:dev-v2' into dev-v2

This commit is contained in:
Dustin 2022-01-30 21:26:24 -07:00 committed by GitHub
commit 4f365cef90
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
247 changed files with 9044 additions and 1757 deletions

View file

@ -4,7 +4,7 @@
* Core library:
* Support preferred video role flags in track selection
((#9402)[https://github.com/google/ExoPlayer/issues/9402]).
([#9402](https://github.com/google/ExoPlayer/issues/9402)).
* Prefer audio content preferences (for example, "default" audio track or
track matching system Locale language) over technical track selection
constraints (for example, preferred MIME type, or maximum channel
@ -13,24 +13,34 @@
can always be made distinguishable by setting an `id` in the
`TrackGroup` constructor. This fixes a crash when resuming playback
after backgrounding the app with an active track override
((#9718)[https://github.com/google/ExoPlayer/issues/9718]).
([#9718](https://github.com/google/ExoPlayer/issues/9718)).
* Sleep and retry when creating a `MediaCodec` instance fails. This works
around an issue that occurs on some devices when switching a surface
from a secure codec to another codec
((#8696)[https://github.com/google/ExoPlayer/issues/8696]).
([#8696](https://github.com/google/ExoPlayer/issues/8696)).
* Add `MediaCodecAdapter.getMetrics()` to allow users obtain metrics data
from `MediaCodec`.
([#9766](https://github.com/google/ExoPlayer/issues/9766)).
* Amend logic in `AdaptiveTrackSelection` to allow a quality increase
under sufficient network bandwidth even if playback is very close to the
live edge ((#9784)[https://github.com/google/ExoPlayer/issues/9784]).
live edge ([#9784](https://github.com/google/ExoPlayer/issues/9784)).
* Fix Maven dependency resolution
((#8353)[https://github.com/google/ExoPlayer/issues/8353]).
([#8353](https://github.com/google/ExoPlayer/issues/8353)).
* Fix decoder fallback logic for Dolby Atmos (E-AC3-JOC) and Dolby Vision
to use a compatible base decoder (E-AC3 or H264/H265) if needed.
* Disable automatic speed adjustment for live streams that neither have
low-latency features nor a user request setting the speed
((#9329)[https://github.com/google/ExoPlayer/issues/9329]).
([#9329](https://github.com/google/ExoPlayer/issues/9329)).
* Update video track selection logic to take preferred MIME types and role
flags into account when selecting multiple video tracks for adaptation
([#9519](https://github.com/google/ExoPlayer/issues/9519)).
* Update video and audio track selection logic to only choose formats for
adaptive selections that have the same level of decoder and hardware
support ([#9565](https://github.com/google/ExoPlayer/issues/9565)).
* Update video track selection logic to prefer more efficient codecs if
multiple codecs are supported by primary, hardware-accelerated decoders
([#4835](https://github.com/google/ExoPlayer/issues/4835)).
* Rename `DecoderCounters#inputBufferCount` to `queuedInputBufferCount`.
* Android 12 compatibility:
* Upgrade the Cast extension to depend on
`com.google.android.gms:play-services-cast-framework:20.1.0`. Earlier
@ -43,28 +53,40 @@
constructors.
* Change `AudioCapabilities` APIs to require passing explicitly
`AudioCapabilities.DEFAULT_AUDIO_CAPABILITIES` instead of `null`.
* Allow customization of the `AudioTrack` buffer size calculation by
injecting an `AudioTrackBufferSizeProvider` to `DefaultAudioSink`.
([#8891](https://github.com/google/ExoPlayer/issues/8891)).
* Extractors:
* Fix inconsistency with spec in H.265 SPS nal units parsing
((#9719)[https://github.com/google/ExoPlayer/issues/9719]).
([#9719](https://github.com/google/ExoPlayer/issues/9719)).
* Parse Vorbis Comments (including `METADATA_BLOCK_PICTURE`) in Ogg Opus
and Vorbis files.
* Text:
* Add a `MediaItem.SubtitleConfiguration#id` field which is propagated to
the `Format#id` field of the subtitle track created from the
configuration
((#9673)[https://github.com/google/ExoPlayer/issues/9673]).
* Rename `DecoderCounters#inputBufferCount` to `queuedInputBufferCount`.
([#9673](https://github.com/google/ExoPlayer/issues/9673)).
* Add basic support for WebVTT subtitles in Matroska containers
([#9886](https://github.com/google/ExoPlayer/issues/9886)).
* DRM:
* Remove `playbackLooper` from `DrmSessionManager.(pre)acquireSession`.
When a `DrmSessionManager` is used by an app in a custom `MediaSource`,
the `playbackLooper` needs to be passed to `DrmSessionManager.setPlayer`
instead.
* IMA:
* Ad playback / IMA:
* Add a method to `AdPlaybackState` to allow resetting an ad group so that
it can be played again
([#9615](https://github.com/google/ExoPlayer/issues/9615)).
* Enforce playback speed of 1.0 during ad playback
([#9018](https://github.com/google/ExoPlayer/issues/9018)).
* DASH:
* Support the `forced-subtitle` track role
([#9727](https://github.com/google/ExoPlayer/issues/9727)).
* Stop interpreting the `main` track role as `C.SELECTION_FLAG_DEFAULT`.
* Fix bug when base URLs have been assigned the same service location and
priority in manifests that do not declare the dvb namespace. This
prevents the exclusion logic to exclude base URL when they actually
should be used as a fallback base URL.
* HLS:
* Use chunkless preparation by default to improve start up time. If your
renditions contain muxed closed-caption tracks that are *not* declared
@ -81,23 +103,36 @@
* Fix the color of the numbers in `StyledPlayerView` rewind and
fastforward buttons when using certain themes
([#9765](https://github.com/google/ExoPlayer/issues/9765)).
* Correctly translate playback speed strings
([#9811](https://github.com/google/ExoPlayer/issues/9811)).
* Transformer:
* Increase required min API version to 21.
* `TransformationException` is now used to describe errors that occur
during a transformation.
* Add `TransformationRequest` for specifying the transformation options.
* Allow multiple listeners to be registered.
* Fix Transformer being stuck when the codec output is partially read.
* Fix potential NPE in `Transformer.getProgress` when releasing the muxer
throws.
* Add a demo app for applying transformations.
* MediaSession extension:
* Remove deprecated call to `onStop(/* reset= */ true)` and provide an
opt-out flag for apps that don't want to clear the playlist on stop.
* RTSP:
* Provide a client API to override the `SocketFactory` used for any server
connection ([#9606](https://github.com/google/ExoPlayer/pull/9606)).
* Prefers DIGEST authentication method over BASIC if both are present.
* Prefers DIGEST authentication method over BASIC if both are present
([#9800](https://github.com/google/ExoPlayer/issues/9800)).
* Handle when RTSP track timing is not available
([#9775](https://github.com/google/ExoPlayer/issues/9775)).
* Ignores invalid RTP-Info header values
([#9619](https://github.com/google/ExoPlayer/issues/9619)).
* Cast extension
* Fix bug that prevented `CastPlayer` from calling `onIsPlayingChanged`
correctly.
correctly ([#9792](https://github.com/google/ExoPlayer/issues/9792)).
* Support audio metadata including artwork with
`DefaultMediaItemConverter`
([#9663](https://github.com/google/ExoPlayer/issues/9663)).
* Remove deprecated symbols:
* Remove `MediaSourceFactory#setDrmSessionManager`,
`MediaSourceFactory#setDrmHttpDataSourceFactory`, and
@ -114,6 +149,8 @@
`MediaItem.LiveConfiguration.Builder#setTargetOffsetMs` to override the
manifest, or `DashMediaSource#setFallbackTargetLiveOffsetMs` to provide
a fallback value.
* Remove `(Simple)ExoPlayer.setThrowsWhenUsingWrongThread`. Opting out of
the thread enforcement is no longer possible.
### 2.16.1 (2021-11-18)

View file

@ -37,6 +37,7 @@ project.ext {
androidxAnnotationVersion = '1.3.0'
androidxAppCompatVersion = '1.3.1'
androidxCollectionVersion = '1.1.0'
androidxConstraintLayoutVersion = '2.0.4'
androidxCoreVersion = '1.7.0'
androidxFuturesVersion = '1.1.0'
androidxMediaVersion = '1.4.3'

View file

@ -223,10 +223,12 @@ import java.util.ArrayList;
if (currentPlayer != localPlayer || tracksInfo == lastSeenTrackGroupInfo) {
return;
}
if (!tracksInfo.isTypeSupportedOrEmpty(C.TRACK_TYPE_VIDEO)) {
if (!tracksInfo.isTypeSupportedOrEmpty(
C.TRACK_TYPE_VIDEO, /* allowExceedsCapabilities= */ true)) {
listener.onUnsupportedTrack(C.TRACK_TYPE_VIDEO);
}
if (!tracksInfo.isTypeSupportedOrEmpty(C.TRACK_TYPE_AUDIO)) {
if (!tracksInfo.isTypeSupportedOrEmpty(
C.TRACK_TYPE_AUDIO, /* allowExceedsCapabilities= */ true)) {
listener.onUnsupportedTrack(C.TRACK_TYPE_AUDIO);
}
lastSeenTrackGroupInfo = tracksInfo;

View file

@ -15,19 +15,19 @@
#extension GL_OES_EGL_image_external : require
precision mediump float;
// External texture containing video decoder output.
uniform samplerExternalOES tex_sampler_0;
uniform samplerExternalOES uTexSampler0;
// Texture containing the overlap bitmap.
uniform sampler2D tex_sampler_1;
uniform sampler2D uTexSampler1;
// Horizontal scaling factor for the overlap bitmap.
uniform float scaleX;
uniform float uScaleX;
// Vertical scaling factory for the overlap bitmap.
uniform float scaleY;
varying vec2 v_texcoord;
uniform float uScaleY;
varying vec2 vTexCoords;
void main() {
vec4 videoColor = texture2D(tex_sampler_0, v_texcoord);
vec4 overlayColor = texture2D(tex_sampler_1,
vec2(v_texcoord.x * scaleX,
v_texcoord.y * scaleY));
vec4 videoColor = texture2D(uTexSampler0, vTexCoords);
vec4 overlayColor = texture2D(uTexSampler1,
vec2(vTexCoords.x * uScaleX,
vTexCoords.y * uScaleY));
// Blend the video decoder output and the overlay bitmap.
gl_FragColor = videoColor * (1.0 - overlayColor.a)
+ overlayColor * overlayColor.a;

View file

@ -11,11 +11,11 @@
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
attribute vec4 a_position;
attribute vec4 a_texcoord;
uniform mat4 tex_transform;
varying vec2 v_texcoord;
attribute vec4 aFramePosition;
attribute vec4 aTexCoords;
uniform mat4 uTexTransform;
varying vec2 vTexCoords;
void main() {
gl_Position = a_position;
v_texcoord = (tex_transform * a_texcoord).xy;
gl_Position = aFramePosition;
vTexCoords = (uTexTransform * aTexCoords).xy;
}

View file

@ -86,9 +86,9 @@ import org.checkerframework.checker.nullness.qual.MonotonicNonNull;
throw new IllegalStateException(e);
}
program.setBufferAttribute(
"a_position", GlUtil.getNormalizedCoordinateBounds(), GlUtil.RECTANGLE_VERTICES_COUNT);
"aFramePosition", GlUtil.getNormalizedCoordinateBounds(), GlUtil.RECTANGLE_VERTICES_COUNT);
program.setBufferAttribute(
"a_texcoord", GlUtil.getTextureCoordinateBounds(), GlUtil.RECTANGLE_VERTICES_COUNT);
"aTexCoords", GlUtil.getTextureCoordinateBounds(), GlUtil.RECTANGLE_VERTICES_COUNT);
GLES20.glGenTextures(1, textures, 0);
GLES20.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);
GLES20.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);
@ -118,11 +118,11 @@ import org.checkerframework.checker.nullness.qual.MonotonicNonNull;
// Run the shader program.
GlUtil.Program program = checkNotNull(this.program);
program.setSamplerTexIdUniform("tex_sampler_0", frameTexture, /* unit= */ 0);
program.setSamplerTexIdUniform("tex_sampler_1", textures[0], /* unit= */ 1);
program.setFloatUniform("scaleX", bitmapScaleX);
program.setFloatUniform("scaleY", bitmapScaleY);
program.setFloatsUniform("tex_transform", transformMatrix);
program.setSamplerTexIdUniform("uTexSampler0", frameTexture, /* unit= */ 0);
program.setSamplerTexIdUniform("uTexSampler1", textures[0], /* unit= */ 1);
program.setFloatUniform("uScaleX", bitmapScaleX);
program.setFloatUniform("uScaleY", bitmapScaleY);
program.setFloatsUniform("uTexTransform", transformMatrix);
program.bindAttributesAndUniforms();
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, /* first= */ 0, /* count= */ 4);

View file

@ -441,10 +441,12 @@ public class PlayerActivity extends AppCompatActivity
if (tracksInfo == lastSeenTracksInfo) {
return;
}
if (!tracksInfo.isTypeSupportedOrEmpty(C.TRACK_TYPE_VIDEO)) {
if (!tracksInfo.isTypeSupportedOrEmpty(
C.TRACK_TYPE_VIDEO, /* allowExceedsCapabilities= */ true)) {
showToast(R.string.error_unsupported_video);
}
if (!tracksInfo.isTypeSupportedOrEmpty(C.TRACK_TYPE_AUDIO)) {
if (!tracksInfo.isTypeSupportedOrEmpty(
C.TRACK_TYPE_AUDIO, /* allowExceedsCapabilities= */ true)) {
showToast(R.string.error_unsupported_audio);
}
lastSeenTracksInfo = tracksInfo;

View file

@ -22,12 +22,14 @@
<uses-sdk/>
<application
android:allowBackup="false"
android:icon="@mipmap/ic_launcher"
android:label="@string/application_name"
android:exported="true">
android:allowBackup="false"
android:icon="@mipmap/ic_launcher"
android:label="@string/application_name"
android:exported="true">
<activity android:name=".MainActivity">
<activity
android:name=".MainActivity"
android:exported="true">
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.LAUNCHER"/>

View file

@ -0,0 +1,9 @@
# Transformer demo
This app demonstrates how to use the [Transformer][] API to modify videos, for
example by removing audio or video.
See the [demos README](../README.md) for instructions on how to build and run
this demo.
[Transformer]: https://exoplayer.dev/transforming-media.html

View file

@ -0,0 +1,60 @@
/*
* Copyright 2021 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
apply from: '../../constants.gradle'
apply plugin: 'com.android.application'
android {
compileSdkVersion project.ext.compileSdkVersion
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
defaultConfig {
versionName project.ext.releaseVersion
versionCode project.ext.releaseVersionCode
minSdkVersion 21
targetSdkVersion project.ext.appTargetSdkVersion
multiDexEnabled true
}
buildTypes {
release {
shrinkResources true
minifyEnabled true
proguardFiles getDefaultProguardFile('proguard-android.txt')
signingConfig signingConfigs.debug
}
}
lintOptions {
// This demo app isn't indexed and doesn't have translations.
disable 'GoogleAppIndexingWarning','MissingTranslation'
}
}
dependencies {
compileOnly 'org.checkerframework:checker-qual:' + checkerframeworkVersion
implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion
implementation 'androidx.appcompat:appcompat:' + androidxAppCompatVersion
implementation 'androidx.constraintlayout:constraintlayout:' + androidxConstraintLayoutVersion
implementation 'androidx.multidex:multidex:' + androidxMultidexVersion
implementation 'com.google.android.material:material:' + androidxMaterialVersion
implementation project(modulePrefix + 'library-core')
implementation project(modulePrefix + 'library-transformer')
implementation project(modulePrefix + 'library-ui')
}

View file

@ -0,0 +1,61 @@
<?xml version="1.0" encoding="utf-8"?>
<!--
~ Copyright 2021 The Android Open Source Project
~
~ Licensed under the Apache License, Version 2.0 (the "License");
~ you may not use this file except in compliance with the License.
~ You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
-->
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
package="com.google.android.exoplayer2.transformerdemo">
<uses-sdk />
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<application
android:allowBackup="false"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:theme="@style/Theme.AppCompat"
android:taskAffinity=""
tools:targetApi="29">
<activity android:name=".ConfigurationActivity"
android:configChanges="keyboard|keyboardHidden|orientation|screenSize|screenLayout|smallestScreenSize|uiMode"
android:launchMode="singleTop"
android:label="@string/app_name"
android:exported="true"
android:theme="@style/Theme.MaterialComponents.DayNight.NoActionBar">
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.LAUNCHER"/>
</intent-filter>
<intent-filter>
<action android:name="com.google.android.exoplayer2.transformerdemo.action.VIEW"/>
<category android:name="android.intent.category.DEFAULT"/>
<data android:scheme="http"/>
<data android:scheme="https"/>
<data android:scheme="content"/>
<data android:scheme="asset"/>
<data android:scheme="file"/>
</intent-filter>
</activity>
<activity android:name=".TransformerActivity"
android:configChanges="keyboard|keyboardHidden|orientation|screenSize|screenLayout|smallestScreenSize|uiMode"
android:launchMode="singleTop"
android:label="@string/app_name"
android:exported="true"
android:theme="@style/Theme.MaterialComponents.DayNight.NoActionBar"/>
</application>
</manifest>

View file

@ -0,0 +1,302 @@
/*
* Copyright 2021 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.android.exoplayer2.transformerdemo;
import static com.google.android.exoplayer2.util.Assertions.checkNotNull;
import static com.google.android.exoplayer2.util.Assertions.checkState;
import android.app.Activity;
import android.content.DialogInterface;
import android.content.Intent;
import android.net.Uri;
import android.os.Bundle;
import android.view.View;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.CheckBox;
import android.widget.Spinner;
import android.widget.TextView;
import androidx.annotation.Nullable;
import androidx.appcompat.app.AlertDialog;
import androidx.appcompat.app.AppCompatActivity;
import com.google.android.exoplayer2.util.MimeTypes;
import java.util.Arrays;
import java.util.List;
import org.checkerframework.checker.nullness.qual.MonotonicNonNull;
import org.checkerframework.checker.nullness.qual.RequiresNonNull;
/**
* An {@link Activity} that sets the configuration to use for transforming and playing media, using
* {@link TransformerActivity}.
*/
public final class ConfigurationActivity extends AppCompatActivity {
public static final String SHOULD_REMOVE_AUDIO = "should_remove_audio";
public static final String SHOULD_REMOVE_VIDEO = "should_remove_video";
public static final String SHOULD_FLATTEN_FOR_SLOW_MOTION = "should_flatten_for_slow_motion";
public static final String AUDIO_MIME_TYPE = "audio_mime_type";
public static final String VIDEO_MIME_TYPE = "video_mime_type";
public static final String RESOLUTION_HEIGHT = "resolution_height";
public static final String TRANSLATE_X = "translate_x";
public static final String TRANSLATE_Y = "translate_y";
public static final String SCALE_X = "scale_x";
public static final String SCALE_Y = "scale_y";
public static final String ROTATE_DEGREES = "rotate_degrees";
private static final String[] INPUT_URIS = {
"https://html5demos.com/assets/dizzy.mp4",
"https://storage.googleapis.com/exoplayer-test-media-0/android-block-1080-hevc.mp4",
"https://storage.googleapis.com/exoplayer-test-media-0/BigBuckBunny_320x180.mp4",
"https://html5demos.com/assets/dizzy.webm",
};
private static final String[] URI_DESCRIPTIONS = { // same order as INPUT_URIS
"MP4 with H264 video and AAC audio",
"MP4 with H265 video and AAC audio",
"Long MP4 with H264 video and AAC audio",
"WebM with VP8 video and Vorbis audio",
};
private static final String SAME_AS_INPUT_OPTION = "same as input";
private @MonotonicNonNull Button chooseFileButton;
private @MonotonicNonNull CheckBox removeAudioCheckbox;
private @MonotonicNonNull CheckBox removeVideoCheckbox;
private @MonotonicNonNull CheckBox flattenForSlowMotionCheckbox;
private @MonotonicNonNull Spinner audioMimeSpinner;
private @MonotonicNonNull Spinner videoMimeSpinner;
private @MonotonicNonNull Spinner resolutionHeightSpinner;
private @MonotonicNonNull Spinner translateSpinner;
private @MonotonicNonNull Spinner scaleSpinner;
private @MonotonicNonNull Spinner rotateSpinner;
private @MonotonicNonNull TextView chosenFileTextView;
private int inputUriPosition;
@Override
protected void onCreate(@Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.configuration_activity);
findViewById(R.id.transform_button).setOnClickListener(this::startTransformation);
chooseFileButton = findViewById(R.id.choose_file_button);
chooseFileButton.setOnClickListener(this::chooseFile);
chosenFileTextView = findViewById(R.id.chosen_file_text_view);
chosenFileTextView.setText(URI_DESCRIPTIONS[inputUriPosition]);
removeAudioCheckbox = findViewById(R.id.remove_audio_checkbox);
removeAudioCheckbox.setOnClickListener(this::onRemoveAudio);
removeVideoCheckbox = findViewById(R.id.remove_video_checkbox);
removeVideoCheckbox.setOnClickListener(this::onRemoveVideo);
flattenForSlowMotionCheckbox = findViewById(R.id.flatten_for_slow_motion_checkbox);
ArrayAdapter<String> audioMimeAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
audioMimeAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
audioMimeSpinner = findViewById(R.id.audio_mime_spinner);
audioMimeSpinner.setAdapter(audioMimeAdapter);
audioMimeAdapter.addAll(
SAME_AS_INPUT_OPTION, MimeTypes.AUDIO_AAC, MimeTypes.AUDIO_AMR_NB, MimeTypes.AUDIO_AMR_WB);
ArrayAdapter<String> videoMimeAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
videoMimeAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
videoMimeSpinner = findViewById(R.id.video_mime_spinner);
videoMimeSpinner.setAdapter(videoMimeAdapter);
videoMimeAdapter.addAll(
SAME_AS_INPUT_OPTION,
MimeTypes.VIDEO_H263,
MimeTypes.VIDEO_H264,
MimeTypes.VIDEO_H265,
MimeTypes.VIDEO_MP4V);
ArrayAdapter<String> resolutionHeightAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
resolutionHeightAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
resolutionHeightSpinner = findViewById(R.id.resolution_height_spinner);
resolutionHeightSpinner.setAdapter(resolutionHeightAdapter);
resolutionHeightAdapter.addAll(
SAME_AS_INPUT_OPTION, "144", "240", "360", "480", "720", "1080", "1440", "2160");
ArrayAdapter<String> translateAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
translateAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
translateSpinner = findViewById(R.id.translate_spinner);
translateSpinner.setAdapter(translateAdapter);
translateAdapter.addAll(
SAME_AS_INPUT_OPTION, "-.1, -.1", "0, 0", ".5, 0", "0, .5", "1, 1", "1.9, 0", "0, 1.9");
ArrayAdapter<String> scaleAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
scaleAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
scaleSpinner = findViewById(R.id.scale_spinner);
scaleSpinner.setAdapter(scaleAdapter);
scaleAdapter.addAll(SAME_AS_INPUT_OPTION, "-1, -1", "-1, 1", "1, 1", ".5, 1", ".5, .5", "2, 2");
ArrayAdapter<String> rotateAdapter =
new ArrayAdapter<>(/* context= */ this, R.layout.spinner_item);
rotateAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
rotateSpinner = findViewById(R.id.rotate_spinner);
rotateSpinner.setAdapter(rotateAdapter);
rotateAdapter.addAll(SAME_AS_INPUT_OPTION, "0", "10", "45", "90", "180");
}
@Override
protected void onResume() {
super.onResume();
@Nullable Uri intentUri = getIntent().getData();
if (intentUri != null) {
checkNotNull(chooseFileButton).setEnabled(false);
checkNotNull(chosenFileTextView).setText(intentUri.toString());
}
}
@Override
protected void onNewIntent(Intent intent) {
super.onNewIntent(intent);
setIntent(intent);
}
@RequiresNonNull({
"removeAudioCheckbox",
"removeVideoCheckbox",
"flattenForSlowMotionCheckbox",
"audioMimeSpinner",
"videoMimeSpinner",
"resolutionHeightSpinner",
"translateSpinner",
"scaleSpinner",
"rotateSpinner"
})
private void startTransformation(View view) {
Intent transformerIntent = new Intent(this, TransformerActivity.class);
Bundle bundle = new Bundle();
bundle.putBoolean(SHOULD_REMOVE_AUDIO, removeAudioCheckbox.isChecked());
bundle.putBoolean(SHOULD_REMOVE_VIDEO, removeVideoCheckbox.isChecked());
bundle.putBoolean(SHOULD_FLATTEN_FOR_SLOW_MOTION, flattenForSlowMotionCheckbox.isChecked());
String selectedAudioMimeType = String.valueOf(audioMimeSpinner.getSelectedItem());
if (!SAME_AS_INPUT_OPTION.equals(selectedAudioMimeType)) {
bundle.putString(AUDIO_MIME_TYPE, selectedAudioMimeType);
}
String selectedVideoMimeType = String.valueOf(videoMimeSpinner.getSelectedItem());
if (!SAME_AS_INPUT_OPTION.equals(selectedVideoMimeType)) {
bundle.putString(VIDEO_MIME_TYPE, selectedVideoMimeType);
}
String selectedResolutionHeight = String.valueOf(resolutionHeightSpinner.getSelectedItem());
if (!SAME_AS_INPUT_OPTION.equals(selectedResolutionHeight)) {
bundle.putInt(RESOLUTION_HEIGHT, Integer.valueOf(selectedResolutionHeight));
}
String selectedTranslate = String.valueOf(translateSpinner.getSelectedItem());
if (!SAME_AS_INPUT_OPTION.equals(selectedTranslate)) {
List<String> translateXY = Arrays.asList(selectedTranslate.split(", "));
checkState(translateXY.size() == 2);
bundle.putFloat(TRANSLATE_X, Float.valueOf(translateXY.get(0)));
bundle.putFloat(TRANSLATE_Y, Float.valueOf(translateXY.get(1)));
}
String selectedScale = String.valueOf(scaleSpinner.getSelectedItem());
if (!SAME_AS_INPUT_OPTION.equals(selectedScale)) {
List<String> scaleXY = Arrays.asList(selectedScale.split(", "));
checkState(scaleXY.size() == 2);
bundle.putFloat(SCALE_X, Float.valueOf(scaleXY.get(0)));
bundle.putFloat(SCALE_Y, Float.valueOf(scaleXY.get(1)));
}
String selectedRotate = String.valueOf(rotateSpinner.getSelectedItem());
if (!SAME_AS_INPUT_OPTION.equals(selectedRotate)) {
bundle.putFloat(ROTATE_DEGREES, Float.valueOf(selectedRotate));
}
transformerIntent.putExtras(bundle);
@Nullable Uri intentUri = getIntent().getData();
transformerIntent.setData(
intentUri != null ? intentUri : Uri.parse(INPUT_URIS[inputUriPosition]));
startActivity(transformerIntent);
}
private void chooseFile(View view) {
new AlertDialog.Builder(/* context= */ this)
.setTitle(R.string.choose_file_title)
.setSingleChoiceItems(URI_DESCRIPTIONS, inputUriPosition, this::selectFileInDialog)
.setPositiveButton(android.R.string.ok, /* listener= */ null)
.create()
.show();
}
@RequiresNonNull("chosenFileTextView")
private void selectFileInDialog(DialogInterface dialog, int which) {
inputUriPosition = which;
chosenFileTextView.setText(URI_DESCRIPTIONS[inputUriPosition]);
}
@RequiresNonNull({
"removeVideoCheckbox",
"audioMimeSpinner",
"videoMimeSpinner",
"resolutionHeightSpinner",
"translateSpinner",
"scaleSpinner",
"rotateSpinner"
})
private void onRemoveAudio(View view) {
if (((CheckBox) view).isChecked()) {
removeVideoCheckbox.setChecked(false);
enableTrackSpecificOptions(/* isAudioEnabled= */ false, /* isVideoEnabled= */ true);
} else {
enableTrackSpecificOptions(/* isAudioEnabled= */ true, /* isVideoEnabled= */ true);
}
}
@RequiresNonNull({
"removeAudioCheckbox",
"audioMimeSpinner",
"videoMimeSpinner",
"resolutionHeightSpinner",
"translateSpinner",
"scaleSpinner",
"rotateSpinner"
})
private void onRemoveVideo(View view) {
if (((CheckBox) view).isChecked()) {
removeAudioCheckbox.setChecked(false);
enableTrackSpecificOptions(/* isAudioEnabled= */ true, /* isVideoEnabled= */ false);
} else {
enableTrackSpecificOptions(/* isAudioEnabled= */ true, /* isVideoEnabled= */ true);
}
}
@RequiresNonNull({
"audioMimeSpinner",
"videoMimeSpinner",
"resolutionHeightSpinner",
"translateSpinner",
"scaleSpinner",
"rotateSpinner"
})
private void enableTrackSpecificOptions(boolean isAudioEnabled, boolean isVideoEnabled) {
audioMimeSpinner.setEnabled(isAudioEnabled);
videoMimeSpinner.setEnabled(isVideoEnabled);
resolutionHeightSpinner.setEnabled(isVideoEnabled);
translateSpinner.setEnabled(isVideoEnabled);
scaleSpinner.setEnabled(isVideoEnabled);
rotateSpinner.setEnabled(isVideoEnabled);
findViewById(R.id.audio_mime_text_view).setEnabled(isAudioEnabled);
findViewById(R.id.video_mime_text_view).setEnabled(isVideoEnabled);
findViewById(R.id.resolution_height_text_view).setEnabled(isVideoEnabled);
findViewById(R.id.translate).setEnabled(isVideoEnabled);
findViewById(R.id.scale).setEnabled(isVideoEnabled);
findViewById(R.id.rotate).setEnabled(isVideoEnabled);
}
}

View file

@ -0,0 +1,372 @@
/*
* Copyright 2021 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.android.exoplayer2.transformerdemo;
import static android.Manifest.permission.READ_EXTERNAL_STORAGE;
import static android.Manifest.permission.WRITE_EXTERNAL_STORAGE;
import static com.google.android.exoplayer2.util.Assertions.checkNotNull;
import android.app.Activity;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.graphics.Matrix;
import android.net.Uri;
import android.os.Bundle;
import android.os.Handler;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.ViewGroup;
import android.widget.TextView;
import android.widget.Toast;
import androidx.annotation.Nullable;
import androidx.appcompat.app.AppCompatActivity;
import com.google.android.exoplayer2.C;
import com.google.android.exoplayer2.ExoPlayer;
import com.google.android.exoplayer2.MediaItem;
import com.google.android.exoplayer2.transformer.ProgressHolder;
import com.google.android.exoplayer2.transformer.TransformationException;
import com.google.android.exoplayer2.transformer.TransformationRequest;
import com.google.android.exoplayer2.transformer.Transformer;
import com.google.android.exoplayer2.ui.AspectRatioFrameLayout;
import com.google.android.exoplayer2.ui.StyledPlayerView;
import com.google.android.exoplayer2.util.DebugTextViewHelper;
import com.google.android.exoplayer2.util.Log;
import com.google.android.exoplayer2.util.Util;
import com.google.android.material.progressindicator.LinearProgressIndicator;
import com.google.common.base.Stopwatch;
import com.google.common.base.Ticker;
import java.io.File;
import java.io.IOException;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import org.checkerframework.checker.nullness.qual.MonotonicNonNull;
import org.checkerframework.checker.nullness.qual.RequiresNonNull;
/** An {@link Activity} that transforms and plays media using {@link Transformer}. */
public final class TransformerActivity extends AppCompatActivity {
private static final String TAG = "TransformerActivity";
private @MonotonicNonNull StyledPlayerView playerView;
private @MonotonicNonNull TextView debugTextView;
private @MonotonicNonNull TextView informationTextView;
private @MonotonicNonNull ViewGroup progressViewGroup;
private @MonotonicNonNull LinearProgressIndicator progressIndicator;
private @MonotonicNonNull Stopwatch transformationStopwatch;
private @MonotonicNonNull AspectRatioFrameLayout debugFrame;
@Nullable private DebugTextViewHelper debugTextViewHelper;
@Nullable private ExoPlayer player;
@Nullable private Transformer transformer;
@Nullable private File externalCacheFile;
@Override
protected void onCreate(@Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.transformer_activity);
playerView = findViewById(R.id.player_view);
debugTextView = findViewById(R.id.debug_text_view);
informationTextView = findViewById(R.id.information_text_view);
progressViewGroup = findViewById(R.id.progress_view_group);
progressIndicator = findViewById(R.id.progress_indicator);
debugFrame = findViewById(R.id.debug_aspect_ratio_frame_layout);
transformationStopwatch =
Stopwatch.createUnstarted(
new Ticker() {
public long read() {
return android.os.SystemClock.elapsedRealtimeNanos();
}
});
}
@Override
protected void onStart() {
super.onStart();
checkNotNull(progressIndicator);
checkNotNull(informationTextView);
checkNotNull(transformationStopwatch);
checkNotNull(playerView);
checkNotNull(debugTextView);
checkNotNull(progressViewGroup);
startTransformation();
playerView.onResume();
}
@Override
protected void onStop() {
super.onStop();
checkNotNull(transformationStopwatch).reset();
checkNotNull(transformer).cancel();
transformer = null;
checkNotNull(playerView).onPause();
releasePlayer();
checkNotNull(externalCacheFile).delete();
externalCacheFile = null;
}
@RequiresNonNull({
"playerView",
"debugTextView",
"informationTextView",
"progressIndicator",
"transformationStopwatch",
"progressViewGroup",
})
private void startTransformation() {
requestTransformerPermissions();
Intent intent = getIntent();
Uri uri = checkNotNull(intent.getData());
try {
externalCacheFile = createExternalCacheFile("transformer-output.mp4");
String filePath = externalCacheFile.getAbsolutePath();
@Nullable Bundle bundle = intent.getExtras();
Transformer transformer = createTransformer(bundle, filePath);
transformationStopwatch.start();
transformer.startTransformation(MediaItem.fromUri(uri), filePath);
this.transformer = transformer;
} catch (IOException e) {
throw new IllegalStateException(e);
}
informationTextView.setText(R.string.transformation_started);
Handler mainHandler = new Handler(getMainLooper());
ProgressHolder progressHolder = new ProgressHolder();
mainHandler.post(
new Runnable() {
@Override
public void run() {
if (transformer != null
&& transformer.getProgress(progressHolder)
!= Transformer.PROGRESS_STATE_NO_TRANSFORMATION) {
progressIndicator.setProgress(progressHolder.progress);
informationTextView.setText(
getString(
R.string.transformation_timer,
transformationStopwatch.elapsed(TimeUnit.SECONDS)));
mainHandler.postDelayed(/* r= */ this, /* delayMillis= */ 500);
}
}
});
}
// Create a cache file, resetting it if it already exists.
private File createExternalCacheFile(String fileName) throws IOException {
File file = new File(getExternalCacheDir(), fileName);
if (file.exists() && !file.delete()) {
throw new IllegalStateException("Could not delete the previous transformer output file");
}
if (!file.createNewFile()) {
throw new IllegalStateException("Could not create the transformer output file");
}
return file;
}
@RequiresNonNull({
"playerView",
"debugTextView",
"informationTextView",
"transformationStopwatch",
"progressViewGroup",
})
private Transformer createTransformer(@Nullable Bundle bundle, String filePath) {
Transformer.Builder transformerBuilder = new Transformer.Builder(/* context= */ this);
if (bundle != null) {
TransformationRequest.Builder requestBuilder = new TransformationRequest.Builder();
requestBuilder.setFlattenForSlowMotion(
bundle.getBoolean(ConfigurationActivity.SHOULD_FLATTEN_FOR_SLOW_MOTION));
@Nullable String audioMimeType = bundle.getString(ConfigurationActivity.AUDIO_MIME_TYPE);
if (audioMimeType != null) {
requestBuilder.setAudioMimeType(audioMimeType);
}
@Nullable String videoMimeType = bundle.getString(ConfigurationActivity.VIDEO_MIME_TYPE);
if (videoMimeType != null) {
requestBuilder.setVideoMimeType(videoMimeType);
}
int resolutionHeight =
bundle.getInt(
ConfigurationActivity.RESOLUTION_HEIGHT, /* defaultValue= */ C.LENGTH_UNSET);
if (resolutionHeight != C.LENGTH_UNSET) {
requestBuilder.setResolution(resolutionHeight);
}
Matrix transformationMatrix = getTransformationMatrix(bundle);
if (!transformationMatrix.isIdentity()) {
requestBuilder.setTransformationMatrix(transformationMatrix);
}
transformerBuilder
.setTransformationRequest(requestBuilder.build())
.setRemoveAudio(bundle.getBoolean(ConfigurationActivity.SHOULD_REMOVE_AUDIO))
.setRemoveVideo(bundle.getBoolean(ConfigurationActivity.SHOULD_REMOVE_VIDEO));
}
return transformerBuilder
.addListener(
new Transformer.Listener() {
@Override
public void onTransformationCompleted(MediaItem mediaItem) {
TransformerActivity.this.onTransformationCompleted(filePath);
}
@Override
public void onTransformationError(
MediaItem mediaItem, TransformationException exception) {
TransformerActivity.this.onTransformationError(exception);
}
})
.setDebugViewProvider(new DemoDebugViewProvider())
.build();
}
private static Matrix getTransformationMatrix(Bundle bundle) {
Matrix transformationMatrix = new Matrix();
float translateX = bundle.getFloat(ConfigurationActivity.TRANSLATE_X, /* defaultValue= */ 0);
float translateY = bundle.getFloat(ConfigurationActivity.TRANSLATE_Y, /* defaultValue= */ 0);
// TODO(b/213198690): Get resolution for aspect ratio and scale all translations' translateX
// by this aspect ratio.
transformationMatrix.postTranslate(translateX, translateY);
float scaleX = bundle.getFloat(ConfigurationActivity.SCALE_X, /* defaultValue= */ 1);
float scaleY = bundle.getFloat(ConfigurationActivity.SCALE_Y, /* defaultValue= */ 1);
transformationMatrix.postScale(scaleX, scaleY);
float rotateDegrees =
bundle.getFloat(ConfigurationActivity.ROTATE_DEGREES, /* defaultValue= */ 0);
transformationMatrix.postRotate(rotateDegrees);
return transformationMatrix;
}
@RequiresNonNull({
"informationTextView",
"progressViewGroup",
"transformationStopwatch",
})
private void onTransformationError(TransformationException exception) {
transformationStopwatch.stop();
informationTextView.setText(R.string.transformation_error);
progressViewGroup.setVisibility(View.GONE);
Toast.makeText(
TransformerActivity.this, "Transformation error: " + exception, Toast.LENGTH_LONG)
.show();
Log.e(TAG, "Transformation error", exception);
}
@RequiresNonNull({
"playerView",
"debugTextView",
"informationTextView",
"progressViewGroup",
"transformationStopwatch",
})
private void onTransformationCompleted(String filePath) {
transformationStopwatch.stop();
informationTextView.setText(
getString(
R.string.transformation_completed, transformationStopwatch.elapsed(TimeUnit.SECONDS)));
progressViewGroup.setVisibility(View.GONE);
playMediaItem(MediaItem.fromUri("file://" + filePath));
Log.d(TAG, "Output file path: file://" + filePath);
}
@RequiresNonNull({"playerView", "debugTextView"})
private void playMediaItem(MediaItem mediaItem) {
playerView.setPlayer(null);
releasePlayer();
ExoPlayer player = new ExoPlayer.Builder(/* context= */ this).build();
playerView.setPlayer(player);
player.setMediaItem(mediaItem);
player.play();
player.prepare();
this.player = player;
debugTextViewHelper = new DebugTextViewHelper(player, debugTextView);
debugTextViewHelper.start();
}
private void releasePlayer() {
if (debugTextViewHelper != null) {
debugTextViewHelper.stop();
debugTextViewHelper = null;
}
if (player != null) {
player.release();
player = null;
}
}
private void requestTransformerPermissions() {
if (Util.SDK_INT < 23) {
return;
}
if (checkSelfPermission(READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED
|| checkSelfPermission(WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
requestPermissions(
new String[] {READ_EXTERNAL_STORAGE, WRITE_EXTERNAL_STORAGE}, /* requestCode= */ 0);
}
}
private final class DemoDebugViewProvider implements Transformer.DebugViewProvider {
@Nullable
@Override
public SurfaceView getDebugPreviewSurfaceView(int width, int height) {
// Update the UI on the main thread and wait for the output surface to be available.
CountDownLatch surfaceCreatedCountDownLatch = new CountDownLatch(1);
SurfaceView surfaceView = new SurfaceView(/* context= */ TransformerActivity.this);
runOnUiThread(
() -> {
AspectRatioFrameLayout debugFrame = checkNotNull(TransformerActivity.this.debugFrame);
debugFrame.addView(surfaceView);
debugFrame.setAspectRatio((float) width / height);
surfaceView
.getHolder()
.addCallback(
new SurfaceHolder.Callback() {
@Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
surfaceCreatedCountDownLatch.countDown();
}
@Override
public void surfaceChanged(
SurfaceHolder surfaceHolder, int format, int width, int height) {
// Do nothing.
}
@Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
// Do nothing.
}
});
});
try {
surfaceCreatedCountDownLatch.await();
} catch (InterruptedException e) {
Log.w(TAG, "Interrupted waiting for debug surface.");
Thread.currentThread().interrupt();
return null;
}
return surfaceView;
}
}
}

View file

@ -0,0 +1,19 @@
/*
* Copyright 2021 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
@NonNullApi
package com.google.android.exoplayer2.transformerdemo;
import com.google.android.exoplayer2.util.NonNullApi;

View file

@ -0,0 +1,179 @@
<?xml version="1.0" encoding="utf-8"?>
<!--
~ Copyright 2021 The Android Open Source Project
~
~ Licensed under the Apache License, Version 2.0 (the "License");
~ you may not use this file except in compliance with the License.
~ You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
-->
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".ConfigurationActivity">
<TextView
android:id="@+id/configuration_text_view"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginTop="24dp"
android:layout_marginStart="32dp"
android:layout_marginEnd="32dp"
android:text="@string/configuration"
android:textSize="24sp"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<Button
android:id="@+id/choose_file_button"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginTop="32dp"
android:layout_marginStart="32dp"
android:layout_marginEnd="32dp"
android:text="@string/choose_file_title"
app:layout_constraintTop_toBottomOf="@+id/configuration_text_view"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent" />
<TextView
android:id="@+id/chosen_file_text_view"
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_marginTop="12dp"
android:layout_marginStart="32dp"
android:layout_marginEnd="32dp"
android:paddingLeft="24dp"
android:paddingRight="24dp"
android:textSize="12sp"
android:gravity="center"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toBottomOf="@+id/choose_file_button" />
<TableLayout
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:stretchColumns="1"
android:layout_marginTop="32dp"
android:layout_marginStart="32dp"
android:layout_marginEnd="32dp"
android:measureWithLargestChild="true"
android:paddingLeft="24dp"
android:paddingRight="12dp"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toBottomOf="@+id/chosen_file_text_view" >
<TableRow
android:layout_weight="1"
android:gravity="center_vertical" >
<TextView
android:text="@string/remove_audio" />
<CheckBox
android:id="@+id/remove_audio_checkbox"
android:layout_gravity="right"/>
</TableRow>
<TableRow
android:layout_weight="1"
android:gravity="center_vertical" >
<TextView
android:text="@string/remove_video"/>
<CheckBox
android:id="@+id/remove_video_checkbox"
android:layout_gravity="right" />
</TableRow>
<TableRow
android:layout_weight="1"
android:gravity="center_vertical" >
<TextView
android:text="@string/flatten_for_slow_motion"/>
<CheckBox
android:id="@+id/flatten_for_slow_motion_checkbox"
android:layout_gravity="right" />
</TableRow>
<TableRow
android:layout_weight="1"
android:gravity="center_vertical" >
<TextView
android:id="@+id/audio_mime_text_view"
android:text="@string/audio_mime"/>
<Spinner
android:id="@+id/audio_mime_spinner"
android:layout_gravity="right|center_vertical"
android:gravity="right" />
</TableRow>
<TableRow
android:layout_weight="1"
android:gravity="center_vertical" >
<TextView
android:id="@+id/video_mime_text_view"
android:text="@string/video_mime"/>
<Spinner
android:id="@+id/video_mime_spinner"
android:layout_gravity="right|center_vertical"
android:gravity="right" />
</TableRow>
<TableRow
android:layout_weight="1"
android:gravity="center_vertical" >
<TextView
android:id="@+id/resolution_height_text_view"
android:text="@string/resolution_height"/>
<Spinner
android:id="@+id/resolution_height_spinner"
android:layout_gravity="right|center_vertical"
android:gravity="right" />
</TableRow>
<TableRow
android:layout_weight="1"
android:gravity="center_vertical" >
<TextView
android:id="@+id/translate"
android:text="@string/translate"/>
<Spinner
android:id="@+id/translate_spinner"
android:layout_gravity="right|center_vertical"
android:gravity="right" />
</TableRow>
<TableRow
android:layout_weight="1"
android:gravity="center_vertical" >
<TextView
android:id="@+id/scale"
android:text="@string/scale"/>
<Spinner
android:id="@+id/scale_spinner"
android:layout_gravity="right|center_vertical"
android:gravity="right" />
</TableRow>
<TableRow
android:layout_weight="1"
android:gravity="center_vertical" >
<TextView
android:id="@+id/rotate"
android:text="@string/rotate"/>
<Spinner
android:id="@+id/rotate_spinner"
android:layout_gravity="right|center_vertical"
android:gravity="right" />
</TableRow>
</TableLayout>
<Button
android:id="@+id/transform_button"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginBottom="28dp"
android:layout_marginStart="32dp"
android:layout_marginEnd="32dp"
android:text="@string/transform"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>

View file

@ -0,0 +1,26 @@
<?xml version="1.0" encoding="utf-8"?>
<!--
~ Copyright 2021 The Android Open Source Project
~
~ Licensed under the Apache License, Version 2.0 (the "License");
~ you may not use this file except in compliance with the License.
~ You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
-->
<TextView
xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="wrap_content"
android:layout_height="32dp"
android:gravity="left|center_vertical"
android:paddingLeft="4dp"
android:paddingRight="4dp"
android:layout_marginLeft="4dp"
android:layout_marginRight="4dp"
android:textIsSelectable="false" />

View file

@ -0,0 +1,107 @@
<?xml version="1.0" encoding="utf-8"?><!--
~ Copyright 2021 The Android Open Source Project
~
~ Licensed under the Apache License, Version 2.0 (the "License");
~ you may not use this file except in compliance with the License.
~ You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
-->
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:keepScreenOn="true"
android:orientation="vertical">
<com.google.android.material.card.MaterialCardView
android:layout_margin="8dp"
android:layout_height="wrap_content"
android:layout_width="match_parent"
app:cardCornerRadius="4dp"
app:cardElevation="2dp"
android:gravity="center_vertical" >
<TextView
android:id="@+id/information_text_view"
android:layout_width="wrap_content"
android:layout_height="match_parent"
android:orientation="vertical"
android:paddingLeft="8dp"
android:paddingRight="8dp"
android:paddingTop="8dp"
android:paddingBottom="8dp" />
</com.google.android.material.card.MaterialCardView>
<com.google.android.material.card.MaterialCardView
android:layout_width="match_parent"
android:layout_height="0dp"
android:layout_weight="1"
android:layout_margin="16dp"
app:cardCornerRadius="4dp"
app:cardElevation="2dp">
<FrameLayout
android:layout_width="match_parent"
android:layout_height="match_parent">
<com.google.android.exoplayer2.ui.StyledPlayerView
android:id="@+id/player_view"
android:layout_width="match_parent"
android:layout_height="match_parent" />
<TextView
android:id="@+id/debug_text_view"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:paddingLeft="4dp"
android:paddingRight="4dp"
android:textSize="10sp"
tools:ignore="SmallSp"/>
<LinearLayout
android:id="@+id/progress_view_group"
android:layout_height="match_parent"
android:layout_width="match_parent"
android:orientation="vertical">
<com.google.android.material.progressindicator.LinearProgressIndicator
android:id="@+id/progress_indicator"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:paddingLeft="16dp"
android:paddingRight="16dp"
android:paddingTop="16dp"
android:paddingBottom="16dp"
android:layout_gravity="center" />
<TextView
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:text="@string/debug_preview" />
<com.google.android.exoplayer2.ui.AspectRatioFrameLayout
android:id="@+id/debug_aspect_ratio_frame_layout"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:paddingTop="16dp">
<TextView
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:text="@string/debug_preview_not_available" />
</com.google.android.exoplayer2.ui.AspectRatioFrameLayout>
</LinearLayout>
</FrameLayout>
</com.google.android.material.card.MaterialCardView>
</LinearLayout>

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 10 KiB

View file

@ -0,0 +1,37 @@
<?xml version="1.0" encoding="utf-8"?>
<!--
~ Copyright 2021 The Android Open Source Project
~
~ Licensed under the Apache License, Version 2.0 (the "License");
~ you may not use this file except in compliance with the License.
~ You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing, software
~ distributed under the License is distributed on an "AS IS" BASIS,
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
~ See the License for the specific language governing permissions and
~ limitations under the License.
-->
<resources xmlns:xliff="urn:oasis:names:tc:xliff:document:1.2">
<string name="app_name" translatable="false">Transformer Demo</string>
<string name="configuration" translatable="false">Configuration</string>
<string name="choose_file_title" translatable="false">Choose File</string>
<string name="remove_audio" translatable="false">Remove audio</string>
<string name="remove_video" translatable="false">Remove video</string>
<string name="flatten_for_slow_motion" translatable="false">Flatten for slow motion</string>
<string name="audio_mime" translatable="false">Output audio MIME type</string>
<string name="video_mime" translatable="false">Output video MIME type</string>
<string name="resolution_height" translatable="false">Output video resolution</string>
<string name="translate" translatable="false">Translate video</string>
<string name="scale" translatable="false">Scale video</string>
<string name="rotate" translatable="false">Rotate video (degrees)</string>
<string name="transform" translatable="false">Transform</string>
<string name="debug_preview" translatable="false">Debug preview:</string>
<string name="debug_preview_not_available" translatable="false">No debug preview available</string>
<string name="transformation_started" translatable="false">Transformation started</string>
<string name="transformation_timer" translatable="false">Transformation started %d seconds ago.</string>
<string name="transformation_completed" translatable="false">Transformation completed in %d seconds.</string>
<string name="transformation_error" translatable="false">Transformation error</string>
</resources>

View file

@ -128,7 +128,7 @@ containing the same content at different bitrates.
An Android API for playing audio.
For more information, see the
[Javadoc](https://developer.android.com/reference/android/media/AudioTrack).
[Javadoc]({{ site.android_sdk }}/android/media/AudioTrack).
###### CDM
@ -137,7 +137,7 @@ decrypting [DRM](#drm) protected content. CDMs are accessed via Androids
[`MediaDrm`](#mediadrm) API.
For more information, see the
[Javadoc](https://developer.android.com/reference/android/media/MediaDrm).
[Javadoc]({{ site.android_sdk }}/android/media/MediaDrm).
###### IMA
@ -153,14 +153,14 @@ An Android API for accessing media [codecs](#codec) (i.e. encoder and decoder
components) in the platform.
For more information, see the
[Javadoc](https://developer.android.com/reference/android/media/MediaCodec).
[Javadoc]({{ site.android_sdk }}/android/media/MediaCodec).
###### MediaDrm
An Android API for accessing [CDMs](#cdm) in the platform.
For more information, see the
[Javadoc](https://developer.android.com/reference/android/media/MediaDrm).
[Javadoc]({{ site.android_sdk }}/android/media/MediaDrm).
###### Audio offload
@ -181,7 +181,7 @@ For more information, see the
###### Surface
See the [Javadoc](https://developer.android.com/reference/android/view/Surface)
See the [Javadoc]({{ site.android_sdk }}/android/view/Surface)
and the
[Android graphics documentation](https://source.android.com/devices/graphics/arch-sh).
@ -212,14 +212,14 @@ transfers. In [adaptive streaming](#adaptive-streaming), bandwidth estimates can
be used to select between different bitrate [tracks](#track) during playback.
For more information, see the component
[Javadoc](https://exoplayer.dev/doc/reference/com/google/android/exoplayer2/upstream/BandwidthMeter.html).
[Javadoc]({{ site.exo_sdk }}/upstream/BandwidthMeter.html).
###### DataSource
Component for requesting data (e.g. over HTTP, from a local file, etc).
For more information, see the component
[Javadoc](https://exoplayer.dev/doc/reference/com/google/android/exoplayer2/upstream/DataSource.html).
[Javadoc]({{ site.exo_sdk }}/upstream/DataSource.html).
###### Extractor
@ -228,7 +228,7 @@ Component that parses a media [container](#container) format, outputting
belonging to each track suitable for consumption by a decoder.
For more information, see the component
[Javadoc](https://exoplayer.dev/doc/reference/com/google/android/exoplayer2/extractor/Extractor.html).
[Javadoc]({{ site.exo_sdk }}/extractor/Extractor.html).
###### LoadControl
@ -236,7 +236,7 @@ Component that decides when to start and stop loading, and when to start
playback.
For more information, see the component
[Javadoc](https://exoplayer.dev/doc/reference/com/google/android/exoplayer2/LoadControl.html).
[Javadoc]({{ site.exo_sdk }}/LoadControl.html).
###### MediaSource
@ -245,7 +245,7 @@ Provides high-level information about the structure of media (as a
(corresponding to periods of the `Timeline`) for playback.
For more information, see the component
[Javadoc](https://exoplayer.dev/doc/reference/com/google/android/exoplayer2/source/MediaSource.html).
[Javadoc]({{ site.exo_sdk }}/source/MediaSource.html).
###### MediaPeriod
@ -257,7 +257,7 @@ media are loaded and when loading starts and stops are made by the
respectively.
For more information, see the component
[Javadoc](https://exoplayer.dev/doc/reference/com/google/android/exoplayer2/source/MediaPeriod.html).
[Javadoc]({{ site.exo_sdk }}/source/MediaPeriod.html).
###### Renderer
@ -266,7 +266,7 @@ and [`AudioTrack`](#audiotrack) are the standard Android platform components to
which video and audio data are rendered.
For more information, see the component
[Javadoc](https://exoplayer.dev/doc/reference/com/google/android/exoplayer2/Renderer.html).
[Javadoc]({{ site.exo_sdk }}/Renderer.html).
###### Timeline
@ -275,7 +275,7 @@ through to complex compositions of media such as playlists and streams with
inserted ads.
For more information, see the component
[Javadoc](https://exoplayer.dev/doc/reference/com/google/android/exoplayer2/Timeline.html).
[Javadoc]({{ site.exo_sdk }}/Timeline.html).
###### TrackGroup
@ -284,7 +284,7 @@ content, normally at different bitrates for
[adaptive streaming](#adaptive-streaming).
For more information, see the component
[Javadoc](https://exoplayer.dev/doc/reference/com/google/android/exoplayer2/source/TrackGroup.html).
[Javadoc]({{ site.exo_sdk }}/source/TrackGroup.html).
###### TrackSelection
@ -295,7 +295,7 @@ responsible for selecting the appropriate track whenever a new media chunk
starts being loaded.
For more information, see the component
[Javadoc](https://exoplayer.dev/doc/reference/com/google/android/exoplayer2/trackselection/TrackSelection.html).
[Javadoc]({{ site.exo_sdk }}/trackselection/TrackSelection.html).
###### TrackSelector
@ -305,4 +305,4 @@ players [`Renderers`](#renderer), a `TrackSelector` will generate a
[`TrackSelection`](#trackselection) for each `Renderer`.
For more information, see the component
[Javadoc](https://exoplayer.dev/doc/reference/com/google/android/exoplayer2/trackselection/TrackSelector.html).
[Javadoc]({{ site.exo_sdk }}/trackselection/TrackSelector.html).

View file

@ -119,11 +119,7 @@ which the player must be accessed can be queried using
If you see `IllegalStateException` being thrown with the message "Player is
accessed on the wrong thread", then some code in your app is accessing an
`ExoPlayer` instance on the wrong thread (the exception's stack trace shows you
where). You can temporarily opt out from these exceptions being thrown by
calling `ExoPlayer.setThrowsWhenUsingWrongThread(false)`, in which case the
issue will be logged as a warning instead. Using this opt out is not safe and
may result in unexpected or obscure errors. It will be removed in ExoPlayer
2.16.
where).
{:.info}
For more information about ExoPlayer's threading model, see the

View file

@ -12,7 +12,7 @@ events is easy:
// Add a listener to receive events from the player.
player.addListener(listener);
~~~
{: .language-java}
{: .language-java }
`Player.Listener` has empty default methods, so you only need to implement
the methods you're interested in. See the [Javadoc][] for a full description of
@ -195,7 +195,7 @@ additional logging with a single line.
```
player.addAnalyticsListener(new EventLogger(trackSelector));
```
{: .language-java}
{: .language-java }
Passing the `trackSelector` enables additional logging, but is optional and so
`null` can be passed instead. See the [debug logging page][] for more details.
@ -220,7 +220,7 @@ player
// Do something at the specified playback position.
})
.setLooper(Looper.getMainLooper())
.setPosition(/* windowIndex= */ 0, /* positionMs= */ 120_000)
.setPosition(/* mediaItemIndex= */ 0, /* positionMs= */ 120_000)
.setPayload(customPayloadData)
.setDeleteAfterDelivery(false)
.send();

View file

@ -51,9 +51,9 @@ methods, as listed below and shown in the following figure.
`Timeline`. The current `Timeline.Window` can be retrieved from the `Timeline`
using `Player.getCurrentWindowIndex` and `Timeline.getWindow`. Within the
`Window`:
* `Window.liveConfiguration` contains the target live offset and and live
offset adjustment parameters. These values are based on information in the
media and any app-provided overrides set in `MediaItem.liveConfiguration`.
* `Window.liveConfiguration` contains the target live offset and live offset
adjustment parameters. These values are based on information in the media
and any app-provided overrides set in `MediaItem.liveConfiguration`.
* `Window.windowStartTimeMs` is the time since the Unix Epoch at which the
live window starts.
* `Window.getCurrentUnixTimeMs` is the time since the Unix Epoch of the

View file

@ -153,4 +153,4 @@ the player also needs to have its `DefaultMediaSourceFactory`
[configured accordingly]({{ site.baseurl }}/ad-insertion.html#declarative-ad-support).
[playlist API]: {{ site.baseurl }}/playlists.html
[`MediaItem.Builder` Javadoc]: {{ site.baseurl }}/doc/reference/com/google/android/exoplayer2/MediaItem.Builder.html
[`MediaItem.Builder` Javadoc]: {{ site.exo_sdk }}/MediaItem.Builder.html

View file

@ -44,7 +44,7 @@ ExoPlayer player = new ExoPlayer.Builder(context)
{: .language-java}
The
[`DefaultMediaSourceFactory` JavaDoc]({{ site.baseurl }}/doc/reference/com/google/android/exoplayer2/source/DefaultMediaSourceFactory.html)
[`DefaultMediaSourceFactory` JavaDoc]({{ site.exo_sdk }}/source/DefaultMediaSourceFactory.html)
describes the available options in more detail.
It's also possible to inject a custom `MediaSource.Factory` implementation, for
@ -79,4 +79,4 @@ exoPlayer.play();
[HLS]: {{ site.baseurl }}/hls.html
[RTSP]: {{ site.baseurl }}/rtsp.html
[regular media files]: {{ site.baseurl }}/progressive.html
[`ExoPlayer`]: {{ site.baseurl }}/doc/reference/com/google/android/exoplayer2/ExoPlayer.html
[`ExoPlayer`]: {{ site.exo_sdk }}/ExoPlayer.html

View file

@ -85,6 +85,6 @@ for (int i = 0; i < trackGroups.length; i++) {
{: .language-java}
[`MediaMetadata`]: {{ site.exo_sdk }}/MediaMetadata.html
[`Metadata.Entry`]: {{ site.exo_sdk}}/metadata/Metadata.Entry.html
[`Metadata.Entry`]: {{ site.exo_sdk }}/metadata/Metadata.Entry.html
[`MetadataRetriever`]: {{ site.exo_sdk }}/MetadataRetriever.html
[`MotionPhotoMetadata`]: {{ site.exo_sdk }}/metadata/mp4/MotionPhotoMetadata.html

View file

@ -164,7 +164,7 @@ from HTTPS to HTTP and so is a cross-protocol redirect. ExoPlayer will not
follow this redirect in its default configuration, meaning playback will fail.
If you need to, you can configure ExoPlayer to follow cross-protocol redirects
when instantiating `DefaultHttpDataSourceFactory` instances used in your
when instantiating [`DefaultHttpDataSource.Factory`][] instances used in your
application. Learn about selecting and configuring the network stack
[here]({{ site.base_url }}/customization.html#configuring-the-network-stack).
@ -326,7 +326,7 @@ is the official way to play YouTube videos on Android.
[`setFragmentedMp4ExtractorFlags`]: {{ site.exo_sdk }}/extractor/DefaultExtractorsFactory#setFragmentedMp4ExtractorFlags(int)
[Wikipedia]: https://en.wikipedia.org/wiki/List_of_HTTP_status_codes
[wget]: https://www.gnu.org/software/wget/manual/wget.html
[`DefaultHttpDataSourceFactory`]: {{ site.exo_sdk }}/upstream/DefaultHttpDataSourceFactory.html
[`DefaultHttpDataSource.Factory`]: {{ site.exo_sdk }}/upstream/DefaultHttpDataSource.Factory.html
[ExoPlayer module]: {{ site.base_url }}/hello-world.html#add-exoplayer-modules
[issue tracker]: https://github.com/google/ExoPlayer/issues
[`isCurrentWindowLive`]: {{ site.exo_sdk }}/Player.html#isCurrentWindowLive()

View file

@ -112,20 +112,20 @@ gets from the libgav1 decoder:
* GL rendering using GL shader for color space conversion
* If you are using `ExoPlayer` with `PlayerView` or
`StyledPlayerView`, enable this option by setting `surface_type` of view
to be `video_decoder_gl_surface_view`.
* If you are using `ExoPlayer` with `PlayerView` or `StyledPlayerView`,
enable this option by setting `surface_type` of view to be
`video_decoder_gl_surface_view`.
* Otherwise, enable this option by sending `Libgav1VideoRenderer` a
message of type `Renderer.MSG_SET_VIDEO_OUTPUT` with an instance of
`VideoDecoderOutputBufferRenderer` as its object.
`VideoDecoderGLSurfaceView` is the concrete
`VideoDecoderOutputBufferRenderer` implementation used by
`PlayerView` and `StyledPlayerView`.
`VideoDecoderOutputBufferRenderer` implementation used by `PlayerView`
and `StyledPlayerView`.
* Native rendering using `ANativeWindow`
* If you are using `ExoPlayer` with `PlayerView` or
`StyledPlayerView`, this option is enabled by default.
* If you are using `ExoPlayer` with `PlayerView` or `StyledPlayerView`,
this option is enabled by default.
* Otherwise, enable this option by sending `Libgav1VideoRenderer` a
message of type `Renderer.MSG_SET_VIDEO_OUTPUT` with an instance of
`SurfaceView` as its object.

View file

@ -14,7 +14,7 @@
apply from: "$gradle.ext.exoplayerSettingsDir/common_library_config.gradle"
dependencies {
api 'com.google.android.gms:play-services-cast-framework:20.1.0'
api 'com.google.android.gms:play-services-cast-framework:21.0.1'
implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion
implementation project(modulePrefix + 'library-common')
compileOnly 'org.checkerframework:checker-qual:' + checkerframeworkVersion

View file

@ -20,9 +20,11 @@ import androidx.annotation.Nullable;
import com.google.android.exoplayer2.C;
import com.google.android.exoplayer2.MediaItem;
import com.google.android.exoplayer2.util.Assertions;
import com.google.android.exoplayer2.util.MimeTypes;
import com.google.android.gms.cast.MediaInfo;
import com.google.android.gms.cast.MediaMetadata;
import com.google.android.gms.cast.MediaQueueItem;
import com.google.android.gms.common.images.WebImage;
import java.util.HashMap;
import java.util.Iterator;
import java.util.UUID;
@ -45,10 +47,43 @@ public final class DefaultMediaItemConverter implements MediaItemConverter {
@Override
public MediaItem toMediaItem(MediaQueueItem mediaQueueItem) {
// `item` came from `toMediaQueueItem()` so the custom JSON data must be set.
MediaInfo mediaInfo = mediaQueueItem.getMedia();
@Nullable MediaInfo mediaInfo = mediaQueueItem.getMedia();
Assertions.checkNotNull(mediaInfo);
return getMediaItem(Assertions.checkNotNull(mediaInfo.getCustomData()));
com.google.android.exoplayer2.MediaMetadata.Builder metadataBuilder =
new com.google.android.exoplayer2.MediaMetadata.Builder();
@Nullable MediaMetadata metadata = mediaInfo.getMetadata();
if (metadata != null) {
if (metadata.containsKey(MediaMetadata.KEY_TITLE)) {
metadataBuilder.setTitle(metadata.getString(MediaMetadata.KEY_TITLE));
}
if (metadata.containsKey(MediaMetadata.KEY_SUBTITLE)) {
metadataBuilder.setSubtitle(metadata.getString(MediaMetadata.KEY_SUBTITLE));
}
if (metadata.containsKey(MediaMetadata.KEY_ARTIST)) {
metadataBuilder.setArtist(metadata.getString(MediaMetadata.KEY_ARTIST));
}
if (metadata.containsKey(MediaMetadata.KEY_ALBUM_ARTIST)) {
metadataBuilder.setAlbumArtist(metadata.getString(MediaMetadata.KEY_ALBUM_ARTIST));
}
if (metadata.containsKey(MediaMetadata.KEY_ALBUM_TITLE)) {
metadataBuilder.setArtist(metadata.getString(MediaMetadata.KEY_ALBUM_TITLE));
}
if (!metadata.getImages().isEmpty()) {
metadataBuilder.setArtworkUri(metadata.getImages().get(0).getUrl());
}
if (metadata.containsKey(MediaMetadata.KEY_COMPOSER)) {
metadataBuilder.setComposer(metadata.getString(MediaMetadata.KEY_COMPOSER));
}
if (metadata.containsKey(MediaMetadata.KEY_DISC_NUMBER)) {
metadataBuilder.setDiscNumber(metadata.getInt(MediaMetadata.KEY_DISC_NUMBER));
}
if (metadata.containsKey(MediaMetadata.KEY_TRACK_NUMBER)) {
metadataBuilder.setTrackNumber(metadata.getInt(MediaMetadata.KEY_TRACK_NUMBER));
}
}
// `mediaQueueItem` came from `toMediaQueueItem()` so the custom JSON data must be set.
return getMediaItem(
Assertions.checkNotNull(mediaInfo.getCustomData()), metadataBuilder.build());
}
@Override
@ -57,10 +92,41 @@ public final class DefaultMediaItemConverter implements MediaItemConverter {
if (mediaItem.localConfiguration.mimeType == null) {
throw new IllegalArgumentException("The item must specify its mimeType");
}
MediaMetadata metadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE);
MediaMetadata metadata =
new MediaMetadata(
MimeTypes.isAudio(mediaItem.localConfiguration.mimeType)
? MediaMetadata.MEDIA_TYPE_MUSIC_TRACK
: MediaMetadata.MEDIA_TYPE_MOVIE);
if (mediaItem.mediaMetadata.title != null) {
metadata.putString(MediaMetadata.KEY_TITLE, mediaItem.mediaMetadata.title.toString());
}
if (mediaItem.mediaMetadata.subtitle != null) {
metadata.putString(MediaMetadata.KEY_SUBTITLE, mediaItem.mediaMetadata.subtitle.toString());
}
if (mediaItem.mediaMetadata.artist != null) {
metadata.putString(MediaMetadata.KEY_ARTIST, mediaItem.mediaMetadata.artist.toString());
}
if (mediaItem.mediaMetadata.albumArtist != null) {
metadata.putString(
MediaMetadata.KEY_ALBUM_ARTIST, mediaItem.mediaMetadata.albumArtist.toString());
}
if (mediaItem.mediaMetadata.albumTitle != null) {
metadata.putString(
MediaMetadata.KEY_ALBUM_TITLE, mediaItem.mediaMetadata.albumTitle.toString());
}
if (mediaItem.mediaMetadata.artworkUri != null) {
metadata.addImage(new WebImage(mediaItem.mediaMetadata.artworkUri));
}
if (mediaItem.mediaMetadata.composer != null) {
metadata.putString(MediaMetadata.KEY_COMPOSER, mediaItem.mediaMetadata.composer.toString());
}
if (mediaItem.mediaMetadata.discNumber != null) {
metadata.putInt(MediaMetadata.KEY_DISC_NUMBER, mediaItem.mediaMetadata.discNumber);
}
if (mediaItem.mediaMetadata.trackNumber != null) {
metadata.putInt(MediaMetadata.KEY_TRACK_NUMBER, mediaItem.mediaMetadata.trackNumber);
}
MediaInfo mediaInfo =
new MediaInfo.Builder(mediaItem.localConfiguration.uri.toString())
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
@ -73,19 +139,15 @@ public final class DefaultMediaItemConverter implements MediaItemConverter {
// Deserialization.
private static MediaItem getMediaItem(JSONObject customData) {
private static MediaItem getMediaItem(
JSONObject customData, com.google.android.exoplayer2.MediaMetadata mediaMetadata) {
try {
JSONObject mediaItemJson = customData.getJSONObject(KEY_MEDIA_ITEM);
MediaItem.Builder builder = new MediaItem.Builder();
builder.setUri(Uri.parse(mediaItemJson.getString(KEY_URI)));
builder.setMediaId(mediaItemJson.getString(KEY_MEDIA_ID));
if (mediaItemJson.has(KEY_TITLE)) {
com.google.android.exoplayer2.MediaMetadata mediaMetadata =
new com.google.android.exoplayer2.MediaMetadata.Builder()
.setTitle(mediaItemJson.getString(KEY_TITLE))
.build();
builder.setMediaMetadata(mediaMetadata);
}
MediaItem.Builder builder =
new MediaItem.Builder()
.setUri(Uri.parse(mediaItemJson.getString(KEY_URI)))
.setMediaId(mediaItemJson.getString(KEY_MEDIA_ID))
.setMediaMetadata(mediaMetadata);
if (mediaItemJson.has(KEY_MIME_TYPE)) {
builder.setMimeType(mediaItemJson.getString(KEY_MIME_TYPE));
}

View file

@ -20,7 +20,7 @@ android {
}
dependencies {
api "com.google.android.gms:play-services-cronet:17.0.1"
api "com.google.android.gms:play-services-cronet:18.0.1"
implementation project(modulePrefix + 'library-common')
implementation project(modulePrefix + 'library-datasource')
implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion

View file

@ -25,7 +25,7 @@ android {
}
dependencies {
api 'com.google.ads.interactivemedia.v3:interactivemedia:3.25.1'
api 'com.google.ads.interactivemedia.v3:interactivemedia:3.26.0'
implementation project(modulePrefix + 'library-core')
implementation 'androidx.annotation:annotation:' + androidxAnnotationVersion
compileOnly 'org.checkerframework:checker-qual:' + checkerframeworkVersion

View file

@ -71,14 +71,15 @@ import java.util.Set;
* #setPlayer(Player)}. If the ads loader is no longer required, it must be released by calling
* {@link #release()}.
*
* <p>See https://developers.google.com/interactive-media-ads/docs/sdks/android/compatibility for
* information on compatible ad tag formats. Pass the ad tag URI when setting media item playback
* properties (if using the media item API) or as a {@link DataSpec} when constructing the {@link
* AdsMediaSource} (if using media sources directly). For the latter case, please note that this
* implementation delegates loading of the data spec to the IMA SDK, so range and headers
* specifications will be ignored in ad tag URIs. Literal ads responses can be encoded as data
* scheme data specs, for example, by constructing the data spec using a URI generated via {@link
* Util#getDataUriForString(String, String)}.
* <p>See <a
* href="https://developers.google.com/interactive-media-ads/docs/sdks/android/compatibility">IMA's
* Support and compatibility page</a> for information on compatible ad tag formats. Pass the ad tag
* URI when setting media item playback properties (if using the media item API) or as a {@link
* DataSpec} when constructing the {@link AdsMediaSource} (if using media sources directly). For the
* latter case, please note that this implementation delegates loading of the data spec to the IMA
* SDK, so range and headers specifications will be ignored in ad tag URIs. Literal ads responses
* can be encoded as data scheme data specs, for example, by constructing the data spec using a URI
* generated via {@link Util#getDataUriForString(String, String)}.
*
* <p>The IMA SDK can report obstructions to the ad view for accurate viewability measurement. This
* means that any overlay views that obstruct the ad overlay but are essential for playback need to

View file

@ -924,8 +924,7 @@ public class SessionPlayerConnectorTest {
assertThat(onPlaylistChangedLatch.getCount()).isEqualTo(1);
}
// TODO(b/168860979): De-flake and re-enable.
@Ignore
@Ignore("Internal ref: b/168860979")
@Test
@LargeTest
public void replacePlaylistItem_calledOnlyOnce_notifiesPlaylistChangeOnlyOnce() throws Exception {

View file

@ -125,20 +125,20 @@ gets from the libvpx decoder:
* GL rendering using GL shader for color space conversion
* If you are using `ExoPlayer` with `PlayerView` or
`StyledPlayerView`, enable this option by setting `surface_type` of view
to be `video_decoder_gl_surface_view`.
* If you are using `ExoPlayer` with `PlayerView` or `StyledPlayerView`,
enable this option by setting `surface_type` of view to be
`video_decoder_gl_surface_view`.
* Otherwise, enable this option by sending `LibvpxVideoRenderer` a message
of type `Renderer.MSG_SET_VIDEO_OUTPUT` with an instance of
`VideoDecoderOutputBufferRenderer` as its object.
`VideoDecoderGLSurfaceView` is the concrete
`VideoDecoderOutputBufferRenderer` implementation used by
`PlayerView` and `StyledPlayerView`.
`VideoDecoderOutputBufferRenderer` implementation used by `PlayerView`
and `StyledPlayerView`.
* Native rendering using `ANativeWindow`
* If you are using `ExoPlayer` with `PlayerView` or
`StyledPlayerView`, this option is enabled by default.
* If you are using `ExoPlayer` with `PlayerView` or `StyledPlayerView`,
this option is enabled by default.
* Otherwise, enable this option by sending `LibvpxVideoRenderer` a message
of type `Renderer.MSG_SET_VIDEO_OUTPUT` with an instance of
`SurfaceView` as its object.

View file

@ -28,7 +28,7 @@ public final class VpxLibrary {
}
private static final LibraryLoader LOADER = new LibraryLoader("vpx", "vpxV2JNI");
@C.CryptoType private static int cryptoType = C.CRYPTO_TYPE_UNSUPPORTED;
private static @C.CryptoType int cryptoType = C.CRYPTO_TYPE_UNSUPPORTED;
private VpxLibrary() {}

View file

@ -100,7 +100,7 @@ public final class HeartRating extends Rating {
private static HeartRating fromBundle(Bundle bundle) {
checkArgument(
bundle.getInt(keyForField(FIELD_RATING_TYPE), /* defaultValue= */ RATING_TYPE_DEFAULT)
bundle.getInt(keyForField(FIELD_RATING_TYPE), /* defaultValue= */ RATING_TYPE_UNSET)
== TYPE);
boolean isRated = bundle.getBoolean(keyForField(FIELD_RATED), /* defaultValue= */ false);
return isRated

View file

@ -96,7 +96,7 @@ public final class PercentageRating extends Rating {
private static PercentageRating fromBundle(Bundle bundle) {
checkArgument(
bundle.getInt(keyForField(FIELD_RATING_TYPE), /* defaultValue= */ RATING_TYPE_DEFAULT)
bundle.getInt(keyForField(FIELD_RATING_TYPE), /* defaultValue= */ RATING_TYPE_UNSET)
== TYPE);
float percent = bundle.getFloat(keyForField(FIELD_PERCENT), /* defaultValue= */ RATING_UNSET);
return percent == RATING_UNSET ? new PercentageRating() : new PercentageRating(percent);

View file

@ -41,7 +41,7 @@ public abstract class Rating implements Bundleable {
@Documented
@Retention(RetentionPolicy.SOURCE)
@IntDef({
RATING_TYPE_DEFAULT,
RATING_TYPE_UNSET,
RATING_TYPE_HEART,
RATING_TYPE_PERCENTAGE,
RATING_TYPE_STAR,
@ -49,7 +49,7 @@ public abstract class Rating implements Bundleable {
})
/* package */ @interface RatingType {}
/* package */ static final int RATING_TYPE_DEFAULT = -1;
/* package */ static final int RATING_TYPE_UNSET = -1;
/* package */ static final int RATING_TYPE_HEART = 0;
/* package */ static final int RATING_TYPE_PERCENTAGE = 1;
/* package */ static final int RATING_TYPE_STAR = 2;
@ -68,7 +68,7 @@ public abstract class Rating implements Bundleable {
private static Rating fromBundle(Bundle bundle) {
@RatingType
int ratingType =
bundle.getInt(keyForField(FIELD_RATING_TYPE), /* defaultValue= */ RATING_TYPE_DEFAULT);
bundle.getInt(keyForField(FIELD_RATING_TYPE), /* defaultValue= */ RATING_TYPE_UNSET);
switch (ratingType) {
case RATING_TYPE_HEART:
return HeartRating.CREATOR.fromBundle(bundle);
@ -78,8 +78,9 @@ public abstract class Rating implements Bundleable {
return StarRating.CREATOR.fromBundle(bundle);
case RATING_TYPE_THUMB:
return ThumbRating.CREATOR.fromBundle(bundle);
case RATING_TYPE_UNSET:
default:
throw new IllegalArgumentException("Encountered unknown rating type: " + ratingType);
throw new IllegalArgumentException("Unknown RatingType: " + ratingType);
}
}

View file

@ -125,7 +125,7 @@ public final class StarRating extends Rating {
private static StarRating fromBundle(Bundle bundle) {
checkArgument(
bundle.getInt(keyForField(FIELD_RATING_TYPE), /* defaultValue= */ RATING_TYPE_DEFAULT)
bundle.getInt(keyForField(FIELD_RATING_TYPE), /* defaultValue= */ RATING_TYPE_UNSET)
== TYPE);
int maxStars =
bundle.getInt(keyForField(FIELD_MAX_STARS), /* defaultValue= */ MAX_STARS_DEFAULT);

View file

@ -97,7 +97,7 @@ public final class ThumbRating extends Rating {
private static ThumbRating fromBundle(Bundle bundle) {
checkArgument(
bundle.getInt(keyForField(FIELD_RATING_TYPE), /* defaultValue= */ RATING_TYPE_DEFAULT)
bundle.getInt(keyForField(FIELD_RATING_TYPE), /* defaultValue= */ RATING_TYPE_UNSET)
== TYPE);
boolean rated = bundle.getBoolean(keyForField(FIELD_RATED), /* defaultValue= */ false);
return rated

View file

@ -25,7 +25,6 @@ import android.os.Bundle;
import androidx.annotation.IntDef;
import androidx.annotation.Nullable;
import com.google.android.exoplayer2.source.TrackGroup;
import com.google.android.exoplayer2.trackselection.TrackSelectionParameters;
import com.google.common.base.MoreObjects;
import com.google.common.collect.ImmutableList;
import com.google.common.primitives.Booleans;
@ -35,11 +34,12 @@ import java.lang.annotation.RetentionPolicy;
import java.util.Arrays;
import java.util.List;
/** Immutable information ({@link TrackGroupInfo}) about tracks. */
/** Information about groups of tracks. */
public final class TracksInfo implements Bundleable {
/**
* Information about tracks in a {@link TrackGroup}: their {@link C.TrackType}, if their format is
* supported by the player and if they are selected for playback.
* Information about a single group of tracks, including the underlying {@link TrackGroup}, the
* {@link C.TrackType type} of tracks it contains, and the level to which each track is supported
* by the player.
*/
public static final class TrackGroupInfo implements Bundleable {
private final TrackGroup trackGroup;
@ -74,7 +74,7 @@ public final class TracksInfo implements Bundleable {
}
/**
* Returns the level of support for a track in a {@link TrackGroup}.
* Returns the level of support for a specified track.
*
* @param trackIndex The index of the track in the {@link TrackGroup}.
* @return The {@link C.FormatSupport} of the track.
@ -85,24 +85,58 @@ public final class TracksInfo implements Bundleable {
}
/**
* Returns if a track in a {@link TrackGroup} is supported for playback.
* Returns whether a specified track is supported for playback, without exceeding the advertised
* capabilities of the device. Equivalent to {@code isTrackSupported(trackIndex, false)}.
*
* @param trackIndex The index of the track in the {@link TrackGroup}.
* @return True if the track's format can be played, false otherwise.
*/
public boolean isTrackSupported(int trackIndex) {
return trackSupport[trackIndex] == C.FORMAT_HANDLED;
return isTrackSupported(trackIndex, /* allowExceedsCapabilities= */ false);
}
/** Returns if at least one track in a {@link TrackGroup} is selected for playback. */
/**
* Returns whether a specified track is supported for playback.
*
* @param trackIndex The index of the track in the {@link TrackGroup}.
* @param allowExceedsCapabilities Whether to consider the track as supported if it has a
* supported {@link Format#sampleMimeType MIME type}, but otherwise exceeds the advertised
* capabilities of the device. For example, a video track for which there's a corresponding
* decoder whose maximum advertised resolution is exceeded by the resolution of the track.
* Such tracks may be playable in some cases.
* @return True if the track's format can be played, false otherwise.
*/
public boolean isTrackSupported(int trackIndex, boolean allowExceedsCapabilities) {
return trackSupport[trackIndex] == C.FORMAT_HANDLED
|| (allowExceedsCapabilities
&& trackSupport[trackIndex] == C.FORMAT_EXCEEDS_CAPABILITIES);
}
/** Returns whether at least one track in the group is selected for playback. */
public boolean isSelected() {
return Booleans.contains(trackSelected, true);
}
/** Returns if at least one track in a {@link TrackGroup} is supported. */
/**
* Returns whether at least one track in the group is supported for playback, without exceeding
* the advertised capabilities of the device. Equivalent to {@code isSupported(false)}.
*/
public boolean isSupported() {
return isSupported(/* allowExceedsCapabilities= */ false);
}
/**
* Returns whether at least one track in the group is supported for playback.
*
* @param allowExceedsCapabilities Whether to consider a track as supported if it has a
* supported {@link Format#sampleMimeType MIME type}, but otherwise exceeds the advertised
* capabilities of the device. For example, a video track for which there's a corresponding
* decoder whose maximum advertised resolution is exceeded by the resolution of the track.
* Such tracks may be playable in some cases.
*/
public boolean isSupported(boolean allowExceedsCapabilities) {
for (int i = 0; i < trackSupport.length; i++) {
if (isTrackSupported(i)) {
if (isTrackSupported(i, allowExceedsCapabilities)) {
return true;
}
}
@ -110,27 +144,24 @@ public final class TracksInfo implements Bundleable {
}
/**
* Returns if a track in a {@link TrackGroup} is selected for playback.
* Returns whether a specified track is selected for playback.
*
* <p>Multiple tracks of a track group may be selected. This is common in adaptive streaming,
* where multiple tracks of different quality are selected and the player switches between them
* depending on the network and the {@link TrackSelectionParameters}.
* <p>Note that multiple tracks in the group may be selected. This is common in adaptive
* streaming, where tracks of different qualities are selected and the player switches between
* them during playback (e.g., based on the available network bandwidth).
*
* <p>While this class doesn't provide which selected track is currently playing, some player
* implementations have ways of getting such information. For example ExoPlayer provides this
* information in {@code ExoTrackSelection.getSelectedFormat}.
* <p>This class doesn't provide a way to determine which of the selected tracks is currently
* playing, however some player implementations have ways of getting such information. For
* example, ExoPlayer provides this information via {@code ExoTrackSelection.getSelectedFormat}.
*
* @param trackIndex The index of the track in the {@link TrackGroup}.
* @return true if the track is selected, false otherwise.
* @return True if the track is selected, false otherwise.
*/
public boolean isTrackSelected(int trackIndex) {
return trackSelected[trackIndex];
}
/**
* Returns the {@link C.TrackType} of the tracks in the {@link TrackGroup}. Tracks in a group
* are all of the same type.
*/
/** Returns the {@link C.TrackType} of the group. */
public @C.TrackType int getTrackType() {
return trackType;
}
@ -212,28 +243,49 @@ public final class TracksInfo implements Bundleable {
private final ImmutableList<TrackGroupInfo> trackGroupInfos;
/** An empty {@code TrackInfo} containing no {@link TrackGroupInfo}. */
/** An {@code TrackInfo} that contains no tracks. */
public static final TracksInfo EMPTY = new TracksInfo(ImmutableList.of());
/** Constructs {@code TracksInfo} from the provided {@link TrackGroupInfo}. */
/**
* Constructs an instance.
*
* @param trackGroupInfos The {@link TrackGroupInfo TrackGroupInfos} describing the groups of
* tracks.
*/
public TracksInfo(List<TrackGroupInfo> trackGroupInfos) {
this.trackGroupInfos = ImmutableList.copyOf(trackGroupInfos);
}
/** Returns the {@link TrackGroupInfo TrackGroupInfos}, describing each {@link TrackGroup}. */
/** Returns the {@link TrackGroupInfo TrackGroupInfos} describing the groups of tracks. */
public ImmutableList<TrackGroupInfo> getTrackGroupInfos() {
return trackGroupInfos;
}
/**
* Returns true if at least one track of type {@code trackType} is {@link
* TrackGroupInfo#isTrackSupported(int) supported}, or there are no tracks of this type.
* TrackGroupInfo#isTrackSupported(int) supported} or if there are no tracks of this type.
*/
public boolean isTypeSupportedOrEmpty(@C.TrackType int trackType) {
return isTypeSupportedOrEmpty(trackType, /* allowExceedsCapabilities= */ false);
}
/**
* Returns true if at least one track of type {@code trackType} is {@link
* TrackGroupInfo#isTrackSupported(int, boolean) supported} or if there are no tracks of this
* type.
*
* @param allowExceedsCapabilities Whether to consider the track as supported if it has a
* supported {@link Format#sampleMimeType MIME type}, but otherwise exceeds the advertised
* capabilities of the device. For example, a video track for which there's a corresponding
* decoder whose maximum advertised resolution is exceeded by the resolution of the track.
* Such tracks may be playable in some cases.
*/
public boolean isTypeSupportedOrEmpty(
@C.TrackType int trackType, boolean allowExceedsCapabilities) {
boolean supported = true;
for (int i = 0; i < trackGroupInfos.size(); i++) {
if (trackGroupInfos.get(i).trackType == trackType) {
if (trackGroupInfos.get(i).isSupported()) {
if (trackGroupInfos.get(i).isSupported(allowExceedsCapabilities)) {
return true;
} else {
supported = false;

View file

@ -220,6 +220,17 @@ public final class MediaFormatUtil {
case C.ENCODING_PCM_FLOAT:
mediaFormatPcmEncoding = AudioFormat.ENCODING_PCM_FLOAT;
break;
case C.ENCODING_PCM_24BIT:
mediaFormatPcmEncoding = AudioFormat.ENCODING_PCM_24BIT_PACKED;
break;
case C.ENCODING_PCM_32BIT:
mediaFormatPcmEncoding = AudioFormat.ENCODING_PCM_32BIT;
break;
case C.ENCODING_INVALID:
mediaFormatPcmEncoding = AudioFormat.ENCODING_INVALID;
break;
case Format.NO_VALUE:
case C.ENCODING_PCM_16BIT_BIG_ENDIAN:
default:
// No matching value. Do nothing.
return;

View file

@ -2405,6 +2405,8 @@ public final class Util {
return "camera motion";
case C.TRACK_TYPE_NONE:
return "none";
case C.TRACK_TYPE_UNKNOWN:
return "unknown";
default:
return trackType >= C.TRACK_TYPE_CUSTOM_BASE ? "custom (" + trackType + ")" : "?";
}
@ -2537,6 +2539,20 @@ public final class Util {
.build();
}
/**
* Returns the sum of all summands of the given array.
*
* @param summands The summands to calculate the sum from.
* @return The sum of all summands.
*/
public static long sum(long... summands) {
long sum = 0;
for (long summand : summands) {
sum += summand;
}
return sum;
}
@Nullable
private static String getSystemProperty(String name) {
try {

View file

@ -146,10 +146,10 @@ public class MediaFormatUtilTest {
@Test
public void createMediaFormatFromFormat_withPcmEncoding_setsCustomPcmEncodingEntry() {
Format format = new Format.Builder().setPcmEncoding(C.ENCODING_PCM_32BIT).build();
Format format = new Format.Builder().setPcmEncoding(C.ENCODING_PCM_16BIT_BIG_ENDIAN).build();
MediaFormat mediaFormat = MediaFormatUtil.createMediaFormatFromFormat(format);
assertThat(mediaFormat.getInteger(MediaFormatUtil.KEY_PCM_ENCODING_EXTENDED))
.isEqualTo(C.ENCODING_PCM_32BIT);
.isEqualTo(C.ENCODING_PCM_16BIT_BIG_ENDIAN);
assertThat(mediaFormat.containsKey(MediaFormat.KEY_PCM_ENCODING)).isFalse();
}
}

View file

@ -99,10 +99,9 @@ public abstract class BaseRenderer implements Renderer, RendererCapabilities {
Assertions.checkState(state == STATE_DISABLED);
this.configuration = configuration;
state = STATE_ENABLED;
lastResetPositionUs = positionUs;
onEnabled(joining, mayRenderStartOfStream);
replaceStream(formats, stream, startPositionUs, offsetUs);
onPositionReset(positionUs, joining);
resetPosition(positionUs, joining);
}
@Override
@ -159,10 +158,14 @@ public abstract class BaseRenderer implements Renderer, RendererCapabilities {
@Override
public final void resetPosition(long positionUs) throws ExoPlaybackException {
resetPosition(positionUs, /* joining= */ false);
}
private void resetPosition(long positionUs, boolean joining) throws ExoPlaybackException {
streamIsFinal = false;
lastResetPositionUs = positionUs;
readingPositionUs = positionUs;
onPositionReset(positionUs, false);
onPositionReset(positionUs, joining);
}
@Override

View file

@ -1456,19 +1456,6 @@ public interface ExoPlayer extends Player {
*/
void setPriorityTaskManager(@Nullable PriorityTaskManager priorityTaskManager);
/**
* Sets whether the player should throw an {@link IllegalStateException} when methods are called
* from a thread other than the one associated with {@link #getApplicationLooper()}.
*
* <p>The default is {@code true} and this method will be removed in the future.
*
* @param throwsWhenUsingWrongThread Whether to throw when methods are called from a wrong thread.
* @deprecated Disabling the enforcement can result in hard-to-detect bugs. Do not use this method
* except to ease the transition while wrong thread access problems are fixed.
*/
@Deprecated
void setThrowsWhenUsingWrongThread(boolean throwsWhenUsingWrongThread);
/**
* Sets whether audio offload scheduling is enabled. If enabled, ExoPlayer's main loop will run as
* rarely as possible when playing an audio stream using audio offload.

View file

@ -230,7 +230,7 @@ import java.util.concurrent.atomic.AtomicBoolean;
BandwidthMeter bandwidthMeter,
@Player.RepeatMode int repeatMode,
boolean shuffleModeEnabled,
@Nullable AnalyticsCollector analyticsCollector,
AnalyticsCollector analyticsCollector,
SeekParameters seekParameters,
LivePlaybackSpeedControl livePlaybackSpeedControl,
long releaseTimeoutMs,
@ -1226,7 +1226,7 @@ import java.util.concurrent.atomic.AtomicBoolean;
/* forceBufferingState= */ playbackInfo.playbackState == Player.STATE_ENDED);
seekPositionAdjusted |= periodPositionUs != newPeriodPositionUs;
periodPositionUs = newPeriodPositionUs;
updateLivePlaybackSpeedControl(
updatePlaybackSpeedSettingsForNewPeriod(
/* newTimeline= */ playbackInfo.timeline,
/* newPeriodId= */ periodId,
/* oldTimeline= */ playbackInfo.timeline,
@ -1866,7 +1866,7 @@ import java.util.concurrent.atomic.AtomicBoolean;
newPositionUs = seekToPeriodPosition(newPeriodId, newPositionUs, forceBufferingState);
}
} finally {
updateLivePlaybackSpeedControl(
updatePlaybackSpeedSettingsForNewPeriod(
/* newTimeline= */ timeline,
newPeriodId,
/* oldTimeline= */ playbackInfo.timeline,
@ -1906,16 +1906,19 @@ import java.util.concurrent.atomic.AtomicBoolean;
}
}
private void updateLivePlaybackSpeedControl(
private void updatePlaybackSpeedSettingsForNewPeriod(
Timeline newTimeline,
MediaPeriodId newPeriodId,
Timeline oldTimeline,
MediaPeriodId oldPeriodId,
long positionForTargetOffsetOverrideUs) {
if (newTimeline.isEmpty() || !shouldUseLivePlaybackSpeedControl(newTimeline, newPeriodId)) {
// Live playback speed control is unused for the current period, reset speed if adjusted.
if (mediaClock.getPlaybackParameters().speed != playbackInfo.playbackParameters.speed) {
mediaClock.setPlaybackParameters(playbackInfo.playbackParameters);
if (!shouldUseLivePlaybackSpeedControl(newTimeline, newPeriodId)) {
// Live playback speed control is unused for the current period, reset speed to user-defined
// playback parameters or 1.0 for ad playback.
PlaybackParameters targetPlaybackParameters =
newPeriodId.isAd() ? PlaybackParameters.DEFAULT : playbackInfo.playbackParameters;
if (!mediaClock.getPlaybackParameters().equals(targetPlaybackParameters)) {
mediaClock.setPlaybackParameters(targetPlaybackParameters);
}
return;
}
@ -2046,10 +2049,18 @@ import java.util.concurrent.atomic.AtomicBoolean;
return;
}
MediaPeriodHolder oldReadingPeriodHolder = readingPeriodHolder;
TrackSelectorResult oldTrackSelectorResult = readingPeriodHolder.getTrackSelectorResult();
readingPeriodHolder = queue.advanceReadingPeriod();
TrackSelectorResult newTrackSelectorResult = readingPeriodHolder.getTrackSelectorResult();
updatePlaybackSpeedSettingsForNewPeriod(
/* newTimeline= */ playbackInfo.timeline,
/* newPeriodId= */ readingPeriodHolder.info.id,
/* oldTimeline= */ playbackInfo.timeline,
/* oldPeriodId= */ oldReadingPeriodHolder.info.id,
/* positionForTargetOffsetOverrideUs= */ C.TIME_UNSET);
if (readingPeriodHolder.prepared
&& readingPeriodHolder.mediaPeriod.readDiscontinuity() != C.TIME_UNSET) {
// The new period starts with a discontinuity, so the renderers will play out all data, then
@ -2134,7 +2145,6 @@ import java.util.concurrent.atomic.AtomicBoolean;
// If we advance more than one period at a time, notify listeners after each update.
maybeNotifyPlaybackInfoChanged();
}
MediaPeriodHolder oldPlayingPeriodHolder = queue.getPlayingPeriod();
MediaPeriodHolder newPlayingPeriodHolder = queue.advancePlayingPeriod();
playbackInfo =
handlePositionDiscontinuity(
@ -2144,12 +2154,6 @@ import java.util.concurrent.atomic.AtomicBoolean;
/* discontinuityStartPositionUs= */ newPlayingPeriodHolder.info.startPositionUs,
/* reportDiscontinuity= */ true,
Player.DISCONTINUITY_REASON_AUTO_TRANSITION);
updateLivePlaybackSpeedControl(
/* newTimeline= */ playbackInfo.timeline,
/* newPeriodId= */ newPlayingPeriodHolder.info.id,
/* oldTimeline= */ playbackInfo.timeline,
/* oldPeriodId= */ oldPlayingPeriodHolder.info.id,
/* positionForTargetOffsetOverrideUs= */ C.TIME_UNSET);
resetPendingPauseAtEndOfPeriod();
updatePlaybackPositions();
advancedPlayingPeriod = true;

View file

@ -66,7 +66,7 @@ import com.google.common.collect.ImmutableList;
private final Timeline.Period period;
private final Timeline.Window window;
@Nullable private final AnalyticsCollector analyticsCollector;
private final AnalyticsCollector analyticsCollector;
private final Handler analyticsCollectorHandler;
private long nextWindowSequenceNumber;
@ -82,13 +82,12 @@ import com.google.common.collect.ImmutableList;
/**
* Creates a new media period queue.
*
* @param analyticsCollector An optional {@link AnalyticsCollector} to be informed of queue
* changes.
* @param analyticsCollector An {@link AnalyticsCollector} to be informed of queue changes.
* @param analyticsCollectorHandler The {@link Handler} to call {@link AnalyticsCollector} methods
* on.
*/
public MediaPeriodQueue(
@Nullable AnalyticsCollector analyticsCollector, Handler analyticsCollectorHandler) {
AnalyticsCollector analyticsCollector, Handler analyticsCollectorHandler) {
this.analyticsCollector = analyticsCollector;
this.analyticsCollectorHandler = analyticsCollectorHandler;
period = new Timeline.Period();
@ -451,17 +450,15 @@ import com.google.common.collect.ImmutableList;
// Internal methods.
private void notifyQueueUpdate() {
if (analyticsCollector != null) {
ImmutableList.Builder<MediaPeriodId> builder = ImmutableList.builder();
@Nullable MediaPeriodHolder period = playing;
while (period != null) {
builder.add(period.info.id);
period = period.getNext();
}
@Nullable MediaPeriodId readingPeriodId = reading == null ? null : reading.info.id;
analyticsCollectorHandler.post(
() -> analyticsCollector.updateMediaPeriodQueueInfo(builder.build(), readingPeriodId));
ImmutableList.Builder<MediaPeriodId> builder = ImmutableList.builder();
@Nullable MediaPeriodHolder period = playing;
while (period != null) {
builder.add(period.info.id);
period = period.getNext();
}
@Nullable MediaPeriodId readingPeriodId = reading == null ? null : reading.info.id;
analyticsCollectorHandler.post(
() -> analyticsCollector.updateMediaPeriodQueueInfo(builder.build(), readingPeriodId));
}
/**

View file

@ -91,15 +91,15 @@ import java.util.Set;
*
* @param listener The {@link MediaSourceListInfoRefreshListener} to be informed of timeline
* changes.
* @param analyticsCollector An optional {@link AnalyticsCollector} to be registered for media
* source events.
* @param analyticsCollector An {@link AnalyticsCollector} to be registered for media source
* events.
* @param analyticsCollectorHandler The {@link Handler} to call {@link AnalyticsCollector} methods
* on.
* @param playerId The {@link PlayerId} of the player using this list.
*/
public MediaSourceList(
MediaSourceListInfoRefreshListener listener,
@Nullable AnalyticsCollector analyticsCollector,
AnalyticsCollector analyticsCollector,
Handler analyticsCollectorHandler,
PlayerId playerId) {
this.playerId = playerId;
@ -112,10 +112,8 @@ import java.util.Set;
drmEventDispatcher = new DrmSessionEventListener.EventDispatcher();
childSources = new HashMap<>();
enabledMediaSourceHolders = new HashSet<>();
if (analyticsCollector != null) {
mediaSourceEventDispatcher.addEventListener(analyticsCollectorHandler, analyticsCollector);
drmEventDispatcher.addEventListener(analyticsCollectorHandler, analyticsCollector);
}
mediaSourceEventDispatcher.addEventListener(analyticsCollectorHandler, analyticsCollector);
drmEventDispatcher.addEventListener(analyticsCollectorHandler, analyticsCollector);
}
/**

View file

@ -1541,9 +1541,7 @@ public class SimpleExoPlayer extends BasePlayer
streamVolumeManager.setMuted(muted);
}
@Deprecated
@Override
public void setThrowsWhenUsingWrongThread(boolean throwsWhenUsingWrongThread) {
/* package */ void setThrowsWhenUsingWrongThread(boolean throwsWhenUsingWrongThread) {
this.throwsWhenUsingWrongThread = throwsWhenUsingWrongThread;
}

View file

@ -177,7 +177,8 @@ public final class DefaultPlaybackSessionManager implements PlaybackSessionManag
Iterator<SessionDescriptor> iterator = sessions.values().iterator();
while (iterator.hasNext()) {
SessionDescriptor session = iterator.next();
if (!session.tryResolvingToNewTimeline(previousTimeline, currentTimeline)) {
if (!session.tryResolvingToNewTimeline(previousTimeline, currentTimeline)
|| session.isFinishedAtEventTime(eventTime)) {
iterator.remove();
if (session.isCreated) {
if (session.sessionId.equals(currentSessionId)) {

View file

@ -40,6 +40,7 @@ import android.util.Pair;
import androidx.annotation.Nullable;
import androidx.annotation.RequiresApi;
import com.google.android.exoplayer2.C;
import com.google.android.exoplayer2.C.ContentType;
import com.google.android.exoplayer2.ExoPlaybackException;
import com.google.android.exoplayer2.ExoPlayerLibraryInfo;
import com.google.android.exoplayer2.Format;
@ -65,7 +66,6 @@ import com.google.android.exoplayer2.source.TrackGroup;
import com.google.android.exoplayer2.upstream.FileDataSource;
import com.google.android.exoplayer2.upstream.HttpDataSource;
import com.google.android.exoplayer2.upstream.UdpDataSource;
import com.google.android.exoplayer2.util.MimeTypes;
import com.google.android.exoplayer2.util.NetworkTypeObserver;
import com.google.android.exoplayer2.util.Util;
import com.google.android.exoplayer2.video.VideoSize;
@ -74,6 +74,7 @@ import java.io.FileNotFoundException;
import java.io.IOException;
import java.net.SocketTimeoutException;
import java.net.UnknownHostException;
import java.util.HashMap;
import java.util.UUID;
import org.checkerframework.checker.nullness.compatqual.NullableType;
import org.checkerframework.checker.nullness.qual.EnsuresNonNullIf;
@ -112,7 +113,10 @@ public final class MediaMetricsListener
private final long startTimeMs;
private final Timeline.Window window;
private final Timeline.Period period;
private final HashMap<String, Long> bandwidthTimeMs;
private final HashMap<String, Long> bandwidthBytes;
@Nullable private String activeSessionId;
@Nullable private PlaybackMetrics.Builder metricsBuilder;
@Player.DiscontinuityReason private int discontinuityReason;
private int currentPlaybackState;
@ -129,8 +133,6 @@ public final class MediaMetricsListener
private boolean hasFatalError;
private int droppedFrames;
private int playedFrames;
private long bandwidthTimeMs;
private long bandwidthBytes;
private int audioUnderruns;
/**
@ -144,6 +146,8 @@ public final class MediaMetricsListener
this.playbackSession = playbackSession;
window = new Timeline.Window();
period = new Timeline.Period();
bandwidthBytes = new HashMap<>();
bandwidthTimeMs = new HashMap<>();
startTimeMs = SystemClock.elapsedRealtime();
currentPlaybackState = PlaybackStateEvent.STATE_NOT_STARTED;
currentNetworkType = NetworkEvent.NETWORK_TYPE_UNKNOWN;
@ -168,6 +172,7 @@ public final class MediaMetricsListener
return;
}
finishCurrentSession();
activeSessionId = sessionId;
metricsBuilder =
new PlaybackMetrics.Builder()
.setPlayerName(ExoPlayerLibraryInfo.TAG)
@ -182,11 +187,14 @@ public final class MediaMetricsListener
@Override
public void onSessionFinished(
EventTime eventTime, String sessionId, boolean automaticTransitionToNextPlayback) {
if (eventTime.mediaPeriodId != null && eventTime.mediaPeriodId.isAd()) {
// Ignore ad sessions.
return;
if ((eventTime.mediaPeriodId != null && eventTime.mediaPeriodId.isAd())
|| !sessionId.equals(activeSessionId)) {
// Ignore ad sessions and other sessions that are finished before becoming active.
} else {
finishCurrentSession();
}
finishCurrentSession();
bandwidthTimeMs.remove(sessionId);
bandwidthBytes.remove(sessionId);
}
// AnalyticsListener implementation.
@ -213,8 +221,17 @@ public final class MediaMetricsListener
@Override
public void onBandwidthEstimate(
EventTime eventTime, int totalLoadTimeMs, long totalBytesLoaded, long bitrateEstimate) {
bandwidthTimeMs += totalLoadTimeMs;
bandwidthBytes += totalBytesLoaded;
if (eventTime.mediaPeriodId != null) {
String sessionId =
sessionManager.getSessionForMediaPeriodId(
eventTime.timeline, checkNotNull(eventTime.mediaPeriodId));
@Nullable Long prevBandwidthBytes = bandwidthBytes.get(sessionId);
@Nullable Long prevBandwidthTimeMs = bandwidthTimeMs.get(sessionId);
bandwidthBytes.put(
sessionId, (prevBandwidthBytes == null ? 0 : prevBandwidthBytes) + totalBytesLoaded);
bandwidthTimeMs.put(
sessionId, (prevBandwidthTimeMs == null ? 0 : prevBandwidthTimeMs) + totalLoadTimeMs);
}
}
@Override
@ -578,16 +595,25 @@ public final class MediaMetricsListener
metricsBuilder.setAudioUnderrunCount(audioUnderruns);
metricsBuilder.setVideoFramesDropped(droppedFrames);
metricsBuilder.setVideoFramesPlayed(playedFrames);
metricsBuilder.setNetworkTransferDurationMillis(bandwidthTimeMs);
@Nullable Long networkTimeMs = bandwidthTimeMs.get(activeSessionId);
metricsBuilder.setNetworkTransferDurationMillis(networkTimeMs == null ? 0 : networkTimeMs);
// TODO(b/181121847): Report localBytesRead. This requires additional callbacks or plumbing.
metricsBuilder.setNetworkBytesRead(bandwidthBytes);
@Nullable Long networkBytes = bandwidthBytes.get(activeSessionId);
metricsBuilder.setNetworkBytesRead(networkBytes == null ? 0 : networkBytes);
// TODO(b/181121847): Detect stream sources mixed and local depending on localBytesRead.
metricsBuilder.setStreamSource(
bandwidthBytes > 0
networkBytes != null && networkBytes > 0
? PlaybackMetrics.STREAM_SOURCE_NETWORK
: PlaybackMetrics.STREAM_SOURCE_UNKNOWN);
playbackSession.reportPlaybackMetrics(metricsBuilder.build());
metricsBuilder = null;
activeSessionId = null;
audioUnderruns = 0;
droppedFrames = 0;
playedFrames = 0;
currentVideoFormat = null;
currentAudioFormat = null;
currentTextFormat = null;
}
private static int getTrackChangeReason(@C.SelectionReason int trackSelectionReason) {
@ -636,19 +662,23 @@ public final class MediaMetricsListener
}
private static int getStreamType(MediaItem mediaItem) {
if (mediaItem.localConfiguration == null || mediaItem.localConfiguration.mimeType == null) {
if (mediaItem.localConfiguration == null) {
return PlaybackMetrics.STREAM_TYPE_UNKNOWN;
}
String mimeType = mediaItem.localConfiguration.mimeType;
switch (mimeType) {
case MimeTypes.APPLICATION_M3U8:
@ContentType
int contentType =
Util.inferContentTypeForUriAndMimeType(
mediaItem.localConfiguration.uri, mediaItem.localConfiguration.mimeType);
switch (contentType) {
case C.TYPE_HLS:
return PlaybackMetrics.STREAM_TYPE_HLS;
case MimeTypes.APPLICATION_MPD:
case C.TYPE_DASH:
return PlaybackMetrics.STREAM_TYPE_DASH;
case MimeTypes.APPLICATION_SS:
case C.TYPE_SS:
return PlaybackMetrics.STREAM_TYPE_SS;
case C.TYPE_RTSP:
default:
return PlaybackMetrics.STREAM_TYPE_PROGRESSIVE;
return PlaybackMetrics.STREAM_TYPE_OTHER;
}
}

View file

@ -404,6 +404,13 @@ public interface AudioSink {
*/
void setAudioAttributes(AudioAttributes audioAttributes);
/**
* Returns the audio attributes used for audio playback, or {@code null} if the sink does not use
* audio attributes.
*/
@Nullable
AudioAttributes getAudioAttributes();
/** Sets the audio session id. */
void setAudioSessionId(int audioSessionId);

View file

@ -17,6 +17,7 @@ package com.google.android.exoplayer2.audio;
import static com.google.android.exoplayer2.audio.AudioCapabilities.DEFAULT_AUDIO_CAPABILITIES;
import static com.google.android.exoplayer2.util.Assertions.checkNotNull;
import static com.google.android.exoplayer2.util.Util.constrainValue;
import static com.google.common.base.MoreObjects.firstNonNull;
import static java.lang.Math.max;
import static java.lang.Math.min;
@ -209,6 +210,39 @@ public final class DefaultAudioSink implements AudioSink {
}
}
/** Provides the buffer size to use when creating an {@link AudioTrack}. */
interface AudioTrackBufferSizeProvider {
/** Default instance. */
AudioTrackBufferSizeProvider DEFAULT =
new DefaultAudioTrackBufferSizeProvider.Builder().build();
/**
* Returns the buffer size to use when creating an {@link AudioTrack} for a specific format and
* output mode.
*
* @param minBufferSizeInBytes The minimum buffer size in bytes required to play this format.
* See {@link AudioTrack#getMinBufferSize}.
* @param encoding The {@link C.Encoding} of the format.
* @param outputMode How the audio will be played. One of the {@link OutputMode output modes}.
* @param pcmFrameSize The size of the PCM frames if the {@code encoding} is PCM, 1 otherwise,
* in bytes.
* @param sampleRate The sample rate of the format, in Hz.
* @param maxAudioTrackPlaybackSpeed The maximum speed the content will be played using {@link
* AudioTrack#setPlaybackParams}. 0.5 is 2x slow motion, 1 is real time, 2 is 2x fast
* forward, etc. This will be {@code 1} unless {@link
* Builder#setEnableAudioTrackPlaybackParams} is enabled.
* @return The computed buffer size in bytes. It should always be {@code >=
* minBufferSizeInBytes}. The computed buffer size must contain an integer number of frames:
* {@code bufferSizeInBytes % pcmFrameSize == 0}.
*/
int getBufferSizeInBytes(
int minBufferSizeInBytes,
@C.Encoding int encoding,
@OutputMode int outputMode,
int pcmFrameSize,
int sampleRate,
double maxAudioTrackPlaybackSpeed);
}
/** A builder to create {@link DefaultAudioSink} instances. */
public static final class Builder {
@ -217,11 +251,13 @@ public final class DefaultAudioSink implements AudioSink {
private boolean enableFloatOutput;
private boolean enableAudioTrackPlaybackParams;
private int offloadMode;
AudioTrackBufferSizeProvider audioTrackBufferSizeProvider;
/** Creates a new builder. */
public Builder() {
audioCapabilities = DEFAULT_AUDIO_CAPABILITIES;
offloadMode = OFFLOAD_MODE_DISABLED;
audioTrackBufferSizeProvider = AudioTrackBufferSizeProvider.DEFAULT;
}
/**
@ -302,6 +338,18 @@ public final class DefaultAudioSink implements AudioSink {
return this;
}
/**
* Sets an {@link AudioTrackBufferSizeProvider} to compute the buffer size when {@link
* #configure} is called with {@code specifiedBufferSize == 0}.
*
* <p>The default value is {@link AudioTrackBufferSizeProvider#DEFAULT}.
*/
public Builder setAudioTrackBufferSizeProvider(
AudioTrackBufferSizeProvider audioTrackBufferSizeProvider) {
this.audioTrackBufferSizeProvider = audioTrackBufferSizeProvider;
return this;
}
/** Builds the {@link DefaultAudioSink}. Must only be called once per Builder instance. */
public DefaultAudioSink build() {
if (audioProcessorChain == null) {
@ -362,31 +410,18 @@ public final class DefaultAudioSink implements AudioSink {
*/
public static final int OFFLOAD_MODE_ENABLED_GAPLESS_DISABLED = 3;
/** Output mode of the audio sink. */
@Documented
@Retention(RetentionPolicy.SOURCE)
@IntDef({OUTPUT_MODE_PCM, OUTPUT_MODE_OFFLOAD, OUTPUT_MODE_PASSTHROUGH})
private @interface OutputMode {}
public @interface OutputMode {}
private static final int OUTPUT_MODE_PCM = 0;
private static final int OUTPUT_MODE_OFFLOAD = 1;
private static final int OUTPUT_MODE_PASSTHROUGH = 2;
/** A minimum length for the {@link AudioTrack} buffer, in microseconds. */
private static final long MIN_BUFFER_DURATION_US = 250_000;
/** A maximum length for the {@link AudioTrack} buffer, in microseconds. */
private static final long MAX_BUFFER_DURATION_US = 750_000;
/** The length for passthrough {@link AudioTrack} buffers, in microseconds. */
private static final long PASSTHROUGH_BUFFER_DURATION_US = 250_000;
/** The length for offload {@link AudioTrack} buffers, in microseconds. */
private static final long OFFLOAD_BUFFER_DURATION_US = 50_000_000;
/**
* A multiplication factor to apply to the minimum buffer size requested by the underlying {@link
* AudioTrack}.
*/
private static final int BUFFER_MULTIPLICATION_FACTOR = 4;
/** To avoid underruns on some devices (e.g., Broadcom 7271), scale up the AC3 buffer duration. */
private static final int AC3_BUFFER_MULTIPLICATION_FACTOR = 2;
/** The audio sink plays PCM audio. */
public static final int OUTPUT_MODE_PCM = 0;
/** The audio sink plays encoded audio in offload. */
public static final int OUTPUT_MODE_OFFLOAD = 1;
/** The audio sink plays encoded audio in passthrough. */
public static final int OUTPUT_MODE_PASSTHROUGH = 2;
/**
* Native error code equivalent of {@link AudioTrack#ERROR_DEAD_OBJECT} to workaround missing
@ -433,6 +468,7 @@ public final class DefaultAudioSink implements AudioSink {
private final PendingExceptionHolder<InitializationException>
initializationExceptionPendingExceptionHolder;
private final PendingExceptionHolder<WriteException> writeExceptionPendingExceptionHolder;
private final AudioTrackBufferSizeProvider audioTrackBufferSizeProvider;
@Nullable private PlayerId playerId;
@Nullable private Listener listener;
@ -553,6 +589,7 @@ public final class DefaultAudioSink implements AudioSink {
enableFloatOutput = Util.SDK_INT >= 21 && builder.enableFloatOutput;
enableAudioTrackPlaybackParams = Util.SDK_INT >= 23 && builder.enableAudioTrackPlaybackParams;
offloadMode = Util.SDK_INT >= 29 ? builder.offloadMode : OFFLOAD_MODE_DISABLED;
audioTrackBufferSizeProvider = builder.audioTrackBufferSizeProvider;
releasingConditionVariable = new ConditionVariable(true);
audioTrackPositionTracker = new AudioTrackPositionTracker(new PositionTrackerListener());
channelMappingAudioProcessor = new ChannelMappingAudioProcessor();
@ -715,6 +752,16 @@ public final class DefaultAudioSink implements AudioSink {
outputChannelConfig = encodingAndChannelConfig.second;
}
}
int bufferSize =
specifiedBufferSize != 0
? specifiedBufferSize
: audioTrackBufferSizeProvider.getBufferSizeInBytes(
getAudioTrackMinBufferSize(outputSampleRate, outputChannelConfig, outputEncoding),
outputEncoding,
outputMode,
outputPcmFrameSize,
outputSampleRate,
enableAudioTrackPlaybackParams ? MAX_PLAYBACK_SPEED : DEFAULT_PLAYBACK_SPEED);
if (outputEncoding == C.ENCODING_INVALID) {
throw new ConfigurationException(
@ -736,8 +783,7 @@ public final class DefaultAudioSink implements AudioSink {
outputSampleRate,
outputChannelConfig,
outputEncoding,
specifiedBufferSize,
enableAudioTrackPlaybackParams,
bufferSize,
availableAudioProcessors);
if (isAudioTrackInitialized()) {
this.pendingConfiguration = pendingConfiguration;
@ -1198,8 +1244,8 @@ public final class DefaultAudioSink implements AudioSink {
public void setPlaybackParameters(PlaybackParameters playbackParameters) {
playbackParameters =
new PlaybackParameters(
Util.constrainValue(playbackParameters.speed, MIN_PLAYBACK_SPEED, MAX_PLAYBACK_SPEED),
Util.constrainValue(playbackParameters.pitch, MIN_PITCH, MAX_PITCH));
constrainValue(playbackParameters.speed, MIN_PLAYBACK_SPEED, MAX_PLAYBACK_SPEED),
constrainValue(playbackParameters.pitch, MIN_PITCH, MAX_PITCH));
if (enableAudioTrackPlaybackParams && Util.SDK_INT >= 23) {
setAudioTrackPlaybackParametersV23(playbackParameters);
} else {
@ -1239,6 +1285,11 @@ public final class DefaultAudioSink implements AudioSink {
flush();
}
@Override
public AudioAttributes getAudioAttributes() {
return audioAttributes;
}
@Override
public void setAudioSessionId(int audioSessionId) {
if (this.audioSessionId != audioSessionId) {
@ -1775,47 +1826,6 @@ public final class DefaultAudioSink implements AudioSink {
return Util.SDK_INT >= 29 && audioTrack.isOffloadedPlayback();
}
private static int getMaximumEncodedRateBytesPerSecond(@C.Encoding int encoding) {
switch (encoding) {
case C.ENCODING_MP3:
return MpegAudioUtil.MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AAC_LC:
return AacUtil.AAC_LC_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AAC_HE_V1:
return AacUtil.AAC_HE_V1_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AAC_HE_V2:
return AacUtil.AAC_HE_V2_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AAC_XHE:
return AacUtil.AAC_XHE_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AAC_ELD:
return AacUtil.AAC_ELD_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AC3:
return Ac3Util.AC3_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_E_AC3:
case C.ENCODING_E_AC3_JOC:
return Ac3Util.E_AC3_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AC4:
return Ac4Util.MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_DTS:
return DtsUtil.DTS_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_DTS_HD:
return DtsUtil.DTS_HD_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_DOLBY_TRUEHD:
return Ac3Util.TRUEHD_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_PCM_16BIT:
case C.ENCODING_PCM_16BIT_BIG_ENDIAN:
case C.ENCODING_PCM_24BIT:
case C.ENCODING_PCM_32BIT:
case C.ENCODING_PCM_8BIT:
case C.ENCODING_PCM_FLOAT:
case C.ENCODING_AAC_ER_BSAC:
case C.ENCODING_INVALID:
case Format.NO_VALUE:
default:
throw new IllegalArgumentException();
}
}
private static int getFramesPerEncodedSample(@C.Encoding int encoding, ByteBuffer buffer) {
switch (encoding) {
case C.ENCODING_MP3:
@ -2005,6 +2015,13 @@ public final class DefaultAudioSink implements AudioSink {
.build();
}
private static int getAudioTrackMinBufferSize(
int sampleRateInHz, int channelConfig, int encoding) {
int minBufferSize = AudioTrack.getMinBufferSize(sampleRateInHz, channelConfig, encoding);
Assertions.checkState(minBufferSize != AudioTrack.ERROR_BAD_VALUE);
return minBufferSize;
}
private final class PositionTrackerListener implements AudioTrackPositionTracker.Listener {
@Override
@ -2099,8 +2116,7 @@ public final class DefaultAudioSink implements AudioSink {
int outputSampleRate,
int outputChannelConfig,
int outputEncoding,
int specifiedBufferSize,
boolean enableAudioTrackPlaybackParams,
int bufferSize,
AudioProcessor[] availableAudioProcessors) {
this.inputFormat = inputFormat;
this.inputPcmFrameSize = inputPcmFrameSize;
@ -2109,10 +2125,8 @@ public final class DefaultAudioSink implements AudioSink {
this.outputSampleRate = outputSampleRate;
this.outputChannelConfig = outputChannelConfig;
this.outputEncoding = outputEncoding;
this.bufferSize = bufferSize;
this.availableAudioProcessors = availableAudioProcessors;
// Call computeBufferSize() last as it depends on the other configuration values.
this.bufferSize = computeBufferSize(specifiedBufferSize, enableAudioTrackPlaybackParams);
}
/** Returns if the configurations are sufficiently compatible to reuse the audio track. */
@ -2132,10 +2146,6 @@ public final class DefaultAudioSink implements AudioSink {
return (frameCount * C.MICROS_PER_SECOND) / outputSampleRate;
}
public long durationUsToFrames(long durationUs) {
return (durationUs * outputSampleRate) / C.MICROS_PER_SECOND;
}
public AudioTrack buildAudioTrack(
boolean tunneling, AudioAttributes audioAttributes, int audioSessionId)
throws InitializationException {
@ -2236,49 +2246,6 @@ public final class DefaultAudioSink implements AudioSink {
}
}
private int computeBufferSize(
int specifiedBufferSize, boolean enableAudioTrackPlaybackParameters) {
if (specifiedBufferSize != 0) {
return specifiedBufferSize;
}
switch (outputMode) {
case OUTPUT_MODE_PCM:
return getPcmDefaultBufferSize(
enableAudioTrackPlaybackParameters ? MAX_PLAYBACK_SPEED : DEFAULT_PLAYBACK_SPEED);
case OUTPUT_MODE_OFFLOAD:
return getEncodedDefaultBufferSize(OFFLOAD_BUFFER_DURATION_US);
case OUTPUT_MODE_PASSTHROUGH:
return getEncodedDefaultBufferSize(PASSTHROUGH_BUFFER_DURATION_US);
default:
throw new IllegalStateException();
}
}
private int getEncodedDefaultBufferSize(long bufferDurationUs) {
int rate = getMaximumEncodedRateBytesPerSecond(outputEncoding);
if (outputEncoding == C.ENCODING_AC3) {
rate *= AC3_BUFFER_MULTIPLICATION_FACTOR;
}
return (int) (bufferDurationUs * rate / C.MICROS_PER_SECOND);
}
private int getPcmDefaultBufferSize(float maxAudioTrackPlaybackSpeed) {
int minBufferSize =
AudioTrack.getMinBufferSize(outputSampleRate, outputChannelConfig, outputEncoding);
Assertions.checkState(minBufferSize != AudioTrack.ERROR_BAD_VALUE);
int multipliedBufferSize = minBufferSize * BUFFER_MULTIPLICATION_FACTOR;
int minAppBufferSize = (int) durationUsToFrames(MIN_BUFFER_DURATION_US) * outputPcmFrameSize;
int maxAppBufferSize =
max(minBufferSize, (int) durationUsToFrames(MAX_BUFFER_DURATION_US) * outputPcmFrameSize);
int bufferSize =
Util.constrainValue(multipliedBufferSize, minAppBufferSize, maxAppBufferSize);
if (maxAudioTrackPlaybackSpeed != 1f) {
// Maintain the buffer duration by scaling the size accordingly.
bufferSize = Math.round(bufferSize * maxAudioTrackPlaybackSpeed);
}
return bufferSize;
}
@RequiresApi(21)
private static android.media.AudioAttributes getAudioTrackAttributesV21(
AudioAttributes audioAttributes, boolean tunneling) {

View file

@ -0,0 +1,257 @@
/*
* Copyright (C) 2021 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.android.exoplayer2.audio;
import static com.google.android.exoplayer2.audio.DefaultAudioSink.OUTPUT_MODE_OFFLOAD;
import static com.google.android.exoplayer2.audio.DefaultAudioSink.OUTPUT_MODE_PASSTHROUGH;
import static com.google.android.exoplayer2.audio.DefaultAudioSink.OUTPUT_MODE_PCM;
import static com.google.android.exoplayer2.util.Util.constrainValue;
import static com.google.common.primitives.Ints.checkedCast;
import static java.lang.Math.max;
import android.media.AudioTrack;
import com.google.android.exoplayer2.C;
import com.google.android.exoplayer2.Format;
import com.google.android.exoplayer2.audio.DefaultAudioSink.OutputMode;
/** Provide the buffer size to use when creating an {@link AudioTrack}. */
public class DefaultAudioTrackBufferSizeProvider
implements DefaultAudioSink.AudioTrackBufferSizeProvider {
/** Default minimum length for the {@link AudioTrack} buffer, in microseconds. */
private static final int MIN_PCM_BUFFER_DURATION_US = 250_000;
/** Default maximum length for the {@link AudioTrack} buffer, in microseconds. */
private static final int MAX_PCM_BUFFER_DURATION_US = 750_000;
/** Default multiplication factor to apply to the minimum buffer size requested. */
private static final int PCM_BUFFER_MULTIPLICATION_FACTOR = 4;
/** Default length for passthrough {@link AudioTrack} buffers, in microseconds. */
private static final int PASSTHROUGH_BUFFER_DURATION_US = 250_000;
/** Default length for offload {@link AudioTrack} buffers, in microseconds. */
private static final int OFFLOAD_BUFFER_DURATION_US = 50_000_000;
/**
* Default multiplication factor to apply to AC3 passthrough buffer to avoid underruns on some
* devices (e.g., Broadcom 7271).
*/
private static final int AC3_BUFFER_MULTIPLICATION_FACTOR = 2;
/** A builder to create {@link DefaultAudioTrackBufferSizeProvider} instances. */
public static class Builder {
private int minPcmBufferDurationUs;
private int maxPcmBufferDurationUs;
private int pcmBufferMultiplicationFactor;
private int passthroughBufferDurationUs;
private int offloadBufferDurationUs;
private int ac3BufferMultiplicationFactor;
/** Creates a new builder. */
public Builder() {
minPcmBufferDurationUs = MIN_PCM_BUFFER_DURATION_US;
maxPcmBufferDurationUs = MAX_PCM_BUFFER_DURATION_US;
pcmBufferMultiplicationFactor = PCM_BUFFER_MULTIPLICATION_FACTOR;
passthroughBufferDurationUs = PASSTHROUGH_BUFFER_DURATION_US;
offloadBufferDurationUs = OFFLOAD_BUFFER_DURATION_US;
ac3BufferMultiplicationFactor = AC3_BUFFER_MULTIPLICATION_FACTOR;
}
/**
* Sets the minimum length for PCM {@link AudioTrack} buffers, in microseconds. Default is
* {@value #MIN_PCM_BUFFER_DURATION_US}.
*/
public Builder setMinPcmBufferDurationUs(int minPcmBufferDurationUs) {
this.minPcmBufferDurationUs = minPcmBufferDurationUs;
return this;
}
/**
* Sets the maximum length for PCM {@link AudioTrack} buffers, in microseconds. Default is
* {@value #MAX_PCM_BUFFER_DURATION_US}.
*/
public Builder setMaxPcmBufferDurationUs(int maxPcmBufferDurationUs) {
this.maxPcmBufferDurationUs = maxPcmBufferDurationUs;
return this;
}
/**
* Sets the multiplication factor to apply to the minimum buffer size requested. Default is
* {@value #PCM_BUFFER_MULTIPLICATION_FACTOR}.
*/
public Builder setPcmBufferMultiplicationFactor(int pcmBufferMultiplicationFactor) {
this.pcmBufferMultiplicationFactor = pcmBufferMultiplicationFactor;
return this;
}
/**
* Sets the length for passthrough {@link AudioTrack} buffers, in microseconds. Default is
* {@value #PASSTHROUGH_BUFFER_DURATION_US}.
*/
public Builder setPassthroughBufferDurationUs(int passthroughBufferDurationUs) {
this.passthroughBufferDurationUs = passthroughBufferDurationUs;
return this;
}
/**
* The length for offload {@link AudioTrack} buffers, in microseconds. Default is {@value
* #OFFLOAD_BUFFER_DURATION_US}.
*/
public Builder setOffloadBufferDurationUs(int offloadBufferDurationUs) {
this.offloadBufferDurationUs = offloadBufferDurationUs;
return this;
}
/**
* Sets the multiplication factor to apply to the passthrough buffer for AC3 to avoid underruns
* on some devices (e.g., Broadcom 7271). Default is {@value #AC3_BUFFER_MULTIPLICATION_FACTOR}.
*/
public Builder setAc3BufferMultiplicationFactor(int ac3BufferMultiplicationFactor) {
this.ac3BufferMultiplicationFactor = ac3BufferMultiplicationFactor;
return this;
}
/** Build the {@link DefaultAudioTrackBufferSizeProvider}. */
public DefaultAudioTrackBufferSizeProvider build() {
return new DefaultAudioTrackBufferSizeProvider(this);
}
}
/** The minimum length for PCM {@link AudioTrack} buffers, in microseconds. */
protected final int minPcmBufferDurationUs;
/** The maximum length for PCM {@link AudioTrack} buffers, in microseconds. */
protected final int maxPcmBufferDurationUs;
/** The multiplication factor to apply to the minimum buffer size requested. */
protected final int pcmBufferMultiplicationFactor;
/** The length for passthrough {@link AudioTrack} buffers, in microseconds. */
protected final int passthroughBufferDurationUs;
/** The length for offload {@link AudioTrack} buffers, in microseconds. */
protected final int offloadBufferDurationUs;
/**
* The multiplication factor to apply to AC3 passthrough buffer to avoid underruns on some devices
* (e.g., Broadcom 7271).
*/
public final int ac3BufferMultiplicationFactor;
protected DefaultAudioTrackBufferSizeProvider(Builder builder) {
minPcmBufferDurationUs = builder.minPcmBufferDurationUs;
maxPcmBufferDurationUs = builder.maxPcmBufferDurationUs;
pcmBufferMultiplicationFactor = builder.pcmBufferMultiplicationFactor;
passthroughBufferDurationUs = builder.passthroughBufferDurationUs;
offloadBufferDurationUs = builder.offloadBufferDurationUs;
ac3BufferMultiplicationFactor = builder.ac3BufferMultiplicationFactor;
}
@Override
public int getBufferSizeInBytes(
int minBufferSizeInBytes,
@C.Encoding int encoding,
@OutputMode int outputMode,
int pcmFrameSize,
int sampleRate,
double maxAudioTrackPlaybackSpeed) {
int bufferSize =
get1xBufferSizeInBytes(
minBufferSizeInBytes, encoding, outputMode, pcmFrameSize, sampleRate);
// Maintain the buffer duration by scaling the size accordingly.
bufferSize = (int) (bufferSize * maxAudioTrackPlaybackSpeed);
// Buffer size must not be lower than the AudioTrack min buffer size for this format.
bufferSize = max(minBufferSizeInBytes, bufferSize);
// Increase if needed to make sure the buffers contains an integer number of frames.
return (bufferSize + pcmFrameSize - 1) / pcmFrameSize * pcmFrameSize;
}
/** Returns the buffer size for playback at 1x speed. */
protected int get1xBufferSizeInBytes(
int minBufferSizeInBytes, int encoding, int outputMode, int pcmFrameSize, int sampleRate) {
switch (outputMode) {
case OUTPUT_MODE_PCM:
return getPcmBufferSizeInBytes(minBufferSizeInBytes, sampleRate, pcmFrameSize);
case OUTPUT_MODE_PASSTHROUGH:
return getPassthroughBufferSizeInBytes(encoding);
case OUTPUT_MODE_OFFLOAD:
return getOffloadBufferSizeInBytes(encoding);
default:
throw new IllegalArgumentException();
}
}
/** Returns the buffer size for PCM playback. */
protected int getPcmBufferSizeInBytes(int minBufferSizeInBytes, int samplingRate, int frameSize) {
int targetBufferSize = minBufferSizeInBytes * pcmBufferMultiplicationFactor;
int minAppBufferSize = durationUsToBytes(minPcmBufferDurationUs, samplingRate, frameSize);
int maxAppBufferSize = durationUsToBytes(maxPcmBufferDurationUs, samplingRate, frameSize);
return constrainValue(targetBufferSize, minAppBufferSize, maxAppBufferSize);
}
/** Returns the buffer size for passthrough playback. */
protected int getPassthroughBufferSizeInBytes(@C.Encoding int encoding) {
int bufferSizeUs = passthroughBufferDurationUs;
if (encoding == C.ENCODING_AC3) {
bufferSizeUs *= ac3BufferMultiplicationFactor;
}
int maxByteRate = getMaximumEncodedRateBytesPerSecond(encoding);
return checkedCast((long) bufferSizeUs * maxByteRate / C.MICROS_PER_SECOND);
}
/** Returns the buffer size for offload playback. */
protected int getOffloadBufferSizeInBytes(@C.Encoding int encoding) {
int maxByteRate = getMaximumEncodedRateBytesPerSecond(encoding);
return checkedCast((long) offloadBufferDurationUs * maxByteRate / C.MICROS_PER_SECOND);
}
protected static int durationUsToBytes(int durationUs, int samplingRate, int frameSize) {
return checkedCast((long) durationUs * samplingRate * frameSize / C.MICROS_PER_SECOND);
}
protected static int getMaximumEncodedRateBytesPerSecond(@C.Encoding int encoding) {
switch (encoding) {
case C.ENCODING_MP3:
return MpegAudioUtil.MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AAC_LC:
return AacUtil.AAC_LC_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AAC_HE_V1:
return AacUtil.AAC_HE_V1_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AAC_HE_V2:
return AacUtil.AAC_HE_V2_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AAC_XHE:
return AacUtil.AAC_XHE_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AAC_ELD:
return AacUtil.AAC_ELD_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AC3:
return Ac3Util.AC3_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_E_AC3:
case C.ENCODING_E_AC3_JOC:
return Ac3Util.E_AC3_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_AC4:
return Ac4Util.MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_DTS:
return DtsUtil.DTS_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_DTS_HD:
return DtsUtil.DTS_HD_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_DOLBY_TRUEHD:
return Ac3Util.TRUEHD_MAX_RATE_BYTES_PER_SECOND;
case C.ENCODING_PCM_16BIT:
case C.ENCODING_PCM_16BIT_BIG_ENDIAN:
case C.ENCODING_PCM_24BIT:
case C.ENCODING_PCM_32BIT:
case C.ENCODING_PCM_8BIT:
case C.ENCODING_PCM_FLOAT:
case C.ENCODING_AAC_ER_BSAC:
case C.ENCODING_INVALID:
case Format.NO_VALUE:
default:
throw new IllegalArgumentException();
}
}
}

View file

@ -119,6 +119,12 @@ public class ForwardingAudioSink implements AudioSink {
sink.setAudioAttributes(audioAttributes);
}
@Override
@Nullable
public AudioAttributes getAudioAttributes() {
return sink.getAudioAttributes();
}
@Override
public void setAudioSessionId(int audioSessionId) {
sink.setAudioSessionId(audioSessionId);

View file

@ -18,6 +18,7 @@ package com.google.android.exoplayer2.audio;
import static com.google.android.exoplayer2.decoder.DecoderReuseEvaluation.DISCARD_REASON_MAX_INPUT_SIZE_EXCEEDED;
import static com.google.android.exoplayer2.decoder.DecoderReuseEvaluation.REUSE_RESULT_NO;
import static com.google.android.exoplayer2.util.Assertions.checkNotNull;
import static com.google.android.exoplayer2.util.Assertions.checkStateNotNull;
import static com.google.common.base.MoreObjects.firstNonNull;
import static java.lang.Math.max;
@ -29,7 +30,9 @@ import android.media.MediaCrypto;
import android.media.MediaFormat;
import android.os.Handler;
import androidx.annotation.CallSuper;
import androidx.annotation.DoNotInline;
import androidx.annotation.Nullable;
import androidx.annotation.RequiresApi;
import com.google.android.exoplayer2.C;
import com.google.android.exoplayer2.ExoPlaybackException;
import com.google.android.exoplayer2.ExoPlayer;
@ -57,8 +60,10 @@ import com.google.android.exoplayer2.util.MediaFormatUtil;
import com.google.android.exoplayer2.util.MimeTypes;
import com.google.android.exoplayer2.util.Util;
import com.google.common.collect.ImmutableList;
import java.lang.reflect.InvocationTargetException;
import java.nio.ByteBuffer;
import java.util.List;
import org.checkerframework.checker.nullness.qual.MonotonicNonNull;
/**
* Decodes and renders audio using {@link MediaCodec} and an {@link AudioSink}.
@ -94,6 +99,7 @@ public class MediaCodecAudioRenderer extends MediaCodecRenderer implements Media
private final Context context;
private final EventDispatcher eventDispatcher;
private final AudioSink audioSink;
private final SpatializationHelper spatializationHelper;
private int codecMaxInputSize;
private boolean codecNeedsDiscardChannelsWorkaround;
@ -249,9 +255,11 @@ public class MediaCodecAudioRenderer extends MediaCodecRenderer implements Media
mediaCodecSelector,
enableDecoderFallback,
/* assumedMinimumCodecOperatingRate= */ 44100);
this.context = context.getApplicationContext();
context = context.getApplicationContext();
this.context = context;
this.audioSink = audioSink;
eventDispatcher = new EventDispatcher(eventHandler, eventListener);
spatializationHelper = new SpatializationHelper(context, audioSink.getAudioAttributes());
audioSink.setListener(new AudioSinkListener());
}
@ -410,6 +418,11 @@ public class MediaCodecAudioRenderer extends MediaCodecRenderer implements Media
return audioSink.supportsFormat(format);
}
@Override
protected boolean shouldReinitCodec() {
return spatializationHelper.shouldReinitCodec();
}
@Override
protected MediaCodecAdapter.Configuration getMediaCodecConfiguration(
MediaCodecInfo codecInfo,
@ -470,7 +483,11 @@ public class MediaCodecAudioRenderer extends MediaCodecRenderer implements Media
@Override
protected void onCodecInitialized(
String name, long initializedTimestampMs, long initializationDurationMs) {
String name,
MediaCodecAdapter.Configuration configuration,
long initializedTimestampMs,
long initializationDurationMs) {
spatializationHelper.onCodecInitialized(configuration);
eventDispatcher.decoderInitialized(name, initializedTimestampMs, initializationDurationMs);
}
@ -561,6 +578,7 @@ public class MediaCodecAudioRenderer extends MediaCodecRenderer implements Media
audioSink.disableTunneling();
}
audioSink.setPlayerId(getPlayerId());
spatializationHelper.enable();
}
@Override
@ -613,6 +631,7 @@ public class MediaCodecAudioRenderer extends MediaCodecRenderer implements Media
audioSinkNeedsReset = false;
audioSink.reset();
}
spatializationHelper.reset();
}
}
@ -737,6 +756,7 @@ public class MediaCodecAudioRenderer extends MediaCodecRenderer implements Media
case MSG_SET_AUDIO_ATTRIBUTES:
AudioAttributes audioAttributes = (AudioAttributes) message;
audioSink.setAudioAttributes(audioAttributes);
spatializationHelper.setAudioAttributes(audioSink.getAudioAttributes());
break;
case MSG_SET_AUX_EFFECT_INFO:
AuxEffectInfo auxEffectInfo = (AuxEffectInfo) message;
@ -848,14 +868,8 @@ public class MediaCodecAudioRenderer extends MediaCodecRenderer implements Media
== AudioSink.SINK_FORMAT_SUPPORTED_DIRECTLY) {
mediaFormat.setInteger(MediaFormat.KEY_PCM_ENCODING, AudioFormat.ENCODING_PCM_FLOAT);
}
spatializationHelper.configureForSpatialization(mediaFormat, format);
if (Util.SDK_INT >= 32) {
// Disable down-mixing in the decoder (for decoders that read the max-output-channel-count
// key).
// TODO[b/190759307]: Update key to use MediaFormat.KEY_MAX_OUTPUT_CHANNEL_COUNT once the
// compile SDK target is set to 32.
mediaFormat.setInteger("max-output-channel-count", 99);
}
return mediaFormat;
}
@ -939,4 +953,163 @@ public class MediaCodecAudioRenderer extends MediaCodecRenderer implements Media
eventDispatcher.audioSinkError(audioSinkError);
}
}
/**
* A helper class that signals whether the codec needs to be re-initialized because spatialization
* properties changed.
*/
private static final class SpatializationHelper implements SpatializerDelegate.Listener {
// TODO[b/190759307] Remove and use MediaFormat.KEY_MAX_OUTPUT_CHANNEL_COUNT once the
// compile SDK target is set to 32.
private static final String KEY_MAX_OUTPUT_CHANNEL_COUNT = "max-output-channel-count";
private static final int SPATIALIZATION_CHANNEL_COUNT = 99;
@Nullable private final SpatializerDelegate spatializerDelegate;
private @MonotonicNonNull Handler handler;
@Nullable private AudioAttributes audioAttributes;
@Nullable private Format inputFormat;
private boolean codecConfiguredForSpatialization;
private boolean codecNeedsReinit;
private boolean listenerAdded;
/** Creates a new instance. */
public SpatializationHelper(Context context, @Nullable AudioAttributes audioAttributes) {
this.spatializerDelegate = maybeCreateSpatializer(context);
this.audioAttributes = audioAttributes;
}
/** Enables this helper. Call this method when the renderer is enabled. */
public void enable() {
maybeAddSpatalizationListener();
}
/** Resets the helper and releases any resources. Call this method when renderer is reset. */
public void reset() {
maybeRemoveSpatalizationListener();
}
/** Sets the audio attributes set by the player. */
public void setAudioAttributes(@Nullable AudioAttributes audioAttributes) {
if (Util.areEqual(this.audioAttributes, audioAttributes)) {
return;
}
this.audioAttributes = audioAttributes;
updateCodecNeedsReinit();
}
/**
* Sets keys for audio spatialization on the {@code mediaFormat} if the platform can apply
* spatialization to this {@code format}.
*/
public void configureForSpatialization(MediaFormat mediaFormat, Format format) {
if (canBeSpatialized(format)) {
mediaFormat.setInteger(KEY_MAX_OUTPUT_CHANNEL_COUNT, SPATIALIZATION_CHANNEL_COUNT);
}
}
/** Informs the helper that a codec was initialized. */
public void onCodecInitialized(MediaCodecAdapter.Configuration configuration) {
codecNeedsReinit = false;
inputFormat = configuration.format;
codecConfiguredForSpatialization =
configuration.mediaFormat.containsKey(KEY_MAX_OUTPUT_CHANNEL_COUNT)
&& configuration.mediaFormat.getInteger(KEY_MAX_OUTPUT_CHANNEL_COUNT)
== SPATIALIZATION_CHANNEL_COUNT;
}
/**
* Returns whether the codec should be re-initialized, caused by a change in the spatialization
* properties.
*/
public boolean shouldReinitCodec() {
return codecNeedsReinit;
}
// SpatializerDelegate.Listener
@Override
public void onSpatializerEnabledChanged(SpatializerDelegate spatializer, boolean enabled) {
updateCodecNeedsReinit();
}
@Override
public void onSpatializerAvailableChanged(SpatializerDelegate spatializer, boolean available) {
updateCodecNeedsReinit();
}
// Other internal methods
/** Returns whether this format can be spatialized by the platform. */
private boolean canBeSpatialized(@Nullable Format format) {
if (Util.SDK_INT < 32
|| format == null
|| audioAttributes == null
|| spatializerDelegate == null
|| spatializerDelegate.getImmersiveAudioLevel()
!= SpatializerDelegate.SPATIALIZER_IMMERSIVE_LEVEL_MULTICHANNEL
|| !spatializerDelegate.isAvailable()
|| !spatializerDelegate.isEnabled()) {
return false;
}
AudioFormat.Builder audioFormatBuilder =
new AudioFormat.Builder()
.setEncoding(AudioFormat.ENCODING_PCM_16BIT)
.setChannelMask(Util.getAudioTrackChannelConfig(format.channelCount));
if (format.sampleRate != Format.NO_VALUE) {
audioFormatBuilder.setSampleRate(format.sampleRate);
}
return spatializerDelegate.canBeSpatialized(
audioAttributes.getAudioAttributesV21(), audioFormatBuilder.build());
}
private void maybeAddSpatalizationListener() {
if (!listenerAdded && spatializerDelegate != null && Util.SDK_INT >= 32) {
if (handler == null) {
// Route callbacks to the playback thread.
handler = Util.createHandlerForCurrentLooper();
}
spatializerDelegate.addOnSpatializerStateChangedListener(handler::post, this);
listenerAdded = true;
}
}
private void maybeRemoveSpatalizationListener() {
if (listenerAdded && spatializerDelegate != null && Util.SDK_INT >= 32) {
spatializerDelegate.removeOnSpatializerStateChangedListener(this);
checkStateNotNull(handler).removeCallbacksAndMessages(null);
}
}
private void updateCodecNeedsReinit() {
codecNeedsReinit = codecConfiguredForSpatialization != canBeSpatialized(inputFormat);
}
@Nullable
private static SpatializerDelegate maybeCreateSpatializer(Context context) {
if (Util.SDK_INT >= 32) {
return Api32.createSpatializer(context);
}
return null;
}
}
@RequiresApi(32)
private static final class Api32 {
private Api32() {}
@DoNotInline
@Nullable
public static SpatializerDelegate createSpatializer(Context context) {
try {
return new SpatializerDelegate(context);
} catch (ClassNotFoundException | NoSuchMethodException | IllegalAccessException e) {
// Do nothing for these cases.
} catch (InvocationTargetException e) {
Log.w(TAG, "Failed to load Spatializer with reflection", e);
}
return null;
}
}
}

View file

@ -564,6 +564,14 @@ public abstract class MediaCodecRenderer extends BaseRenderer {
return true;
}
/**
* Returns whether the renderer needs to re-initialize the codec, possibly as a result of a change
* in device capabilities.
*/
protected boolean shouldReinitCodec() {
return false;
}
/**
* Returns whether the codec needs the renderer to propagate the end-of-stream signal directly,
* rather than by using an end-of-stream buffer queued to the codec.
@ -1118,7 +1126,7 @@ public abstract class MediaCodecRenderer extends BaseRenderer {
decoderCounters.decoderInitCount++;
long elapsed = codecInitializedTimestamp - codecInitializingTimestamp;
onCodecInitialized(codecName, codecInitializedTimestamp, elapsed);
onCodecInitialized(codecName, configuration, codecInitializedTimestamp, elapsed);
}
private boolean shouldContinueRendering(long renderStartTimeMs) {
@ -1158,6 +1166,9 @@ public abstract class MediaCodecRenderer extends BaseRenderer {
if (codec == null || codecDrainState == DRAIN_STATE_WAIT_END_OF_STREAM || inputStreamEnded) {
return false;
}
if (codecDrainState == DRAIN_STATE_NONE && shouldReinitCodec()) {
drainAndReinitializeCodec();
}
if (inputIndex < 0) {
inputIndex = codec.dequeueInputBufferIndex();
@ -1352,12 +1363,16 @@ public abstract class MediaCodecRenderer extends BaseRenderer {
* <p>The default implementation is a no-op.
*
* @param name The name of the codec that was initialized.
* @param configuration The {@link MediaCodecAdapter.Configuration} used to configure the codec.
* @param initializedTimestampMs {@link SystemClock#elapsedRealtime()} when initialization
* finished.
* @param initializationDurationMs The time taken to initialize the codec in milliseconds.
*/
protected void onCodecInitialized(
String name, long initializedTimestampMs, long initializationDurationMs) {
String name,
MediaCodecAdapter.Configuration configuration,
long initializedTimestampMs,
long initializationDurationMs) {
// Do nothing.
}

View file

@ -142,7 +142,7 @@ public final class MediaCodecUtil {
return decoderInfos.isEmpty() ? null : decoderInfos.get(0);
}
/*
/**
* Returns all {@link MediaCodecInfo}s for the given mime type, in the order given by {@link
* MediaCodecList}.
*

View file

@ -365,15 +365,14 @@ public final class DefaultMediaSourceFactory implements MediaSourceFactory {
};
mediaSources[i + 1] =
new ProgressiveMediaSource.Factory(dataSourceFactory, extractorsFactory)
.setLoadErrorHandlingPolicy(loadErrorHandlingPolicy)
.createMediaSource(
MediaItem.fromUri(subtitleConfigurations.get(i).uri.toString()));
} else {
SingleSampleMediaSource.Factory singleSampleSourceFactory =
new SingleSampleMediaSource.Factory(dataSourceFactory)
.setLoadErrorHandlingPolicy(loadErrorHandlingPolicy);
mediaSources[i + 1] =
singleSampleSourceFactory.createMediaSource(
subtitleConfigurations.get(i), /* durationUs= */ C.TIME_UNSET);
new SingleSampleMediaSource.Factory(dataSourceFactory)
.setLoadErrorHandlingPolicy(loadErrorHandlingPolicy)
.createMediaSource(subtitleConfigurations.get(i), /* durationUs= */ C.TIME_UNSET);
}
}

View file

@ -66,7 +66,7 @@ import java.util.Arrays;
/** Clears all sample data. */
public void reset() {
clearAllocationNodes(firstAllocationNode);
firstAllocationNode = new AllocationNode(0, allocationLength);
firstAllocationNode.reset(/* startPosition= */ 0, allocationLength);
readAllocationNode = firstAllocationNode;
writeAllocationNode = firstAllocationNode;
totalBytesWritten = 0;
@ -462,9 +462,9 @@ import java.util.Arrays;
private static final class AllocationNode implements Allocator.AllocationNode {
/** The absolute position of the start of the data (inclusive). */
public final long startPosition;
public long startPosition;
/** The absolute position of the end of the data (exclusive). */
public final long endPosition;
public long endPosition;
/**
* The {@link Allocation}, or {@code null} if the node is not {@link #initialize initialized}.
*/
@ -481,6 +481,17 @@ import java.util.Arrays;
* initialized.
*/
public AllocationNode(long startPosition, int allocationLength) {
reset(startPosition, allocationLength);
}
/**
* Sets the {@link #startPosition} and the {@link Allocation} length.
*
* <p>Must only be called for uninitialized instances, where {@link #allocation} is {@code
* null}.
*/
public void reset(long startPosition, int allocationLength) {
Assertions.checkState(allocation == null);
this.startPosition = startPosition;
this.endPosition = startPosition + allocationLength;
}

View file

@ -991,7 +991,6 @@ public final class ServerSideAdInsertionMediaSource extends BaseMediaSource
public ServerSideAdInsertionTimeline(
Timeline contentTimeline, ImmutableMap<Object, AdPlaybackState> adPlaybackStates) {
super(contentTimeline);
checkState(contentTimeline.getPeriodCount() == 1);
checkState(contentTimeline.getWindowCount() == 1);
Period period = new Period();
for (int i = 0; i < contentTimeline.getPeriodCount(); i++) {
@ -1005,25 +1004,23 @@ public final class ServerSideAdInsertionMediaSource extends BaseMediaSource
public Window getWindow(int windowIndex, Window window, long defaultPositionProjectionUs) {
super.getWindow(windowIndex, window, defaultPositionProjectionUs);
Object firstPeriodUid =
checkNotNull(getPeriod(/* periodIndex= */ 0, new Period(), /* setIds= */ true).uid);
AdPlaybackState adPlaybackState = checkNotNull(adPlaybackStates.get(firstPeriodUid));
checkNotNull(getPeriod(window.firstPeriodIndex, new Period(), /* setIds= */ true).uid);
AdPlaybackState firstAdPlaybackState = checkNotNull(adPlaybackStates.get(firstPeriodUid));
long positionInPeriodUs =
getMediaPeriodPositionUsForContent(
window.positionInFirstPeriodUs,
/* nextAdGroupIndex= */ C.INDEX_UNSET,
adPlaybackState);
firstAdPlaybackState);
if (window.durationUs == C.TIME_UNSET) {
if (adPlaybackState.contentDurationUs != C.TIME_UNSET) {
window.durationUs = adPlaybackState.contentDurationUs - positionInPeriodUs;
if (firstAdPlaybackState.contentDurationUs != C.TIME_UNSET) {
window.durationUs = firstAdPlaybackState.contentDurationUs - positionInPeriodUs;
}
} else {
long actualWindowEndPositionInPeriodUs = window.positionInFirstPeriodUs + window.durationUs;
long windowEndPositionInPeriodUs =
getMediaPeriodPositionUsForContent(
actualWindowEndPositionInPeriodUs,
/* nextAdGroupIndex= */ C.INDEX_UNSET,
adPlaybackState);
window.durationUs = windowEndPositionInPeriodUs - positionInPeriodUs;
Period lastPeriod = getPeriod(/* periodIndex= */ window.lastPeriodIndex, new Period());
window.durationUs =
lastPeriod.durationUs == C.TIME_UNSET
? C.TIME_UNSET
: lastPeriod.positionInWindowUs + lastPeriod.durationUs;
}
window.positionInFirstPeriodUs = positionInPeriodUs;
return window;
@ -1041,11 +1038,26 @@ public final class ServerSideAdInsertionMediaSource extends BaseMediaSource
getMediaPeriodPositionUsForContent(
durationUs, /* nextAdGroupIndex= */ C.INDEX_UNSET, adPlaybackState);
}
long positionInWindowUs =
-getMediaPeriodPositionUsForContent(
-period.getPositionInWindowUs(),
/* nextAdGroupIndex= */ C.INDEX_UNSET,
adPlaybackState);
long positionInWindowUs = 0;
Period innerPeriod = new Period();
for (int i = 0; i < periodIndex + 1; i++) {
timeline.getPeriod(/* periodIndex= */ i, innerPeriod, /* setIds= */ true);
AdPlaybackState innerAdPlaybackState = checkNotNull(adPlaybackStates.get(innerPeriod.uid));
if (i == 0) {
positionInWindowUs =
-getMediaPeriodPositionUsForContent(
-innerPeriod.getPositionInWindowUs(),
/* nextAdGroupIndex= */ C.INDEX_UNSET,
innerAdPlaybackState);
}
if (i != periodIndex) {
positionInWindowUs +=
getMediaPeriodPositionUsForContent(
innerPeriod.durationUs,
/* nextAdGroupIndex= */ C.INDEX_UNSET,
innerAdPlaybackState);
}
}
period.set(
period.id,
period.uid,

View file

@ -15,6 +15,7 @@
*/
package com.google.android.exoplayer2.source.ads;
import static com.google.android.exoplayer2.util.Util.sum;
import static java.lang.Math.max;
import androidx.annotation.CheckResult;
@ -33,23 +34,25 @@ public final class ServerSideAdInsertionUtil {
/**
* Adds a new server-side inserted ad group to an {@link AdPlaybackState}.
*
* <p>If the first ad with a non-zero duration is not the first ad in the group, all ads before
* that ad are marked as skipped.
*
* @param adPlaybackState The existing {@link AdPlaybackState}.
* @param fromPositionUs The position in the underlying server-side inserted ads stream at which
* the ad group starts, in microseconds.
* @param toPositionUs The position in the underlying server-side inserted ads stream at which the
* ad group ends, in microseconds.
* @param contentResumeOffsetUs The timestamp offset which should be added to the content stream
* when resuming playback after the ad group. An offset of 0 collapses the ad group to a
* single insertion point, an offset of {@code toPositionUs-fromPositionUs} keeps the original
* stream timestamps after the ad group.
* @param adDurationsUs The durations of the ads to be added to the group, in microseconds.
* @return The updated {@link AdPlaybackState}.
*/
@CheckResult
public static AdPlaybackState addAdGroupToAdPlaybackState(
AdPlaybackState adPlaybackState,
long fromPositionUs,
long toPositionUs,
long contentResumeOffsetUs) {
long contentResumeOffsetUs,
long... adDurationsUs) {
long adGroupInsertionPositionUs =
getMediaPeriodPositionUsForContent(
fromPositionUs, /* nextAdGroupIndex= */ C.INDEX_UNSET, adPlaybackState);
@ -59,39 +62,21 @@ public final class ServerSideAdInsertionUtil {
&& adPlaybackState.getAdGroup(insertionIndex).timeUs <= adGroupInsertionPositionUs) {
insertionIndex++;
}
long adDurationUs = toPositionUs - fromPositionUs;
adPlaybackState =
adPlaybackState
.withNewAdGroup(insertionIndex, adGroupInsertionPositionUs)
.withIsServerSideInserted(insertionIndex, /* isServerSideInserted= */ true)
.withAdCount(insertionIndex, /* adCount= */ 1)
.withAdDurationsUs(insertionIndex, adDurationUs)
.withAdCount(insertionIndex, /* adCount= */ adDurationsUs.length)
.withAdDurationsUs(insertionIndex, adDurationsUs)
.withContentResumeOffsetUs(insertionIndex, contentResumeOffsetUs);
// Mark all ads as skipped that are before the first ad with a non-zero duration.
int adIndex = 0;
while (adIndex < adDurationsUs.length && adDurationsUs[adIndex] == 0) {
adPlaybackState =
adPlaybackState.withSkippedAd(insertionIndex, /* adIndexInAdGroup= */ adIndex++);
}
return correctFollowingAdGroupTimes(
adPlaybackState, insertionIndex, adDurationUs, contentResumeOffsetUs);
}
/**
* Returns the duration of the underlying server-side inserted ads stream for the current {@link
* Timeline.Period} in the {@link Player}.
*
* @param player The {@link Player}.
* @param adPlaybackState The {@link AdPlaybackState} defining the ad groups.
* @return The duration of the underlying server-side inserted ads stream, in microseconds, or
* {@link C#TIME_UNSET} if it can't be determined.
*/
public static long getStreamDurationUs(Player player, AdPlaybackState adPlaybackState) {
Timeline timeline = player.getCurrentTimeline();
if (timeline.isEmpty()) {
return C.TIME_UNSET;
}
Timeline.Period period =
timeline.getPeriod(player.getCurrentPeriodIndex(), new Timeline.Period());
if (period.durationUs == C.TIME_UNSET) {
return C.TIME_UNSET;
}
return getStreamPositionUsForContent(
period.durationUs, /* nextAdGroupIndex= */ C.INDEX_UNSET, adPlaybackState);
adPlaybackState, insertionIndex, sum(adDurationsUs), contentResumeOffsetUs);
}
/**

View file

@ -490,7 +490,6 @@ public class MediaCodecVideoRenderer extends MediaCodecRenderer {
releaseCodec();
}
eventDispatcher.enabled(decoderCounters);
frameReleaseHelper.onEnabled();
mayRenderFirstFrameAfterEnableIfNotStarted = mayRenderStartOfStream;
renderedFirstFrameAfterEnable = false;
}
@ -558,7 +557,6 @@ public class MediaCodecVideoRenderer extends MediaCodecRenderer {
clearReportedVideoSize();
clearRenderedFirstFrame();
haveReportedFirstFrameRenderedForCurrentSurface = false;
frameReleaseHelper.onDisabled();
tunnelingOnFrameRenderedListener = null;
try {
super.onDisabled();
@ -770,7 +768,10 @@ public class MediaCodecVideoRenderer extends MediaCodecRenderer {
@Override
protected void onCodecInitialized(
String name, long initializedTimestampMs, long initializationDurationMs) {
String name,
MediaCodecAdapter.Configuration configuration,
long initializedTimestampMs,
long initializationDurationMs) {
eventDispatcher.decoderInitialized(name, initializedTimestampMs, initializationDurationMs);
codecNeedsSetOutputSurfaceWorkaround = codecNeedsSetOutputSurfaceWorkaround(name);
codecHandlesHdr10PlusOutOfBandMetadata =

View file

@ -149,18 +149,14 @@ public final class VideoFrameReleaseHelper {
updateSurfacePlaybackFrameRate(/* forceUpdate= */ true);
}
/** Called when the renderer is enabled. */
public void onEnabled() {
if (displayHelper != null) {
checkNotNull(vsyncSampler).addObserver();
displayHelper.register(this::updateDefaultDisplayRefreshRateParams);
}
}
/** Called when the renderer is started. */
public void onStarted() {
started = true;
resetAdjustment();
if (displayHelper != null) {
checkNotNull(vsyncSampler).addObserver();
displayHelper.register(this::updateDefaultDisplayRefreshRateParams);
}
updateSurfacePlaybackFrameRate(/* forceUpdate= */ false);
}
@ -227,15 +223,11 @@ public final class VideoFrameReleaseHelper {
/** Called when the renderer is stopped. */
public void onStopped() {
started = false;
clearSurfaceFrameRate();
}
/** Called when the renderer is disabled. */
public void onDisabled() {
if (displayHelper != null) {
displayHelper.unregister();
checkNotNull(vsyncSampler).removeObserver();
}
clearSurfaceFrameRate();
}
// Frame release time adjustment.

View file

@ -9324,6 +9324,85 @@ public final class ExoPlayerTest {
.onPlaybackParametersChanged(new PlaybackParameters(/* speed= */ 2, /* pitch= */ 2));
}
@Test
public void setPlaybackSpeed_withAdPlayback_onlyAppliesToContent() throws Exception {
// Create renderer with media clock to listen to playback parameter changes.
ArrayList<PlaybackParameters> playbackParameters = new ArrayList<>();
FakeMediaClockRenderer audioRenderer =
new FakeMediaClockRenderer(C.TRACK_TYPE_AUDIO) {
private long positionUs;
@Override
protected void onStreamChanged(Format[] formats, long startPositionUs, long offsetUs) {
this.positionUs = offsetUs;
}
@Override
public long getPositionUs() {
// Continuously increase position to let playback progress.
positionUs += 10_000;
return positionUs;
}
@Override
public void setPlaybackParameters(PlaybackParameters parameters) {
playbackParameters.add(parameters);
}
@Override
public PlaybackParameters getPlaybackParameters() {
return playbackParameters.isEmpty()
? PlaybackParameters.DEFAULT
: Iterables.getLast(playbackParameters);
}
};
ExoPlayer player = new TestExoPlayerBuilder(context).setRenderers(audioRenderer).build();
AdPlaybackState adPlaybackState =
FakeTimeline.createAdPlaybackState(
/* adsPerAdGroup= */ 1,
/* adGroupTimesUs...= */ 0,
7 * C.MICROS_PER_SECOND,
C.TIME_END_OF_SOURCE);
TimelineWindowDefinition adTimelineDefinition =
new TimelineWindowDefinition(
/* periodCount= */ 1,
/* id= */ 0,
/* isSeekable= */ true,
/* isDynamic= */ false,
/* isLive= */ false,
/* isPlaceholder= */ false,
/* durationUs= */ 10 * C.MICROS_PER_SECOND,
/* defaultPositionUs= */ 0,
/* windowOffsetInFirstPeriodUs= */ 0,
adPlaybackState);
player.setMediaSource(
new FakeMediaSource(
new FakeTimeline(adTimelineDefinition), ExoPlayerTestRunner.AUDIO_FORMAT));
Player.Listener mockListener = mock(Player.Listener.class);
player.addListener(mockListener);
player.setPlaybackSpeed(5f);
player.prepare();
player.play();
runUntilPlaybackState(player, Player.STATE_ENDED);
player.release();
// Assert that the renderer received the playback speed updates at each ad/content boundary.
assertThat(playbackParameters)
.containsExactly(
/* preroll ad */ new PlaybackParameters(1f),
/* content after preroll */ new PlaybackParameters(5f),
/* midroll ad */ new PlaybackParameters(1f),
/* content after midroll */ new PlaybackParameters(5f),
/* postroll ad */ new PlaybackParameters(1f),
/* content after postroll */ new PlaybackParameters(5f))
.inOrder();
// Assert that user-set speed was reported, but none of the ad overrides.
verify(mockListener).onPlaybackParametersChanged(any());
verify(mockListener).onPlaybackParametersChanged(new PlaybackParameters(5.0f));
}
@Test
public void targetLiveOffsetInMedia_withSetPlaybackParameters_usesPlaybackParameterSpeed()
throws Exception {

View file

@ -22,7 +22,9 @@ import static org.robolectric.Shadows.shadowOf;
import android.net.Uri;
import android.os.Handler;
import android.os.Looper;
import androidx.test.core.app.ApplicationProvider;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import com.google.android.exoplayer2.analytics.AnalyticsCollector;
import com.google.android.exoplayer2.analytics.PlayerId;
import com.google.android.exoplayer2.source.MediaSource.MediaPeriodId;
import com.google.android.exoplayer2.source.MediaSource.MediaSourceCaller;
@ -37,6 +39,7 @@ import com.google.android.exoplayer2.trackselection.ExoTrackSelection;
import com.google.android.exoplayer2.trackselection.TrackSelector;
import com.google.android.exoplayer2.trackselection.TrackSelectorResult;
import com.google.android.exoplayer2.upstream.Allocator;
import com.google.android.exoplayer2.util.Clock;
import com.google.common.collect.ImmutableList;
import org.junit.Before;
import org.junit.Test;
@ -74,12 +77,16 @@ public final class MediaPeriodQueueTest {
@Before
public void setUp() {
AnalyticsCollector analyticsCollector = new AnalyticsCollector(Clock.DEFAULT);
analyticsCollector.setPlayer(
new ExoPlayer.Builder(ApplicationProvider.getApplicationContext()).build(),
Looper.getMainLooper());
mediaPeriodQueue =
new MediaPeriodQueue(/* analyticsCollector= */ null, new Handler(Looper.getMainLooper()));
new MediaPeriodQueue(analyticsCollector, new Handler(Looper.getMainLooper()));
mediaSourceList =
new MediaSourceList(
mock(MediaSourceList.MediaSourceListInfoRefreshListener.class),
/* analyticsCollector= */ null,
analyticsCollector,
new Handler(Looper.getMainLooper()),
PlayerId.UNSET);
rendererCapabilities = new RendererCapabilities[0];

View file

@ -25,12 +25,16 @@ import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.when;
import android.os.Looper;
import androidx.test.core.app.ApplicationProvider;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import com.google.android.exoplayer2.analytics.AnalyticsCollector;
import com.google.android.exoplayer2.analytics.PlayerId;
import com.google.android.exoplayer2.source.MediaSource;
import com.google.android.exoplayer2.source.ShuffleOrder;
import com.google.android.exoplayer2.testutil.FakeMediaSource;
import com.google.android.exoplayer2.testutil.FakeShuffleOrder;
import com.google.android.exoplayer2.util.Clock;
import com.google.android.exoplayer2.util.Util;
import java.util.ArrayList;
import java.util.Collections;
@ -51,10 +55,14 @@ public class MediaSourceListTest {
@Before
public void setUp() {
AnalyticsCollector analyticsCollector = new AnalyticsCollector(Clock.DEFAULT);
analyticsCollector.setPlayer(
new ExoPlayer.Builder(ApplicationProvider.getApplicationContext()).build(),
Looper.getMainLooper());
mediaSourceList =
new MediaSourceList(
mock(MediaSourceList.MediaSourceListInfoRefreshListener.class),
/* analyticsCollector= */ null,
analyticsCollector,
Util.createHandlerForCurrentOrMainLooper(),
PlayerId.UNSET);
}

View file

@ -743,6 +743,126 @@ public final class DefaultPlaybackSessionManagerTest {
verify(mockListener, never()).onSessionFinished(any(), anyString(), anyBoolean());
}
@Test
public void timelineUpdate_toNewMediaWithWindowIndexOnly_finishesOtherSessions() {
Timeline firstTimeline =
new FakeTimeline(
new TimelineWindowDefinition(/* periodCount= */ 1, /* id= */ 1000),
new TimelineWindowDefinition(/* periodCount= */ 1, /* id= */ 2000),
new TimelineWindowDefinition(/* periodCount= */ 1, /* id= */ 3000));
EventTime eventTimeFirstTimelineWithPeriodId =
createEventTime(
firstTimeline,
/* windowIndex= */ 0,
new MediaPeriodId(
firstTimeline.getUidOfPeriod(/* periodIndex= */ 0), /* windowSequenceNumber= */ 0));
EventTime eventTimeFirstTimelineWindowOnly1 =
createEventTime(firstTimeline, /* windowIndex= */ 1, /* mediaPeriodId= */ null);
EventTime eventTimeFirstTimelineWindowOnly2 =
createEventTime(firstTimeline, /* windowIndex= */ 2, /* mediaPeriodId= */ null);
Timeline secondTimeline =
new FakeTimeline(
new TimelineWindowDefinition(/* periodCount= */ 1, /* id= */ 2000),
new TimelineWindowDefinition(/* periodCount= */ 1, /* id= */ 4000));
EventTime eventTimeSecondTimeline =
createEventTime(secondTimeline, /* windowIndex= */ 0, /* mediaPeriodId= */ null);
sessionManager.updateSessionsWithTimelineChange(eventTimeFirstTimelineWithPeriodId);
sessionManager.updateSessions(eventTimeFirstTimelineWindowOnly1);
sessionManager.updateSessions(eventTimeFirstTimelineWindowOnly2);
sessionManager.updateSessionsWithTimelineChange(eventTimeSecondTimeline);
InOrder inOrder = inOrder(mockListener);
ArgumentCaptor<String> firstId = ArgumentCaptor.forClass(String.class);
inOrder
.verify(mockListener)
.onSessionCreated(eq(eventTimeFirstTimelineWithPeriodId), firstId.capture());
inOrder
.verify(mockListener)
.onSessionActive(eventTimeFirstTimelineWithPeriodId, firstId.getValue());
ArgumentCaptor<String> secondId = ArgumentCaptor.forClass(String.class);
inOrder
.verify(mockListener)
.onSessionCreated(eq(eventTimeFirstTimelineWindowOnly1), secondId.capture());
ArgumentCaptor<String> thirdId = ArgumentCaptor.forClass(String.class);
inOrder
.verify(mockListener)
.onSessionCreated(eq(eventTimeFirstTimelineWindowOnly2), thirdId.capture());
// The sessions may finish at the same time, so the order of these two callbacks is undefined.
ArgumentCaptor<String> finishedSessions = ArgumentCaptor.forClass(String.class);
inOrder
.verify(mockListener, times(2))
.onSessionFinished(
eq(eventTimeSecondTimeline),
finishedSessions.capture(),
/* automaticTransitionToNextPlayback= */ eq(false));
assertThat(finishedSessions.getAllValues())
.containsExactly(firstId.getValue(), thirdId.getValue());
inOrder.verify(mockListener).onSessionActive(eventTimeSecondTimeline, secondId.getValue());
inOrder.verifyNoMoreInteractions();
}
@Test
public void timelineUpdate_toNewMediaWithMediaPeriodId_finishesOtherSessions() {
Timeline firstTimeline =
new FakeTimeline(
new TimelineWindowDefinition(/* periodCount= */ 1, /* id= */ 1000),
new TimelineWindowDefinition(/* periodCount= */ 1, /* id= */ 2000),
new TimelineWindowDefinition(/* periodCount= */ 1, /* id= */ 3000));
EventTime eventTimeFirstTimeline1 =
createEventTime(
firstTimeline,
/* windowIndex= */ 0,
new MediaPeriodId(
firstTimeline.getUidOfPeriod(/* periodIndex= */ 0), /* windowSequenceNumber= */ 0));
EventTime eventTimeFirstTimeline2 =
createEventTime(
firstTimeline,
/* windowIndex= */ 1,
new MediaPeriodId(
firstTimeline.getUidOfPeriod(/* periodIndex= */ 1), /* windowSequenceNumber= */ 1));
EventTime eventTimeFirstTimeline3 =
createEventTime(
firstTimeline,
/* windowIndex= */ 2,
new MediaPeriodId(
firstTimeline.getUidOfPeriod(/* periodIndex= */ 2), /* windowSequenceNumber= */ 2));
Timeline secondTimeline =
new FakeTimeline(
new TimelineWindowDefinition(/* periodCount= */ 1, /* id= */ 2000),
new TimelineWindowDefinition(/* periodCount= */ 1, /* id= */ 1000),
new TimelineWindowDefinition(/* periodCount= */ 1, /* id= */ 3000));
EventTime eventTimeSecondTimeline =
createEventTime(
secondTimeline,
/* windowIndex= */ 0,
new MediaPeriodId(
secondTimeline.getUidOfPeriod(/* periodIndex= */ 0),
/* windowSequenceNumber= */ 1));
sessionManager.updateSessionsWithTimelineChange(eventTimeFirstTimeline1);
sessionManager.updateSessions(eventTimeFirstTimeline2);
sessionManager.updateSessions(eventTimeFirstTimeline3);
sessionManager.updateSessionsWithTimelineChange(eventTimeSecondTimeline);
InOrder inOrder = inOrder(mockListener);
ArgumentCaptor<String> firstId = ArgumentCaptor.forClass(String.class);
inOrder.verify(mockListener).onSessionCreated(eq(eventTimeFirstTimeline1), firstId.capture());
inOrder.verify(mockListener).onSessionActive(eventTimeFirstTimeline1, firstId.getValue());
ArgumentCaptor<String> secondId = ArgumentCaptor.forClass(String.class);
inOrder.verify(mockListener).onSessionCreated(eq(eventTimeFirstTimeline2), secondId.capture());
ArgumentCaptor<String> thirdId = ArgumentCaptor.forClass(String.class);
inOrder.verify(mockListener).onSessionCreated(eq(eventTimeFirstTimeline3), thirdId.capture());
inOrder
.verify(mockListener)
.onSessionFinished(
eventTimeSecondTimeline,
firstId.getValue(),
/* automaticTransitionToNextPlayback= */ false);
inOrder.verify(mockListener).onSessionActive(eventTimeSecondTimeline, secondId.getValue());
inOrder.verifyNoMoreInteractions();
}
@Test
public void positionDiscontinuity_withinWindow_doesNotFinishSession() {
Timeline timeline =

View file

@ -0,0 +1,270 @@
/*
* Copyright (C) 2022 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.android.exoplayer2.audio;
import static com.google.android.exoplayer2.audio.DefaultAudioSink.OUTPUT_MODE_PASSTHROUGH;
import static com.google.android.exoplayer2.audio.DefaultAudioSink.OUTPUT_MODE_PCM;
import static com.google.android.exoplayer2.audio.DefaultAudioTrackBufferSizeProvider.getMaximumEncodedRateBytesPerSecond;
import static com.google.common.truth.Truth.assertThat;
import com.google.android.exoplayer2.C;
import com.google.android.exoplayer2.util.Util;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableSet;
import com.google.common.collect.Sets;
import java.util.List;
import java.util.stream.Collectors;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.JUnit4;
import org.junit.runners.Parameterized;
/** Tests for {@link DefaultAudioTrackBufferSizeProvider}. */
@RunWith(JUnit4.class)
public class DefaultAudioTrackBufferSizeProviderTest {
private static final DefaultAudioTrackBufferSizeProvider DEFAULT =
new DefaultAudioTrackBufferSizeProvider.Builder().build();
/** Tests for {@link DefaultAudioTrackBufferSizeProvider} for PCM audio. */
@RunWith(Parameterized.class)
public static class PcmTest {
@Parameterized.Parameter(0)
@C.PcmEncoding
public int encoding;
@Parameterized.Parameter(1)
public int channelCount;
@Parameterized.Parameter(2)
public int sampleRate;
@Parameterized.Parameters(name = "{index}: encoding={0}, channelCount={1}, sampleRate={2}")
public static List<Integer[]> data() {
return Sets.cartesianProduct(
ImmutableList.of(
/* encoding */ ImmutableSet.of(
C.ENCODING_PCM_8BIT,
C.ENCODING_PCM_16BIT,
C.ENCODING_PCM_16BIT_BIG_ENDIAN,
C.ENCODING_PCM_24BIT,
C.ENCODING_PCM_32BIT,
C.ENCODING_PCM_FLOAT),
/* channelCount */ ImmutableSet.of(1, 2, 3, 4, 6, 8),
/* sampleRate*/ ImmutableSet.of(
8000, 11025, 16000, 22050, 44100, 48000, 88200, 96000)))
.stream()
.map(s -> s.toArray(new Integer[0]))
.collect(Collectors.toList());
}
private int getPcmFrameSize() {
return Util.getPcmFrameSize(encoding, channelCount);
}
private int durationUsToBytes(int durationUs) {
return (int) (((long) durationUs * getPcmFrameSize() * sampleRate) / C.MICROS_PER_SECOND);
}
@Test
public void getBufferSizeInBytes_veryBigMinBufferSize_isMinBufferSize() {
int bufferSize =
DEFAULT.getBufferSizeInBytes(
/* minBufferSizeInBytes= */ 123456789,
/* encoding= */ encoding,
/* outputMode= */ OUTPUT_MODE_PCM,
/* pcmFrameSize= */ getPcmFrameSize(),
/* sampleRate= */ sampleRate,
/* maxAudioTrackPlaybackSpeed= */ 1);
assertThat(bufferSize).isEqualTo(123456789);
}
@Test
public void getBufferSizeInBytes_noMinBufferSize_isMinBufferDuration() {
int bufferSize =
DEFAULT.getBufferSizeInBytes(
/* minBufferSizeInBytes= */ 0,
/* encoding= */ encoding,
/* outputMode= */ OUTPUT_MODE_PCM,
/* pcmFrameSize= */ getPcmFrameSize(),
/* sampleRate= */ sampleRate,
/* maxAudioTrackPlaybackSpeed= */ 1);
assertThat(bufferSize).isEqualTo(durationUsToBytes(DEFAULT.minPcmBufferDurationUs));
assertThat(bufferSize % getPcmFrameSize()).isEqualTo(0);
}
@Test
public void getBufferSizeInBytes_tooSmallMinBufferSize_isMinBufferDuration() {
int minBufferSizeInBytes =
durationUsToBytes(DEFAULT.minPcmBufferDurationUs / DEFAULT.pcmBufferMultiplicationFactor)
- 1;
int bufferSize =
DEFAULT.getBufferSizeInBytes(
/* minBufferSizeInBytes= */ minBufferSizeInBytes,
/* encoding= */ encoding,
/* outputMode= */ OUTPUT_MODE_PCM,
/* pcmFrameSize= */ getPcmFrameSize(),
/* sampleRate= */ sampleRate,
/* maxAudioTrackPlaybackSpeed= */ 1);
assertThat(bufferSize).isEqualTo(durationUsToBytes(DEFAULT.minPcmBufferDurationUs));
}
@Test
public void getBufferSizeInBytes_lowMinBufferSize_multipliesAudioTrackMinBuffer() {
int minBufferSizeInBytes =
durationUsToBytes(DEFAULT.minPcmBufferDurationUs / DEFAULT.pcmBufferMultiplicationFactor)
+ 1;
int bufferSize =
DEFAULT.getBufferSizeInBytes(
/* minBufferSizeInBytes= */ minBufferSizeInBytes,
/* encoding= */ encoding,
/* outputMode= */ OUTPUT_MODE_PCM,
/* pcmFrameSize= */ getPcmFrameSize(),
/* sampleRate= */ sampleRate,
/* maxAudioTrackPlaybackSpeed= */ 1);
assertThat(bufferSize)
.isEqualTo(minBufferSizeInBytes * DEFAULT.pcmBufferMultiplicationFactor);
}
@Test
public void getBufferSizeInBytes_highMinBufferSize_multipliesAudioTrackMinBuffer() {
int minBufferSizeInBytes =
durationUsToBytes(DEFAULT.maxPcmBufferDurationUs / DEFAULT.pcmBufferMultiplicationFactor)
- 1;
int bufferSize =
DEFAULT.getBufferSizeInBytes(
/* minBufferSizeInBytes= */ minBufferSizeInBytes,
/* encoding= */ encoding,
/* outputMode= */ OUTPUT_MODE_PCM,
/* pcmFrameSize= */ getPcmFrameSize(),
/* sampleRate= */ sampleRate,
/* maxAudioTrackPlaybackSpeed= */ 1);
assertThat(bufferSize)
.isEqualTo(minBufferSizeInBytes * DEFAULT.pcmBufferMultiplicationFactor);
}
@Test
public void getBufferSizeInBytes_tooHighMinBufferSize_isMaxBufferDuration() {
int minBufferSizeInBytes =
durationUsToBytes(DEFAULT.maxPcmBufferDurationUs / DEFAULT.pcmBufferMultiplicationFactor)
+ 1;
int bufferSize =
DEFAULT.getBufferSizeInBytes(
/* minBufferSizeInBytes= */ minBufferSizeInBytes,
/* encoding= */ encoding,
/* outputMode= */ OUTPUT_MODE_PCM,
/* pcmFrameSize= */ getPcmFrameSize(),
/* sampleRate= */ sampleRate,
/* maxAudioTrackPlaybackSpeed= */ 1);
assertThat(bufferSize).isEqualTo(durationUsToBytes(DEFAULT.maxPcmBufferDurationUs));
assertThat(bufferSize % getPcmFrameSize()).isEqualTo(0);
}
@Test
public void getBufferSizeInBytes_lowPlaybackSpeed_isScaledByPlaybackSpeed() {
int bufferSize =
DEFAULT.getBufferSizeInBytes(
/* minBufferSizeInBytes= */ 0,
/* encoding= */ encoding,
/* outputMode= */ OUTPUT_MODE_PCM,
/* pcmFrameSize= */ getPcmFrameSize(),
/* sampleRate= */ sampleRate,
/* maxAudioTrackPlaybackSpeed= */ 1 / 5F);
assertThat(bufferSize).isEqualTo(durationUsToBytes(DEFAULT.minPcmBufferDurationUs / 5));
}
@Test
public void getBufferSizeInBytes_highPlaybackSpeed_isScaledByPlaybackSpeed() {
int bufferSize =
DEFAULT.getBufferSizeInBytes(
/* minBufferSizeInBytes= */ 0,
/* encoding= */ encoding,
/* outputMode= */ OUTPUT_MODE_PCM,
/* pcmFrameSize= */ getPcmFrameSize(),
/* sampleRate= */ sampleRate,
/* maxAudioTrackPlaybackSpeed= */ 8F);
assertThat(bufferSize).isEqualTo(durationUsToBytes(DEFAULT.minPcmBufferDurationUs * 8));
}
}
/**
* Tests for {@link DefaultAudioTrackBufferSizeProvider} for encoded audio except {@link
* C#ENCODING_AC3}.
*/
@RunWith(Parameterized.class)
public static class EncodedTest {
@Parameterized.Parameter(0)
@C.Encoding
public int encoding;
@Parameterized.Parameters(name = "{index}: encoding={0}")
public static ImmutableList<Integer> data() {
return ImmutableList.of(
C.ENCODING_MP3,
C.ENCODING_AAC_LC,
C.ENCODING_AAC_HE_V1,
C.ENCODING_AC4,
C.ENCODING_DTS,
C.ENCODING_DOLBY_TRUEHD);
}
@Test
public void getBufferSizeInBytes_veryBigMinBufferSize_isMinBufferSize() {
int bufferSize =
DEFAULT.getBufferSizeInBytes(
/* minBufferSizeInBytes= */ 123456789,
/* encoding= */ encoding,
/* outputMode= */ OUTPUT_MODE_PASSTHROUGH,
/* pcmFrameSize= */ 1,
/* sampleRate= */ 0,
/* maxAudioTrackPlaybackSpeed= */ 0);
assertThat(bufferSize).isEqualTo(123456789);
}
}
@Test
public void
getBufferSizeInBytes_passthroughAC3_isPassthroughBufferSizeTimesMultiplicationFactor() {
int bufferSize =
DEFAULT.getBufferSizeInBytes(
/* minBufferSizeInBytes= */ 0,
/* encoding= */ C.ENCODING_AC3,
/* outputMode= */ OUTPUT_MODE_PASSTHROUGH,
/* pcmFrameSize= */ 1,
/* sampleRate= */ 0,
/* maxAudioTrackPlaybackSpeed= */ 1);
assertThat(bufferSize)
.isEqualTo(
durationUsToAc3MaxBytes(DEFAULT.passthroughBufferDurationUs)
* DEFAULT.ac3BufferMultiplicationFactor);
}
private static int durationUsToAc3MaxBytes(long durationUs) {
return (int)
(durationUs * getMaximumEncodedRateBytesPerSecond(C.ENCODING_AC3) / C.MICROS_PER_SECOND);
}
}

View file

@ -48,7 +48,9 @@ public final class MkvPlaybackTest {
"sample_with_ssa_subtitles.mkv",
"sample_with_null_terminated_ssa_subtitles.mkv",
"sample_with_srt.mkv",
"sample_with_null_terminated_srt.mkv");
"sample_with_null_terminated_srt.mkv",
"sample_with_vtt_subtitles.mkv",
"sample_with_null_terminated_vtt_subtitles.mkv");
}
@ParameterizedRobolectricTestRunner.Parameter public String inputFile;

View file

@ -182,20 +182,20 @@ public final class ServerSideAdInsertionMediaSourceTest {
addAdGroupToAdPlaybackState(
adPlaybackState,
/* fromPositionUs= */ 0,
/* toPositionUs= */ 200_000,
/* contentResumeOffsetUs= */ 0);
/* contentResumeOffsetUs= */ 0,
/* adDurationsUs...= */ 200_000);
adPlaybackState =
addAdGroupToAdPlaybackState(
adPlaybackState,
/* fromPositionUs= */ 400_000,
/* toPositionUs= */ 700_000,
/* contentResumeOffsetUs= */ 1_000_000);
/* contentResumeOffsetUs= */ 1_000_000,
/* adDurationsUs...= */ 300_000);
AdPlaybackState firstAdPlaybackState =
addAdGroupToAdPlaybackState(
adPlaybackState,
/* fromPositionUs= */ 900_000,
/* toPositionUs= */ 1_000_000,
/* contentResumeOffsetUs= */ 0);
/* contentResumeOffsetUs= */ 0,
/* adDurationsUs...= */ 100_000);
AtomicReference<ServerSideAdInsertionMediaSource> mediaSourceRef = new AtomicReference<>();
mediaSourceRef.set(
@ -252,8 +252,8 @@ public final class ServerSideAdInsertionMediaSourceTest {
addAdGroupToAdPlaybackState(
new AdPlaybackState(/* adsId= */ new Object()),
/* fromPositionUs= */ 900_000,
/* toPositionUs= */ 1_000_000,
/* contentResumeOffsetUs= */ 0);
/* contentResumeOffsetUs= */ 0,
/* adDurationsUs...= */ 100_000);
AtomicReference<ServerSideAdInsertionMediaSource> mediaSourceRef = new AtomicReference<>();
mediaSourceRef.set(
new ServerSideAdInsertionMediaSource(
@ -280,8 +280,8 @@ public final class ServerSideAdInsertionMediaSourceTest {
addAdGroupToAdPlaybackState(
firstAdPlaybackState,
/* fromPositionUs= */ 0,
/* toPositionUs= */ 500_000,
/* contentResumeOffsetUs= */ 0);
/* contentResumeOffsetUs= */ 0,
/* adDurationsUs...= */ 500_000);
mediaSourceRef
.get()
.setAdPlaybackStates(ImmutableMap.of(periodUid.get(), secondAdPlaybackState));
@ -323,8 +323,8 @@ public final class ServerSideAdInsertionMediaSourceTest {
addAdGroupToAdPlaybackState(
new AdPlaybackState(/* adsId= */ new Object()),
/* fromPositionUs= */ 0,
/* toPositionUs= */ 500_000,
/* contentResumeOffsetUs= */ 0);
/* contentResumeOffsetUs= */ 0,
/* adDurationsUs...= */ 500_000);
AtomicReference<ServerSideAdInsertionMediaSource> mediaSourceRef = new AtomicReference<>();
mediaSourceRef.set(
new ServerSideAdInsertionMediaSource(
@ -391,20 +391,20 @@ public final class ServerSideAdInsertionMediaSourceTest {
addAdGroupToAdPlaybackState(
adPlaybackState,
/* fromPositionUs= */ 0,
/* toPositionUs= */ 100_000,
/* contentResumeOffsetUs= */ 0);
/* contentResumeOffsetUs= */ 0,
/* adDurationsUs...= */ 100_000);
adPlaybackState =
addAdGroupToAdPlaybackState(
adPlaybackState,
/* fromPositionUs= */ 600_000,
/* toPositionUs= */ 700_000,
/* contentResumeOffsetUs= */ 1_000_000);
/* contentResumeOffsetUs= */ 1_000_000,
/* adDurationsUs...= */ 100_000);
AdPlaybackState firstAdPlaybackState =
addAdGroupToAdPlaybackState(
adPlaybackState,
/* fromPositionUs= */ 900_000,
/* toPositionUs= */ 1_000_000,
/* contentResumeOffsetUs= */ 0);
/* contentResumeOffsetUs= */ 0,
/* adDurationsUs...= */ 100_000);
AtomicReference<ServerSideAdInsertionMediaSource> mediaSourceRef = new AtomicReference<>();
mediaSourceRef.set(
@ -427,7 +427,7 @@ public final class ServerSideAdInsertionMediaSourceTest {
player.setMediaSource(mediaSourceRef.get());
player.prepare();
// Play to the first content part, then seek past the midroll.
playUntilPosition(player, /* windowIndex= */ 0, /* positionMs= */ 100);
playUntilPosition(player, /* mediaItemIndex= */ 0, /* positionMs= */ 100);
player.seekTo(/* positionMs= */ 1_600);
runUntilPendingCommandsAreFullyHandled(player);
long positionAfterSeekMs = player.getCurrentPosition();

View file

@ -22,6 +22,7 @@ import static com.google.android.exoplayer2.source.ads.ServerSideAdInsertionUtil
import static com.google.android.exoplayer2.source.ads.ServerSideAdInsertionUtil.getStreamPositionUsForAd;
import static com.google.android.exoplayer2.source.ads.ServerSideAdInsertionUtil.getStreamPositionUsForContent;
import static com.google.common.truth.Truth.assertThat;
import static java.util.Arrays.stream;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import com.google.android.exoplayer2.C;
@ -46,8 +47,8 @@ public final class ServerSideAdInsertionUtilTest {
addAdGroupToAdPlaybackState(
state,
/* fromPositionUs= */ 4300,
/* toPositionUs= */ 4500,
/* contentResumeOffsetUs= */ 400);
/* contentResumeOffsetUs= */ 400,
/* adDurationsUs...= */ 200);
assertThat(state)
.isEqualTo(
@ -64,8 +65,8 @@ public final class ServerSideAdInsertionUtilTest {
addAdGroupToAdPlaybackState(
state,
/* fromPositionUs= */ 2100,
/* toPositionUs= */ 2400,
/* contentResumeOffsetUs= */ 0);
/* contentResumeOffsetUs= */ 0,
/* adDurationsUs...= */ 300);
assertThat(state)
.isEqualTo(
@ -86,8 +87,8 @@ public final class ServerSideAdInsertionUtilTest {
addAdGroupToAdPlaybackState(
state,
/* fromPositionUs= */ 0,
/* toPositionUs= */ 100,
/* contentResumeOffsetUs= */ 50);
/* contentResumeOffsetUs= */ 50,
/* adDurationsUs...= */ 100);
assertThat(state)
.isEqualTo(
@ -112,8 +113,8 @@ public final class ServerSideAdInsertionUtilTest {
addAdGroupToAdPlaybackState(
state,
/* fromPositionUs= */ 5000,
/* toPositionUs= */ 6000,
/* contentResumeOffsetUs= */ 0);
/* contentResumeOffsetUs= */ 0,
/* adDurationsUs...= */ 1000);
assertThat(state)
.isEqualTo(
@ -143,6 +144,33 @@ public final class ServerSideAdInsertionUtilTest {
.withAdDurationsUs(/* adGroupIndex= */ 5, /* adDurationsUs...= */ 1000));
}
@Test
public void addAdGroupToAdPlaybackState_emptyLeadingAds_markedAsSkipped() {
AdPlaybackState state = new AdPlaybackState(ADS_ID);
state =
addAdGroupToAdPlaybackState(
state,
/* fromPositionUs= */ 0,
/* contentResumeOffsetUs= */ 50_000,
/* adDurationsUs...= */ 0,
0,
10_000,
40_000,
0);
AdPlaybackState.AdGroup adGroup = state.getAdGroup(/* adGroupIndex= */ 0);
assertThat(adGroup.durationsUs[0]).isEqualTo(0);
assertThat(adGroup.states[0]).isEqualTo(AdPlaybackState.AD_STATE_SKIPPED);
assertThat(adGroup.durationsUs[1]).isEqualTo(0);
assertThat(adGroup.states[1]).isEqualTo(AdPlaybackState.AD_STATE_SKIPPED);
assertThat(adGroup.durationsUs[2]).isEqualTo(10_000);
assertThat(adGroup.states[2]).isEqualTo(AdPlaybackState.AD_STATE_UNAVAILABLE);
assertThat(adGroup.durationsUs[4]).isEqualTo(0);
assertThat(adGroup.states[4]).isEqualTo(AdPlaybackState.AD_STATE_UNAVAILABLE);
assertThat(stream(adGroup.durationsUs).sum()).isEqualTo(50_000);
}
@Test
public void getStreamPositionUsForAd_returnsCorrectPositions() {
// stream: 0-- ad1 --200-- content --2100-- ad2 --2300-- content --4300-- ad3 --4500-- content

View file

@ -20,6 +20,10 @@ import static com.google.android.exoplayer2.C.FORMAT_HANDLED;
import static com.google.android.exoplayer2.C.FORMAT_UNSUPPORTED_SUBTYPE;
import static com.google.android.exoplayer2.C.FORMAT_UNSUPPORTED_TYPE;
import static com.google.android.exoplayer2.RendererCapabilities.ADAPTIVE_NOT_SEAMLESS;
import static com.google.android.exoplayer2.RendererCapabilities.DECODER_SUPPORT_FALLBACK;
import static com.google.android.exoplayer2.RendererCapabilities.DECODER_SUPPORT_PRIMARY;
import static com.google.android.exoplayer2.RendererCapabilities.HARDWARE_ACCELERATION_NOT_SUPPORTED;
import static com.google.android.exoplayer2.RendererCapabilities.HARDWARE_ACCELERATION_SUPPORTED;
import static com.google.android.exoplayer2.RendererCapabilities.TUNNELING_NOT_SUPPORTED;
import static com.google.android.exoplayer2.RendererConfiguration.DEFAULT;
import static com.google.common.truth.Truth.assertThat;
@ -37,6 +41,7 @@ import com.google.android.exoplayer2.C;
import com.google.android.exoplayer2.ExoPlaybackException;
import com.google.android.exoplayer2.Format;
import com.google.android.exoplayer2.RendererCapabilities;
import com.google.android.exoplayer2.RendererCapabilities.Capabilities;
import com.google.android.exoplayer2.RendererConfiguration;
import com.google.android.exoplayer2.Timeline;
import com.google.android.exoplayer2.TracksInfo;
@ -1623,6 +1628,122 @@ public final class DefaultTrackSelectorTest {
assertNoSelection(result.selections[0]);
}
@Test
public void selectTracksWithMultipleAudioTracksWithMixedDecoderSupportLevels() throws Exception {
Format.Builder formatBuilder = AUDIO_FORMAT.buildUpon();
Format format0 = formatBuilder.setId("0").setAverageBitrate(200).build();
Format format1 = formatBuilder.setId("1").setAverageBitrate(400).build();
Format format2 = formatBuilder.setId("2").setAverageBitrate(600).build();
Format format3 = formatBuilder.setId("3").setAverageBitrate(800).build();
TrackGroupArray trackGroups = singleTrackGroup(format0, format1, format2, format3);
@Capabilities int unsupported = RendererCapabilities.create(FORMAT_UNSUPPORTED_TYPE);
@Capabilities
int primaryHardware =
RendererCapabilities.create(
FORMAT_HANDLED,
ADAPTIVE_NOT_SEAMLESS,
TUNNELING_NOT_SUPPORTED,
HARDWARE_ACCELERATION_SUPPORTED,
DECODER_SUPPORT_PRIMARY);
@Capabilities
int primarySoftware =
RendererCapabilities.create(
FORMAT_HANDLED,
ADAPTIVE_NOT_SEAMLESS,
TUNNELING_NOT_SUPPORTED,
HARDWARE_ACCELERATION_NOT_SUPPORTED,
DECODER_SUPPORT_PRIMARY);
@Capabilities
int fallbackHardware =
RendererCapabilities.create(
FORMAT_HANDLED,
ADAPTIVE_NOT_SEAMLESS,
TUNNELING_NOT_SUPPORTED,
HARDWARE_ACCELERATION_SUPPORTED,
DECODER_SUPPORT_FALLBACK);
@Capabilities
int fallbackSoftware =
RendererCapabilities.create(
FORMAT_HANDLED,
ADAPTIVE_NOT_SEAMLESS,
TUNNELING_NOT_SUPPORTED,
HARDWARE_ACCELERATION_NOT_SUPPORTED,
DECODER_SUPPORT_FALLBACK);
// Select all tracks supported by primary, hardware decoder by default.
ImmutableMap<String, Integer> rendererCapabilitiesMap =
ImmutableMap.of(
"0",
primaryHardware,
"1",
primaryHardware,
"2",
primarySoftware,
"3",
fallbackHardware);
RendererCapabilities rendererCapabilities =
new FakeMappedRendererCapabilities(C.TRACK_TYPE_AUDIO, rendererCapabilitiesMap);
TrackSelectorResult result =
trackSelector.selectTracks(
new RendererCapabilities[] {rendererCapabilities}, trackGroups, periodId, TIMELINE);
assertAdaptiveSelection(result.selections[0], trackGroups.get(0), 1, 0);
// Select all tracks supported by primary, software decoder by default if no primary, hardware
// decoder is available.
rendererCapabilitiesMap =
ImmutableMap.of(
"0",
fallbackHardware,
"1",
fallbackHardware,
"2",
primarySoftware,
"3",
fallbackSoftware);
rendererCapabilities =
new FakeMappedRendererCapabilities(C.TRACK_TYPE_AUDIO, rendererCapabilitiesMap);
result =
trackSelector.selectTracks(
new RendererCapabilities[] {rendererCapabilities}, trackGroups, periodId, TIMELINE);
assertFixedSelection(result.selections[0], trackGroups.get(0), 2);
// Select all tracks supported by fallback, hardware decoder if no primary decoder is
// available.
rendererCapabilitiesMap =
ImmutableMap.of(
"0", fallbackHardware, "1", unsupported, "2", fallbackSoftware, "3", fallbackHardware);
rendererCapabilities =
new FakeMappedRendererCapabilities(C.TRACK_TYPE_AUDIO, rendererCapabilitiesMap);
result =
trackSelector.selectTracks(
new RendererCapabilities[] {rendererCapabilities}, trackGroups, periodId, TIMELINE);
assertAdaptiveSelection(result.selections[0], trackGroups.get(0), 3, 0);
// Select all tracks supported by fallback, software decoder if no other decoder is available.
rendererCapabilitiesMap =
ImmutableMap.of(
"0", fallbackSoftware, "1", fallbackSoftware, "2", unsupported, "3", fallbackSoftware);
rendererCapabilities =
new FakeMappedRendererCapabilities(C.TRACK_TYPE_AUDIO, rendererCapabilitiesMap);
result =
trackSelector.selectTracks(
new RendererCapabilities[] {rendererCapabilities}, trackGroups, periodId, TIMELINE);
assertAdaptiveSelection(result.selections[0], trackGroups.get(0), 3, 1, 0);
// Select all tracks if mixed decoder support is allowed.
rendererCapabilitiesMap =
ImmutableMap.of(
"0", primaryHardware, "1", unsupported, "2", primarySoftware, "3", fallbackHardware);
rendererCapabilities =
new FakeMappedRendererCapabilities(C.TRACK_TYPE_AUDIO, rendererCapabilitiesMap);
trackSelector.setParameters(
defaultParameters.buildUpon().setAllowAudioMixedDecoderSupportAdaptiveness(true));
result =
trackSelector.selectTracks(
new RendererCapabilities[] {rendererCapabilities}, trackGroups, periodId, TIMELINE);
assertAdaptiveSelection(result.selections[0], trackGroups.get(0), 3, 2, 0);
}
@Test
public void selectTracksWithMultipleAudioTracksOverrideReturnsAdaptiveTrackSelection()
throws Exception {
@ -1773,6 +1894,122 @@ public final class DefaultTrackSelectorTest {
assertAdaptiveSelection(result.selections[0], trackGroups.get(0), 0, 1);
}
@Test
public void selectTracksWithMultipleVideoTracksWithMixedDecoderSupportLevels() throws Exception {
Format.Builder formatBuilder = VIDEO_FORMAT.buildUpon();
Format format0 = formatBuilder.setId("0").setAverageBitrate(200).build();
Format format1 = formatBuilder.setId("1").setAverageBitrate(400).build();
Format format2 = formatBuilder.setId("2").setAverageBitrate(600).build();
Format format3 = formatBuilder.setId("3").setAverageBitrate(800).build();
TrackGroupArray trackGroups = singleTrackGroup(format0, format1, format2, format3);
@Capabilities int unsupported = RendererCapabilities.create(FORMAT_UNSUPPORTED_TYPE);
@Capabilities
int primaryHardware =
RendererCapabilities.create(
FORMAT_HANDLED,
ADAPTIVE_NOT_SEAMLESS,
TUNNELING_NOT_SUPPORTED,
HARDWARE_ACCELERATION_SUPPORTED,
DECODER_SUPPORT_PRIMARY);
@Capabilities
int primarySoftware =
RendererCapabilities.create(
FORMAT_HANDLED,
ADAPTIVE_NOT_SEAMLESS,
TUNNELING_NOT_SUPPORTED,
HARDWARE_ACCELERATION_NOT_SUPPORTED,
DECODER_SUPPORT_PRIMARY);
@Capabilities
int fallbackHardware =
RendererCapabilities.create(
FORMAT_HANDLED,
ADAPTIVE_NOT_SEAMLESS,
TUNNELING_NOT_SUPPORTED,
HARDWARE_ACCELERATION_SUPPORTED,
DECODER_SUPPORT_FALLBACK);
@Capabilities
int fallbackSoftware =
RendererCapabilities.create(
FORMAT_HANDLED,
ADAPTIVE_NOT_SEAMLESS,
TUNNELING_NOT_SUPPORTED,
HARDWARE_ACCELERATION_NOT_SUPPORTED,
DECODER_SUPPORT_FALLBACK);
// Select all tracks supported by primary, hardware decoder by default.
ImmutableMap<String, Integer> rendererCapabilitiesMap =
ImmutableMap.of(
"0",
primaryHardware,
"1",
primaryHardware,
"2",
primarySoftware,
"3",
fallbackHardware);
RendererCapabilities rendererCapabilities =
new FakeMappedRendererCapabilities(C.TRACK_TYPE_VIDEO, rendererCapabilitiesMap);
TrackSelectorResult result =
trackSelector.selectTracks(
new RendererCapabilities[] {rendererCapabilities}, trackGroups, periodId, TIMELINE);
assertAdaptiveSelection(result.selections[0], trackGroups.get(0), 1, 0);
// Select all tracks supported by primary, software decoder by default if no primary, hardware
// decoder is available.
rendererCapabilitiesMap =
ImmutableMap.of(
"0",
fallbackHardware,
"1",
fallbackHardware,
"2",
primarySoftware,
"3",
fallbackSoftware);
rendererCapabilities =
new FakeMappedRendererCapabilities(C.TRACK_TYPE_VIDEO, rendererCapabilitiesMap);
result =
trackSelector.selectTracks(
new RendererCapabilities[] {rendererCapabilities}, trackGroups, periodId, TIMELINE);
assertFixedSelection(result.selections[0], trackGroups.get(0), 2);
// Select all tracks supported by fallback, hardware decoder if no primary decoder is
// available.
rendererCapabilitiesMap =
ImmutableMap.of(
"0", fallbackHardware, "1", unsupported, "2", fallbackSoftware, "3", fallbackHardware);
rendererCapabilities =
new FakeMappedRendererCapabilities(C.TRACK_TYPE_VIDEO, rendererCapabilitiesMap);
result =
trackSelector.selectTracks(
new RendererCapabilities[] {rendererCapabilities}, trackGroups, periodId, TIMELINE);
assertAdaptiveSelection(result.selections[0], trackGroups.get(0), 3, 0);
// Select all tracks supported by fallback, software decoder if no other decoder is available.
rendererCapabilitiesMap =
ImmutableMap.of(
"0", fallbackSoftware, "1", fallbackSoftware, "2", unsupported, "3", fallbackSoftware);
rendererCapabilities =
new FakeMappedRendererCapabilities(C.TRACK_TYPE_VIDEO, rendererCapabilitiesMap);
result =
trackSelector.selectTracks(
new RendererCapabilities[] {rendererCapabilities}, trackGroups, periodId, TIMELINE);
assertAdaptiveSelection(result.selections[0], trackGroups.get(0), 3, 1, 0);
// Select all tracks if mixed decoder support is allowed.
rendererCapabilitiesMap =
ImmutableMap.of(
"0", primaryHardware, "1", unsupported, "2", primarySoftware, "3", fallbackHardware);
rendererCapabilities =
new FakeMappedRendererCapabilities(C.TRACK_TYPE_VIDEO, rendererCapabilitiesMap);
trackSelector.setParameters(
defaultParameters.buildUpon().setAllowVideoMixedDecoderSupportAdaptiveness(true));
result =
trackSelector.selectTracks(
new RendererCapabilities[] {rendererCapabilities}, trackGroups, periodId, TIMELINE);
assertAdaptiveSelection(result.selections[0], trackGroups.get(0), 3, 2, 0);
}
@Test
public void selectTracksWithMultipleVideoTracksOverrideReturnsAdaptiveTrackSelection()
throws Exception {
@ -1859,11 +2096,17 @@ public final class DefaultTrackSelectorTest {
throws Exception {
Format formatAv1 = new Format.Builder().setSampleMimeType(MimeTypes.VIDEO_AV1).build();
Format formatVp9 = new Format.Builder().setSampleMimeType(MimeTypes.VIDEO_VP9).build();
Format formatH264 = new Format.Builder().setSampleMimeType(MimeTypes.VIDEO_H264).build();
TrackGroupArray trackGroups = wrapFormats(formatAv1, formatVp9, formatH264);
Format formatH264Low =
new Format.Builder().setSampleMimeType(MimeTypes.VIDEO_H264).setAverageBitrate(400).build();
Format formatH264High =
new Format.Builder().setSampleMimeType(MimeTypes.VIDEO_H264).setAverageBitrate(800).build();
// Use an adaptive group to check that MIME type has a higher priority than number of tracks.
TrackGroup adaptiveGroup = new TrackGroup(formatH264Low, formatH264High);
TrackGroupArray trackGroups =
new TrackGroupArray(new TrackGroup(formatAv1), new TrackGroup(formatVp9), adaptiveGroup);
trackSelector.setParameters(
trackSelector.buildUponParameters().setPreferredVideoMimeType(MimeTypes.VIDEO_VP9));
defaultParameters.buildUpon().setPreferredVideoMimeType(MimeTypes.VIDEO_VP9));
TrackSelectorResult result =
trackSelector.selectTracks(
new RendererCapabilities[] {VIDEO_CAPABILITIES}, trackGroups, periodId, TIMELINE);
@ -1871,8 +2114,8 @@ public final class DefaultTrackSelectorTest {
assertFixedSelection(result.selections[0], trackGroups, formatVp9);
trackSelector.setParameters(
trackSelector
.buildUponParameters()
defaultParameters
.buildUpon()
.setPreferredVideoMimeTypes(MimeTypes.VIDEO_VP9, MimeTypes.VIDEO_AV1));
result =
trackSelector.selectTracks(
@ -1881,23 +2124,22 @@ public final class DefaultTrackSelectorTest {
assertFixedSelection(result.selections[0], trackGroups, formatVp9);
trackSelector.setParameters(
trackSelector
.buildUponParameters()
defaultParameters
.buildUpon()
.setPreferredVideoMimeTypes(MimeTypes.VIDEO_DIVX, MimeTypes.VIDEO_H264));
result =
trackSelector.selectTracks(
new RendererCapabilities[] {VIDEO_CAPABILITIES}, trackGroups, periodId, TIMELINE);
assertThat(result.length).isEqualTo(1);
assertFixedSelection(result.selections[0], trackGroups, formatH264);
assertAdaptiveSelection(result.selections[0], adaptiveGroup, /* expectedTracks...= */ 1, 0);
// Select first in the list if no preference is specified.
trackSelector.setParameters(
trackSelector.buildUponParameters().setPreferredVideoMimeType(null));
// Select default (=most tracks) if no preference is specified.
trackSelector.setParameters(defaultParameters.buildUpon().setPreferredVideoMimeType(null));
result =
trackSelector.selectTracks(
new RendererCapabilities[] {VIDEO_CAPABILITIES}, trackGroups, periodId, TIMELINE);
assertThat(result.length).isEqualTo(1);
assertFixedSelection(result.selections[0], trackGroups, formatAv1);
assertAdaptiveSelection(result.selections[0], adaptiveGroup, /* expectedTracks...= */ 1, 0);
}
/**
@ -1907,13 +2149,18 @@ public final class DefaultTrackSelectorTest {
@Test
public void selectTracks_withPreferredVideoRoleFlags_selectPreferredTrack() throws Exception {
Format.Builder formatBuilder = VIDEO_FORMAT.buildUpon();
Format noRoleFlags = formatBuilder.build();
Format noRoleFlagsLow = formatBuilder.setAverageBitrate(4000).build();
Format noRoleFlagsHigh = formatBuilder.setAverageBitrate(8000).build();
Format lessRoleFlags = formatBuilder.setRoleFlags(C.ROLE_FLAG_CAPTION).build();
Format moreRoleFlags =
formatBuilder
.setRoleFlags(C.ROLE_FLAG_CAPTION | C.ROLE_FLAG_COMMENTARY | C.ROLE_FLAG_DUB)
.build();
TrackGroupArray trackGroups = wrapFormats(noRoleFlags, moreRoleFlags, lessRoleFlags);
// Use an adaptive group to check that role flags have higher priority than number of tracks.
TrackGroup adaptiveNoRoleFlagsGroup = new TrackGroup(noRoleFlagsLow, noRoleFlagsHigh);
TrackGroupArray trackGroups =
new TrackGroupArray(
adaptiveNoRoleFlagsGroup, new TrackGroup(moreRoleFlags), new TrackGroup(lessRoleFlags));
trackSelector.setParameters(
defaultParameters
@ -2109,6 +2356,7 @@ public final class DefaultTrackSelectorTest {
.setExceedVideoConstraintsIfNecessary(false)
.setAllowVideoMixedMimeTypeAdaptiveness(true)
.setAllowVideoNonSeamlessAdaptiveness(false)
.setAllowVideoMixedDecoderSupportAdaptiveness(true)
.setViewportSize(
/* viewportWidth= */ 8,
/* viewportHeight= */ 9,
@ -2123,6 +2371,7 @@ public final class DefaultTrackSelectorTest {
.setAllowAudioMixedMimeTypeAdaptiveness(true)
.setAllowAudioMixedSampleRateAdaptiveness(false)
.setAllowAudioMixedChannelCountAdaptiveness(true)
.setAllowAudioMixedDecoderSupportAdaptiveness(false)
.setPreferredAudioMimeTypes(MimeTypes.AUDIO_AC3, MimeTypes.AUDIO_E_AC3)
// Text
.setPreferredTextLanguages("de", "en")

View file

@ -66,7 +66,9 @@ public final class BaseUrlExclusionList {
public void exclude(BaseUrl baseUrlToExclude, long exclusionDurationMs) {
long excludeUntilMs = SystemClock.elapsedRealtime() + exclusionDurationMs;
addExclusion(baseUrlToExclude.serviceLocation, excludeUntilMs, excludedServiceLocations);
addExclusion(baseUrlToExclude.priority, excludeUntilMs, excludedPriorities);
if (baseUrlToExclude.priority != BaseUrl.PRIORITY_UNSET) {
addExclusion(baseUrlToExclude.priority, excludeUntilMs, excludedPriorities);
}
}
/**

View file

@ -217,7 +217,7 @@ public class DefaultDashChunkSource implements DashChunkSource {
periodDurationUs,
representation,
selectedBaseUrl != null ? selectedBaseUrl : representation.baseUrls.get(0),
BundledChunkExtractor.FACTORY.createProgressiveMediaExtractor(
chunkExtractorFactory.createProgressiveMediaExtractor(
trackType,
representation.format,
enableEventMessageTrack,

View file

@ -21,10 +21,12 @@ import com.google.common.base.Objects;
/** A base URL, as defined by ISO 23009-1, 2nd edition, 5.6. and ETSI TS 103 285 V1.2.1, 10.8.2.1 */
public final class BaseUrl {
/** The default priority. */
public static final int DEFAULT_PRIORITY = 1;
/** The default weight. */
public static final int DEFAULT_WEIGHT = 1;
/** The default priority. */
public static final int DEFAULT_DVB_PRIORITY = 1;
/** Constant representing an unset priority in a manifest that does not declare a DVB profile. */
public static final int PRIORITY_UNSET = Integer.MIN_VALUE;
/** The URL. */
public final String url;
@ -36,11 +38,11 @@ public final class BaseUrl {
public final int weight;
/**
* Creates an instance with {@link #DEFAULT_PRIORITY default priority}, {@link #DEFAULT_WEIGHT
* Creates an instance with {@link #PRIORITY_UNSET an unset priority}, {@link #DEFAULT_WEIGHT
* default weight} and using the URL as the service location.
*/
public BaseUrl(String url) {
this(url, /* serviceLocation= */ url, DEFAULT_PRIORITY, DEFAULT_WEIGHT);
this(url, /* serviceLocation= */ url, PRIORITY_UNSET, DEFAULT_WEIGHT);
}
/** Creates an instance. */

View file

@ -15,6 +15,10 @@
*/
package com.google.android.exoplayer2.source.dash.manifest;
import static com.google.android.exoplayer2.source.dash.manifest.BaseUrl.DEFAULT_DVB_PRIORITY;
import static com.google.android.exoplayer2.source.dash.manifest.BaseUrl.DEFAULT_WEIGHT;
import static com.google.android.exoplayer2.source.dash.manifest.BaseUrl.PRIORITY_UNSET;
import android.net.Uri;
import android.text.TextUtils;
import android.util.Base64;
@ -103,14 +107,16 @@ public class DashManifestParser extends DefaultHandler
"inputStream does not contain a valid media presentation description",
/* cause= */ null);
}
return parseMediaPresentationDescription(xpp, new BaseUrl(uri.toString()));
return parseMediaPresentationDescription(xpp, uri);
} catch (XmlPullParserException e) {
throw ParserException.createForMalformedManifest(/* message= */ null, /* cause= */ e);
}
}
protected DashManifest parseMediaPresentationDescription(
XmlPullParser xpp, BaseUrl documentBaseUrl) throws XmlPullParserException, IOException {
protected DashManifest parseMediaPresentationDescription(XmlPullParser xpp, Uri documentBaseUri)
throws XmlPullParserException, IOException {
boolean dvbProfileDeclared =
isDvbProfileDeclared(parseProfiles(xpp, "profiles", new String[0]));
long availabilityStartTime = parseDateTime(xpp, "availabilityStartTime", C.TIME_UNSET);
long durationMs = parseDuration(xpp, "mediaPresentationDuration", C.TIME_UNSET);
long minBufferTimeMs = parseDuration(xpp, "minBufferTime", C.TIME_UNSET);
@ -128,6 +134,12 @@ public class DashManifestParser extends DefaultHandler
Uri location = null;
ServiceDescriptionElement serviceDescription = null;
long baseUrlAvailabilityTimeOffsetUs = dynamic ? 0 : C.TIME_UNSET;
BaseUrl documentBaseUrl =
new BaseUrl(
documentBaseUri.toString(),
/* serviceLocation= */ documentBaseUri.toString(),
dvbProfileDeclared ? DEFAULT_DVB_PRIORITY : PRIORITY_UNSET,
DEFAULT_WEIGHT);
ArrayList<BaseUrl> parentBaseUrls = Lists.newArrayList(documentBaseUrl);
List<Period> periods = new ArrayList<>();
@ -143,7 +155,7 @@ public class DashManifestParser extends DefaultHandler
parseAvailabilityTimeOffsetUs(xpp, baseUrlAvailabilityTimeOffsetUs);
seenFirstBaseUrl = true;
}
baseUrls.addAll(parseBaseUrl(xpp, parentBaseUrls));
baseUrls.addAll(parseBaseUrl(xpp, parentBaseUrls, dvbProfileDeclared));
} else if (XmlPullParserUtil.isStartTag(xpp, "ProgramInformation")) {
programInformation = parseProgramInformation(xpp);
} else if (XmlPullParserUtil.isStartTag(xpp, "UTCTiming")) {
@ -160,7 +172,8 @@ public class DashManifestParser extends DefaultHandler
nextPeriodStartMs,
baseUrlAvailabilityTimeOffsetUs,
availabilityStartTime,
timeShiftBufferDepthMs);
timeShiftBufferDepthMs,
dvbProfileDeclared);
Period period = periodWithDurationMs.first;
if (period.startMs == C.TIME_UNSET) {
if (dynamic) {
@ -280,7 +293,8 @@ public class DashManifestParser extends DefaultHandler
long defaultStartMs,
long baseUrlAvailabilityTimeOffsetUs,
long availabilityStartTimeMs,
long timeShiftBufferDepthMs)
long timeShiftBufferDepthMs,
boolean dvbProfileDeclared)
throws XmlPullParserException, IOException {
@Nullable String id = xpp.getAttributeValue(null, "id");
long startMs = parseDuration(xpp, "start", defaultStartMs);
@ -302,7 +316,7 @@ public class DashManifestParser extends DefaultHandler
parseAvailabilityTimeOffsetUs(xpp, baseUrlAvailabilityTimeOffsetUs);
seenFirstBaseUrl = true;
}
baseUrls.addAll(parseBaseUrl(xpp, parentBaseUrls));
baseUrls.addAll(parseBaseUrl(xpp, parentBaseUrls, dvbProfileDeclared));
} else if (XmlPullParserUtil.isStartTag(xpp, "AdaptationSet")) {
adaptationSets.add(
parseAdaptationSet(
@ -313,7 +327,8 @@ public class DashManifestParser extends DefaultHandler
baseUrlAvailabilityTimeOffsetUs,
segmentBaseAvailabilityTimeOffsetUs,
periodStartUnixTimeMs,
timeShiftBufferDepthMs));
timeShiftBufferDepthMs,
dvbProfileDeclared));
} else if (XmlPullParserUtil.isStartTag(xpp, "EventStream")) {
eventStreams.add(parseEventStream(xpp));
} else if (XmlPullParserUtil.isStartTag(xpp, "SegmentBase")) {
@ -373,7 +388,8 @@ public class DashManifestParser extends DefaultHandler
long baseUrlAvailabilityTimeOffsetUs,
long segmentBaseAvailabilityTimeOffsetUs,
long periodStartUnixTimeMs,
long timeShiftBufferDepthMs)
long timeShiftBufferDepthMs,
boolean dvbProfileDeclared)
throws XmlPullParserException, IOException {
int id = parseInt(xpp, "id", AdaptationSet.ID_UNSET);
@C.TrackType int contentType = parseContentType(xpp);
@ -406,7 +422,7 @@ public class DashManifestParser extends DefaultHandler
parseAvailabilityTimeOffsetUs(xpp, baseUrlAvailabilityTimeOffsetUs);
seenFirstBaseUrl = true;
}
baseUrls.addAll(parseBaseUrl(xpp, parentBaseUrls));
baseUrls.addAll(parseBaseUrl(xpp, parentBaseUrls, dvbProfileDeclared));
} else if (XmlPullParserUtil.isStartTag(xpp, "ContentProtection")) {
Pair<String, SchemeData> contentProtection = parseContentProtection(xpp);
if (contentProtection.first != null) {
@ -450,7 +466,8 @@ public class DashManifestParser extends DefaultHandler
periodDurationMs,
baseUrlAvailabilityTimeOffsetUs,
segmentBaseAvailabilityTimeOffsetUs,
timeShiftBufferDepthMs);
timeShiftBufferDepthMs,
dvbProfileDeclared);
contentType =
checkContentTypeConsistency(
contentType, MimeTypes.getTrackType(representationInfo.format.sampleMimeType));
@ -650,7 +667,8 @@ public class DashManifestParser extends DefaultHandler
long periodDurationMs,
long baseUrlAvailabilityTimeOffsetUs,
long segmentBaseAvailabilityTimeOffsetUs,
long timeShiftBufferDepthMs)
long timeShiftBufferDepthMs,
boolean dvbProfileDeclared)
throws XmlPullParserException, IOException {
String id = xpp.getAttributeValue(null, "id");
int bandwidth = parseInt(xpp, "bandwidth", Format.NO_VALUE);
@ -679,7 +697,7 @@ public class DashManifestParser extends DefaultHandler
parseAvailabilityTimeOffsetUs(xpp, baseUrlAvailabilityTimeOffsetUs);
seenFirstBaseUrl = true;
}
baseUrls.addAll(parseBaseUrl(xpp, parentBaseUrls));
baseUrls.addAll(parseBaseUrl(xpp, parentBaseUrls, dvbProfileDeclared));
} else if (XmlPullParserUtil.isStartTag(xpp, "AudioChannelConfiguration")) {
audioChannels = parseAudioChannelConfiguration(xpp);
} else if (XmlPullParserUtil.isStartTag(xpp, "SegmentBase")) {
@ -1371,35 +1389,42 @@ public class DashManifestParser extends DefaultHandler
*
* @param xpp The parser from which to read.
* @param parentBaseUrls The parent base URLs for resolving the parsed URLs.
* @param dvbProfileDeclared Whether the dvb profile is declared.
* @throws XmlPullParserException If an error occurs parsing the element.
* @throws IOException If an error occurs reading the element.
* @return The list of parsed and resolved URLs.
*/
protected List<BaseUrl> parseBaseUrl(XmlPullParser xpp, List<BaseUrl> parentBaseUrls)
protected List<BaseUrl> parseBaseUrl(
XmlPullParser xpp, List<BaseUrl> parentBaseUrls, boolean dvbProfileDeclared)
throws XmlPullParserException, IOException {
@Nullable String priorityValue = xpp.getAttributeValue(null, "dvb:priority");
int priority =
priorityValue != null ? Integer.parseInt(priorityValue) : BaseUrl.DEFAULT_PRIORITY;
priorityValue != null
? Integer.parseInt(priorityValue)
: (dvbProfileDeclared ? DEFAULT_DVB_PRIORITY : PRIORITY_UNSET);
@Nullable String weightValue = xpp.getAttributeValue(null, "dvb:weight");
int weight = weightValue != null ? Integer.parseInt(weightValue) : BaseUrl.DEFAULT_WEIGHT;
int weight = weightValue != null ? Integer.parseInt(weightValue) : DEFAULT_WEIGHT;
@Nullable String serviceLocation = xpp.getAttributeValue(null, "serviceLocation");
String baseUrl = parseText(xpp, "BaseURL");
if (serviceLocation == null) {
serviceLocation = baseUrl;
}
if (UriUtil.isAbsolute(baseUrl)) {
if (serviceLocation == null) {
serviceLocation = baseUrl;
}
return Lists.newArrayList(new BaseUrl(baseUrl, serviceLocation, priority, weight));
}
List<BaseUrl> baseUrls = new ArrayList<>();
for (int i = 0; i < parentBaseUrls.size(); i++) {
BaseUrl parentBaseUrl = parentBaseUrls.get(i);
priority = parentBaseUrl.priority;
weight = parentBaseUrl.weight;
serviceLocation = parentBaseUrl.serviceLocation;
baseUrls.add(
new BaseUrl(
UriUtil.resolve(parentBaseUrl.url, baseUrl), serviceLocation, priority, weight));
String resolvedBaseUri = UriUtil.resolve(parentBaseUrl.url, baseUrl);
String resolvedServiceLocation = serviceLocation == null ? resolvedBaseUri : serviceLocation;
if (dvbProfileDeclared) {
// Inherit parent properties only if dvb profile is declared.
priority = parentBaseUrl.priority;
weight = parentBaseUrl.weight;
resolvedServiceLocation = parentBaseUrl.serviceLocation;
}
baseUrls.add(new BaseUrl(resolvedBaseUri, resolvedServiceLocation, priority, weight));
}
return baseUrls;
}
@ -1581,6 +1606,14 @@ public class DashManifestParser extends DefaultHandler
}
}
protected String[] parseProfiles(XmlPullParser xpp, String attributeName, String[] defaultValue) {
@Nullable String attributeValue = xpp.getAttributeValue(/* namespace= */ null, attributeName);
if (attributeValue == null) {
return defaultValue;
}
return attributeValue.split(",");
}
// Utility methods.
/**
@ -1907,6 +1940,15 @@ public class DashManifestParser extends DefaultHandler
return availabilityTimeOffsetUs;
}
private boolean isDvbProfileDeclared(String[] profiles) {
for (String profile : profiles) {
if (profile.startsWith("urn:dvb:dash:profile:dvb-dash:")) {
return true;
}
}
return false;
}
/** A parsed Representation element. */
protected static final class RepresentationInfo {

View file

@ -213,7 +213,7 @@ public abstract class Representation {
new RangedUri(null, initializationStart, initializationEnd - initializationStart + 1);
SingleSegmentBase segmentBase =
new SingleSegmentBase(rangedUri, 1, 0, indexStart, indexEnd - indexStart + 1);
List<BaseUrl> baseUrls = ImmutableList.of(new BaseUrl(uri));
ImmutableList<BaseUrl> baseUrls = ImmutableList.of(new BaseUrl(uri));
return new SingleSegmentRepresentation(
revisionId,
format,

View file

@ -15,6 +15,8 @@
*/
package com.google.android.exoplayer2.source.dash;
import static com.google.android.exoplayer2.source.dash.manifest.BaseUrl.DEFAULT_DVB_PRIORITY;
import static com.google.android.exoplayer2.source.dash.manifest.BaseUrl.DEFAULT_WEIGHT;
import static com.google.android.exoplayer2.upstream.DefaultLoadErrorHandlingPolicy.DEFAULT_LOCATION_EXCLUSION_MS;
import static com.google.common.truth.Truth.assertThat;
import static org.mockito.ArgumentMatchers.anyInt;
@ -173,6 +175,32 @@ public class BaseUrlExclusionListTest {
assertThat(baseUrlExclusionList.getPriorityCountAfterExclusion(baseUrls)).isEqualTo(2);
}
@Test
public void selectBaseUrl_priorityUnset_isNotExcluded() {
BaseUrlExclusionList baseUrlExclusionList = new BaseUrlExclusionList();
ImmutableList<BaseUrl> baseUrls =
ImmutableList.of(
new BaseUrl(
/* url= */ "a-1",
/* serviceLocation= */ "a",
BaseUrl.PRIORITY_UNSET,
/* weight= */ 1),
new BaseUrl(
/* url= */ "a-2",
/* serviceLocation= */ "a",
BaseUrl.PRIORITY_UNSET,
/* weight= */ 1),
new BaseUrl(
/* url= */ "b",
/* serviceLocation= */ "b",
BaseUrl.PRIORITY_UNSET,
/* weight= */ 1));
baseUrlExclusionList.exclude(baseUrls.get(0), 10_000);
assertThat(baseUrlExclusionList.selectBaseUrl(baseUrls).serviceLocation).isEqualTo("b");
}
@Test
public void selectBaseUrl_emptyBaseUrlList_selectionIsNull() {
BaseUrlExclusionList baseUrlExclusionList = new BaseUrlExclusionList();
@ -183,7 +211,8 @@ public class BaseUrlExclusionListTest {
@Test
public void reset_dropsAllExclusions() {
BaseUrlExclusionList baseUrlExclusionList = new BaseUrlExclusionList();
List<BaseUrl> baseUrls = ImmutableList.of(new BaseUrl("a"));
ImmutableList<BaseUrl> baseUrls =
ImmutableList.of(new BaseUrl("a", "a", DEFAULT_DVB_PRIORITY, DEFAULT_WEIGHT));
baseUrlExclusionList.exclude(baseUrls.get(0), 5000);
baseUrlExclusionList.reset();

View file

@ -61,6 +61,10 @@ public class DashManifestParserTest {
"media/mpd/sample_mpd_availabilityTimeOffset_baseUrl";
private static final String SAMPLE_MPD_MULTIPLE_BASE_URLS =
"media/mpd/sample_mpd_multiple_baseUrls";
private static final String SAMPLE_MPD_RELATIVE_BASE_URLS_DVB_PROFILE_NOT_DECLARED =
"media/mpd/sample_mpd_relative_baseUrls_dvb_profile_not_declared";
private static final String SAMPLE_MPD_RELATIVE_BASE_URLS_DVB_PROFILE_DECLARED =
"media/mpd/sample_mpd_relative_baseUrls_dvb_profile_declared";
private static final String SAMPLE_MPD_AVAILABILITY_TIME_OFFSET_SEGMENT_TEMPLATE =
"media/mpd/sample_mpd_availabilityTimeOffset_segmentTemplate";
private static final String SAMPLE_MPD_AVAILABILITY_TIME_OFFSET_SEGMENT_LIST =
@ -748,6 +752,41 @@ public class DashManifestParserTest {
assertThat(textBaseUrls.get(0).serviceLocation).isEqualTo("e");
}
@Test
public void baseUrl_relativeBaseUrlsNoDvbNamespace_hasDifferentPrioritiesAndServiceLocation()
throws IOException {
DashManifestParser parser = new DashManifestParser();
DashManifest manifest =
parser.parse(
Uri.parse("https://example.com/test.mpd"),
TestUtil.getInputStream(
ApplicationProvider.getApplicationContext(),
SAMPLE_MPD_RELATIVE_BASE_URLS_DVB_PROFILE_NOT_DECLARED));
ImmutableList<BaseUrl> baseUrls =
manifest.getPeriod(0).adaptationSets.get(0).representations.get(0).baseUrls;
assertThat(baseUrls.get(0).priority).isEqualTo(BaseUrl.PRIORITY_UNSET);
assertThat(baseUrls.get(1).priority).isEqualTo(BaseUrl.PRIORITY_UNSET);
assertThat(baseUrls.get(0).serviceLocation).isNotEqualTo(baseUrls.get(1).serviceLocation);
}
@Test
public void baseUrl_relativeBaseUrlsWithDvbNamespace_inheritsPrioritiesAndServiceLocation()
throws IOException {
DashManifestParser parser = new DashManifestParser();
DashManifest manifest =
parser.parse(
Uri.parse("https://example.com/test.mpd"),
TestUtil.getInputStream(
ApplicationProvider.getApplicationContext(),
SAMPLE_MPD_RELATIVE_BASE_URLS_DVB_PROFILE_DECLARED));
ImmutableList<BaseUrl> baseUrls =
manifest.getPeriod(0).adaptationSets.get(0).representations.get(0).baseUrls;
assertThat(baseUrls.get(0).priority).isEqualTo(baseUrls.get(1).priority);
assertThat(baseUrls.get(0).serviceLocation).isEqualTo(baseUrls.get(1).serviceLocation);
}
@Test
public void serviceDescriptionElement_allValuesSet() throws IOException {
DashManifestParser parser = new DashManifestParser();

View file

@ -109,8 +109,7 @@ public class DownloadManagerDashTest {
testThread.release();
}
// Disabled due to flakiness.
@Ignore
@Ignore("Disabled due to flakiness")
@Test
public void saveAndLoadActionFile() throws Throwable {
// Configure fakeDataSet to block until interrupted when TEST_MPD is read.

View file

@ -157,7 +157,7 @@ public class DownloadServiceDashTest {
testThread.release();
}
@Ignore // b/78877092
@Ignore("Internal ref: b/78877092")
@Test
public void multipleDownloadRequest() throws Throwable {
downloadKeys(fakeStreamKey1);
@ -168,7 +168,7 @@ public class DownloadServiceDashTest {
assertCachedData(cache, fakeDataSet);
}
@Ignore // b/78877092
@Ignore("Internal ref: b/78877092")
@Test
public void removeAction() throws Throwable {
downloadKeys(fakeStreamKey1, fakeStreamKey2);
@ -182,7 +182,7 @@ public class DownloadServiceDashTest {
assertCacheEmpty(cache);
}
@Ignore // b/78877092
@Ignore("Internal ref: b/78877092")
@Test
public void removeBeforeDownloadComplete() throws Throwable {
pauseDownloadCondition = new ConditionVariable();

View file

@ -24,10 +24,9 @@ import com.google.android.exoplayer2.metadata.flac.PictureFrame;
import com.google.android.exoplayer2.metadata.id3.Id3Decoder;
import com.google.android.exoplayer2.util.ParsableBitArray;
import com.google.android.exoplayer2.util.ParsableByteArray;
import com.google.common.base.Charsets;
import com.google.common.collect.ImmutableList;
import java.io.IOException;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
/**
@ -168,9 +167,12 @@ public final class FlacMetadataReader {
metadataHolder.flacStreamMetadata =
flacStreamMetadata.copyWithVorbisComments(vorbisComments);
} else if (type == FlacConstants.METADATA_TYPE_PICTURE) {
PictureFrame pictureFrame = readPictureMetadataBlock(input, length);
ParsableByteArray pictureBlock = new ParsableByteArray(length);
input.readFully(pictureBlock.getData(), 0, length);
pictureBlock.skipBytes(FlacConstants.METADATA_BLOCK_HEADER_SIZE);
PictureFrame pictureFrame = PictureFrame.fromPictureBlock(pictureBlock);
metadataHolder.flacStreamMetadata =
flacStreamMetadata.copyWithPictureFrames(Collections.singletonList(pictureFrame));
flacStreamMetadata.copyWithPictureFrames(ImmutableList.of(pictureFrame));
} else {
input.skipFully(length);
}
@ -268,28 +270,5 @@ public final class FlacMetadataReader {
return Arrays.asList(commentHeader.comments);
}
private static PictureFrame readPictureMetadataBlock(ExtractorInput input, int length)
throws IOException {
ParsableByteArray scratch = new ParsableByteArray(length);
input.readFully(scratch.getData(), 0, length);
scratch.skipBytes(FlacConstants.METADATA_BLOCK_HEADER_SIZE);
int pictureType = scratch.readInt();
int mimeTypeLength = scratch.readInt();
String mimeType = scratch.readString(mimeTypeLength, Charsets.US_ASCII);
int descriptionLength = scratch.readInt();
String description = scratch.readString(descriptionLength);
int width = scratch.readInt();
int height = scratch.readInt();
int depth = scratch.readInt();
int colors = scratch.readInt();
int pictureDataLength = scratch.readInt();
byte[] pictureData = new byte[pictureDataLength];
scratch.readBytes(pictureData, 0, pictureDataLength);
return new PictureFrame(
pictureType, mimeType, description, width, height, depth, colors, pictureData);
}
private FlacMetadataReader() {}
}

View file

@ -15,13 +15,13 @@
*/
package com.google.android.exoplayer2.extractor;
import static com.google.android.exoplayer2.extractor.VorbisUtil.parseVorbisComments;
import androidx.annotation.Nullable;
import com.google.android.exoplayer2.C;
import com.google.android.exoplayer2.Format;
import com.google.android.exoplayer2.metadata.Metadata;
import com.google.android.exoplayer2.metadata.flac.PictureFrame;
import com.google.android.exoplayer2.metadata.flac.VorbisComment;
import com.google.android.exoplayer2.util.Log;
import com.google.android.exoplayer2.util.MimeTypes;
import com.google.android.exoplayer2.util.ParsableBitArray;
import com.google.android.exoplayer2.util.Util;
@ -60,8 +60,6 @@ public final class FlacStreamMetadata {
/** Indicates that a value is not in the corresponding lookup table. */
public static final int NOT_IN_LOOKUP_TABLE = -1;
/** Separator between the field name of a Vorbis comment and the corresponding value. */
private static final String SEPARATOR = "=";
/** Minimum number of samples per block. */
public final int minBlockSizeSamples;
@ -149,7 +147,7 @@ public final class FlacStreamMetadata {
bitsPerSample,
totalSamples,
/* seekTable= */ null,
buildMetadata(vorbisComments, pictureFrames));
concatenateVorbisMetadata(vorbisComments, pictureFrames));
}
private FlacStreamMetadata(
@ -274,8 +272,7 @@ public final class FlacStreamMetadata {
public FlacStreamMetadata copyWithVorbisComments(List<String> vorbisComments) {
@Nullable
Metadata appendedMetadata =
getMetadataCopyWithAppendedEntriesFrom(
buildMetadata(vorbisComments, Collections.emptyList()));
getMetadataCopyWithAppendedEntriesFrom(parseVorbisComments(vorbisComments));
return new FlacStreamMetadata(
minBlockSizeSamples,
maxBlockSizeSamples,
@ -292,9 +289,7 @@ public final class FlacStreamMetadata {
/** Returns a copy of {@code this} with the given picture frames added to the metadata. */
public FlacStreamMetadata copyWithPictureFrames(List<PictureFrame> pictureFrames) {
@Nullable
Metadata appendedMetadata =
getMetadataCopyWithAppendedEntriesFrom(
buildMetadata(Collections.emptyList(), pictureFrames));
Metadata appendedMetadata = getMetadataCopyWithAppendedEntriesFrom(new Metadata(pictureFrames));
return new FlacStreamMetadata(
minBlockSizeSamples,
maxBlockSizeSamples,
@ -308,6 +303,20 @@ public final class FlacStreamMetadata {
appendedMetadata);
}
/**
* Returns a new {@link Metadata} instance created from {@code vorbisComments} and {@code
* pictureFrames}.
*/
@Nullable
private static Metadata concatenateVorbisMetadata(
List<String> vorbisComments, List<PictureFrame> pictureFrames) {
@Nullable Metadata parsedVorbisComments = parseVorbisComments(vorbisComments);
if (parsedVorbisComments == null && pictureFrames.isEmpty()) {
return null;
}
return new Metadata(pictureFrames).copyWithAppendedEntriesFrom(parsedVorbisComments);
}
private static int getSampleRateLookupKey(int sampleRate) {
switch (sampleRate) {
case 88200:
@ -353,27 +362,4 @@ public final class FlacStreamMetadata {
return NOT_IN_LOOKUP_TABLE;
}
}
@Nullable
private static Metadata buildMetadata(
List<String> vorbisComments, List<PictureFrame> pictureFrames) {
if (vorbisComments.isEmpty() && pictureFrames.isEmpty()) {
return null;
}
ArrayList<Metadata.Entry> metadataEntries = new ArrayList<>();
for (int i = 0; i < vorbisComments.size(); i++) {
String vorbisComment = vorbisComments.get(i);
String[] keyAndValue = Util.splitAtFirst(vorbisComment, SEPARATOR);
if (keyAndValue.length != 2) {
Log.w(TAG, "Failed to parse Vorbis comment: " + vorbisComment);
} else {
VorbisComment entry = new VorbisComment(keyAndValue[0], keyAndValue[1]);
metadataEntries.add(entry);
}
}
metadataEntries.addAll(pictureFrames);
return metadataEntries.isEmpty() ? null : new Metadata(metadataEntries);
}
}

View file

@ -15,11 +15,20 @@
*/
package com.google.android.exoplayer2.extractor;
import android.util.Base64;
import androidx.annotation.Nullable;
import com.google.android.exoplayer2.Format;
import com.google.android.exoplayer2.ParserException;
import com.google.android.exoplayer2.metadata.Metadata;
import com.google.android.exoplayer2.metadata.Metadata.Entry;
import com.google.android.exoplayer2.metadata.flac.PictureFrame;
import com.google.android.exoplayer2.metadata.vorbis.VorbisComment;
import com.google.android.exoplayer2.util.Log;
import com.google.android.exoplayer2.util.ParsableByteArray;
import com.google.android.exoplayer2.util.Util;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
/** Utility methods for parsing Vorbis streams. */
public final class VorbisUtil {
@ -248,6 +257,45 @@ public final class VorbisUtil {
return new CommentHeader(vendor, comments, length);
}
/**
* Builds a {@link Metadata} instance from a list of Vorbis Comments.
*
* <p>METADATA_BLOCK_PICTURE comments will be transformed into {@link PictureFrame} entries. All
* others will be transformed into {@link VorbisComment} entries.
*
* @param vorbisComments The raw input of comments, as a key-value pair KEY=VAL.
* @return The fully parsed Metadata instance. Null if no vorbis comments could be parsed.
*/
@Nullable
public static Metadata parseVorbisComments(List<String> vorbisComments) {
List<Entry> metadataEntries = new ArrayList<>();
for (int i = 0; i < vorbisComments.size(); i++) {
String vorbisComment = vorbisComments.get(i);
String[] keyAndValue = Util.splitAtFirst(vorbisComment, "=");
if (keyAndValue.length != 2) {
Log.w(TAG, "Failed to parse Vorbis comment: " + vorbisComment);
continue;
}
if (keyAndValue[0].equals("METADATA_BLOCK_PICTURE")) {
// This tag is a special cover art tag, outlined by
// https://wiki.xiph.org/index.php/VorbisComment#Cover_art.
// Decode it from Base64 and transform it into a PictureFrame.
try {
byte[] decoded = Base64.decode(keyAndValue[1], Base64.DEFAULT);
metadataEntries.add(PictureFrame.fromPictureBlock(new ParsableByteArray(decoded)));
} catch (RuntimeException e) {
Log.w(TAG, "Failed to parse vorbis picture", e);
}
} else {
VorbisComment entry = new VorbisComment(keyAndValue[0], keyAndValue[1]);
metadataEntries.add(entry);
}
}
return metadataEntries.isEmpty() ? null : new Metadata(metadataEntries);
}
/**
* Verifies whether the next bytes in {@code header} are a Vorbis header of the given {@code
* headerType}.

View file

@ -138,6 +138,7 @@ public class MatroskaExtractor implements Extractor {
private static final String CODEC_ID_PCM_FLOAT = "A_PCM/FLOAT/IEEE";
private static final String CODEC_ID_SUBRIP = "S_TEXT/UTF8";
private static final String CODEC_ID_ASS = "S_TEXT/ASS";
private static final String CODEC_ID_VTT = "S_TEXT/WEBVTT";
private static final String CODEC_ID_VOBSUB = "S_VOBSUB";
private static final String CODEC_ID_PGS = "S_HDMV/PGS";
private static final String CODEC_ID_DVBSUB = "S_DVBSUB";
@ -323,6 +324,32 @@ public class MatroskaExtractor implements Extractor {
/** The format of an SSA timecode. */
private static final String SSA_TIMECODE_FORMAT = "%01d:%02d:%02d:%02d";
/**
* A template for the prefix that must be added to each VTT sample.
*
* <p>The display time of each subtitle is passed as {@code timeUs} to {@link
* TrackOutput#sampleMetadata}. The start and end timecodes in this template are relative to
* {@code timeUs}. Hence the start timecode is always zero. The 12 byte end timecode starting at
* {@link #VTT_PREFIX_END_TIMECODE_OFFSET} is set to a placeholder value, and must be replaced
* with the duration of the subtitle.
*
* <p>Equivalent to the UTF-8 string: "WEBVTT\n\n00:00:00.000 --> 00:00:00.000\n".
*/
private static final byte[] VTT_PREFIX =
new byte[] {
87, 69, 66, 86, 84, 84, 10, 10, 48, 48, 58, 48, 48, 58, 48, 48, 46, 48, 48, 48, 32, 45, 45,
62, 32, 48, 48, 58, 48, 48, 58, 48, 48, 46, 48, 48, 48, 10
};
/** The byte offset of the end timecode in {@link #VTT_PREFIX}. */
private static final int VTT_PREFIX_END_TIMECODE_OFFSET = 25;
/**
* The value by which to divide a time in microseconds to convert it to the unit of the last value
* in a VTT timecode (milliseconds).
*/
private static final long VTT_TIMECODE_LAST_VALUE_SCALING_FACTOR = 1000;
/** The format of a VTT timecode. */
private static final String VTT_TIMECODE_FORMAT = "%02d:%02d:%02d.%03d";
/** The length in bytes of a WAVEFORMATEX structure. */
private static final int WAVE_FORMAT_SIZE = 18;
/** Format tag indicating a WAVEFORMATEXTENSIBLE structure. */
@ -1342,7 +1369,9 @@ public class MatroskaExtractor implements Extractor {
track.trueHdSampleRechunker.sampleMetadata(
track.output, timeUs, flags, size, offset, track.cryptoData);
} else {
if (CODEC_ID_SUBRIP.equals(track.codecId) || CODEC_ID_ASS.equals(track.codecId)) {
if (CODEC_ID_SUBRIP.equals(track.codecId)
|| CODEC_ID_ASS.equals(track.codecId)
|| CODEC_ID_VTT.equals(track.codecId)) {
if (blockSampleCount > 1) {
Log.w(TAG, "Skipping subtitle sample in laced block.");
} else if (blockDurationUs == C.TIME_UNSET) {
@ -1415,6 +1444,9 @@ public class MatroskaExtractor implements Extractor {
} else if (CODEC_ID_ASS.equals(track.codecId)) {
writeSubtitleSampleData(input, SSA_PREFIX, size);
return finishWriteSampleData();
} else if (CODEC_ID_VTT.equals(track.codecId)) {
writeSubtitleSampleData(input, VTT_PREFIX, size);
return finishWriteSampleData();
}
TrackOutput output = track.output;
@ -1641,7 +1673,8 @@ public class MatroskaExtractor implements Extractor {
* <p>See documentation on {@link #SSA_DIALOGUE_FORMAT} and {@link #SUBRIP_PREFIX} for why we use
* the duration as the end timecode.
*
* @param codecId The subtitle codec; must be {@link #CODEC_ID_SUBRIP} or {@link #CODEC_ID_ASS}.
* @param codecId The subtitle codec; must be {@link #CODEC_ID_SUBRIP}, {@link #CODEC_ID_ASS} or
* {@link #CODEC_ID_VTT}.
* @param durationUs The duration of the sample, in microseconds.
* @param subtitleData The subtitle sample in which to overwrite the end timecode (output
* parameter).
@ -1662,6 +1695,12 @@ public class MatroskaExtractor implements Extractor {
durationUs, SSA_TIMECODE_FORMAT, SSA_TIMECODE_LAST_VALUE_SCALING_FACTOR);
endTimecodeOffset = SSA_PREFIX_END_TIMECODE_OFFSET;
break;
case CODEC_ID_VTT:
endTimecode =
formatSubtitleTimecode(
durationUs, VTT_TIMECODE_FORMAT, VTT_TIMECODE_LAST_VALUE_SCALING_FACTOR);
endTimecodeOffset = VTT_PREFIX_END_TIMECODE_OFFSET;
break;
default:
throw new IllegalArgumentException();
}
@ -1830,6 +1869,7 @@ public class MatroskaExtractor implements Extractor {
case CODEC_ID_PCM_FLOAT:
case CODEC_ID_SUBRIP:
case CODEC_ID_ASS:
case CODEC_ID_VTT:
case CODEC_ID_VOBSUB:
case CODEC_ID_PGS:
case CODEC_ID_DVBSUB:
@ -2157,6 +2197,9 @@ public class MatroskaExtractor implements Extractor {
mimeType = MimeTypes.TEXT_SSA;
initializationData = ImmutableList.of(SSA_DIALOGUE_FORMAT, getCodecPrivate(codecId));
break;
case CODEC_ID_VTT:
mimeType = MimeTypes.TEXT_VTT;
break;
case CODEC_ID_VOBSUB:
mimeType = MimeTypes.APPLICATION_VOBSUB;
initializationData = ImmutableList.of(getCodecPrivate(codecId));
@ -2245,6 +2288,7 @@ public class MatroskaExtractor implements Extractor {
.setColorInfo(colorInfo);
} else if (MimeTypes.APPLICATION_SUBRIP.equals(mimeType)
|| MimeTypes.TEXT_SSA.equals(mimeType)
|| MimeTypes.TEXT_VTT.equals(mimeType)
|| MimeTypes.APPLICATION_VOBSUB.equals(mimeType)
|| MimeTypes.APPLICATION_PGS.equals(mimeType)
|| MimeTypes.APPLICATION_DVBSUBS.equals(mimeType)) {

View file

@ -15,12 +15,18 @@
*/
package com.google.android.exoplayer2.extractor.ogg;
import static com.google.android.exoplayer2.util.Assertions.checkNotNull;
import static com.google.android.exoplayer2.util.Assertions.checkState;
import static com.google.android.exoplayer2.util.Assertions.checkStateNotNull;
import androidx.annotation.Nullable;
import com.google.android.exoplayer2.Format;
import com.google.android.exoplayer2.ParserException;
import com.google.android.exoplayer2.audio.OpusUtil;
import com.google.android.exoplayer2.extractor.VorbisUtil;
import com.google.android.exoplayer2.metadata.Metadata;
import com.google.android.exoplayer2.util.MimeTypes;
import com.google.android.exoplayer2.util.ParsableByteArray;
import com.google.common.collect.ImmutableList;
import java.util.Arrays;
import java.util.List;
import org.checkerframework.checker.nullness.qual.EnsuresNonNullIf;
@ -28,26 +34,13 @@ import org.checkerframework.checker.nullness.qual.EnsuresNonNullIf;
/** {@link StreamReader} to extract Opus data out of Ogg byte stream. */
/* package */ final class OpusReader extends StreamReader {
private static final int OPUS_CODE = 0x4f707573;
private static final byte[] OPUS_SIGNATURE = {'O', 'p', 'u', 's', 'H', 'e', 'a', 'd'};
private boolean headerRead;
private static final byte[] OPUS_ID_HEADER_SIGNATURE = {'O', 'p', 'u', 's', 'H', 'e', 'a', 'd'};
private static final byte[] OPUS_COMMENT_HEADER_SIGNATURE = {
'O', 'p', 'u', 's', 'T', 'a', 'g', 's'
};
public static boolean verifyBitstreamType(ParsableByteArray data) {
if (data.bytesLeft() < OPUS_SIGNATURE.length) {
return false;
}
byte[] header = new byte[OPUS_SIGNATURE.length];
data.readBytes(header, 0, OPUS_SIGNATURE.length);
return Arrays.equals(header, OPUS_SIGNATURE);
}
@Override
protected void reset(boolean headerData) {
super.reset(headerData);
if (headerData) {
headerRead = false;
}
return peekPacketStartsWith(data, OPUS_ID_HEADER_SIGNATURE);
}
@Override
@ -57,11 +50,16 @@ import org.checkerframework.checker.nullness.qual.EnsuresNonNullIf;
@Override
@EnsuresNonNullIf(expression = "#3.format", result = false)
protected boolean readHeaders(ParsableByteArray packet, long position, SetupData setupData) {
if (!headerRead) {
protected boolean readHeaders(ParsableByteArray packet, long position, SetupData setupData)
throws ParserException {
if (peekPacketStartsWith(packet, OPUS_ID_HEADER_SIGNATURE)) {
byte[] headerBytes = Arrays.copyOf(packet.getData(), packet.limit());
int channelCount = OpusUtil.getChannelCount(headerBytes);
List<byte[]> initializationData = OpusUtil.buildInitializationData(headerBytes);
// The ID header must come at the start of the file:
// https://datatracker.ietf.org/doc/html/rfc7845#section-3
checkState(setupData.format == null);
setupData.format =
new Format.Builder()
.setSampleMimeType(MimeTypes.AUDIO_OPUS)
@ -69,13 +67,33 @@ import org.checkerframework.checker.nullness.qual.EnsuresNonNullIf;
.setSampleRate(OpusUtil.SAMPLE_RATE)
.setInitializationData(initializationData)
.build();
headerRead = true;
return true;
} else if (peekPacketStartsWith(packet, OPUS_COMMENT_HEADER_SIGNATURE)) {
// The comment header must come immediately after the ID header, so the format will already
// be populated: https://datatracker.ietf.org/doc/html/rfc7845#section-3
checkStateNotNull(setupData.format);
packet.skipBytes(OPUS_COMMENT_HEADER_SIGNATURE.length);
VorbisUtil.CommentHeader commentHeader =
VorbisUtil.readVorbisCommentHeader(
packet, /* hasMetadataHeader= */ false, /* hasFramingBit= */ false);
@Nullable
Metadata vorbisMetadata =
VorbisUtil.parseVorbisComments(ImmutableList.copyOf(commentHeader.comments));
if (vorbisMetadata == null) {
return true;
}
setupData.format =
setupData
.format
.buildUpon()
.setMetadata(vorbisMetadata.copyWithAppendedEntriesFrom(setupData.format.metadata))
.build();
return true;
} else {
checkNotNull(setupData.format); // Has been set when the header was read.
boolean headerPacket = packet.readInt() == OPUS_CODE;
packet.setPosition(0);
return headerPacket;
// The ID header must come at the start of the file, so the format must already be populated:
// https://datatracker.ietf.org/doc/html/rfc7845#section-3
checkStateNotNull(setupData.format);
return false;
}
}
@ -114,4 +132,22 @@ import org.checkerframework.checker.nullness.qual.EnsuresNonNullIf;
}
return (long) frames * length;
}
/**
* Returns true if the given {@link ParsableByteArray} starts with {@code expectedPrefix}. Does
* not change the {@link ParsableByteArray#getPosition() position} of {@code packet}.
*
* @param packet The packet data.
* @return True if the packet starts with {@code expectedPrefix}, false if not.
*/
private static boolean peekPacketStartsWith(ParsableByteArray packet, byte[] expectedPrefix) {
if (packet.bytesLeft() < expectedPrefix.length) {
return false;
}
int startPosition = packet.getPosition();
byte[] header = new byte[expectedPrefix.length];
packet.readBytes(header, 0, expectedPrefix.length);
packet.setPosition(startPosition);
return Arrays.equals(header, expectedPrefix);
}
}

View file

@ -24,8 +24,10 @@ import com.google.android.exoplayer2.Format;
import com.google.android.exoplayer2.ParserException;
import com.google.android.exoplayer2.extractor.VorbisUtil;
import com.google.android.exoplayer2.extractor.VorbisUtil.Mode;
import com.google.android.exoplayer2.metadata.Metadata;
import com.google.android.exoplayer2.util.MimeTypes;
import com.google.android.exoplayer2.util.ParsableByteArray;
import com.google.common.collect.ImmutableList;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Arrays;
@ -111,6 +113,10 @@ import org.checkerframework.checker.nullness.qual.EnsuresNonNullIf;
codecInitializationData.add(idHeader.data);
codecInitializationData.add(vorbisSetup.setupHeaderData);
@Nullable
Metadata metadata =
VorbisUtil.parseVorbisComments(ImmutableList.copyOf(vorbisSetup.commentHeader.comments));
setupData.format =
new Format.Builder()
.setSampleMimeType(MimeTypes.AUDIO_VORBIS)
@ -119,6 +125,7 @@ import org.checkerframework.checker.nullness.qual.EnsuresNonNullIf;
.setChannelCount(idHeader.channels)
.setSampleRate(idHeader.sampleRate)
.setInitializationData(codecInitializationData)
.setMetadata(metadata)
.build();
return true;
}

View file

@ -22,9 +22,11 @@ import android.os.Parcelable;
import androidx.annotation.Nullable;
import com.google.android.exoplayer2.MediaMetadata;
import com.google.android.exoplayer2.metadata.Metadata;
import com.google.android.exoplayer2.util.ParsableByteArray;
import com.google.common.base.Charsets;
import java.util.Arrays;
/** A picture parsed from a FLAC file. */
/** A picture parsed from a Vorbis Comment or a FLAC picture block. */
public final class PictureFrame implements Metadata.Entry {
/** The type of the picture. */
@ -134,6 +136,35 @@ public final class PictureFrame implements Metadata.Entry {
return 0;
}
/**
* Parses a {@code METADATA_BLOCK_PICTURE} into a {@code PictureFrame} instance.
*
* <p>{@code pictureBlock} may be read directly from a <a
* href="https://xiph.org/flac/format.html#metadata_block_picture">FLAC file</a>, or decoded from
* the base64 content of a <a
* href="https://wiki.xiph.org/VorbisComment#METADATA_BLOCK_PICTURE">Vorbis Comment</a>.
*
* @param pictureBlock The data of the {@code METADATA_BLOCK_PICTURE}, not including any headers.
* @return A {@code PictureFrame} parsed from {@code pictureBlock}.
*/
public static PictureFrame fromPictureBlock(ParsableByteArray pictureBlock) {
int pictureType = pictureBlock.readInt();
int mimeTypeLength = pictureBlock.readInt();
String mimeType = pictureBlock.readString(mimeTypeLength, Charsets.US_ASCII);
int descriptionLength = pictureBlock.readInt();
String description = pictureBlock.readString(descriptionLength);
int width = pictureBlock.readInt();
int height = pictureBlock.readInt();
int depth = pictureBlock.readInt();
int colors = pictureBlock.readInt();
int pictureDataLength = pictureBlock.readInt();
byte[] pictureData = new byte[pictureDataLength];
pictureBlock.readBytes(pictureData, 0, pictureDataLength);
return new PictureFrame(
pictureType, mimeType, description, width, height, depth, colors, pictureData);
}
public static final Parcelable.Creator<PictureFrame> CREATOR =
new Parcelable.Creator<PictureFrame>() {

View file

@ -23,8 +23,9 @@ import androidx.annotation.Nullable;
import com.google.android.exoplayer2.MediaMetadata;
import com.google.android.exoplayer2.metadata.Metadata;
/** A vorbis comment. */
public final class VorbisComment implements Metadata.Entry {
/** @deprecated Use {@link com.google.android.exoplayer2.metadata.vorbis.VorbisComment} instead. */
@Deprecated
public class VorbisComment implements Metadata.Entry {
/** The key. */
public final String key;
@ -41,7 +42,7 @@ public final class VorbisComment implements Metadata.Entry {
this.value = value;
}
/* package */ VorbisComment(Parcel in) {
protected VorbisComment(Parcel in) {
this.key = castNonNull(in.readString());
this.value = castNonNull(in.readString());
}

View file

@ -0,0 +1,49 @@
/*
* Copyright (C) 2019 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.android.exoplayer2.metadata.vorbis;
import android.os.Parcel;
/** A vorbis comment, extracted from a FLAC or Ogg file. */
@SuppressWarnings("deprecation") // Extending deprecated type for backwards compatibility.
public final class VorbisComment extends com.google.android.exoplayer2.metadata.flac.VorbisComment {
/**
* @param key The key.
* @param value The value.
*/
public VorbisComment(String key, String value) {
super(key, value);
}
/* package */ VorbisComment(Parcel in) {
super(in);
}
public static final Creator<VorbisComment> CREATOR =
new Creator<VorbisComment>() {
@Override
public VorbisComment createFromParcel(Parcel in) {
return new VorbisComment(in);
}
@Override
public VorbisComment[] newArray(int size) {
return new VorbisComment[size];
}
};
}

View file

@ -0,0 +1,19 @@
/*
* Copyright (C) 2019 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
@NonNullApi
package com.google.android.exoplayer2.metadata.vorbis;
import com.google.android.exoplayer2.util.NonNullApi;

View file

@ -25,7 +25,7 @@ import com.google.android.exoplayer2.extractor.FlacMetadataReader.FlacStreamMeta
import com.google.android.exoplayer2.extractor.flac.FlacConstants;
import com.google.android.exoplayer2.metadata.Metadata;
import com.google.android.exoplayer2.metadata.flac.PictureFrame;
import com.google.android.exoplayer2.metadata.flac.VorbisComment;
import com.google.android.exoplayer2.metadata.vorbis.VorbisComment;
import com.google.android.exoplayer2.testutil.FakeExtractorInput;
import com.google.android.exoplayer2.testutil.TestUtil;
import com.google.android.exoplayer2.util.ParsableByteArray;

Some files were not shown because too many files have changed in this diff Show more