关联博客
ExoPlayer播放器剖析(一)进入ExoPlayer的世界
ExoPlayer播放器剖析(二)编写exoplayer的demo
ExoPlayer播放器剖析(三)流程分析—从build到prepare看ExoPlayer的创建流程
ExoPlayer播放器剖析(四)从renderer.render函数分析至MediaCodec
ExoPlayer播放器剖析(五)ExoPlayer对AudioTrack的操作
ExoPlayer播放器剖析(六)ExoPlayer同步机制分析
ExoPlayer播放器剖析(七)ExoPlayer对音频时间戳的处理
ExoPlayer播放器扩展(一)DASH流与HLS流简介
一、引言:
在上篇博客中,分析了exoplayer对audiotrack的操作,包括创建过程,读取媒体流数据到codec,codec再将解码出来的pcm数据送到audiotrack等。这篇博客分析视频送显机制,实际上也是整个ExoPlayer的同步机制。
二、同步机制分析:
1.得到精确的音视频时间间隔:
exoplayer的同步原理是视频去追音频,音频pts的获取是通过调用audiotrack的api接口拿到的,然后经过了比较复杂的校准之后,作为最终的音频时间戳送去给视频同步的,本博客不讨论音频时间戳的校准原理,我们先跟踪同步入手点。
还是回到我们熟悉的地方:exoplayer的大循环doSomeWork:
doSomeWork@ExoPlayerImplInternal:
private void doSomeWork() throws ExoPlaybackException, IOException {
/* 1.更新音频时间戳 */
updatePlaybackPositions();
...
if (playingPeriodHolder.prepared) {
/* 2.获取系统当前时间 */
long rendererPositionElapsedRealtimeUs = SystemClock.elapsedRealtime() * 1000;
...
/* 3.核心处理方法 */
renderer.render(rendererPositionUs, rendererPositionElapsedRealtimeUs);
...
}
...
}
updatePlaybackPositions()函数会去更新并校准当前的音频时间戳,得到的结果也就是renderer.render()方法的第一个入参rendererPositionUs,SystemClock.elapsedRealtime()是Android的系统方法,返回的是设备从boot开始到当前的毫秒时差,所以,理解为当前的系统时间,将这两个时间传入到
renderer.render()中去做同步处理,接下来,我们进入到renderer.render()函数:
drainOutputBuffer@MediaCodecRenderer:
private boolean drainOutputBuffer(long positionUs, long elapsedRealtimeUs)
throws ExoPlaybackException {
...
try {
processedOutputBuffer =
/* 处理解码完成后的输出buffer */
processOutputBuffer(
positionUs,
elapsedRealtimeUs,
codec,
outputBuffer,
outputIndex,
outputBufferInfo.flags,
/* sampleCount= */ 1,
outputBufferInfo.presentationTimeUs,
isDecodeOnlyOutputBuffer,
isLastOutputBuffer,
outputFormat);
} catch (IllegalStateException e) {
processEndOfStream();
if (outputStreamEnded) {
// Release the codec, as it's in an error state.
releaseCodec();
}
return false;
}
...
}
这次我们进入到MediaCodecVideoRenderer的processOutputBuffer函数:
@Override
protected boolean processOutputBuffer(
long positionUs,
long elapsedRealtimeUs,
@Nullable MediaCodec codec,
@Nullable ByteBuffer buffer,
int bufferIndex,
int bufferFlags,
int sampleCount,
long bufferPresentationTimeUs,
boolean isDecodeOnlyBuffer,
boolean isLastBuffer,
Format format)
throws ExoPlaybackException {
Assertions.checkNotNull(codec); // Can not render video without codec
if (initialPositionUs == C.TIME_UNSET) {
initialPositionUs = positionUs;
}
/* 1.当成基准时间戳,通常是0 */
long outputStreamOffsetUs = getOutputStreamOffsetUs();
/* 2.对视频buffer队列中的当前帧时间戳进行一个校准 */
long presentationTimeUs = bufferPresentationTimeUs - outputStreamOffsetUs;
if (isDecodeOnlyBuffer && !isLastBuffer) {
skipOutputBuffer(codec, bufferIndex, presentationTimeUs);
return true;
}
/* 3.音视频时间戳间隔 */
long earlyUs = bufferPresentationTimeUs - positionUs;
if (surface == dummySurface) {
// Skip frames in sync with playback, so we'll be at the right frame if the mode changes.
if (isBufferLate(earlyUs)) {
skipOutputBuffer(codec, bufferIndex, presentationTimeUs);
updateVideoFrameProcessingOffsetCounters(earlyUs);
return true;
}
return false;
}
long elapsedRealtimeNowUs = SystemClock.elapsedRealtime() * 1000;
/* 4.距离上次渲染的时间差 = 系统当前时间 - 上一帧的渲染时间 */
long elapsedSinceLastRenderUs = elapsedRealtimeNowUs - lastRenderTimeUs;
boolean isStarted = getState() == STATE_STARTED;
boolean shouldRenderFirstFrame =
!renderedFirstFrameAfterEnable
? (isStarted || mayRenderFirstFrameAfterEnableIfNotStarted)
: !renderedFirstFrameAfterReset;
// Don't force output until we joined and the position reached the current stream.
boolean forceRenderOutputBuffer =
joiningDeadlineMs == C.TIME_UNSET
&& positionUs >= outputStreamOffsetUs
&& (shouldRenderFirstFrame
|| (isStarted && shouldForceRenderOutputBuffer(earlyUs, elapsedSinceLastRenderUs)));
if (forceRenderOutputBuffer) {
long releaseTimeNs = System.nanoTime();
notifyFrameMetadataListener(presentationTimeUs, releaseTimeNs, format);
if (Util.SDK_INT >= 21) {
ren