Stagefright框架中视频播放流程
2013-12-21 14:56
405 查看
Stagefright框架中视频播放流程
1.创建playerengine
// 设置数据源,以及 audio sink
MediaPlayer::SetDataSource(PATH_TO_FILE)->
MediaPlayerService::create->
MediaPlayerService::Client::setDataSource->
GetPlayerType->
MediaPlayerService:: Client::CreatePlayer->
StagefrightPlayer:: setAudioSink->
StagefrightPlayer:: setDataSource->
Create MediaPlayerImpl(AwesomePlayer)->
MediaPlayerImpl:: MediaPlayerImpl
PlayerType:
PV_PLAYER--------------------(OpenCore中的PVPlayer,2.2之前的默认多媒体框架,从2.3开始android源码中已经不存在了,变更为Stagefright,但芯片厂商会添加进来。位于external/opencore目录。)
SONIVOX_PLAYER----------- MidiFile()(MIDI 格式)
STAGEFRIGHT_PLAYER----- StagefrightPlayer
NU_PLAYER---------------------NuPlayer(流媒体播放器)
TEST_PLAYER------------------- TestPlayerStub (only for ‘test’ and ‘eng’build)
//以下为与openMax插件的初始化连接。
AwesomePlayer:mClient.connect()->
OMXClient::connect->
MediaPlayerService::getOMX()->
OMXMaster::OMXMaster: addVendorPlugin ()->
addPlugin((*createOMXPlugin ())->
*createOMXPlugin (){
new TIOMXPlugin;
}
2.解析mUri指定的内容,根据header来确定对应的Extractor
AwesomePlayer:: prepare()
AwesomePlayer:: prepareAsync_l()->
在该函数中启动mQueue,作为EventHandler(stagefright使用event来进行驱动)
AwesomePlayer::finishSetDataSource_l()->
MediaExtractor::create(datasource)->
3.使用extractor对文件做A/V分离(mVideoTrack/mAudioTrack)
AwesomePlayer::setDataSource_l(extractor)->
AwesomePlayer::setVideoSource()->
AwesomePlayer::setAudioSource()->
mVideoTrack=source
mAudioTrack=source
4.根据mVideoTrace中编码类型来选择video_decoder(mVideoSource)
AwesomePlayer::initVideoDecoder()->
mVideoSource->start();(初始化解码器)
OMXCodec::Create()->
根据编码类型去匹配codecs,将softwareCodec优先放在matchCodecs前面,优先匹配,即优先建立softWareCodec
<MediaSource>softwareCodec=InstantiateSoftwareCodec(componentName, source)->
如果没有匹配的softWareCodec则去调用hardware中实现的omx_codec
omx->allocateNode(componentName...)->
sp<OMXCodec> codec = new OMXCodec(~)->
observer->setCodec(codec)->
err = codec->configureCodec(meta, flags)->
return codec.
5.根据mAudioTrace中编码类型来选择audio_decoder(mAudioSource)
AwesomePlayer::initAudioDecoder()->
mAudioSource->start();(初始化解码器)
OMXCodec::Create()->
根据编码类型去匹配codecs,将softwareCodec优先放在matchCodecs前面,优先匹配,即优先建立softWareCodec
<MediaSource>softwareCodec=InstantiateSoftwareCodec(componentName, source)->
如果没有匹配的softWareCodec则去调用Hardware中实现的omx_codec
omx->allocateNode(componentName...)->
sp<OMXCodec> codec = new OMXCodec(~)->
observer->setCodec(codec)->
err = codec->configureCodec(meta, flags)->
return codec.
6.创建AudioPlayer,解码并开启Audio output播放audio数据
AwesomePlayer::play_l->
mAudioPlayer = new AudioPlayer(mAudioSink, this);
mAudioPlayer->setSource(mAudioSource);
mAudioPlayer->start
mSource->read(&mFirstBuffer);(在audioplayer启动过程中,会先读取第一段需解码后的资料。)
mAudioSink->open(..., &AudioPlayer::AudioSinkCallback, ...);
AudioSinkCallback{
me->fillBuffer(buffer, size)
}
开启audio output,同时AudioPlayer将callback函数设给它,之后每次callback函数被调用,AudioPlayer便会去读取Audio decoder解码后的资料。)
7.根据Codec类型选择Renderer
AwesomePlayer::start->
postVideoEvent_l();
AwesomePlayer::onVideoEvent()->
mVideoSource->read()(&mVideoBuffer, &options)->
AwesomePlayer::initRenderer_l()->
判断Codec类型,
HardWare Codec:
mVideoRenderer =new AwesomeNativeWindowRenderer(mSurface, rotationDegrees);
AwesomeNativeWindowRenderer::render()(hook Called by EGL)->
HardWare Codec不需要进行ColorConvert操作,直接push到NativeWindow
SoftWare Codec:
mVideoRenderer = new AwesomeLocalRenderer(mSurface, meta)->
mVideoRenderer = new SoftwareRenderer()->
SoftwareRenderer::render()->
AwesomePlayer::onVideoEvent()->
[Check Timestamp]
mVideoRenderer->render(mVideoBuffer);
8.Audio和Video同步
Stagefright中Audio由CallBack驱动数据流,Video则在OnVideoEvent中获取Audio的timeStamp,进行同步。
Audio::fillBuffer()->
mPositionTimeMediaUs为资料中的timestamp,
mPositionTimeRealUs为播放资料的实际时间。
AwesomePlayer::onVideoEvent()->
mTimeSourceDeltaUs = realTimeUs- mediaTimeUs
1.创建playerengine
// 设置数据源,以及 audio sink
MediaPlayer::SetDataSource(PATH_TO_FILE)->
MediaPlayerService::create->
MediaPlayerService::Client::setDataSource->
GetPlayerType->
MediaPlayerService:: Client::CreatePlayer->
StagefrightPlayer:: setAudioSink->
StagefrightPlayer:: setDataSource->
Create MediaPlayerImpl(AwesomePlayer)->
MediaPlayerImpl:: MediaPlayerImpl
PlayerType:
PV_PLAYER--------------------(OpenCore中的PVPlayer,2.2之前的默认多媒体框架,从2.3开始android源码中已经不存在了,变更为Stagefright,但芯片厂商会添加进来。位于external/opencore目录。)
SONIVOX_PLAYER----------- MidiFile()(MIDI 格式)
STAGEFRIGHT_PLAYER----- StagefrightPlayer
NU_PLAYER---------------------NuPlayer(流媒体播放器)
TEST_PLAYER------------------- TestPlayerStub (only for ‘test’ and ‘eng’build)
//以下为与openMax插件的初始化连接。
AwesomePlayer:mClient.connect()->
OMXClient::connect->
MediaPlayerService::getOMX()->
OMXMaster::OMXMaster: addVendorPlugin ()->
addPlugin((*createOMXPlugin ())->
*createOMXPlugin (){
new TIOMXPlugin;
}
2.解析mUri指定的内容,根据header来确定对应的Extractor
AwesomePlayer:: prepare()
AwesomePlayer:: prepareAsync_l()->
在该函数中启动mQueue,作为EventHandler(stagefright使用event来进行驱动)
AwesomePlayer::finishSetDataSource_l()->
MediaExtractor::create(datasource)->
3.使用extractor对文件做A/V分离(mVideoTrack/mAudioTrack)
AwesomePlayer::setDataSource_l(extractor)->
AwesomePlayer::setVideoSource()->
AwesomePlayer::setAudioSource()->
mVideoTrack=source
mAudioTrack=source
4.根据mVideoTrace中编码类型来选择video_decoder(mVideoSource)
AwesomePlayer::initVideoDecoder()->
mVideoSource->start();(初始化解码器)
OMXCodec::Create()->
根据编码类型去匹配codecs,将softwareCodec优先放在matchCodecs前面,优先匹配,即优先建立softWareCodec
<MediaSource>softwareCodec=InstantiateSoftwareCodec(componentName, source)->
如果没有匹配的softWareCodec则去调用hardware中实现的omx_codec
omx->allocateNode(componentName...)->
sp<OMXCodec> codec = new OMXCodec(~)->
observer->setCodec(codec)->
err = codec->configureCodec(meta, flags)->
return codec.
5.根据mAudioTrace中编码类型来选择audio_decoder(mAudioSource)
AwesomePlayer::initAudioDecoder()->
mAudioSource->start();(初始化解码器)
OMXCodec::Create()->
根据编码类型去匹配codecs,将softwareCodec优先放在matchCodecs前面,优先匹配,即优先建立softWareCodec
<MediaSource>softwareCodec=InstantiateSoftwareCodec(componentName, source)->
如果没有匹配的softWareCodec则去调用Hardware中实现的omx_codec
omx->allocateNode(componentName...)->
sp<OMXCodec> codec = new OMXCodec(~)->
observer->setCodec(codec)->
err = codec->configureCodec(meta, flags)->
return codec.
6.创建AudioPlayer,解码并开启Audio output播放audio数据
AwesomePlayer::play_l->
mAudioPlayer = new AudioPlayer(mAudioSink, this);
mAudioPlayer->setSource(mAudioSource);
mAudioPlayer->start
mSource->read(&mFirstBuffer);(在audioplayer启动过程中,会先读取第一段需解码后的资料。)
mAudioSink->open(..., &AudioPlayer::AudioSinkCallback, ...);
AudioSinkCallback{
me->fillBuffer(buffer, size)
}
开启audio output,同时AudioPlayer将callback函数设给它,之后每次callback函数被调用,AudioPlayer便会去读取Audio decoder解码后的资料。)
7.根据Codec类型选择Renderer
AwesomePlayer::start->
postVideoEvent_l();
AwesomePlayer::onVideoEvent()->
mVideoSource->read()(&mVideoBuffer, &options)->
AwesomePlayer::initRenderer_l()->
判断Codec类型,
HardWare Codec:
mVideoRenderer =new AwesomeNativeWindowRenderer(mSurface, rotationDegrees);
AwesomeNativeWindowRenderer::render()(hook Called by EGL)->
HardWare Codec不需要进行ColorConvert操作,直接push到NativeWindow
SoftWare Codec:
mVideoRenderer = new AwesomeLocalRenderer(mSurface, meta)->
mVideoRenderer = new SoftwareRenderer()->
SoftwareRenderer::render()->
AwesomePlayer::onVideoEvent()->
[Check Timestamp]
mVideoRenderer->render(mVideoBuffer);
8.Audio和Video同步
Stagefright中Audio由CallBack驱动数据流,Video则在OnVideoEvent中获取Audio的timeStamp,进行同步。
Audio::fillBuffer()->
mPositionTimeMediaUs为资料中的timestamp,
mPositionTimeRealUs为播放资料的实际时间。
AwesomePlayer::onVideoEvent()->
mTimeSourceDeltaUs = realTimeUs- mediaTimeUs
相关文章推荐
- [整理]Stagefright框架中视频播放流程
- Stagefright框架中视频播放流程
- Stagefright框架解读(—)音视频Playback流程
- Stagefright框架解读(—)音视频Playback流程
- [置顶] Android Multimedia框架总结(十)Stagefright框架之音视频输出过程
- 【视频开发】Gstreamer框架中使用gst-launch进行流媒体播放
- StageFright框架流程解读 .
- Android视频播放框架Vitamio
- 使用开源框架ijkplayer播放视频,写给新手不走弯路
- StageFright框架流程解读
- 音视频的流程:录制、播放、编码解码、上传下载等
- stagefright框架(四)-Video Buffer传输流程
- stagefright框架(一)Video Playback的流程
- 播放视频的框架Vitamio的使用问题
- 183使用 MediaPlayer Framework 框架播放视频
- SGPlayer 原理详解 - 支持 VR、RTMP 的视频播放框架
- stagefright框架(一)Video Playback的流程
- stagefright框架(六)-Audio Playback的流程
- QT框架中快速应用OpenCV——基于视频播放
- 播放视频的框架Vitamio的使用问题