Android CameraHal CameraAdapter相关(一)
2013-09-22 10:31
603 查看
FrameNotifier继承了接口类MessageNotifier,因此,CameraAdapter需要实现这两个类的所有接口。调用的基础是enableMsgType,使用此接口来使能及注册相关回调函数。
1.enableMsgType
从上图看,MessageNotifier中的接口函数enableMsgType最终被BaseCameraAdapter继承实现,而MessageNotifier接口类被EventProvider引用。而EventProvider被两个类引用——AppCallbackNotifier及CameraHal。EventProvider与FrameProvider基本类似。
先看从CameraHal发起的EventProvider的调用:
1.
在Hal调用setParameters时,当下列条件满足时会调用setEventProvider()函数:
ExCameraParameters::KEY_TEMP_BRACKETING!=NULL&&strcmp(valstr,ExCameraParameters::BRACKET_ENABLE)==0,
就是说当使能BRACKETING功能时才会用到。
BRACKETING:使用不同曝光参数拍摄同一场景。用于图片合成。例如,广袤的草原,远处有高山,如果采用同一组曝光参数,势必会导致拍摄效果不佳,但如果近处的草原及远处的高山采用不同的曝光参数,之后再进行合成。就可以兼顾。
setEventProvider设置的参数为ALL_EVENTS及mCameraAdapter;
ALL_EVENTS定义:= enumCameraHalEventType{ NO_EVENTS=0x0, EVENT_FOCUS_LOCKED=0x1, EVENT_FOCUS_ERROR=0x2, EVENT_ZOOM_INDEX_REACHED=0x4, EVENT_SHUTTER=0x8, EVENT_FACE=0x10, ///@remarksFutureenumrelatedtodisplay,likeframedisplayedevent,couldbeaddedhere ALL_EVENTS=0xFFFF///Maximumof16eventtypessupported };
2~4.
在setEventProvider()中,构造一个EventProvider,并使用ALL_EVENTS注册。
voidCameraHal::setEventProvider(int32_teventMask,MessageNotifier*eventNotifier) { if(NULL!=mEventProvider) { mEventProvider->disableEventNotification(CameraHalEvent::ALL_EVENTS); deletemEventProvider; mEventProvider=NULL; } mEventProvider=newEventProvider(eventNotifier,this,eventCallbackRelay); if(NULL==mEventProvider) { CAMHAL_LOGEA("ErrorincreatingEventProvider"); } else { mEventProvider->enableEventNotification(eventMask); } }
mEventProvider=newEventProvider(eventNotifier,this,eventCallbackRelay);
第一个参数是mCameraAdapter的类型转换,第三个参数为Hal中的eventCallbackRelay.没有特别的内容,只是初始化赋值操作。
mEventProvider->enableEventNotification(eventMask);
intEventProvider::enableEventNotification(int32_tframeTypes) { status_tret=NO_ERROR; ///EnabletheframenotificationtoCameraAdapter(whichimplementsFrameNotifierinterface) mEventNotifier->enableMsgType(frameTypes<<MessageNotifier::EVENT_BIT_FIELD_POSITION ,NULL ,mEventCallback ,mCookie ); returnret; }
mEventNotifier是mCameraAdapter的类型转换。因此,enableMsgType函数最终会调用到BaseCameraAdapter中的实现。
5.BaseCameraAdapter中的enableMsgType
voidBaseCameraAdapter::enableMsgType(int32_tmsgs,frame_callbackcallback,event_callbackeventCb,void*cookie) { Mutex::Autolocklock(mSubscriberLock); LOG_FUNCTION_NAME; if(CameraFrame::PREVIEW_FRAME_SYNC==msgs) { mFrameSubscribers.add((int)cookie,callback); } elseif(CameraFrame::FRAME_DATA_SYNC==msgs) { mFrameDataSubscribers.add((int)cookie,callback); } elseif(CameraFrame::IMAGE_FRAME==msgs) { mImageSubscribers.add((int)cookie,callback); } elseif(CameraFrame::RAW_FRAME==msgs) { mRawSubscribers.add((int)cookie,callback); } elseif(CameraFrame::VIDEO_FRAME_SYNC==msgs) { mVideoSubscribers.add((int)cookie,callback); } elseif(CameraHalEvent::ALL_EVENTS==msgs) { mFocusSubscribers.add((int)cookie,eventCb); mShutterSubscribers.add((int)cookie,eventCb); mZoomSubscribers.add((int)cookie,eventCb); mFaceSubscribers.add((int)cookie,eventCb); } else { CAMHAL_LOGEA("Messagetypesubscriptionnosupportedyet!"); } LOG_FUNCTION_NAME_EXIT; }
其中ALL_EVENTS的相关类型CameraHalEvent并没有像CameraFrame类枚举做区分。
我们再回头看回调函数的实现
voidCameraHal::eventCallback(CameraHalEvent*event) { if(NULL!=event) { switch(event->mEventType) { caseCameraHalEvent::EVENT_FOCUS_LOCKED: caseCameraHalEvent::EVENT_FOCUS_ERROR: { if(mBracketingEnabled) { startImageBracketing(); } break; } default: { break; } }; } }
回调函数中有针对EVENT_FOCUS_LOCKED及EVENT_FOCUS_ERROR两个事件的处理函数——>startImageBracketing();因为我们不太会用到Bracketing功能,所以,此处不深究。
从HAL中发起的Event事件处理就结束了,回顾一下,从setParameters()函数中,若Bracketing功能打开,则会新建EventProvider,并使用ALL_EVENTS向BaseCameraAdapter注册回调函数。当有事件发生,CameraAdapter会调用回调函数,如果事件类型是EVENT_FOCUS_LOCKED及EVENT_FOCUS_ERROR,则会调用处理函数startImageBracketing();
从AppCallbackNotifier发起的Event事件处理。
跟CameraHal发起的EVENT,从流程上讲没有区别,包括CameraAdapter都是一个。唯一的区别是回调函数不一样。AppCallbackNotifier的处理函数eventCallbackRelay——》eventCallback:
voidAppCallbackNotifier::eventCallback(CameraHalEvent*chEvt) { ///PosttheeventtotheeventqueueofAppCallbackNotifier MSGUTILS::Messagemsg; CameraHalEvent*event; LOG_FUNCTION_NAME; if(NULL!=chEvt) { event=newCameraHalEvent(*chEvt); if(NULL!=event) { msg.command=AppCallbackNotifier::NOTIFIER_CMD_PROCESS_EVENT; msg.arg1=event; { Mutex::Autolocklock(mLock); mEventQ.put(&msg); } } else { CAMHAL_LOGEA("NotenoughresourcestoallocateCameraHalEvent"); } } LOG_FUNCTION_NAME_EXIT; }
使用消息队列mEventQ发送到消息处理线程AppCallbackNotifier::notificationThread()
boolAppCallbackNotifier::notificationThread() { boolshouldLive=true; status_tret; LOG_FUNCTION_NAME; //CAMHAL_LOGDA("NotificationThreadwaitingformessage"); ret=MSGUTILS::MessageQueue::waitForMsg(&mNotificationThread->msgQ(), &mEventQ, &mFrameQ, AppCallbackNotifier::NOTIFIER_TIMEOUT); //CAMHAL_LOGDA("NotificationThreadreceivedmessage"); if(mNotificationThread->msgQ().hasMsg()){ ///ReceivedamessagefromCameraHal,processit CAMHAL_LOGDA("NotificationThreadreceivedmessagefromCameraHAL"); shouldLive=processMessage(); if(!shouldLive){ CAMHAL_LOGDA("NotificationThreadexiting."); } } if(mEventQ.hasMsg()){ ///Receivedaneventfromoneoftheeventproviders CAMHAL_LOGDA("NotificationThreadreceivedaneventfromeventprovider(CameraAdapter)"); notifyEvent(); } if(mFrameQ.hasMsg()){ ///Receivedaframefromoneoftheframeproviders //CAMHAL_LOGDA("NotificationThreadreceivedaframefromframeprovider(CameraAdapter)"); notifyFrame(); } LOG_FUNCTION_NAME_EXIT; returnshouldLive; }
mEventQ的处理函数为notifyEvent()
voidAppCallbackNotifier::notifyEvent() { ///Receiveandsendtheeventnotificationstoapp MSGUTILS::Messagemsg; LOG_FUNCTION_NAME; { Mutex::Autolocklock(mLock); mEventQ.get(&msg); } boolret=true; CameraHalEvent*evt=NULL; CameraHalEvent::FocusEventData*focusEvtData; CameraHalEvent::ZoomEventData*zoomEvtData; CameraHalEvent::FaceEventDatafaceEvtData; if(mNotifierState!=AppCallbackNotifier::NOTIFIER_STARTED) { return; } switch(msg.command) { caseAppCallbackNotifier::NOTIFIER_CMD_PROCESS_EVENT: evt=(CameraHalEvent*)msg.arg1; if(NULL==evt) { CAMHAL_LOGEA("InvalidCameraHalEvent"); return; } switch(evt->mEventType) { caseCameraHalEvent::EVENT_SHUTTER: if((NULL!=mCameraHal)&& (NULL!=mNotifyCb)&& (mCameraHal->msgTypeEnabled(CAMERA_MSG_SHUTTER))) { mNotifyCb(CAMERA_MSG_SHUTTER,0,0,mCallbackCookie); } mRawAvailable=false; break; caseCameraHalEvent::EVENT_FOCUS_LOCKED: caseCameraHalEvent::EVENT_FOCUS_ERROR: focusEvtData=&evt->mEventData->focusEvent; if((focusEvtData->focusLocked)&& (NULL!=mCameraHal)&& (NULL!=mNotifyCb)&& (mCameraHal->msgTypeEnabled(CAMERA_MSG_FOCUS))) { mNotifyCb(CAMERA_MSG_FOCUS,true,0,mCallbackCookie); mCameraHal->disableMsgType(CAMERA_MSG_FOCUS); } elseif(focusEvtData->focusError&& (NULL!=mCameraHal)&& (NULL!=mNotifyCb)&& (mCameraHal->msgTypeEnabled(CAMERA_MSG_FOCUS))) { mNotifyCb(CAMERA_MSG_FOCUS,false,0,mCallbackCookie); mCameraHal->disableMsgType(CAMERA_MSG_FOCUS); } break; caseCameraHalEvent::EVENT_ZOOM_INDEX_REACHED: zoomEvtData=&evt->mEventData->zoomEvent; if((NULL!=mCameraHal)&& (NULL!=mNotifyCb)&& (mCameraHal->msgTypeEnabled(CAMERA_MSG_ZOOM))) { mNotifyCb(CAMERA_MSG_ZOOM,zoomEvtData->currentZoomIndex,zoomEvtData->targetZoomIndexReached,mCallbackCookie); } break; caseCameraHalEvent::EVENT_FACE: faceEvtData=evt->mEventData->faceEvent; if((NULL!=mCameraHal)&& (NULL!=mNotifyCb)&& (mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_METADATA))) { //WAforanissueinsideCameraService camera_memory_t*tmpBuffer=mRequestMemory(-1,1,1,NULL); mDataCb(CAMERA_MSG_PREVIEW_METADATA, tmpBuffer, 0, faceEvtData->getFaceResult(), mCallbackCookie); faceEvtData.clear(); if(NULL!=tmpBuffer){ tmpBuffer->release(tmpBuffer); } } break; caseCameraHalEvent::ALL_EVENTS: break; default: break; } break; } if(NULL!=evt) { deleteevt; } }
caseCameraHalEvent::EVENT_SHUTTER:
caseCameraHalEvent::EVENT_FOCUS_LOCKED:
caseCameraHalEvent::EVENT_FOCUS_ERROR:
caseCameraHalEvent::EVENT_ZOOM_INDEX_REACHED:
都是使用mNotifyCb实现通知,
caseCameraHalEvent::EVENT_FACE:
使用mDataCb通知,需要将数据传递给上层。
上面是所有的notifyevent事件,看看Frameevent都在那里定义?一共有下面这些类型
enumFrameType { PREVIEW_FRAME_SYNC=0x1,///SYNCimpliesthattheframeneedstobeexplicitlyreturnedafterconsuminginordertobefilledbycameraagain PREVIEW_FRAME=0x2,///Previewframeincludesviewfinderandsnapshotframes IMAGE_FRAME_SYNC=0x4,///ImageFrameistheimagecaptureoutputframe IMAGE_FRAME=0x8, VIDEO_FRAME_SYNC=0x10,///Timestampwillbeupdatedfortheseframes VIDEO_FRAME=0x20, FRAME_DATA_SYNC=0x40,///Anyextradataassosicatedwiththeframe.Alwayssyncedwiththeframe FRAME_DATA=0x80, RAW_FRAME=0x100, SNAPSHOT_FRAME=0x200, ALL_FRAMES=0xFFFF///Maximumof16frametypessupported };
FRAME_DATA_SYNCIMAGE_FRAMERAW_FRAMEPREVIEW_FRAME_SYNCVIDEO_FRAME_SYNC
第一处:FRAME_DATA_SYNC=0x40,///Anyextradataassosicatedwiththeframe.Alwayssyncedwiththeframe
开启Measurements时,将完整数据传递给mPreviewBufs[],不进行2Dto1D转换,
voidAppCallbackNotifier::setMeasurements(boolenable) { Mutex::Autolocklock(mLock); LOG_FUNCTION_NAME; mMeasurementEnabled=enable; if(enable) { mFrameProvider->enableFrameNotification(CameraFrame::FRAME_DATA_SYNC); } LOG_FUNCTION_NAME_EXIT; }
elseif((CameraFrame::FRAME_DATA_SYNC==frame->mFrameType)&& (NULL!=mCameraHal)&& (NULL!=mDataCb)&& (mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME))){ copyAndSendPreviewFrame(frame,CAMERA_MSG_PREVIEW_FRAME);
voidAppCallbackNotifier::copyAndSendPreviewFrame(CameraFrame*frame,int32_tmsgType) { dest=(void*)mPreviewBufs[mPreviewBufCount]; CAMHAL_LOGVB("%d:copy2Dto1D(%p,%p,%d,%d,%d,%d,%d,%d,%s)", __LINE__, NULL,//buf, frame->mBuffer, frame->mWidth, frame->mHeight, frame->mPixelFmt, frame->mAlignment, 2, frame->mLength, mPreviewPixelFormat); if(NULL!=dest){ //datasyncframesdon'tneedconversion if(CameraFrame::FRAME_DATA_SYNC==frame->mFrameType){ if((mPreviewMemory->size/MAX_BUFFERS)>=frame->mLength){ memcpy(dest,(void*)src,frame->mLength); }else{ memset(dest,0,(mPreviewMemory->size/MAX_BUFFERS)); } }else{ if((NULL==(void*)frame->mYuv[0])||(NULL==(void*)frame->mYuv[1])){ CAMHAL_LOGEA("Error!OneoftheYUVPointerisNULL"); gotoexit; } else{ copy2Dto1D(dest, frame->mYuv, frame->mWidth, frame->mHeight, frame->mPixelFmt, frame->mAlignment, frame->mOffset, 2, frame->mLength, mPreviewPixelFormat); } } }
第二处:只注册IMAGE_FRAMERAW_FRAME两个类型
voidAppCallbackNotifier::setFrameProvider(FrameNotifier*frameNotifier) { LOG_FUNCTION_NAME; ///@remarksThereisnoNULLcheckhere.Wewillcheck ///forNULLwhenwegetthestartcommandfromCameraAdapter mFrameProvider=newFrameProvider(frameNotifier,this,frameCallbackRelay); if(NULL==mFrameProvider) { CAMHAL_LOGEA("ErrorincreatingFrameProvider"); } else { //RegisteronlyforcapturedimagesandRAWfornow //TODO:Registerforandhandlealltypesofframes mFrameProvider->enableFrameNotification(CameraFrame::IMAGE_FRAME); mFrameProvider->enableFrameNotification(CameraFrame::RAW_FRAME); } LOG_FUNCTION_NAME_EXIT; }
IMAGE_FRAME,表示需要JPEG编码数据,有两种方式,如果满足下列条件
elseif((CameraFrame::IMAGE_FRAME==frame->mFrameType)&& (NULL!=mCameraHal)&& (NULL!=mDataCb)&& ((CameraFrame::ENCODE_RAW_YUV422I_TO_JPEG&frame->mQuirks)|| (CameraFrame::ENCODE_RAW_RGB24_TO_JPEG&frame->mQuirks)|| (CameraFrame::ENCODE_RAW_YUV420SP_TO_JPEG&frame->mQuirks)))
则需要新建encoder线程,将YUV或RGB数据传递给encoder线程,并将AppCallbackNotifierEncoderCallback回调函数作为参数传递进入,当编码完成时,调用回调函数,回调函数调用AppCallbackNotifier::EncoderDoneCb(),函数内部会调用mDataCb(CAMERA_MSG_COMPRESSED_IMAGE,picture,0,NULL,mCallbackCookie);将数据回传。
若不满足上面的条件,只满足
elseif((CameraFrame::IMAGE_FRAME==frame->mFrameType)&& (NULL!=mCameraHal)&& (NULL!=mDataCb))
则表示此frame不需要编码,直接调用copyAndSendPictureFrame(frame,CAMERA_MSG_COMPRESSED_IMAGE);将数据回调
voidAppCallbackNotifier::copyAndSendPictureFrame(CameraFrame*frame,int32_tmsgType) { camera_memory_t*picture=NULL; void*dest=NULL,*src=NULL; //scopeforlock { picture=mRequestMemory(-1,frame->mLength,1,NULL); if(NULL!=picture){ dest=picture->data; if(NULL!=dest){ src=(void*)((unsignedint)frame->mBuffer+frame->mOffset); memcpy(dest,src,frame->mLength); } } } exit: mFrameProvider->returnFrame(frame->mBuffer,(CameraFrame::FrameType)frame->mFrameType);if(picture){ if((mNotifierState==AppCallbackNotifier::NOTIFIER_STARTED)&& mCameraHal->msgTypeEnabled(msgType)){ mDataCb(msgType,picture,0,NULL,mCallbackCookie); } picture->release(picture); } }
注意,这两种处理都会调用mFrameProvider->returnFrame函数,用于回收buffer。
对于RAW_FRAME类型,notifyFrame处理较简单:只是回传数据,然后通过returnFrame回收buffer
if((CameraFrame::RAW_FRAME==frame->mFrameType)&& (NULL!=mCameraHal)&& (NULL!=mDataCb)&& (NULL!=mNotifyCb)) { if(mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE)) { #ifdefCOPY_IMAGE_BUFFER copyAndSendPictureFrame(frame,CAMERA_MSG_RAW_IMAGE); #else //TODO:FindawaytomapaTilerbuffertoaMemoryHeapBase #endif } else { if(mCameraHal->msgTypeEnabled(CAMERA_MSG_RAW_IMAGE_NOTIFY)){ mNotifyCb(CAMERA_MSG_RAW_IMAGE_NOTIFY,0,0,mCallbackCookie); } mFrameProvider->returnFrame(frame->mBuffer, (CameraFrame::FrameType)frame->mFrameType); } mRawAvailable=true; }
第三处:在CameraFrame::PREVIEW_FRAME_SYNC用于Preview的回调处理
status_tAppCallbackNotifier::startPreviewCallbacks(CameraParameters¶ms,void*buffers,uint32_t*offsets,intfd,size_tlength,size_tcount) {
if(mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME)){
mFrameProvider->enableFrameNotification(CameraFrame::PREVIEW_FRAME_SYNC);
}
}
notifyFrame中的处理,如果MeasureMent没有打开,则直接回收buffer,如果打开了,则需要回传数据。
elseif((CameraFrame::PREVIEW_FRAME_SYNC==frame->mFrameType)&& (NULL!=mCameraHal)&& (NULL!=mDataCb)&& (mCameraHal->msgTypeEnabled(CAMERA_MSG_PREVIEW_FRAME))){ //Whenenabled,measurementdataissentinsteadofvideodata if(!mMeasurementEnabled){ copyAndSendPreviewFrame(frame,CAMERA_MSG_PREVIEW_FRAME);// }else{ mFrameProvider->returnFrame(frame->mBuffer, (CameraFrame::FrameType)frame->mFrameType); } }
第四处:startRecording中CameraFrame::VIDEO_FRAME_SYNC,用于录像,需要拷贝数据。
status_tAppCallbackNotifier::startRecording() { status_tret=NO_ERROR; LOG_FUNCTION_NAME; Mutex::Autolocklock(mRecordingLock); if(NULL==mFrameProvider) { CAMHAL_LOGEA("TryingtostartvideorecordingwithoutFrameProvider"); ret=-1; } if(mRecording) { returnNO_INIT; } if(NO_ERROR==ret) { mFrameProvider->enableFrameNotification(CameraFrame::VIDEO_FRAME_SYNC); } mRecording=true; LOG_FUNCTION_NAME_EXIT; returnret; }
第一处:FRAME_DATA_SYNC=0x40,///Anyextradataassosicatedwiththeframe.Alwayssyncedwiththeframe
开启Measurements时,将完整数据传递给mPreviewBufs[],不进行2Dto1D转换,
第二初:setFrameProvider时,设置对于RAW_FRAME及IMAGE_FRAME,用于图片的上传及回收
第三处:在CameraFrame::PREVIEW_FRAME_SYNC用于Preview的回调及回收
第四处:startRecording中CameraFrame::VIDEO_FRAME_SYNC,用于录像,需要拷贝数据
elseif((CameraFrame::IMAGE_FRAME==frame->mFrameType)&&(NULL!=mCameraHal)&&(NULL!=mDataCb)&&((CameraFrame::ENCODE_RAW_YUV422I_TO_JPEG&frame->mQuirks)||(CameraFrame::ENCODE_RAW_RGB24_TO_JPEG&frame->mQuirks)||(CameraFrame::ENCODE_RAW_YUV420SP_TO_JPEG&frame->mQuirks)))
相关文章推荐
- Android CameraHal NativeWindow相关(二):FrameProvider与NativeWindowDisplayAdapter
- Android CameraHal NativeWindow相关(三)- Leon 5 (mDisplayAdapter->setErrorHandler(mAppCallbackNotifier.get());)
- Android CameraHal NativeWindow相关(一):从CameraHal::setPreviewWindow(struct preview_stream_ops *window)开始
- Android Camera API2中采用CameraMetadata用于从APP到HAL的参数交互
- Android Camera CameraHal.cpp 初始化分析
- Android Camera OMXCameraAdapter.cpp初始化分析
- Android Camera CameraHal.cpp 初始化分析
- Android Camera MSM HAL
- android中camera的hal模块怎么被调用
- android Camera相关问题及NV12剪裁旋转
- Android Camera API2中采用CameraMetadata用于从APP到HAL的参数交互
- Android开源项目-Jamendo音乐播放器研究与优化-Adapter相关
- Android中Exif的操作以及Camera应用中相关代码优化方案
- Android Camera CameraHal.cpp 分析
- 与Android camera相关代码
- android-mtk方案 “CompileTimeError” 编译报错的分析 (摄像头HAL相关,其他类似报错可以参考)
- Android ViewList,Adapter 相关机制
- android中与Adapter相关的控件----Spinner&AutoCompleteTextView