您的位置:首页 > 编程语言

iphone视频聊天代码实现

2013-06-25 18:15 423 查看
视频聊天从摄像头中取得缓冲数据,转换成NSData,通过网络发送,接收端取得NSData后把NSData转换成图像,双方不停的收发数据,播放图像,就形成了视频聊天。废话不多说,直接上代码:

首先创建视频输入输出:

NSError *error= nil;

    

    //Setup the video input

    AVCaptureDevice *videoDevice=[self getFrontCamera];//[AVCaptureDevicedefaultDeviceWithMediaType: AVMediaTypeVideo];

    //Create a device input with the device and add it to thesession.

    AVCaptureDeviceInput *videoInput=[AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

    //Setup the video output

    _videoOutput =[[AVCaptureVideoDataOutput alloc] init];

    _videoOutput.alwaysDiscardsLateVideoFrames = NO;

    _videoOutput.videoSettings =

    [NSDictionary dictionaryWithObject:

    [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];     

    

    

    //Create the session

    _capSession =[[AVCaptureSession alloc] init];

    [_capSession addInput:videoInput];

    //[_capSessionaddInput:audioInput];

    [_capSession addOutput:_videoOutput];

    //[_capSessionaddOutput:_audioOutput];

    

    _capSession.sessionPreset = AVCaptureSessionPresetLow;     

    

    //Setup the queue

    dispatch_queue_t queue= dispatch_queue_create("MyQueue", NULL);

    [_videoOutput setSampleBufferDelegate:self queue:queue];

    [_audioOutput setSampleBufferDelegate:self queue:queue];

    dispatch_release(queue);

    [_capSession startRunning];

通过AVCaptureVideoDataOutputSampleBufferDelegate代理 取得摄像头数据

#pragma mark AVCaptureSession delegate

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer

      fromConnection:(AVCaptureConnection *)connection

{

    

    NSData *data=[NSData dataWithBytes:&sampleBuffer length:malloc_size(sampleBuffer)];

    [self recieveVideoFromData:data];

}

收到数据后转换成图像

-(void)recieveVideoFromData:(NSData *)data{

    CMSampleBufferRef sampleBuffer;

    [data getBytes:&sampleBuffer length:sizeof(sampleBuffer)];

    NSAutoreleasePool *pool =[[NSAutoreleasePool alloc] init];

    

    CVImageBufferRef imageBuffer= CMSampleBufferGetImageBuffer(sampleBuffer);

    CVPixelBufferLockBaseAddress(imageBuffer,0);

    uint8_t *baseAddress=(uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);

    size_t bytesPerRow= CVPixelBufferGetBytesPerRow(imageBuffer);

    size_t width= CVPixelBufferGetWidth(imageBuffer);

    size_t height= CVPixelBufferGetHeight(imageBuffer);

    

    CGColorSpaceRef colorSpace= CGColorSpaceCreateDeviceRGB();

    CGContextRef newContext= CGBitmapContextCreate(baseAddress,

                                                    width,height, 8,
bytesPerRow,colorSpace,

                                                    kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);

    CGImageRef newImage= CGBitmapContextCreateImage(newContext);

    

    CGContextRelease(newContext);

    CGColorSpaceRelease(colorSpace);

    

    UIImage *image=[UIImage imageWithCGImage:newImage scale:1.0

                                  orientation:UIImageOrientationRight];

   
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: