您的位置:首页 > 其它

AVFoundation/AVCaptureSession实现自定义相机界面录像(三)

2017-03-10 23:17 351 查看
要用到的类:

AVCaptureSession
AVCaptureVideoPreviewLayer
AVCaptureDeviceInput
AVCaptureConnection
AVCaptureVideoDataOutput
AVCaptureAudioDataOutput
AVAssetWriter
AVAssetWriterInput

// AVCaptureMovieFileOutput是AVCaptureFileOutput的子类,AVCaptureOutput的子类有:AVCaptureFileOutput;AVCaptureAudioDataOutput,AVCaptureVideoDataOutput,AVCaptureStillImageOutput(ios10.0之后用AVCapturePhotoOutput代替)----这三种颜色的子类处理数据的方法有区别;

//*设备输出数据管理对象,管理输出数据,通常使用它的子类:AVCaptureAudioDataOutput//输出音频管理对象,输出数据为NSData

// AVCaptureStillImageDataOutput//输出图片管理对象,输出数据为NSData

//AVCaptureVideoDataOutput//输出视频管理对象,输出数据为NSData

/* 输出文件管理对象,输出数据以文件形式输出 */

//AVCaptureFileOutput

// {//子类

// AVCaptureAudioFileOutput //输出是音频文件

//AVCaptureMovieFileOutput //输出是视频文件

//}

//添加防抖动功能

@property(nonatomic,strong)AVCaptureConnection
*videoConnection;

self.videoConnection = [self.videoOutputconnectionWithMediaType:AVMediaTypeVideo];

if ([self.videoConnectionisVideoStabilizationSupported])
{

self.videoConnection.preferredVideoStabilizationMode
= AVCaptureVideoStabilizationModeAuto;//防抖模式

}

=========================

1.AVCaptureSession
AVFoundation
捕捉类,在视频捕获时,客户端可以实例化
AVCaptureSession
并添加适当的
AVCaptureInputs
AVCaptureDeviceInput
和输出,比如
AVCaptureMovieFileOutput
。通过
[AVCaptureSession
startRunning]
开始数据流从输入到输出,和
[AVCaptureSession stopRunning]
停止输出输入的流动。客户端可以通过设置sessionPreset属性定制录制质量水平或输出的比特率。

2.
AVCaptureDevice
的每个实例对应一个设备,如摄像头或麦克风。
AVCaptureDevice
的实例不能直接创建。所有现有设备可以使用类方法
devicesWithMediaType:defaultDeviceWithMediaType:
获取,设备可以提供一个或多个给定流媒体类型。
AVCaptureDevice
实例可用于提供给
AVCaptureSession
创建一个为
AVCaptureDeviceInput
类型的输入源。

3.
AVCaptureDeviceInput
是AVCaptureSession输入源,提供媒体数据从设备连接到系统,通过
AVCaptureDevice
的实例化得到,就是我们将要用到的设备输出源设备,也就是前后摄像头,通过
[AVCaptureDevice
devicesWithMediaType:AVMediaTypeVideo]
方法获得。

4.AVCaptureVideoPreviewLayer是
CoreAnimation
里面layer的一个子类,用来做为
AVCaptureSession
预览视频输出,简单来说就是来做为拍摄的视频呈现的一个layer。

5.
AVCaptureMovieFileOutput
AVCaptureFileOutput
的子类,用来写入
QuickTime
视频类型的媒体文件。因为这个类在iphone上并不能实现暂停录制,和不能定义视频文件的类型,所以在这里并不使用,而是用灵活性更强的
AVCaptureVideoDataOutput
AVCaptureAudioDataOutput
来实现视频的录制。

6.
AVCaptureVideoDataOutput
AVCaptureOutput
一个子类,可以用于用来输出未压缩或压缩的视频捕获的帧,
AVCaptureVideoDataOutput
产生的实例可以使用其他媒体视频帧适合的api处理,应用程序可以用
captureOutput:didOutputSampleBuffer:fromConnection:
代理方法来获取帧数据。

7.
AVCaptureAudioDataOutput
AVCaptureOutput
的子类,可用于用来输出捕获来的非压缩或压缩的音频样本,
AVCaptureAudioDataOutput
产生的实例可以使用其他媒体视频帧适合的api处理,应用程序可以用
captureOutput:didOutputSampleBuffer:fromConnection:
代理方法来获取音频数据。

8.
AVCaptureConnection
代表
AVCaptureInputPort
或端口之间的连接,和一个
AVCaptureOutput
AVCaptureVideoPreviewLayer
AVCaptureSession
中的呈现;

9.

AVAssetWriter
为写入媒体数据到一个新的文件提供服务,
AVAssetWriter
的实例可以规定写入媒体文件的格式,如
QuickTime
电影文件格式或
MPEG-4
文件格式等等。
AVAssetWriter
有多个并行的轨道媒体数据,基本的有视频轨道和音频轨道,将会在下面介绍。
AVAssetWriter
的单个实例可用于一次写入一个单一的文件。那些希望写入多次文件的客户端必须每一次用一个新的
AVAssetWriter
实例。

10.用
AVAssetWriterInput
去拼接一个多媒体样本类型为
CMSampleBuffer
的实例到
AVAssetWriter
对象的输出文件的一个轨道;当有多个输入时,
AVAssetWriter
试图在用于存储和播放效率的理想模式写媒体数据。它的每一个输入信号,是否能接受媒体的数据根据通过
readyForMoreMediaData
的值来判断。如果
readyForMoreMediaData
YES
,说明输入可以接受媒体数据。并且你只能媒体数据追加到输入端。

#import "WCLRecordVideoVC.h"

#import "WCLRecordEngine.h"

#import "WCLRecordProgressView.h"

#import <MobileCoreServices/MobileCoreServices.h>

#import <MediaPlayer/MediaPlayer.h>

typedef NS_ENUM(NSUInteger, UploadVieoStyle) {

VideoRecord = 0,

VideoLocation,

};

@interface
WCLRecordVideoVC ()<WCLRecordEngineDelegate,UIImagePickerControllerDelegate,UINavigationControllerDelegate>

@property (weak,nonatomic)IBOutletUIButton
*flashLightBT;

@property (weak,nonatomic)IBOutletUIButton
*changeCameraBT;

@property (weak,nonatomic)IBOutletUIButton
*recordNextBT;

@property (weak,nonatomic)IBOutletUIButton
*recordBt;

@property (weak,nonatomic)IBOutletUIButton
*locationVideoBT;

@property (weak,nonatomic)IBOutletNSLayoutConstraint
*topViewTop;

@property (weak,nonatomic)IBOutletWCLRecordProgressView
*progressView;

@property (strong,nonatomic)WCLRecordEngine
*recordEngine;

@property (assign,nonatomic)BOOL
allowRecord;//允许录制

@property (assign,nonatomic)UploadVieoStyle
videoStyle;//视频的类型

@property (strong,nonatomic)UIImagePickerController
*moviePicker;//视频选择器

@property (strong,nonatomic)MPMoviePlayerViewController
*playerVC;

@end

@implementation WCLRecordVideoVC

- (void)dealloc {

_recordEngine =nil;

[[NSNotificationCenterdefaultCenter]removeObserver:selfname:MPMoviePlayerPlaybackDidFinishNotificationobject:[_playerVCmoviePlayer]];

}

- (void)viewWillAppear:(BOOL)animated {

[superviewWillAppear:animated];

[self.navigationControllersetNavigationBarHidden:YESanimated:YES];//隐藏导航栏

}

- (void)viewDidDisappear:(BOOL)animated {

[superviewDidDisappear:animated];

[self.recordEngineshutdown];//视图将要消失的时候,关闭录制功能

}

//在这个方法中初始化了_recordEngine,previewLayer,session

- (void)viewDidAppear:(BOOL)animated {

[superviewDidAppear:animated];

if (_recordEngine ==nil) {

// [self.recordEngine
初始化了WCLRecordEngine

[self.recordEnginepreviewLayer].frame
=self.view.bounds;

[self.view.layerinsertSublayer:[self.recordEnginepreviewLayer]atIndex:0];//初始化视频预览涂层,并且放在第一层

}

[self.recordEnginestartUp];//开始录制,用session开启或关闭来控制录制

}

- (void)viewDidLoad {

[superviewDidLoad];

self.allowRecord =YES;

}

//根据状态调整view的展示情况

- (void)adjustViewFrame {

[self.viewlayoutIfNeeded];

[UIViewanimateWithDuration:0.4delay:0.0options:UIViewAnimationOptionCurveEaseInOutanimations:^{

if (self.recordBt.selected)
{

self.topViewTop.constant = -64;

[[UIApplicationsharedApplication]setStatusBarHidden:YESwithAnimation:UIStatusBarAnimationSlide];

}else {

[[UIApplicationsharedApplication]setStatusBarHidden:NOwithAnimation:UIStatusBarAnimationSlide];

self.topViewTop.constant =0;

}

if (self.videoStyle ==VideoRecord)
{

self.locationVideoBT.alpha =0;

}

[self.viewlayoutIfNeeded];//刷新布局

} completion:nil];

}

#pragma mark - set、get方法

//创建WCLRecordEngine

- (WCLRecordEngine *)recordEngine {

if (_recordEngine ==nil) {

_recordEngine = [[WCLRecordEnginealloc]init];

_recordEngine.delegate =self;

}

return_recordEngine;

}

- (UIImagePickerController *)moviePicker {

if (_moviePicker ==nil) {

_moviePicker = [[UIImagePickerControlleralloc]init];

_moviePicker.delegate =self;

_moviePicker.sourceType =UIImagePickerControllerSourceTypePhotoLibrary;

_moviePicker.mediaTypes =@[(NSString
*)kUTTypeMovie];

}

return_moviePicker;

}

#pragma mark - Apple相册选择代理

//选择了某个照片的回调函数/代理回调

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary
*)info {

if ([[infoobjectForKey:UIImagePickerControllerMediaType]isEqualToString:(NSString*)kUTTypeMovie])
{

//获取视频的名称

NSString * videoPath=[NSStringstringWithFormat:@"%@",[infoobjectForKey:UIImagePickerControllerMediaURL]];

NSRange range =[videoPathrangeOfString:@"trim."];//匹配得到的下标

NSString *content=[videoPathsubstringFromIndex:range.location+5];

//视频的后缀

NSRange rangeSuffix=[contentrangeOfString:@"."];

NSString * suffixName=[contentsubstringFromIndex:rangeSuffix.location+1];

//如果视频是mov格式的则转为MP4的

if ([suffixNameisEqualToString:@"MOV"]) {

NSURL *videoUrl = [infoobjectForKey:UIImagePickerControllerMediaURL];

__weaktypeof(self) weakSelf =self;

[self.recordEnginechangeMovToMp4:videoUrldataBlock:^(UIImage
*movieImage) {

[weakSelf.moviePickerdismissViewControllerAnimated:YEScompletion:^{

weakSelf.playerVC = [[MPMoviePlayerViewControlleralloc]initWithContentURL:[NSURLfileURLWithPath:weakSelf.recordEngine.videoPath]];

[[NSNotificationCenterdefaultCenter]addObserver:selfselector:@selector(playVideoFinished:)name:MPMoviePlayerPlaybackDidFinishNotificationobject:[weakSelf.playerVCmoviePlayer]];

[[weakSelf.playerVCmoviePlayer]prepareToPlay];

[weakSelf presentMoviePlayerViewControllerAnimated:weakSelf.playerVC];

[[weakSelf.playerVCmoviePlayer]play];

}];

}];

}

}

}

#pragma mark - WCLRecordEngineDelegate

- (void)recordProgress:(CGFloat)progress {

if (progress >=1) {

[selfrecordAction:self.recordBt];

self.allowRecord =NO;

}

self.progressView.progress = progress;

}

#pragma mark - 各种点击事件

//返回点击事件

- (IBAction)dismissAction:(id)sender {

[self.navigationControllerpopViewControllerAnimated:YES];

}

//开关闪光灯

- (IBAction)flashLightAction:(id)sender {

if (self.changeCameraBT.selected
== NO) {

self.flashLightBT.selected
= !self.flashLightBT.selected;

if (self.flashLightBT.selected
== YES) {

[self.recordEngineopenFlashLight];

}else {

[self.recordEnginecloseFlashLight];

}

}

}

//切换前后摄像头

- (IBAction)changeCameraAction:(id)sender {

self.changeCameraBT.selected
= !self.changeCameraBT.selected;

if (self.changeCameraBT.selected
== YES) {

//前置摄像头

[self.recordEnginecloseFlashLight];

self.flashLightBT.selected =NO;

[self.recordEnginechangeCameraInputDeviceisFront:YES];

}else {

[self.recordEnginechangeCameraInputDeviceisFront:NO];

}

}

//录制下一步点击事件

- (IBAction)recordNextAction:(id)sender {

if (_recordEngine.videoPath.length
> 0) {

__weaktypeof(self) weakSelf =self;

[self.recordEnginestopCaptureHandler:^(UIImage
*movieImage) {

weakSelf.playerVC = [[MPMoviePlayerViewControlleralloc]initWithContentURL:[NSURLfileURLWithPath:weakSelf.recordEngine.videoPath]];

[[NSNotificationCenterdefaultCenter]addObserver:selfselector:@selector(playVideoFinished:)name:MPMoviePlayerPlaybackDidFinishNotificationobject:[weakSelf.playerVCmoviePlayer]];

[[weakSelf.playerVCmoviePlayer]prepareToPlay];

[weakSelf presentMoviePlayerViewControllerAnimated:weakSelf.playerVC];

[[weakSelf.playerVCmoviePlayer]play];

}];

}else {

NSLog(@"请先录制视频~");

}

}

//当点击Done按键或者播放完毕时调用此函数

- (void) playVideoFinished:(NSNotification *)theNotification {

MPMoviePlayerController *player = [theNotificationobject];

[[NSNotificationCenterdefaultCenter]removeObserver:selfname:MPMoviePlayerPlaybackDidFinishNotificationobject:player];

[player stop];

[self.playerVCdismissMoviePlayerViewControllerAnimated];

self.playerVC =nil;

}

//本地视频点击视频

- (IBAction)locationVideoAction:(id)sender {

self.videoStyle =VideoLocation;

[self.recordEngineshutdown];

[selfpresentViewController:self.moviePickeranimated:YEScompletion:nil];

}

//开始和暂停录制事件(红按钮)

- (IBAction)recordAction:(UIButton *)sender {

if (self.allowRecord) {//允许录像

self.videoStyle =VideoRecord;

self.recordBt.selected = !self.recordBt.selected;//按钮选中状态

if (self.recordBt.selected)
{

if (self.recordEngine.isCapturing)
{//正在录像

[self.recordEngineresumeCapture];

}else {

[self.recordEnginestartCapture];

}

}else {

[self.recordEnginepauseCapture];

}

[selfadjustViewFrame];

}

}

- (void)didReceiveMemoryWarning {

[superdidReceiveMemoryWarning];

// Dispose of any resources that can be recreated.

}

@end

==========================

#import "WCLRecordEngine.h"

#import "WCLRecordEncoder.h"

#import <AVFoundation/AVFoundation.h>

#import <Photos/Photos.h>

@interface
WCLRecordEngine ()<AVCaptureVideoDataOutputSampleBufferDelegate,AVCaptureAudioDataOutputSampleBufferDelegate,CAAnimationDelegate>

{

CMTime _timeOffset;//录制的偏移CMTime

CMTime _lastVideo;//记录上一次视频数据文件的CMTime

CMTime _lastAudio;//记录上一次音频数据文件的CMTime

NSInteger _cx;//视频分辨的宽

NSInteger _cy;//视频分辨的高

int _channels;//音频通道

Float64 _samplerate;//音频采样率

}

@property (strong,nonatomic)WCLRecordEncoder
*recordEncoder;//录制编码

@property (strong,nonatomic)AVCaptureSession
*recordSession;//捕获视频的会话

@property (strong,nonatomic)AVCaptureVideoPreviewLayer
*previewLayer;//捕获到的视频呈现的layer

@property (strong,nonatomic)AVCaptureDeviceInput
*backCameraInput;//后置摄像头输入

@property (strong,nonatomic)AVCaptureDeviceInput
*frontCameraInput;//前置摄像头输入

@property (strong,nonatomic)AVCaptureDeviceInput
*audioMicInput;//麦克风输入

@property (copy ,nonatomic)dispatch_queue_t
captureQueue;//录制的队列

@property (strong,nonatomic)AVCaptureConnection
*audioConnection;//音频录制连接

@property (strong,nonatomic)AVCaptureConnection
*videoConnection;//视频录制连接

@property (strong,nonatomic)AVCaptureVideoDataOutput
*videoOutput;//视频输出

@property (strong,nonatomic)AVCaptureAudioDataOutput
*audioOutput;//音频输出

@property (atomic,assign)BOOL isCapturing;//正在录制

@property (atomic,assign)BOOL isPaused;//是否暂停

@property (atomic,assign)BOOL discont;//是否中断

@property (atomic,assign)CMTime
startTime;//开始录制的时间

@property (atomic,assign)CGFloat currentRecordTime;//当前录制时间

@end

@implementation WCLRecordEngine

- (void)dealloc {

[_recordSessionstopRunning];

_captureQueue =nil;

_recordSession =nil;

_previewLayer =nil;

_backCameraInput =nil;

_frontCameraInput =nil;

_audioOutput =nil;

_videoOutput =nil;

_audioConnection =nil;

_videoConnection =nil;

_recordEncoder =nil;

}

- (instancetype)init

{

self = [superinit];

if (self) {

self.maxRecordTime =60.0f;

}

returnself;

}

#pragma mark - 公开的方法

//启动录制功能

- (void)startUp {

NSLog(@"启动录制功能");

self.startTime =CMTimeMake(0,0);

self.isCapturing =NO;

self.isPaused =NO;

self.discont =NO;

[self.recordSessionstartRunning];//开始会话,

}

//关闭录制功能

- (void)shutdown {

_startTime =CMTimeMake(0,0);

if (_recordSession) {

[_recordSessionstopRunning];//结束会话

}

[_recordEncoderfinishWithCompletionHandler:^{

// NSLog(@"录制完成");

}];

}

//开始录制

- (void) startCapture {

@synchronized(self) {

if (!self.isCapturing) {

// NSLog(@"开始录制");

self.recordEncoder =nil;

self.isPaused =NO;

self.discont =NO;

_timeOffset =CMTimeMake(0,0);

self.isCapturing =YES;

}

}

}

//暂停录制

- (void) pauseCapture {

@synchronized(self) {

if (self.isCapturing) {

// NSLog(@"暂停录制");

self.isPaused =YES;

self.discont =YES;

}

}

}

//继续录制

- (void) resumeCapture {

@synchronized(self) {

if (self.isPaused) {

// NSLog(@"继续录制");

self.isPaused =NO;

}

}

}

//停止录制

- (void) stopCaptureHandler:(void (^)(UIImage *movieImage))handler {

@synchronized(self) {

if (self.isCapturing) {

NSString* path =self.recordEncoder.path;

NSURL* url = [NSURLfileURLWithPath:path];

self.isCapturing =NO;

dispatch_async(_captureQueue, ^{

[self.recordEncoderfinishWithCompletionHandler:^{

self.isCapturing =NO;

self.recordEncoder =nil;

self.startTime =CMTimeMake(0,0);

self.currentRecordTime =0;

if ([self.delegaterespondsToSelector:@selector(recordProgress:)])
{

dispatch_async(dispatch_get_main_queue(), ^{

[self.delegaterecordProgress:self.currentRecordTime/self.maxRecordTime];

});

}

[[PHPhotoLibrarysharedPhotoLibrary]performChanges:^{

[PHAssetChangeRequestcreationRequestForAssetFromVideoAtFileURL:url];

} completionHandler:^(BOOL success,NSError *_Nullable
error) {

NSLog(@"保存成功");

}];

[selfmovieToImageHandler:handler];

}];

});

}

}

}

//获取视频第一帧的图片

- (void)movieToImageHandler:(void (^)(UIImage *movieImage))handler {

NSURL *url = [NSURLfileURLWithPath:self.videoPath];

AVURLAsset *asset = [[AVURLAssetalloc]initWithURL:urloptions:nil];

AVAssetImageGenerator *generator = [[AVAssetImageGeneratoralloc]initWithAsset:asset];

generator.appliesPreferredTrackTransform =TRUE;

CMTime thumbTime =CMTimeMakeWithSeconds(0,60);

generator.apertureMode =AVAssetImageGeneratorApertureModeEncodedPixels;

AVAssetImageGeneratorCompletionHandler generatorHandler =

^(CMTime requestedTime,CGImageRef im,CMTime actualTime,AVAssetImageGeneratorResult
result,NSError *error){

if (result ==AVAssetImageGeneratorSucceeded) {

UIImage *thumbImg = [UIImageimageWithCGImage:im];

if (handler) {

dispatch_async(dispatch_get_main_queue(), ^{

handler(thumbImg);

});

}

}

};

[generator generateCGImagesAsynchronouslyForTimes:

[NSArrayarrayWithObject:[NSValuevalueWithCMTime:thumbTime]]completionHandler:generatorHandler];

}

#pragma mark - set、get方法

//捕获视频的会话

- (AVCaptureSession *)recordSession {

if (_recordSession ==nil) {

_recordSession = [[AVCaptureSessionalloc]init];

//添加后置摄像头的输出

if ([_recordSessioncanAddInput:self.backCameraInput])
{

[_recordSessionaddInput:self.backCameraInput];

}

//添加后置麦克风的输出

if ([_recordSessioncanAddInput:self.audioMicInput])
{

[_recordSessionaddInput:self.audioMicInput];

}

//添加视频输出

if ([_recordSessioncanAddOutput:self.videoOutput])
{

[_recordSessionaddOutput:self.videoOutput];

//设置视频的分辨率

_cx =720;

_cy =1280;

}

//添加音频输出

if ([_recordSessioncanAddOutput:self.audioOutput])
{

[_recordSessionaddOutput:self.audioOutput];

}

//设置视频录制的方向

self.videoConnection.videoOrientation
= AVCaptureVideoOrientationPortrait;

}

return_recordSession;

}

//后置摄像头输入

- (AVCaptureDeviceInput *)backCameraInput {

if (_backCameraInput ==nil)
{

NSError *error;

_backCameraInput = [[AVCaptureDeviceInputalloc]initWithDevice:[selfbackCamera]error:&error];

if (error) {

NSLog(@"获取后置摄像头失败~");

}

}

return_backCameraInput;

}

//前置摄像头输入

- (AVCaptureDeviceInput *)frontCameraInput {

if (_frontCameraInput ==nil)
{

NSError *error;

_frontCameraInput = [[AVCaptureDeviceInputalloc]initWithDevice:[selffrontCamera]error:&error];

if (error) {

NSLog(@"获取前置摄像头失败~");

}

}

return_frontCameraInput;

}

//麦克风输入

- (AVCaptureDeviceInput *)audioMicInput {

if (_audioMicInput ==nil) {

AVCaptureDevice *mic = [AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeAudio];

NSError *error;

_audioMicInput = [AVCaptureDeviceInputdeviceInputWithDevice:micerror:&error];

if (error) {

NSLog(@"获取麦克风失败~");

}

}

return_audioMicInput;

}

//视频输出

- (AVCaptureVideoDataOutput *)videoOutput {

if (_videoOutput ==nil) {

_videoOutput = [[AVCaptureVideoDataOutputalloc]init];

[_videoOutputsetSampleBufferDelegate:selfqueue:self.captureQueue];

NSDictionary* setcapSettings = [NSDictionarydictionaryWithObjectsAndKeys:

[NSNumbernumberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange],kCVPixelBufferPixelFormatTypeKey,

nil];

_videoOutput.videoSettings = setcapSettings;

}

return_videoOutput;

}

//音频输出

- (AVCaptureAudioDataOutput *)audioOutput {

if (_audioOutput ==nil) {

_audioOutput = [[AVCaptureAudioDataOutputalloc]init];

[_audioOutputsetSampleBufferDelegate:selfqueue:self.captureQueue];

}

return_audioOutput;

}

//视频连接

- (AVCaptureConnection *)videoConnection {

_videoConnection = [self.videoOutputconnectionWithMediaType:AVMediaTypeVideo];

return_videoConnection;

}

//音频连接

- (AVCaptureConnection *)audioConnection {

if (_audioConnection ==nil)
{

_audioConnection = [self.audioOutputconnectionWithMediaType:AVMediaTypeAudio];

}

return_audioConnection;

}

//捕获到的视频呈现的layer

- (AVCaptureVideoPreviewLayer *)previewLayer {

if (_previewLayer ==nil) {

//通过AVCaptureSession初始化

AVCaptureVideoPreviewLayer *preview = [[AVCaptureVideoPreviewLayeralloc]initWithSession:self.recordSession];//self.recordSession初始化了Session

//设置比例为铺满全屏

preview.videoGravity =AVLayerVideoGravityResizeAspectFill;

_previewLayer = preview;

}

return_previewLayer;

}

//录制的队列

- (dispatch_queue_t)captureQueue {

if (_captureQueue ==nil) {

_captureQueue =dispatch_queue_create("cn.qiuyouqun.im.wclrecordengine.capture",DISPATCH_QUEUE_SERIAL);

}

return_captureQueue;

}

#pragma mark - 切换动画(转场动画)

- (void)changeCameraAnimation {

CATransition *changeAnimation = [CATransitionanimation];

changeAnimation.delegate =self;

changeAnimation.duration =0.45;

changeAnimation.type =@"oglFlip";

changeAnimation.subtype =kCATransitionFromRight;

changeAnimation.timingFunction =UIViewAnimationCurveEaseInOut;

[self.previewLayeraddAnimation:changeAnimationforKey:@"changeAnimation"];

}

- (void)animationDidStart:(CAAnimation *)anim {

self.videoConnection.videoOrientation
= AVCaptureVideoOrientationPortrait;

[self.recordSessionstartRunning];

}

#pragma -mark 将mov文件转为MP4文件

- (void)changeMovToMp4:(NSURL *)mediaURL dataBlock:(void (^)(UIImage
*movieImage))handler {

AVAsset *video = [AVAssetassetWithURL:mediaURL];

AVAssetExportSession *exportSession = [AVAssetExportSessionexportSessionWithAsset:videopresetName:AVAssetExportPreset1280x720];

exportSession.shouldOptimizeForNetworkUse =YES;

exportSession.outputFileType =AVFileTypeMPEG4;

NSString * basePath=[selfgetVideoCachePath];

self.videoPath = [basePathstringByAppendingPathComponent:[selfgetUploadFile_type:@"video"fileType:@"mp4"]];

exportSession.outputURL = [NSURLfileURLWithPath:self.videoPath];

[exportSession exportAsynchronouslyWithCompletionHandler:^{

[selfmovieToImageHandler:handler];

}];

}

#pragma mark - 视频相关

//返回前置摄像头

- (AVCaptureDevice *)frontCamera {

return [selfcameraWithPosition:AVCaptureDevicePositionFront];

}

//返回后置摄像头

- (AVCaptureDevice *)backCamera {

return [selfcameraWithPosition:AVCaptureDevicePositionBack];

}

//切换前后置摄像头

- (void)changeCameraInputDeviceisFront:(BOOL)isFront {

if (isFront) {

[self.recordSessionstopRunning];

[self.recordSessionremoveInput:self.backCameraInput];

if ([self.recordSessioncanAddInput:self.frontCameraInput])
{

[selfchangeCameraAnimation];

[self.recordSessionaddInput:self.frontCameraInput];

}

}else {

[self.recordSessionstopRunning];

[self.recordSessionremoveInput:self.frontCameraInput];

if ([self.recordSessioncanAddInput:self.backCameraInput])
{

[selfchangeCameraAnimation];

[self.recordSessionaddInput:self.backCameraInput];

}

}

}

//用来返回是前置摄像头还是后置摄像头

- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition) position {

//返回和视频录制相关的所有默认设备

NSArray *devices = [AVCaptureDevicedevicesWithMediaType:AVMediaTypeVideo];

//遍历这些设备返回跟position相关的设备

for (AVCaptureDevice *devicein devices) {

if ([deviceposition] == position) {

return device;

}

}

returnnil;

}

//开启闪光灯

- (void)openFlashLight {

AVCaptureDevice *backCamera = [selfbackCamera];

if (backCamera.torchMode ==AVCaptureTorchModeOff) {

[backCamera lockForConfiguration:nil];

backCamera.torchMode =AVCaptureTorchModeOn;

backCamera.flashMode =AVCaptureFlashModeOn;

[backCamera unlockForConfiguration];

}

}

//关闭闪光灯

- (void)closeFlashLight {

AVCaptureDevice *backCamera = [selfbackCamera];

if (backCamera.torchMode ==AVCaptureTorchModeOn) {

[backCamera lockForConfiguration:nil];

backCamera.torchMode =AVCaptureTorchModeOff;

backCamera.flashMode =AVCaptureTorchModeOff;

[backCamera unlockForConfiguration];

}

}

//获得视频存放地址

- (NSString *)getVideoCachePath {

NSString *videoCache = [NSTemporaryDirectory()stringByAppendingPathComponent:@"videos"]
;

BOOL isDir =NO;

NSFileManager *fileManager = [NSFileManagerdefaultManager];

BOOL existed = [fileManagerfileExistsAtPath:videoCacheisDirectory:&isDir];

if ( !(isDir ==YES && existed ==YES) ) {

[fileManager createDirectoryAtPath:videoCachewithIntermediateDirectories:YESattributes:nilerror:nil];

};

return videoCache;

}

- (NSString *)getUploadFile_type:(NSString *)type fileType:(NSString *)fileType
{

NSTimeInterval now = [[NSDatedate]timeIntervalSince1970];

NSDateFormatter * formatter = [[NSDateFormatteralloc]init];

[formatter setDateFormat:@"HHmmss"];

NSDate * NowDate = [NSDatedateWithTimeIntervalSince1970:now];

;

NSString * timeStr = [formatterstringFromDate:NowDate];

NSString *fileName = [NSStringstringWithFormat:@"%@_%@.%@",type,timeStr,fileType];

return fileName;

}

#pragma mark - 写入数据

- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {

NSLog(@"写入数据");

BOOL isVideo =YES;

@synchronized(self) {

if (!self.isCapturing ||self.isPaused)
{

return;

}

if (captureOutput !=self.videoOutput) {

isVideo = NO;

}

//初始化编码器,当有音频和视频参数时创建编码器

if ((self.recordEncoder ==nil)
&& !isVideo) {

CMFormatDescriptionRef fmt =CMSampleBufferGetFormatDescription(sampleBuffer);

[selfsetAudioFormat:fmt];

NSString *videoName = [selfgetUploadFile_type:@"video"fileType:@"mp4"];

self.videoPath = [[selfgetVideoCachePath]stringByAppendingPathComponent:videoName];

self.recordEncoder = [WCLRecordEncoderencoderForPath:self.videoPathHeight:_cywidth:_cxchannels:_channelssamples:_samplerate];

}

//判断是否中断录制过

if (self.discont) {

if (isVideo) {

return;

}

self.discont =NO;

// 计算暂停的时间

CMTime pts =CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

CMTime last = isVideo ?_lastVideo :_lastAudio;

if (last.flags &kCMTimeFlags_Valid) {

if (_timeOffset.flags &kCMTimeFlags_Valid)
{

pts = CMTimeSubtract(pts,_timeOffset);

}

CMTime offset =CMTimeSubtract(pts, last);

if (_timeOffset.value ==0)
{

_timeOffset = offset;

}else {

_timeOffset =CMTimeAdd(_timeOffset, offset);

}

}

_lastVideo.flags =0;

_lastAudio.flags =0;

}

// 增加sampleBuffer的引用计时,这样我们可以释放这个或修改这个数据,防止在修改时被释放

CFRetain(sampleBuffer);

if (_timeOffset.value >0)
{

CFRelease(sampleBuffer);

//根据得到的timeOffset调整

sampleBuffer = [selfadjustTime:sampleBufferby:_timeOffset];

}

// 记录暂停上一次录制的时间

CMTime pts =CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

CMTime dur =CMSampleBufferGetDuration(sampleBuffer);

if (dur.value >0) {

pts = CMTimeAdd(pts, dur);

}

if (isVideo) {

_lastVideo = pts;

}else {

_lastAudio = pts;

}

}

CMTime dur =CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

if (self.startTime.value
== 0) {

self.startTime = dur;

}

CMTime sub =CMTimeSubtract(dur,self.startTime);

self.currentRecordTime =CMTimeGetSeconds(sub);

if (self.currentRecordTime >self.maxRecordTime)
{

if (self.currentRecordTime
- self.maxRecordTime <0.1) {

if ([self.delegaterespondsToSelector:@selector(recordProgress:)])
{

dispatch_async(dispatch_get_main_queue(), ^{

[self.delegaterecordProgress:self.currentRecordTime/self.maxRecordTime];

});

}

}

return;

}

if ([self.delegaterespondsToSelector:@selector(recordProgress:)])
{

dispatch_async(dispatch_get_main_queue(), ^{

[self.delegaterecordProgress:self.currentRecordTime/self.maxRecordTime];

});

}

// 进行数据编码

[self.recordEncoderencodeFrame:sampleBufferisVideo:isVideo];

CFRelease(sampleBuffer);

NSLog(@"写入数据02");

}

//设置音频格式

- (void)setAudioFormat:(CMFormatDescriptionRef)fmt {

constAudioStreamBasicDescription *asbd =CMAudioFormatDescriptionGetStreamBasicDescription(fmt);

_samplerate = asbd->mSampleRate;

_channels = asbd->mChannelsPerFrame;

}

//调整媒体数据的时间

- (CMSampleBufferRef)adjustTime:(CMSampleBufferRef)sample by:(CMTime)offset
{

CMItemCount count;

CMSampleBufferGetSampleTimingInfoArray(sample,0,nil,
&count);

CMSampleTimingInfo* pInfo =malloc(sizeof(CMSampleTimingInfo)
* count);

CMSampleBufferGetSampleTimingInfoArray(sample, count, pInfo, &count);

for (CMItemCount i =0; i < count; i++) {

pInfo[i].decodeTimeStamp =CMTimeSubtract(pInfo[i].decodeTimeStamp,
offset);

pInfo[i].presentationTimeStamp =CMTimeSubtract(pInfo[i].presentationTimeStamp,
offset);

}

CMSampleBufferRef sout;

CMSampleBufferCreateCopyWithNewTiming(nil, sample, count, pInfo, &sout);

free(pInfo);

return sout;

}

@end

=================================

#import <Foundation/Foundation.h>

#import <UIKit/UIKit.h>

#import <AVFoundation/AVCaptureVideoPreviewLayer.h>

@protocol WCLRecordEngineDelegate <NSObject>

- (void)recordProgress:(CGFloat)progress;

@end

@interface WCLRecordEngine :
NSObject

@property (atomic,assign,readonly)BOOL
isCapturing;//正在录制

@property (atomic,assign,readonly)BOOL
isPaused;//是否暂停

@property (atomic,assign,readonly)CGFloat
currentRecordTime;//当前录制时间

@property (atomic,assign)CGFloat maxRecordTime;//录制最长时间

@property (weak,nonatomic)id<WCLRecordEngineDelegate>delegate;

@property (atomic,strong)NSString
*videoPath;//视频路径

//捕获到的视频呈现的layer

- (AVCaptureVideoPreviewLayer *)previewLayer;

//启动录制功能

- (void)startUp;

//关闭录制功能

- (void)shutdown;

//开始录制

- (void) startCapture;

//暂停录制

- (void) pauseCapture;

//停止录制

- (void) stopCaptureHandler:(void (^)(UIImage *movieImage))handler;

//继续录制

- (void) resumeCapture;

//开启闪光灯

- (void)openFlashLight;

//关闭闪光灯

- (void)closeFlashLight;

//切换前后置摄像头

- (void)changeCameraInputDeviceisFront:(BOOL)isFront;

//将mov的视频转成mp4

- (void)changeMovToMp4:(NSURL *)mediaURL dataBlock:(void (^)(UIImage
*movieImage))handler;

@end

=======================

视频写入

#import "WCLRecordEncoder.h"

@interface
WCLRecordEncoder ()

@property (nonatomic,strong)AVAssetWriter
*writer;//媒体写入对象

@property (nonatomic,strong)AVAssetWriterInput
*videoInput;//视频写入

@property (nonatomic,strong)AVAssetWriterInput
*audioInput;//音频写入

@property (nonatomic,strong)NSString
*path;//写入路径

@end

@implementation WCLRecordEncoder

- (void)dealloc {

_writer =nil;

_videoInput =nil;

_audioInput =nil;

_path =nil;

}

//WCLRecordEncoder遍历构造器的

+ (WCLRecordEncoder*)encoderForPath:(NSString*) path Height:(NSInteger)
cy width:(NSInteger) cx channels: (int) ch samples:(Float64) rate {

WCLRecordEncoder* enc = [WCLRecordEncoderalloc];

return [encinitPath:pathHeight:cywidth:cxchannels:chsamples:rate];

}

//初始化方法

- (instancetype)initPath:(NSString*)path Height:(NSInteger)cy width:(NSInteger)cx
channels:(int)ch samples:(Float64) rate {

self = [superinit];

if (self) {

self.path = path;

//先把路径下的文件给删除掉,保证录制的文件是最新的

[[NSFileManagerdefaultManager]removeItemAtPath:self.patherror:nil];

NSURL* url = [NSURLfileURLWithPath:self.path];

//初始化写入媒体类型为MP4类型

_writer = [AVAssetWriterassetWriterWithURL:urlfileType:AVFileTypeMPEG4error:nil];

//使其更适合在网络上播放

_writer.shouldOptimizeForNetworkUse =YES;

//初始化视频输出

[selfinitVideoInputHeight:cywidth:cx];

//确保采集到rate和ch

if (rate !=0 && ch !=0) {

//初始化音频输出

[selfinitAudioInputChannels:chsamples:rate];

}

}

returnself;

}

//初始化视频写入

- (void)initVideoInputHeight:(NSInteger)cy width:(NSInteger)cx {

//录制视频的一些配置,分辨率,编码方式等等

NSDictionary* settings = [NSDictionarydictionaryWithObjectsAndKeys:

AVVideoCodecH264,AVVideoCodecKey,

[NSNumbernumberWithInteger: cx],AVVideoWidthKey,

[NSNumbernumberWithInteger: cy],AVVideoHeightKey,

nil];

//初始化视频写入

_videoInput = [AVAssetWriterInputassetWriterInputWithMediaType:AVMediaTypeVideooutputSettings:settings];

//表明输入是否应该调整其处理为实时数据源的数据

_videoInput.expectsMediaDataInRealTime =YES;

//将视频写入源加入写入对象

[_writeraddInput:_videoInput];

}

//初始化音频写入

- (void)initAudioInputChannels:(int)ch samples:(Float64)rate {

//音频的一些配置包括音频各种这里为AAC,音频通道、采样率和音频的比特率

NSDictionary *settings = [NSDictionarydictionaryWithObjectsAndKeys:

[ NSNumbernumberWithInt:kAudioFormatMPEG4AAC],AVFormatIDKey,

[ NSNumbernumberWithInt: ch],AVNumberOfChannelsKey,

[ NSNumbernumberWithFloat: rate],AVSampleRateKey,

[ NSNumbernumberWithInt:128000],AVEncoderBitRateKey,

nil];

//初始化音频写入类

_audioInput = [AVAssetWriterInputassetWriterInputWithMediaType:AVMediaTypeAudiooutputSettings:settings];

//表明输入是否应该调整其处理为实时数据源的数据

_audioInput.expectsMediaDataInRealTime =YES;

//将音频输入源加入

[_writeraddInput:_audioInput];

}

//完成视频录制时调用

- (void)finishWithCompletionHandler:(void (^)(void))handler {

[_writerfinishWritingWithCompletionHandler: handler];

}

//通过这个方法写入数据

- (BOOL)encodeFrame:(CMSampleBufferRef) sampleBuffer isVideo:(BOOL)isVideo
{

//数据是否准备写入

if (CMSampleBufferDataIsReady(sampleBuffer)) {

//写入状态为未知,保证视频先写入

if (_writer.status ==AVAssetWriterStatusUnknown
&& isVideo) {

//获取开始写入的CMTime

CMTime startTime =CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

//开始写入

[_writerstartWriting];//准备写入

[_writerstartSessionAtSourceTime:startTime];//从当前时间开始写入

}

//写入失败

if (_writer.status
== AVAssetWriterStatusFailed) {

NSLog(@"writer error %@",_writer.error.localizedDescription);

returnNO;

}

//判断是否是视频

if (isVideo) {

//视频输入是否准备接受更多的媒体数据

if (_videoInput.readyForMoreMediaData ==YES)
{

//拼接视频数据

[_videoInputappendSampleBuffer:sampleBuffer];

returnYES;

}

}else {

//音频输入是否准备接受更多的媒体数据

if (_audioInput.readyForMoreMediaData)
{

//拼接音频数据

[_audioInputappendSampleBuffer:sampleBuffer];

returnYES;

}

}

}

returnNO;

}

@end
=========================

#import <Foundation/Foundation.h>

#import <AVFoundation/AVFoundation.h>

/**

* 写入并编码视频的的类

*/

@interface WCLRecordEncoder :
NSObject

@property (nonatomic,readonly)NSString
*path;

/**

* WCLRecordEncoder遍历构造器的

*

* @param path 媒体存发路径

* @param cy 视频分辨率的高

* @param cx 视频分辨率的宽

* @param ch 音频通道

* @param rate 音频的采样比率

*

* @return WCLRecordEncoder的实体

*/

+ (WCLRecordEncoder*)encoderForPath:(NSString*)path Height:(NSInteger)cy
width:(NSInteger)cx channels: (int)ch samples:(Float64)rate;

/**

* 初始化方法

*

* @param path 媒体存发路径

* @param cy 视频分辨率的高

* @param cx 视频分辨率的宽

* @param ch 音频通道

* @param rate 音频的采样率

*

* @return WCLRecordEncoder的实体

*/

- (instancetype)initPath:(NSString*)path Height:(NSInteger)cy width:(NSInteger)cx
channels: (int)ch samples:(Float64)rate;

/**

* 完成视频录制时调用

*

* @param handler 完成的回掉block

*/

- (void)finishWithCompletionHandler:(void (^)(void))handler;

/**

* 通过这个方法写入数据

*

* @param sampleBuffer 写入的数据

* @param isVideo 是否写入的是视频

*

* @return 写入是否成功

*/

- (BOOL)encodeFrame:(CMSampleBufferRef)sampleBuffer isVideo:(BOOL)isVideo;

@end

================================

#import "WCLRecordProgressView.h"

@implementation WCLRecordProgressView

- (void)setProgress:(CGFloat)progress {

_progress = progress;

[selfsetNeedsDisplay];

}

- (void)setProgressBgColor:(UIColor *)progressBgColor {

_progressBgColor = progressBgColor;

[selfsetNeedsDisplay];

}

- (void)setloadProgressColor:(UIColor *)loadProgressColor {

_loadProgressColor = loadProgressColor;

[selfsetNeedsDisplay];

}

- (void)setLoadProgress:(CGFloat)loadProgress {

_loadProgress = loadProgress;

[selfsetNeedsDisplay];

}

- (void)setProgressColor:(UIColor *)progressColor {

_progressColor = progressColor;

[selfsetNeedsDisplay];

}

- (void)drawRect:(CGRect)rect {

CGContextRef context =UIGraphicsGetCurrentContext();

CGContextAddRect(context,CGRectMake(0,0,
rect.size.width, rect.size.height));

[self.progressBgColorset];

CGContextSetAlpha(context,0.5);

CGContextDrawPath(context,kCGPathFill);

CGContextAddRect(context,CGRectMake(0,0,
rect.size.width*self.loadProgress,
rect.size.height));

[self.progressBgColorset];

CGContextSetAlpha(context,1);

CGContextDrawPath(context,kCGPathFill);

CGContextAddRect(context,CGRectMake(0,0,
rect.size.width*self.progress,
rect.size.height));

[self.progressColorset];

CGContextSetAlpha(context,1);

CGContextDrawPath(context,kCGPathFill);

}

@end

=====================

#import <UIKit/UIKit.h>

IB_DESIGNABLE

@interface WCLRecordProgressView :UIView

@property (assign,nonatomic)IBInspectableCGFloat
progress;//当前进度

@property (strong,nonatomic)IBInspectableUIColor
*progressBgColor;//进度条背景颜色

@property (strong,nonatomic)IBInspectableUIColor
*progressColor;//进度条颜色

@property (assign,nonatomic)CGFloat
loadProgress;//加载好的进度

@property (strong,nonatomic)UIColor
*loadProgressColor;//已经加载好的进度颜色

@end

转载自:https://github.com/631106979/WCLRecordVideo.git
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: