我AVCaptureMovieFileOutput
用来录制一些视频.我显示了预览图层,使用AVLayerVideoGravityResizeAspectFill
它稍微放大.我遇到的问题是最终视频较大,包含在预览期间不适合屏幕的额外图像.
这是预览和结果视频
有没有办法可以指定CGRect
我想要从视频中删除的内容AVAssetExportSession
?
编辑----
当我应用它CGAffineTransformScale
时,AVAssetTrack
它放大到视频,并AVMutableVideoComposition
renderSize
设置为view.bounds
它裁剪结束.太棒了,还剩下一个问题.视频的宽度不会拉伸到正确的宽度,只是填充黑色.
编辑2 ----建议的问题/答案是不完整的..
我的一些代码:
在我的- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
方法中,我有这个来裁剪和调整视频大小.
- (void)flipAndSave:(NSURL *)videoURL withCompletionBlock:(void(^)(NSURL *returnURL))completionBlock { AVURLAsset *firstAsset = [AVURLAsset assetWithURL:videoURL]; // 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances. AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init]; // 2 - Video track AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil]; // 2.1 - Create AVMutableVideoCompositionInstruction AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; mainInstruction.timeRange = CMTimeRangeMake(CMTimeMakeWithSeconds(0, 600), firstAsset.duration); // 2.2 - Create an AVMutableVideoCompositionLayerInstruction for the first track AVMutableVideoCompositionLayerInstruction *firstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack]; AVAssetTrack *firstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; UIImageOrientation firstAssetOrientation_ = UIImageOrientationUp; BOOL isFirstAssetPortrait_ = NO; CGAffineTransform firstTransform = firstAssetTrack.preferredTransform; if (firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0) { firstAssetOrientation_ = UIImageOrientationRight; isFirstAssetPortrait_ = YES; } if (firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0) { firstAssetOrientation_ = UIImageOrientationLeft; isFirstAssetPortrait_ = YES; } if (firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0) { firstAssetOrientation_ = UIImageOrientationUp; } if (firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0) { firstAssetOrientation_ = UIImageOrientationDown; } // [firstlayerInstruction setTransform:firstAssetTrack.preferredTransform atTime:kCMTimeZero]; // [firstlayerInstruction setCropRectangle:self.view.bounds atTime:kCMTimeZero]; CGFloat scale = [self getScaleFromAsset:firstAssetTrack]; firstTransform = CGAffineTransformScale(firstTransform, scale, scale); [firstlayerInstruction setTransform:firstTransform atTime:kCMTimeZero]; // 2.4 - Add instructions mainInstruction.layerInstructions = [NSArray arrayWithObjects:firstlayerInstruction,nil]; AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition]; mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction]; mainCompositionInst.frameDuration = CMTimeMake(1, 30); // CGSize videoSize = firstAssetTrack.naturalSize; CGSize videoSize = self.view.bounds.size; BOOL isPortrait_ = [self isVideoPortrait:firstAsset]; if(isPortrait_) { videoSize = CGSizeMake(videoSize.height, videoSize.width); } NSLog(@"%@", NSStringFromCGSize(videoSize)); mainCompositionInst.renderSize = videoSize; // 3 - Audio track AVMutableCompositionTrack *AudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [AudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil]; // 4 - Get path NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"cutoutput.mov"]; NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath]; NSFileManager *manager = [[NSFileManager alloc] init]; if ([manager fileExistsAtPath:outputPath]) { [manager removeItemAtPath:outputPath error:nil]; } // 5 - Create exporter AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality]; exporter.outputURL=outputURL; exporter.outputFileType = AVFileTypeQuickTimeMovie; exporter.shouldOptimizeForNetworkUse = YES; exporter.videoComposition = mainCompositionInst; [exporter exportAsynchronouslyWithCompletionHandler:^{ switch ([exporter status]) { case AVAssetExportSessionStatusFailed: NSLog(@"Export failed: %@ : %@", [[exporter error] localizedDescription], [exporter error]); completionBlock(nil); break; case AVAssetExportSessionStatusCancelled: NSLog(@"Export canceled"); completionBlock(nil); break; default: { NSURL *outputURL = exporter.outputURL; dispatch_async(dispatch_get_main_queue(), ^{ completionBlock(outputURL); }); break; } } }]; }
Matt.. 20
以下是我对您的问题的解释:您正在屏幕比例为4:3的AVCaptureVideoPreviewLayer
设备上捕获视频,因此您的视频为4:3,但视频输入设备以16:9捕获视频,因此生成的视频"更大"比在预览中看到的.
如果您只是想要裁剪预览未捕获的额外像素,请查看此http://www.netwalk.be/article/record-square-video-ios.本文介绍如何将视频裁剪为正方形.但是,您只需要进行一些修改即可裁剪为4:3.我已经去测试了这个,这是我做的改变:
一旦你拥有AVAssetTrack
了视频,你将需要计算一个新的高度.
// we convert the captured height i.e. 1080 to a 4:3 screen ratio and get the new height CGFloat newHeight = clipVideoTrack.naturalSize.height/3*4;
然后使用newHeight修改这两行.
videoComposition.renderSize = CGSizeMake(clipVideoTrack.naturalSize.height, newHeight); CGAffineTransform t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height, -(clipVideoTrack.naturalSize.width - newHeight)/2 );
所以我们在这里做的是将renderSize设置为4:3的比例 - 确切的尺寸基于输入设备.然后我们使用a CGAffineTransform
来翻译视频位置,以便我们在其中看到的AVCaptureVideoPreviewLayer
是呈现给我们文件的内容.
编辑:如果你想把它们放在一起并根据设备的屏幕比例裁剪视频(3:2,4:3,16:9)并考虑视频方向,我们需要添加一些东西.
首先是修改后的示例代码,其中包含一些重要的更改:
// output file NSString* docFolder = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject]; NSString* outputPath = [docFolder stringByAppendingPathComponent:@"output2.mov"]; if ([[NSFileManager defaultManager] fileExistsAtPath:outputPath]) [[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil]; // input file AVAsset* asset = [AVAsset assetWithURL:outputFileURL]; AVMutableComposition *composition = [AVMutableComposition composition]; [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; // input clip AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; // crop clip to screen ratio UIInterfaceOrientation orientation = [self orientationForTrack:asset]; BOOL isPortrait = (orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationPortraitUpsideDown) ? YES: NO; CGFloat complimentSize = [self getComplimentSize:videoTrack.naturalSize.height]; CGSize videoSize; if(isPortrait) { videoSize = CGSizeMake(videoTrack.naturalSize.height, complimentSize); } else { videoSize = CGSizeMake(complimentSize, videoTrack.naturalSize.height); } AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition]; videoComposition.renderSize = videoSize; videoComposition.frameDuration = CMTimeMake(1, 30); AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) ); // rotate and position video AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; CGFloat tx = (videoTrack.naturalSize.width-complimentSize)/2; if (orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationLandscapeRight) { // invert translation tx *= -1; } // t1: rotate and position video since it may have been cropped to screen ratio CGAffineTransform t1 = CGAffineTransformTranslate(videoTrack.preferredTransform, tx, 0); // t2/t3: mirror video horizontally CGAffineTransform t2 = CGAffineTransformTranslate(t1, isPortrait?0:videoTrack.naturalSize.width, isPortrait?videoTrack.naturalSize.height:0); CGAffineTransform t3 = CGAffineTransformScale(t2, isPortrait?1:-1, isPortrait?-1:1); [transformer setTransform:t3 atTime:kCMTimeZero]; instruction.layerInstructions = [NSArray arrayWithObject: transformer]; videoComposition.instructions = [NSArray arrayWithObject: instruction]; // export exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ; exporter.videoComposition = videoComposition; exporter.outputURL=[NSURL fileURLWithPath:outputPath]; exporter.outputFileType=AVFileTypeQuickTimeMovie; [exporter exportAsynchronouslyWithCompletionHandler:^(void){ NSLog(@"Exporting done!"); // added export to library for testing ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init]; if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:[NSURL fileURLWithPath:outputPath]]) { [library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:outputPath] completionBlock:^(NSURL *assetURL, NSError *error) { NSLog(@"Saved to album"); if (error) { } }]; } }];
我们在这里添加的是调用基于将其尺寸裁剪为屏幕比率来获取视频的新渲染大小.一旦我们缩小尺寸,我们需要转换位置以重新定位视频.所以我们抓住它的方向将它移向正确的方向.这将解决我们看到的偏心问题UIInterfaceOrientationLandscapeLeft
.最后CGAffineTransform t2, t3
水平镜像视频.
以下是实现这一目标的两种新方法:
- (CGFloat)getComplimentSize:(CGFloat)size { CGRect screenRect = [[UIScreen mainScreen] bounds]; CGFloat ratio = screenRect.size.height / screenRect.size.width; // we have to adjust the ratio for 16:9 screens if (ratio == 1.775) ratio = 1.77777777777778; return size * ratio; } - (UIInterfaceOrientation)orientationForTrack:(AVAsset *)asset { UIInterfaceOrientation orientation = UIInterfaceOrientationPortrait; NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo]; if([tracks count] > 0) { AVAssetTrack *videoTrack = [tracks objectAtIndex:0]; CGAffineTransform t = videoTrack.preferredTransform; // Portrait if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0) { orientation = UIInterfaceOrientationPortrait; } // PortraitUpsideDown if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0) { orientation = UIInterfaceOrientationPortraitUpsideDown; } // LandscapeRight if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0) { orientation = UIInterfaceOrientationLandscapeRight; } // LandscapeLeft if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0) { orientation = UIInterfaceOrientationLandscapeLeft; } } return orientation; }
这些非常简单.唯一需要注意的是,在getComplimentSize:
方法中我们必须手动调整比例为16:9,因为iPhone5 +分辨率在数学上与真正的16:9无关.
以下是我对您的问题的解释:您正在屏幕比例为4:3的AVCaptureVideoPreviewLayer
设备上捕获视频,因此您的视频为4:3,但视频输入设备以16:9捕获视频,因此生成的视频"更大"比在预览中看到的.
如果您只是想要裁剪预览未捕获的额外像素,请查看此http://www.netwalk.be/article/record-square-video-ios.本文介绍如何将视频裁剪为正方形.但是,您只需要进行一些修改即可裁剪为4:3.我已经去测试了这个,这是我做的改变:
一旦你拥有AVAssetTrack
了视频,你将需要计算一个新的高度.
// we convert the captured height i.e. 1080 to a 4:3 screen ratio and get the new height CGFloat newHeight = clipVideoTrack.naturalSize.height/3*4;
然后使用newHeight修改这两行.
videoComposition.renderSize = CGSizeMake(clipVideoTrack.naturalSize.height, newHeight); CGAffineTransform t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height, -(clipVideoTrack.naturalSize.width - newHeight)/2 );
所以我们在这里做的是将renderSize设置为4:3的比例 - 确切的尺寸基于输入设备.然后我们使用a CGAffineTransform
来翻译视频位置,以便我们在其中看到的AVCaptureVideoPreviewLayer
是呈现给我们文件的内容.
编辑:如果你想把它们放在一起并根据设备的屏幕比例裁剪视频(3:2,4:3,16:9)并考虑视频方向,我们需要添加一些东西.
首先是修改后的示例代码,其中包含一些重要的更改:
// output file NSString* docFolder = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject]; NSString* outputPath = [docFolder stringByAppendingPathComponent:@"output2.mov"]; if ([[NSFileManager defaultManager] fileExistsAtPath:outputPath]) [[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil]; // input file AVAsset* asset = [AVAsset assetWithURL:outputFileURL]; AVMutableComposition *composition = [AVMutableComposition composition]; [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; // input clip AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; // crop clip to screen ratio UIInterfaceOrientation orientation = [self orientationForTrack:asset]; BOOL isPortrait = (orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationPortraitUpsideDown) ? YES: NO; CGFloat complimentSize = [self getComplimentSize:videoTrack.naturalSize.height]; CGSize videoSize; if(isPortrait) { videoSize = CGSizeMake(videoTrack.naturalSize.height, complimentSize); } else { videoSize = CGSizeMake(complimentSize, videoTrack.naturalSize.height); } AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition]; videoComposition.renderSize = videoSize; videoComposition.frameDuration = CMTimeMake(1, 30); AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) ); // rotate and position video AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; CGFloat tx = (videoTrack.naturalSize.width-complimentSize)/2; if (orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationLandscapeRight) { // invert translation tx *= -1; } // t1: rotate and position video since it may have been cropped to screen ratio CGAffineTransform t1 = CGAffineTransformTranslate(videoTrack.preferredTransform, tx, 0); // t2/t3: mirror video horizontally CGAffineTransform t2 = CGAffineTransformTranslate(t1, isPortrait?0:videoTrack.naturalSize.width, isPortrait?videoTrack.naturalSize.height:0); CGAffineTransform t3 = CGAffineTransformScale(t2, isPortrait?1:-1, isPortrait?-1:1); [transformer setTransform:t3 atTime:kCMTimeZero]; instruction.layerInstructions = [NSArray arrayWithObject: transformer]; videoComposition.instructions = [NSArray arrayWithObject: instruction]; // export exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ; exporter.videoComposition = videoComposition; exporter.outputURL=[NSURL fileURLWithPath:outputPath]; exporter.outputFileType=AVFileTypeQuickTimeMovie; [exporter exportAsynchronouslyWithCompletionHandler:^(void){ NSLog(@"Exporting done!"); // added export to library for testing ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init]; if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:[NSURL fileURLWithPath:outputPath]]) { [library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:outputPath] completionBlock:^(NSURL *assetURL, NSError *error) { NSLog(@"Saved to album"); if (error) { } }]; } }];
我们在这里添加的是调用基于将其尺寸裁剪为屏幕比率来获取视频的新渲染大小.一旦我们缩小尺寸,我们需要转换位置以重新定位视频.所以我们抓住它的方向将它移向正确的方向.这将解决我们看到的偏心问题UIInterfaceOrientationLandscapeLeft
.最后CGAffineTransform t2, t3
水平镜像视频.
以下是实现这一目标的两种新方法:
- (CGFloat)getComplimentSize:(CGFloat)size { CGRect screenRect = [[UIScreen mainScreen] bounds]; CGFloat ratio = screenRect.size.height / screenRect.size.width; // we have to adjust the ratio for 16:9 screens if (ratio == 1.775) ratio = 1.77777777777778; return size * ratio; } - (UIInterfaceOrientation)orientationForTrack:(AVAsset *)asset { UIInterfaceOrientation orientation = UIInterfaceOrientationPortrait; NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo]; if([tracks count] > 0) { AVAssetTrack *videoTrack = [tracks objectAtIndex:0]; CGAffineTransform t = videoTrack.preferredTransform; // Portrait if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0) { orientation = UIInterfaceOrientationPortrait; } // PortraitUpsideDown if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0) { orientation = UIInterfaceOrientationPortraitUpsideDown; } // LandscapeRight if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0) { orientation = UIInterfaceOrientationLandscapeRight; } // LandscapeLeft if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0) { orientation = UIInterfaceOrientationLandscapeLeft; } } return orientation; }
这些非常简单.唯一需要注意的是,在getComplimentSize:
方法中我们必须手动调整比例为16:9,因为iPhone5 +分辨率在数学上与真正的16:9无关.