Iphone 为什么AVCaptureSession输出方向错误?
因此,我按照苹果的指示,使用Iphone 为什么AVCaptureSession输出方向错误?,iphone,avcapturesession,avcapturedevice,Iphone,Avcapturesession,Avcapturedevice,因此,我按照苹果的指示,使用AVCaptureSession::捕获视频会话。我面临的一个问题是,尽管摄像头/iPhone设备的方向是垂直的(并且AVCaptureVideoPreviewLayer显示垂直的摄像头流),但输出图像似乎处于横向模式。我检查了样本代码的imageFromSampleBuffer:中imageBuffer的宽度和高度,分别得到了640px和480px。有人知道为什么会这样吗 谢谢 查看标题AVCaptureSession.h。有一个名为AVCaptureVideoOr
AVCaptureSession
::捕获视频会话。我面临的一个问题是,尽管摄像头/iPhone设备的方向是垂直的(并且AVCaptureVideoPreviewLayer
显示垂直的摄像头流),但输出图像似乎处于横向模式。我检查了样本代码的imageFromSampleBuffer:
中imageBuffer的宽度和高度,分别得到了640px和480px。有人知道为什么会这样吗
谢谢 查看标题AVCaptureSession.h。有一个名为
AVCaptureVideoOrientation
的枚举定义,它定义了各种视频方向。在AVCaptureConnection对象上有一个名为videoOrientation的属性,它是一个AVCaptureVideoOrientation
。您应该能够将其设置为更改视频的方向。您可能需要avcaptureVideoOrientationAndscapeRight
或avcaptureVideoOrientationAndscapeLeft
通过查看会话的输出,可以找到会话的AVCaptureConnections。输出具有一个connections属性,该属性是该输出的连接数组。我对
imageFromSampleBuffer
进行了简单的单行修改,以纠正方向问题(请参阅代码中“我修改了…”下的注释)。希望它能帮助别人,因为我在这上面花了太多时间
// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer {
// Get a CMSampleBuffer's Core Video image buffer for the media data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// Get the number of bytes per row for the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Create a bitmap graphics context with the sample buffer data
CGContextRef context1 = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context1);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// Free up the context and color space
CGContextRelease(context1);
CGColorSpaceRelease(colorSpace);
// Create an image object from the Quartz image
//I modified this line: [UIImage imageWithCGImage:quartzImage]; to the following to correct the orientation:
UIImage *image = [UIImage imageWithCGImage:quartzImage scale:1.0 orientation:UIImageOrientationRight];
// Release the Quartz image
CGImageRelease(quartzImage);
return (image);
}
例如:
AVCaptureConnection *captureConnection = <a capture connection>;
if ([captureConnection isVideoOrientationSupported]) {
captureConnection.videoOrientation = AVCaptureVideoOrientationPortrait;
}
AVCaptureConnection*captureConnection=。以下是正确的顺序:
AVCaptureVideoDataOutput *videoCaptureOutput = [[AVCaptureVideoDataOutput alloc] init];
if([self.captureSession canAddOutput:self.videoCaptureOutput]){
[self.captureSession addOutput:self.videoCaptureOutput];
}else{
NSLog(@"cantAddOutput");
}
// set portrait orientation
AVCaptureConnection *conn = [self.videoCaptureOutput connectionWithMediaType:AVMediaTypeVideo];
[conn setVideoOrientation:AVCaptureVideoOrientationPortrait];
方向问题与前置摄像头有关,因此请检查设备类型并生成新图像,这肯定会解决方向问题:
-(void)capture:(void(^)(UIImage *))handler{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
if (imageSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
**UIImage *capturedImage = [UIImage imageWithData:imageData];
if (self.captureDevice == [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo][1]) {
capturedImage = [[UIImage alloc] initWithCGImage:capturedImage.CGImage scale:1.0f orientation:UIImageOrientationLeftMirrored];
}**
handler(capturedImage);
}
}];
}
你们都让这件事变得很困难
在DidOutputSampleBuffer中,只需在抓取图像之前更改方向即可。是单核细胞增多症,但你有
public class OutputRecorder : AVCaptureVideoDataOutputSampleBufferDelegate {
public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
{
try {
connection.videoOrientation = AVCaptureVideoOrientation.LandscapeLeft;
在objC中,就是这种方法
- ( void ) captureOutput: ( AVCaptureOutput * ) captureOutput
didOutputSampleBuffer: ( CMSampleBufferRef ) sampleBuffer
fromConnection: ( AVCaptureConnection * ) connection
如果AVCaptureVideoPreviewLayer
方向正确,您只需在拍摄图像之前设置方向即可
AVCaptureStillImageOutput *stillImageOutput;
AVCaptureVideoPreviewLayer *previewLayer;
NSData *capturedImageData;
AVCaptureConnection *videoConnection = [stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
if ([videoConnection isVideoOrientationSupported]) {
[videoConnection setVideoOrientation:previewLayer.connection.videoOrientation];
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
CFDictionaryRef exifAttachments =
CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
// Do something with the attachments.
}
// TODO need to manually add GPS data to the image captured
capturedImageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [UIImage imageWithData:capturedImageData];
}];
另外,需要注意的是,UIImageOrientation
和AVCaptureVideoOrientation
是不同的UIImageOrientationUp
指的是横向模式,音量控制向下朝向地面(不向上,如果您考虑将音量控制用作快门按钮)
因此,电源按钮指向天空时的纵向方向(avcaptureVideoOrientationGrait
)实际上是UIImageOrientationLeft
对于那些需要使用CIImage的人来说,缓冲区的方向是错误的,我使用了此更正
就这么简单。顺便说一句,数字3,1,6,8来自这里
别问我为什么3,1,6,8是正确的组合。我用蛮力的方法找到了它。如果你知道为什么让解释在评论中,请
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// common way to get CIImage
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate);
CIImage *ciImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer
options:(__bridge NSDictionary *)attachments];
if (attachments) {
CFRelease(attachments);
}
// fixing the orientation of the CIImage
UIInterfaceOrientation curOrientation = [[UIApplication sharedApplication] statusBarOrientation];
if (curOrientation == UIInterfaceOrientationLandscapeLeft){
ciImage = [ciImage imageByApplyingOrientation:3];
} else if (curOrientation == UIInterfaceOrientationLandscapeRight){
ciImage = [ciImage imageByApplyingOrientation:1];
} else if (curOrientation == UIInterfaceOrientationPortrait){
ciImage = [ciImage imageByApplyingOrientation:6];
} else if (curOrientation == UIInterfaceOrientationPortraitUpsideDown){
ciImage = [ciImage imageByApplyingOrientation:8];
}
// ....
}
您可以尝试以下方法:
private func startLiveVideo() {
let captureSession = AVCaptureSession()
captureSession.sessionPreset = .photo
let captureDevice = AVCaptureDevice.default(for: .video)
let input = try! AVCaptureDeviceInput(device: captureDevice!)
let output = AVCaptureVideoDataOutput()
captureSession.addInput(input)
captureSession.addOutput(output)
output.setSampleBufferDelegate(self, queue: DispatchQueue(label: "videoQueue"))
output.connection(with: .video)?.videoOrientation = .portrait
let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.frame = view.bounds
view.layer.addSublayer(previewLayer)
captureSession.startRunning()
}
首先,在视频输出的配置中,放置以下行:
guard let connection = videoOutput.connection(withMediaType:
AVFoundation.AVMediaTypeVideo) else { return }
guard connection.isVideoOrientationSupported else { return }
guard connection.isVideoMirroringSupported else { return }
connection.videoOrientation = .portrait
connection.isVideoMirrored = position == .front
然后,通过在常规配置中取消选中横向模式,将目标配置为仅支持Portait
()
一旦您能够正确使用AVCaptureSession
,就可以设置视频方向。
下面是对上述代码的详细描述。请记住,此代码必须在执行[captureSession startRunning]
后执行:
选择您喜欢的方向
对于ios版本>=13.0,您必须从captureSession
检索活动连接。请记住:只有视频连接支持videoOrientation
对于ios版本<13.0,您可以使用previewLayer
如果您的viewController没有固定的方向,则可以在设备方向更改后为您的连接设置新的videoOrientation
。在类型为“AVCaptureSession*”的对象上找不到属性“videoOrientation”。
@cheesus videoOrientation位于AVCaptureConnection上,不是AvCaptureSession,而是应该在AVCaptureConnection
中的何处设置此属性?在委托方法内部?类似于:-(void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)SampleBufferfromConnection:(AVCaptureConnection*)连接{[connection setVideoOrientation:AvCaptureVideoOrientation纵向];
@rptwsthi您可以在将输出添加到会话后设置captureSession.addOutput(videoOutput)let connection=videoOutput.connectionWithMediaType(AVFoundation.AVMediaTypeVideo)connection.videoOrientation=.grait captureSession.startRunning()
这对我来说很有效,我将它添加到了我的AvCaptureConnection处理程序中。它按预期旋转了图像,但镜像了视频。我发现isMirrored已被弃用,因此不清楚如何修复镜像内容。b/c新的镜像方法没有使用文档。@Jim:已被弃用,您应该改用AVCaptureConnection
的videoMirrored
属性。它至少在头文件中有文档记录,其中mirrored
及其朋友还明确地指代您使用videoMirrored
和匹配的捕获连接方法。看起来该功能不可用iOS标签sadly@Jim:不行,老兄!它也在在线文档中:是什么让你认为它在iOS上不可用?谢谢,对我来说它很有帮助。注意下面的方法:you'da man!我无法理解这一点,因为我没有使用AVCaptureConnection…这就是诀窍。谢谢!@ReidBelton你不能使用AVCaptureConnection,当您有输入和输出时,它会自动创建,您只需检索它并更改方向
// #1
AVCaptureVideoOrientation newOrientation = AVCaptureVideoOrientationLandscapeRight;
if (@available(iOS 13.0, *)) {
// #2
for (AVCaptureConnection *connection in [captureSession connections]) {
if ([connection isVideoOrientationSupported]) {
connection.videoOrientation = newOrientation;
break;
}
} // #3
} else if ([previewLayer.connection isVideoOrientationSupported]) {
previewLayer.connection.videoOrientation = newOrientation;
}