Ios 以OpenCV Mat格式从相机捕获静止图像
我正在开发一个iOS应用程序,并尝试使用捕获会话从相机获取静态图像快照,但我无法将其成功转换为OpenCV Mat 静态图像输出是使用以下代码创建的:Ios 以OpenCV Mat格式从相机捕获静止图像,ios,opencv,avfoundation,Ios,Opencv,Avfoundation,我正在开发一个iOS应用程序,并尝试使用捕获会话从相机获取静态图像快照,但我无法将其成功转换为OpenCV Mat 静态图像输出是使用以下代码创建的: - (void)createStillImageOutput; { // setup still image output with jpeg codec self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init]; NSDictionary *outp
- (void)createStillImageOutput;
{
// setup still image output with jpeg codec
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
[self.captureSession addOutput:self.stillImageOutput];
for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([port.mediaType isEqual:AVMediaTypeVideo]) {
self.videoCaptureConnection = connection;
break;
}
}
if (self.videoCaptureConnection) {
break;
}
}
NSLog(@"[Camera] still image output created");
}
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:self.videoCaptureConnection
completionHandler:
^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
if (error == nil && imageSampleBuffer != NULL)
{
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
}
UIImage* newImage = [UIImage imageWithData:jpegData];
然后尝试使用以下代码捕获静止图像:
- (void)createStillImageOutput;
{
// setup still image output with jpeg codec
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
[self.captureSession addOutput:self.stillImageOutput];
for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([port.mediaType isEqual:AVMediaTypeVideo]) {
self.videoCaptureConnection = connection;
break;
}
}
if (self.videoCaptureConnection) {
break;
}
}
NSLog(@"[Camera] still image output created");
}
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:self.videoCaptureConnection
completionHandler:
^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
if (error == nil && imageSampleBuffer != NULL)
{
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
}
UIImage* newImage = [UIImage imageWithData:jpegData];
我需要一种基于缓冲区中的像素数据创建OpenCV Mat的方法。
我曾尝试使用以下代码创建垫子,这些代码取自OpenCV摄像机类摄像机:
但它无法在CVPixelBufferGetWidth或CVPixelBufferGetHeight调用上获取图像的实际高度和宽度,因此Mat的创建失败
我知道我可以使用以下代码基于像素数据创建UIImage:
- (void)createStillImageOutput;
{
// setup still image output with jpeg codec
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
[self.captureSession addOutput:self.stillImageOutput];
for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([port.mediaType isEqual:AVMediaTypeVideo]) {
self.videoCaptureConnection = connection;
break;
}
}
if (self.videoCaptureConnection) {
break;
}
}
NSLog(@"[Camera] still image output created");
}
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:self.videoCaptureConnection
completionHandler:
^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
if (error == nil && imageSampleBuffer != NULL)
{
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
}
UIImage* newImage = [UIImage imageWithData:jpegData];
但我更喜欢直接构建CvMat,就像OpenCV CvVideoCamera类中的情况一样,因为我只对OpenCV中的图像感兴趣,我不想再次花费时间转换,也不想冒失去质量或方向问题的风险(无论如何,OpenCV提供的UIImagetoCV转换函数导致内存泄漏,无法释放内存)
请告知我如何获得OpenCV Mat的图像。
提前谢谢。我已经找到了解决问题的方法 解决办法是: 通过在OpenCv Camera类中重写此方法:“createVideoPreviewLayer” 应该是这样的:
- (void)createVideoPreviewLayer;
{
self.parentView.layer.sublayers = nil;
if (captureVideoPreviewLayer == nil) {
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc]
initWithSession:self.captureSession];
}
if (self.parentView != nil) {
captureVideoPreviewLayer.frame = self.parentView.bounds;
captureVideoPreviewLayer.videoGravity =
AVLayerVideoGravityResizeAspectFill;
[self.parentView.layer addSublayer:captureVideoPreviewLayer];
}
NSLog(@"[Camera] created AVCaptureVideoPreviewLayer");
}
您应该将这一行添加到“createVideoPreviewLayer”方法将解决的问题中
问题是:
self.parentView.layer.sublayers = nil;
您需要使用暂停方法而不是停止方法