iOS Swift-AVCaptureSession-根据帧速率捕获帧
我正在尝试构建一个应用程序,该程序将从相机捕获帧,并在将这些文件保存到设备之前使用OpenCV进行处理,但以特定的帧速率 目前我一直坚持的是,iOS Swift-AVCaptureSession-根据帧速率捕获帧,ios,swift,avcapturesession,avcapturedevice,Ios,Swift,Avcapturesession,Avcapturedevice,我正在尝试构建一个应用程序,该程序将从相机捕获帧,并在将这些文件保存到设备之前使用OpenCV进行处理,但以特定的帧速率 目前我一直坚持的是,AVCaptureVideoDataOutputSampleBufferDelegate似乎不尊重AVCaptureDevice.activeVideoMinFrameDuration或AVCaptureDevice.activeVideoMaxFrameDuration设置 captureOutput的运行速度远远快于上述设置显示的每秒2帧 您是否碰巧知
AVCaptureVideoDataOutputSampleBufferDelegate
似乎不尊重AVCaptureDevice.activeVideoMinFrameDuration
或AVCaptureDevice.activeVideoMaxFrameDuration
设置
captureOutput
的运行速度远远快于上述设置显示的每秒2帧
您是否碰巧知道,无论是否有代表,如何才能做到这一点
视图控制器:
override func viewDidLoad() {
super.viewDidLoad()
}
override func viewDidAppear(animated: Bool) {
setupCaptureSession()
}
func setupCaptureSession() {
let session : AVCaptureSession = AVCaptureSession()
session.sessionPreset = AVCaptureSessionPreset1280x720
let videoDevices : [AVCaptureDevice] = AVCaptureDevice.devices() as! [AVCaptureDevice]
for device in videoDevices {
if device.position == AVCaptureDevicePosition.Back {
let captureDevice : AVCaptureDevice = device
do {
try captureDevice.lockForConfiguration()
captureDevice.activeVideoMinFrameDuration = CMTimeMake(1, 2)
captureDevice.activeVideoMaxFrameDuration = CMTimeMake(1, 2)
captureDevice.unlockForConfiguration()
let input : AVCaptureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)
if session.canAddInput(input) {
try session.addInput(input)
}
let output : AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()
let dispatch_queue : dispatch_queue_t = dispatch_queue_create("streamoutput", nil)
output.setSampleBufferDelegate(self, queue: dispatch_queue)
session.addOutput(output)
session.startRunning()
let previewLayer = AVCaptureVideoPreviewLayer(session: session)
previewLayer.connection.videoOrientation = .LandscapeRight
let previewBounds : CGRect = CGRectMake(0,0,self.view.frame.width/2,self.view.frame.height+20)
previewLayer.backgroundColor = UIColor.blackColor().CGColor
previewLayer.frame = previewBounds
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
self.imageView.layer.addSublayer(previewLayer)
self.previewMat.frame = CGRectMake(previewBounds.width, 0, previewBounds.width, previewBounds.height)
} catch _ {
}
break
}
}
}
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
self.wrapper.processBuffer(self.getUiImageFromBuffer(sampleBuffer), self.previewMat)
}
所以我已经解决了问题 在
activeVideoMinFrameDuration
属性上方的AVCaptureDevice.h
注释部分,它声明:
在iOS上,接收器的activeVideoMinFrameDuration重置为其
以下条件下的默认值:
- 接收器的activeFormat更改
- 接收器的AVCaptureDeviceInput会话的会话预设更改
- 接收器的AVCaptureDeviceInput被添加到会话中
do {
let input : AVCaptureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)
if session.canAddInput(input) {
try session.addInput(input)
}
try captureDevice.lockForConfiguration()
captureDevice.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: 2)
captureDevice.activeVideoMaxFrameDuration = CMTimeMake(value: 1, timescale: 2)
captureDevice.unlockForConfiguration()
let output : AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()
let dispatch_queue : dispatch_queue_t = dispatch_queue_create("streamoutput", nil)
output.setSampleBufferDelegate(self, queue: dispatch_queue)
session.addOutput(output)
这里的关键部分是关于值何时重置的注释!我也得到了比特,因为我在将设备添加到会话之前设置了帧速率!