质量:定制AVFoundation摄像头应用程序与iOS标准摄像头应用程序

质量:定制AVFoundation摄像头应用程序与iOS标准摄像头应用程序,ios,swift,camera,avfoundation,Ios,Swift,Camera,Avfoundation,我用不同的科目和灯光做了很多测试。每项测试都显示标准iOS摄像头应用程序的质量明显优于我定制的基于AVFoundation的应用程序(颜色不褪色、焦距更好、照明更好、颗粒更少)。我无法解释这些巨大的差异。下面是使用两种方法(使用前置摄像头)拍摄的视频截屏示例 iOS标准摄像头应用程序 自定义AVFoundation录制的视频 自定义实现的代码: let chosenCameraType = AVCaptureDevicePosition.Front //get camera let dev

我用不同的科目和灯光做了很多测试。每项测试都显示标准iOS摄像头应用程序的质量明显优于我定制的基于AVFoundation的应用程序(颜色不褪色、焦距更好、照明更好、颗粒更少)。我无法解释这些巨大的差异。下面是使用两种方法(使用前置摄像头)拍摄的视频截屏示例

iOS标准摄像头应用程序

自定义AVFoundation录制的视频

自定义实现的代码:

let chosenCameraType = AVCaptureDevicePosition.Front

//get camera
let devices = AVCaptureDevice.devices()
for device in devices
{
    if (!device.hasMediaType(AVMediaTypeVideo))
    {
        continue
    }

    if (device.position != chosenCameraType)
    {
        continue
    }

    camera = (device as? AVCaptureDevice)!
}

do
{
    captureSession = AVCaptureSession()
    captureSession!.sessionPreset = AVCaptureSessionPresetHigh      

    let video = try AVCaptureDeviceInput(device: camera) as AVCaptureDeviceInput
    captureSession!.addInput(video)

    let audio = try AVCaptureDeviceInput(device: AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio)) as AVCaptureDeviceInput
    captureSession!.addInput(audio)

    fileOutput = AVCaptureMovieFileOutput()
    captureSession?.addOutput(fileOutput)

    captureSession!.startRunning()

    let documentsPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0] as NSString

    let name = String(UInt64(NSDate().timeIntervalSince1970 * 1000))
    fileOutput?.startRecordingToOutputFileURL(NSURL(fileURLWithPath: "\(documentsPath)/" + name + ".mov"), recordingDelegate: self)
}
catch let error as NSError
{
    print(error)
}

请试一试!您也会看到差异…

我注意到您正在描述的内容,并查看您的代码,但我没有看到您实现:

backCamera.focusPointOfInterest = focusPoint
backCamera.focusMode = AVCaptureFocusMode.autoFocus
backCamera.exposureMode = AVCaptureExposureMode.autoExpose
我在
touchesbreated
内部实现了这一功能,以便摄像头将焦点对准用户触摸屏幕的区域。这是代码的一部分:

override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
        let touchPoint = touches.first
        let x = (touchPoint?.location(in: self.view).x)! / self.view.bounds.width
        let y = (touchPoint?.location(in: self.view).y)! / self.view.bounds.height

        let realX = (touchPoint?.location(in: self.view).x)!
        let realY = (touchPoint?.location(in: self.view).y)!

        let focusPoint = CGPoint(x: x, y: y)

        let k = DrawSquare(frame: CGRect(
            origin: CGPoint(x: realX - 75, y: realY - 75),
            size: CGSize(width: 150, height: 150)))


        if backCamera != nil {
            do {
                try backCamera.lockForConfiguration()
                self.previewView.addSubview(k)
            }
            catch {
                print("Can't lock back camera for configuration")
            }
            if backCamera.isFocusPointOfInterestSupported {
                backCamera.focusPointOfInterest = focusPoint
            }
            if backCamera.isFocusModeSupported(AVCaptureDevice.FocusMode.autoFocus) {
                backCamera.focusMode = AVCaptureDevice.FocusMode.autoFocus
            }
            if backCamera.isExposureModeSupported(AVCaptureDevice.ExposureMode.autoExpose) {
                backCamera.exposureMode = AVCaptureDevice.ExposureMode.autoExpose
            }
            backCamera.unlockForConfiguration()

        }
        DispatchQueue.main.asyncAfter(deadline: .now() + .milliseconds(500)) {
            k.removeFromSuperview()
        }
    }
override func touchsbegind(touch:Set,带有事件:UIEvent?){
让touchPoint=touchs.first
设x=(接触点?.location(in:self.view.x)!/self.view.bounds.width
设y=(接触点?.location(in:self.view.y)!/self.view.bounds.height
让realX=(接触点?.location(in:self.view).x)!
让realY=(接触点?.location(in:self.view).y)!
设focusPoint=CGPoint(x:x,y:y)
设k=DrawSquare(帧:CGRect(
原点:CGPoint(x:realX-75,y:realY-75),
尺寸:CGSize(宽:150,高:150)))
如果后置摄像头!=零{
做{
请尝试backCamera.lockForConfiguration()
self.previewView.addSubview(k)
}
抓住{
打印(“无法锁定摄像头进行配置”)
}
如果backCamera.isFocusPointOfInterestSupported{
backCamera.focusPointOfInterest=焦点点
}
如果支持backCamera.isFocusMode(AVCaptureDevice.FocusMode.autoFocus){
backCamera.focusMode=AVCaptureDevice.focusMode.autoFocus
}
如果支持backCamera.IsExposeRemodeSupported(AVCaptureDevice.ExposureMode.autoExpose){
backCamera.exposureMode=AVCaptureDevice.exposureMode.AutoExposure
}
backCamera.unlockForConfiguration()文件
}
DispatchQueue.main.asyncAfter(截止日期:.now()+毫秒(500)){
k、 removeFromSuperview()
}
}

当相机显示暗图像时,用户只需轻触屏幕,即可调整曝光和焦距,从而使图片变亮

你所有的测试都是在低照度或人工照明下进行的吗?您可能需要启用“lowLightBoost”。AVCaptureDevice上有一个属性,AutomaticallyEnablesLowlightBoostWhen-Available。这可以弥补你所看到的差异。是的,那项测试是在弱光下进行的。我尝试启用了您建议的属性,但它说在使用前置摄像头进行测试时不支持该属性。我发布的主要图片显示了照明问题,但我做的其他测试显示了更多问题(特别是当我测试我的肖像时)。与我使用AVFoundation进行的测试相比,默认的摄像头应用程序看起来非常壮观(没有褪色,照明适当,颗粒较少,锐利等)。我使用AVFoundation进行的测试看起来像是用5美分的CMOS传感器完成的。你试过其他微光增强应用程序方法吗?你有没有将你的测试结果与苹果公司的AVCam iOS示例代码进行比较?我认为你的测试在某种程度上是无效的,AVFoundation有很多选项可以重新创建摄像头应用程序的质量。例如,您不设置AVCaptureDeviceFormat,也不设置曝光或焦距。您几乎没有使用默认值。因此,您的结果在很大程度上取决于您的代码实现,很抱歉,这很简单。为了更好地进行测试,请尝试下载此文件。您的会话预设是什么?