Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/ios/117.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Ios Swift 3-自定义相机视图-单击按钮后显示照片的静止图像_Ios_Swift_Swift3_Avfoundation - Fatal编程技术网

Ios Swift 3-自定义相机视图-单击按钮后显示照片的静止图像

Ios Swift 3-自定义相机视图-单击按钮后显示照片的静止图像,ios,swift,swift3,avfoundation,Ios,Swift,Swift3,Avfoundation,我使用的是Swift 3,Xcode 8.2 我有一个自定义的摄像头视图,显示视频馈送和一个按钮,我想作为一个快门。当用户点击按钮时,我希望拍摄一张照片并显示在屏幕上。(例如,像Snapchat或Facebook Messenger风格的摄像头行为) 这是我的密码: import UIKit import AVFoundation class CameraVC: UIViewController, AVCapturePhotoCaptureDelegate { // this is w

我使用的是Swift 3,Xcode 8.2

我有一个自定义的摄像头视图,显示视频馈送和一个按钮,我想作为一个快门。当用户点击按钮时,我希望拍摄一张照片并显示在屏幕上。(例如,像Snapchat或Facebook Messenger风格的摄像头行为)

这是我的密码:

import UIKit
import AVFoundation

class CameraVC: UIViewController, AVCapturePhotoCaptureDelegate {

   // this is where the camera feed from the phone is going to be displayed
    @IBOutlet var cameraView : UIView!

    var shutterButton : UIButton = UIButton.init(type: .custom)

    // manages capture activity and coordinates the flow of data from input devices to capture outputs.
    var capture_session = AVCaptureSession()

    // a capture output for use in workflows related to still photography.
    var session_output = AVCapturePhotoOutput()

    // preview layer that we will have on our view so users can see the photo we took
    var preview_layer = AVCaptureVideoPreviewLayer()

    // still picture image is what we show as the picture taken, frozen on the screen
    var still_picture_image : UIImage!

    ... //more code in viewWillAppear that sets up the camera feed

    // called when the shutter button is pressed
    func shutterButtonPressed() {

        // get the actual video feed and take a photo from that feed
        session_output.capturePhoto(with: AVCapturePhotoSettings.init(format: [AVVideoCodecKey : AVVideoCodecJPEG]), delegate: self as AVCapturePhotoCaptureDelegate)
    }

    func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

        // take the session output, get the buffer, and create an image from that buffer
        if let sampleBuffer = photoSampleBuffer,
            let previewBuffer = previewPhotoSampleBuffer,
            let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {

            print("Here") // doesn't get here

        }

    }
        if let error = error {
            print(error.localizedDescription)
        }
运行此代码时,它似乎没有打印“Here”,我也找不到任何有关如何显示此图像的Swift 3教程。我猜我想把
图像数据
分配给我的
静止图像
并以某种方式覆盖在相机馈送上

在正确的方向上的任何帮助或一点都将是巨大的帮助

编辑

将以下内容添加到我的代码后:

import UIKit
import AVFoundation

class CameraVC: UIViewController, AVCapturePhotoCaptureDelegate {

   // this is where the camera feed from the phone is going to be displayed
    @IBOutlet var cameraView : UIView!

    var shutterButton : UIButton = UIButton.init(type: .custom)

    // manages capture activity and coordinates the flow of data from input devices to capture outputs.
    var capture_session = AVCaptureSession()

    // a capture output for use in workflows related to still photography.
    var session_output = AVCapturePhotoOutput()

    // preview layer that we will have on our view so users can see the photo we took
    var preview_layer = AVCaptureVideoPreviewLayer()

    // still picture image is what we show as the picture taken, frozen on the screen
    var still_picture_image : UIImage!

    ... //more code in viewWillAppear that sets up the camera feed

    // called when the shutter button is pressed
    func shutterButtonPressed() {

        // get the actual video feed and take a photo from that feed
        session_output.capturePhoto(with: AVCapturePhotoSettings.init(format: [AVVideoCodecKey : AVVideoCodecJPEG]), delegate: self as AVCapturePhotoCaptureDelegate)
    }

    func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

        // take the session output, get the buffer, and create an image from that buffer
        if let sampleBuffer = photoSampleBuffer,
            let previewBuffer = previewPhotoSampleBuffer,
            let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {

            print("Here") // doesn't get here

        }

    }
        if let error = error {
            print(error.localizedDescription)
        }

但是我仍然没有打印出任何错误。

将以下代码添加到委托方法中,以打印出抛出的错误:

if let error = error {
   print(error.localizedDescription)
}

一旦您解决了错误,我认为这篇文章应该可以帮助您提取图像:

将以下代码添加到您的委托方法中,以打印出抛出的错误:

if let error = error {
   print(error.localizedDescription)
}

一旦你解决了错误,我想这篇文章应该可以帮助你提取图像:

好的,我解决了我的问题:

首先,将UIImageView拖动到情节提要,并使其占据整个屏幕。这是按下快门按钮后显示静态图片的位置

在代码中创建该变量并将其链接

@IBOutlet weak var stillPicture : UIImageView!
然后,在
viewdiload
中,确保将
UIImageView
插入相机视图顶部

self.view.insertSubview(stillPicture, aboveSubview: your_camera_view)
这是单击快门按钮时调用的函数:

func shutterButtonPressed() {

    let settings = AVCapturePhotoSettings()

    let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
    let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                         kCVPixelBufferWidthKey as String: 160,
                         kCVPixelBufferHeightKey as String: 160,
                         ]
    settings.previewPhotoFormat = previewFormat
    session_output.capturePhoto(with: settings, delegate: self)
}
然后,在捕获代理中:

func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
        if let error = error {
            print(error.localizedDescription)
        }

        // take the session output, get the buffer, and create an image from that buffer
        if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
            // this is the image that the user has taken!
            let takenImage : UIImage = UIImage(data: dataImage)!
            stillPicture?.image = takenImage                
        } else {
            print("Error setting up photo capture")
        }
}

好吧,我解决了我的问题:

首先,将UIImageView拖动到情节提要,并使其占据整个屏幕。这是按下快门按钮后显示静态图片的位置

在代码中创建该变量并将其链接

@IBOutlet weak var stillPicture : UIImageView!
然后,在
viewdiload
中,确保将
UIImageView
插入相机视图顶部

self.view.insertSubview(stillPicture, aboveSubview: your_camera_view)
这是单击快门按钮时调用的函数:

func shutterButtonPressed() {

    let settings = AVCapturePhotoSettings()

    let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
    let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                         kCVPixelBufferWidthKey as String: 160,
                         kCVPixelBufferHeightKey as String: 160,
                         ]
    settings.previewPhotoFormat = previewFormat
    session_output.capturePhoto(with: settings, delegate: self)
}
然后,在捕获代理中:

func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
        if let error = error {
            print(error.localizedDescription)
        }

        // take the session output, get the buffer, and create an image from that buffer
        if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
            // this is the image that the user has taken!
            let takenImage : UIImage = UIImage(data: dataImage)!
            stillPicture?.image = takenImage                
        } else {
            print("Error setting up photo capture")
        }
}

您可以添加一个带有捕获图像的UIImageView,对吗?理论上是的,但是我的代码中没有打印“Here”这个词,因此它似乎没有通过
if
语句。您是否调用了
captureSession?.startRunning()
?嘿,你能发布完整的代码吗?你可以添加一个带有捕获图像的UIImageView,对吗?理论上,是的,但是我的代码中没有打印“Here”这个词,所以它似乎没有通过
if
语句。你调用了
captureSession吗?.startRunning()
?嘿,你能发布完整的代码吗?