Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/ios/98.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/swift/18.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Ios 无法使用AVAssetTrackReaderOutput在while循环中更新UIViewImage_Ios_Swift_Uikit_Avfoundation - Fatal编程技术网

Ios 无法使用AVAssetTrackReaderOutput在while循环中更新UIViewImage

Ios 无法使用AVAssetTrackReaderOutput在while循环中更新UIViewImage,ios,swift,uikit,avfoundation,Ios,Swift,Uikit,Avfoundation,我试图从视频中提取每一帧,进行一些图像处理,然后将图像显示回UIImageView 如果我手动单击UIButton,图像将显示并迭代整个视频,并显示每个帧 但是,如果我将更新显示嵌入while循环,则视图不会更新(即,在设备上它不会更新) 我认为这是因为相对于在屏幕上绘制更新的图像而言,帧的处理速度太快了,所以我加入了睡眠线,以将其降低到视频的帧速率,但这不起作用 代码如下: import UIKit import Foundation import Vision import AVFounda

我试图从视频中提取每一帧,进行一些图像处理,然后将图像显示回UIImageView

如果我手动单击UIButton,图像将显示并迭代整个视频,并显示每个帧

但是,如果我将更新显示嵌入while循环,则视图不会更新(即,在设备上它不会更新)

我认为这是因为相对于在屏幕上绘制更新的图像而言,帧的处理速度太快了,所以我加入了睡眠线,以将其降低到视频的帧速率,但这不起作用

代码如下:

import UIKit
import Foundation
import Vision
import AVFoundation
import Darwin

class ViewController: UIViewController {

    var uiImage: UIImage?

    var displayLink: CADisplayLink?

    var videoAsset: AVAsset!
    var videoTrack: AVAssetTrack!
    var videoReader: AVAssetReader!
    var videoOutput: AVAssetReaderTrackOutput!


    @IBOutlet weak var topView: UIImageView!

    @IBOutlet weak var bottomView: UIImageView!

    @IBOutlet weak var rightLabel: UILabel!

    @IBOutlet weak var leftLabel: UILabel!

    @IBAction func tapButton(_ sender: Any) {

        while let sampleBuffer = videoOutput.copyNextSampleBuffer() {
            print ("sample at time \(CMSampleBufferGetPresentationTimeStamp(sampleBuffer))")
            if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {

                let ciImage = CIImage(cvImageBuffer: imageBuffer)
                uiImage = UIImage(ciImage: ciImage)
                self.topView.image = uiImage
                self.topView.setNeedsDisplay()
                usleep(useconds_t(24000))

            }

        }
    }


    override func viewDidLoad() {

        super.viewDidLoad()

    }



    override func viewDidAppear(_ animated: Bool) {




        guard let urlPath = Bundle.main.path(forResource: "video1", ofType: "mp4")  else {
            print ("No File")
            return
        }

        videoAsset = AVAsset(url: URL(fileURLWithPath: urlPath))
        let array = videoAsset.tracks(withMediaType: AVMediaType.video)
        videoTrack = array[0]


        do {
            videoReader = try AVAssetReader(asset: videoAsset)
        } catch {
            print ("No reader created")
        }

        videoOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange])
        videoReader.add(videoOutput)
        videoReader.startReading()


    }

}

在你的情况下,你应该使用计时器,而不是睡觉。请尝试以下操作:

let frameDuration: TimeInterval = 1.0/60.0 // Using 60 FPS
Timer.scheduledTimer(withTimeInterval: frameDuration, repeats: true) { timer in
    guard let sampleBuffer = videoOutput.copyNextSampleBuffer() else  {
        timer.invalidate()
        return
    }

    print ("sample at time \(CMSampleBufferGetPresentationTimeStamp(sampleBuffer))")
    if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
        let ciImage = CIImage(cvImageBuffer: imageBuffer)
        uiImage = UIImage(ciImage: ciImage)
        self.topView.image = uiImage
        self.topView.setNeedsDisplay()
    }
}
您的情况中的问题是,在执行此操作时,您仍然处于主线程上。所以当你睡觉的时候,你不会让你的主线程继续。因此,基本上你的主线程是构建图像和睡眠,但你没有给它时间来实际更新用户界面

您可以使用单独的线程执行类似的操作。例如,一个简单的分派就可以了:

DispatchQueue(label: "Updating images").async {
    // Now on separate thread
    while let sampleBuffer = videoOutput.copyNextSampleBuffer() {
        print ("sample at time \(CMSampleBufferGetPresentationTimeStamp(sampleBuffer))")
        if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
            let ciImage = CIImage(cvImageBuffer: imageBuffer)
            uiImage = UIImage(ciImage: ciImage)

            // The UI part now needs to go back to main thread
            DispatchQueue.main.async {
                self.topView.image = uiImage
                self.topView.setNeedsDisplay()
            }

            usleep(useconds_t(24000))

        }
    }
}

但仍然要更新(至少)主线程上的UIKit部件,这一点很重要。甚至CI部件也可能需要在主线程上。最好试试看。

在你的情况下,你应该使用计时器,而不是睡觉。请尝试以下操作:

let frameDuration: TimeInterval = 1.0/60.0 // Using 60 FPS
Timer.scheduledTimer(withTimeInterval: frameDuration, repeats: true) { timer in
    guard let sampleBuffer = videoOutput.copyNextSampleBuffer() else  {
        timer.invalidate()
        return
    }

    print ("sample at time \(CMSampleBufferGetPresentationTimeStamp(sampleBuffer))")
    if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
        let ciImage = CIImage(cvImageBuffer: imageBuffer)
        uiImage = UIImage(ciImage: ciImage)
        self.topView.image = uiImage
        self.topView.setNeedsDisplay()
    }
}
您的情况中的问题是,在执行此操作时,您仍然处于主线程上。所以当你睡觉的时候,你不会让你的主线程继续。因此,基本上你的主线程是构建图像和睡眠,但你没有给它时间来实际更新用户界面

您可以使用单独的线程执行类似的操作。例如,一个简单的分派就可以了:

DispatchQueue(label: "Updating images").async {
    // Now on separate thread
    while let sampleBuffer = videoOutput.copyNextSampleBuffer() {
        print ("sample at time \(CMSampleBufferGetPresentationTimeStamp(sampleBuffer))")
        if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
            let ciImage = CIImage(cvImageBuffer: imageBuffer)
            uiImage = UIImage(ciImage: ciImage)

            // The UI part now needs to go back to main thread
            DispatchQueue.main.async {
                self.topView.image = uiImage
                self.topView.setNeedsDisplay()
            }

            usleep(useconds_t(24000))

        }
    }
}

但仍然要更新(至少)主线程上的UIKit部件,这一点很重要。甚至CI部件也可能需要在主线程上。最好尝试一下。

您正在阻止“主”线程-阻止它更新UIU您正在阻止“主”线程-阻止它更新UIHanks。我使用了您的建议,将帧抓取器部分卸载到单独的线程中,谢谢。我使用了您的建议,将帧抓取器部分卸载到单独的线程