Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/swift/16.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Ios 如何在不使用相机的情况下拍摄图像变化的视频?_Ios_Swift_Avassetwriter_Ciimage - Fatal编程技术网

Ios 如何在不使用相机的情况下拍摄图像变化的视频?

Ios 如何在不使用相机的情况下拍摄图像变化的视频?,ios,swift,avassetwriter,ciimage,Ios,Swift,Avassetwriter,Ciimage,我已经问了一个问题,但没有回答: 但也许我的问题需要更简单一些。我的谷歌搜索毫无结果如何在不使用相机的情况下实时捕获图像变化的视频? 使用captureOutput,我得到了一个CMSampleBuffer,我可以将它制作成一个CVPixelBuffer。AVAssetWriterInput的mediaType设置为视频,但我认为它需要压缩视频。此外,我不清楚AVAssetWriterInput ExpectsDiadataInRealTime属性是否应设置为true 看起来应该很简单,但我所

我已经问了一个问题,但没有回答:

但也许我的问题需要更简单一些。我的谷歌搜索毫无结果如何在不使用相机的情况下实时捕获图像变化的视频?

使用captureOutput,我得到了一个CMSampleBuffer,我可以将它制作成一个CVPixelBuffer。AVAssetWriterInput的mediaType设置为视频,但我认为它需要压缩视频。此外,我不清楚AVAssetWriterInput ExpectsDiadataInRealTime属性是否应设置为true

看起来应该很简单,但我所做的一切都让我的AVAssetWriter的身份失败了

这是我的最后一次尝试。仍然失败:

@objc func importLivePreview(){

    guard var importedImage = importedDryCIImage else { return }

    DispatchQueue.main.async(){

        // apply filter to camera image
        // this is what makes the CIImage appear that it is changing
        importedImage = self.applyFilterAndReturnImage(ciImage: importedImage, orientation: UIImage.Orientation.right, currentCameraRes:currentCameraRes!)


        if self.videoIsRecording &&
           self.assetWriterPixelBufferInput?.assetWriterInput.isReadyForMoreMediaData == true {

            guard let writer: AVAssetWriter = self.assetWriter, writer.status == .writing else {
                return
            }                       

            guard let cv:CVPixelBuffer = self.buffer(from: importedImage) else {
                print("CVPixelBuffer could not be created.")
                return
            }

            self.MTLContext?.render(_:importedImage, to:cv)

            self.currentSampleTime = CMTimeMakeWithSeconds(0.1, preferredTimescale: 1000000000)

            guard let currentSampleTime = self.currentSampleTime else {
                return
            }

            let success = self.assetWriterPixelBufferInput?.append(cv, withPresentationTime: currentSampleTime)

            if success == false {
                print("Pixel Buffer input failed")
            }

        }

        guard let MTLView = self.MTLCaptureView else {
            print("MTLCaptureView is not found or nil.")
            return
        }

        // update the MTKView with the changed CIImage so the user can see the changed image
        MTLView.image = importedImage


    }           

}

我让它工作了。问题是我没有抵消currentSampleTime。此示例没有精确的偏移,但它表明需要将其添加到上一次

@objc func importLivePreview(){

    guard var importedImage = importedDryCIImage else { return }

    DispatchQueue.main.async(){

        // apply filter to camera image
        // this is what makes the CIImage appear that it is changing
        importedImage = self.applyFilterAndReturnImage(ciImage: importedImage, orientation: UIImage.Orientation.right, currentCameraRes:currentCameraRes!)


        if self.videoIsRecording &&
           self.assetWriterPixelBufferInput?.assetWriterInput.isReadyForMoreMediaData == true {

            guard let writer: AVAssetWriter = self.assetWriter, writer.status == .writing else {
                return
            }                       

            guard let cv:CVPixelBuffer = self.buffer(from: importedImage) else {
                print("CVPixelBuffer could not be created.")
                return
            }

            self.MTLContext?.render(_:importedImage, to:cv)

            guard let currentSampleTime = self.currentSampleTime else {
                return
            }

            // offset currentSampleTime
            let sampleTimeOffset = CMTimeMakeWithSeconds(0.1, preferredTimescale: 1000000000)

            self.currentSampleTime = CMTimeAdd(currentSampleTime, sampleTimeOffset)

            print("currentSampleTime = \(String(describing: currentSampleTime))")

            let success = self.assetWriterPixelBufferInput?.append(cv, withPresentationTime: currentSampleTime)

            if success == false {
                print("Pixel Buffer input failed")
            }

        }

        guard let MTLView = self.MTLCaptureView else {
            print("MTLCaptureView is not found or nil.")
            return
        }

        // update the MTKView with the changed CIImage so the user can see the changed image
        MTLView.image = importedImage


    }           

}