Ios 如何在arkit(swift4)上添加黑白过滤器

Ios 如何在arkit(swift4)上添加黑白过滤器,ios,filter,swift4,scenekit,arkit,Ios,Filter,Swift4,Scenekit,Arkit,我所要做的就是采用基本的arkit视图,并将其转换为黑白视图。现在基本视图是正常的,我不知道如何添加过滤器。理想情况下,在截图上添加黑白滤镜 import UIKit import SceneKit import ARKit class ViewController: UIViewController, ARSCNViewDelegate { @IBOutlet var sceneView: ARSCNView! override func viewDidLoad() {

我所要做的就是采用基本的arkit视图,并将其转换为黑白视图。现在基本视图是正常的,我不知道如何添加过滤器。理想情况下,在截图上添加黑白滤镜

import UIKit
import SceneKit
import ARKit

class ViewController: UIViewController, ARSCNViewDelegate {

    @IBOutlet var sceneView: ARSCNView!

    override func viewDidLoad() {
        super.viewDidLoad()
        sceneView.delegate = self
        sceneView.showsStatistics = true
    }

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        let configuration = ARWorldTrackingConfiguration()
        sceneView.session.run(configuration)
    }

    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)
        sceneView.session.pause()
    }

    @IBAction func changeTextColour(){
        let snapShot = self.augmentedRealityView.snapshot()
        UIImageWriteToSavedPhotosAlbum(snapShot, self, #selector(image(_:didFinishSavingWithError:contextInfo:)), nil)
    }
}

snapshot
对象应该是
UIImage
。通过导入
CoreImage
框架,在此
UIImage
对象上应用过滤器,然后在其上应用核心图像过滤器。您应该调整图像上的曝光和控制值。有关更多实现详细信息,请检查此项。在iOS6中,您还可以使用
CIColorMonochrome
过滤器来实现相同的效果

这是所有可用过滤器的苹果。单击每个过滤器,了解应用过滤器后图像的视觉效果

这是swift 4代码

 func imageBlackAndWhite() -> UIImage?
    {
        if let beginImage = CoreImage.CIImage(image: self)
        {
            let paramsColor: [String : Double] = [kCIInputBrightnessKey: 0.0,
                                                  kCIInputContrastKey:   1.1,
                                                  kCIInputSaturationKey: 0.0]
            let blackAndWhite = beginImage.applyingFilter("CIColorControls", parameters: paramsColor)

            let paramsExposure: [String : AnyObject] = [kCIInputEVKey: NSNumber(value: 0.7)]
            let output = blackAndWhite.applyingFilter("CIExposureAdjust", parameters: paramsExposure)

            guard let processedCGImage = CIContext().createCGImage(output, from: output.extent) else {
                return nil
            }

            return UIImage(cgImage: processedCGImage, scale: self.scale, orientation: self.imageOrientation)
        }
        return nil
    }

过滤ARSCNView快照:如果你想为你的
ARSCNView
创建一个黑白截图,你可以这样做,返回一个灰度的
UIImage
,从而
augmentedRealityView
引用一个
ARSCNView

/// Converts A UIImage To A High Contrast GrayScaleImage
///
/// - Returns: UIImage
func highContrastBlackAndWhiteFilter() -> UIImage?
{
    //1. Convert It To A CIIamge
    guard let convertedImage = CIImage(image: self) else { return nil }

    //2. Set The Filter Parameters
    let filterParameters = [kCIInputBrightnessKey: 0.0,
                            kCIInputContrastKey:   1.1,
                            kCIInputSaturationKey: 0.0]

    //3. Apply The Basic Filter To The Image
    let imageToFilter = convertedImage.applyingFilter("CIColorControls", parameters: filterParameters)

    //4. Set The Exposure
    let exposure =  [kCIInputEVKey: NSNumber(value: 0.7)]

    //5. Process The Image With The Exposure Setting
    let processedImage = imageToFilter.applyingFilter("CIExposureAdjust", parameters: exposure)

    //6. Create A CG GrayScale Image
    guard let grayScaleImage = CIContext().createCGImage(processedImage, from: processedImage.extent) else { return nil }

    return UIImage(cgImage: grayScaleImage, scale: self.scale, orientation: self.imageOrientation)
}
因此,使用该方法的示例如下:

 override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {

    //1. Create A UIImageView Dynamically
    let imageViewResult = UIImageView(frame: CGRect(x: 0, y: 0, width: self.view.bounds.width, height: self.view.bounds.height))
    self.view.addSubview(imageViewResult)

    //2. Create The Snapshot & Get The Black & White Image
    guard let snapShotImage = self.augmentedRealityView.snapshot().highContrastBlackAndWhiteFilter() else { return }
    imageViewResult.image = snapShotImage

    //3. Remove The ImageView After A Delay Of 5 Seconds
    DispatchQueue.main.asyncAfter(deadline: .now() + 5) {
        imageViewResult.removeFromSuperview()
    }

}
然后您可以像这样轻松地使用:

guard let snapShotImage = self.augmentedRealityView.snapshot().highContrastBlackAndWhiteFilter() else { return }
/// Creates 3 Objects And Adds Them To The Scene (Rendering Them In GrayScale)
func createObjects(){

    //1. Create An Array Of UIColors To Set As The Geometry Colours
    let colours = [UIColor.red, UIColor.green, UIColor.yellow]

    //2. Create An Array Of The X Positions Of The Nodes
    let xPositions: [CGFloat] = [-0.3, 0, 0.3]

    //3. Create The Nodes & Add Them To The Scene
    for i in 0 ..< 3{

        let sphereNode = SCNNode()
        let sphereGeometry = SCNSphere(radius: 0.1)
        sphereGeometry.firstMaterial?.diffuse.contents = colours[i]
        sphereNode.geometry = sphereGeometry
        sphereNode.position = SCNVector3( xPositions[i], 0, -1.5)
        augmentedRealityView.scene.rootNode.addChildNode(sphereNode)

        //a. Create A Black & White Filter
        guard let blackAndWhiteFilter = CIFilter(name: "CIColorControls", withInputParameters: [kCIInputSaturationKey:0.0]) else { return }
        blackAndWhiteFilter.name = "bw"
        sphereNode.filters = [blackAndWhiteFilter]
        sphereNode.setValue(CIFilter(), forKeyPath: "bw")
    }

}
请记住,您应该将扩展名放在
类声明上方,例如:

extension UIImage{

}

class ViewController: UIViewController, ARSCNViewDelegate {

}
因此,根据您问题中提供的代码,您将得到如下结果:

/// Creates A Black & White ScreenShot & Saves It To The Photo Album
@IBAction func changeTextColour(){

    //1. Create A Snapshot
    guard let snapShotImage = self.augmentedRealityView.snapshot().highContrastBlackAndWhiteFilter() else { return }

    //2. Save It The Photos Album
    UIImageWriteToSavedPhotosAlbum(snapShotImage, self, #selector(image(_:didFinishSavingWithError:contextInfo:)), nil)

}

///Calback To Check Whether The Image Has Been Saved
@objc func image(_ image: UIImage, didFinishSavingWithError error: Error?, contextInfo: UnsafeRawPointer) {

    if let error = error {
        print("Error Saving ARKit Scene \(error)")
    } else {
        print("ARKit Scene Successfully Saved")
    }
}
黑白实时渲染: 使用下面这个精彩的答案,我还能够使用以下方法将整个摄影机提要渲染为黑白:

第一。注册
ARSessionDelegate
如下:

 augmentedRealitySession.delegate = self
第二。然后在以下委托回调中添加以下内容:

 //-----------------------
 //MARK: ARSessionDelegate
 //-----------------------

 extension ViewController: ARSessionDelegate{

 func session(_ session: ARSession, didUpdate frame: ARFrame) {

        /*
        Full Credit To https://stackoverflow.com/questions/45919745/reliable-access-and-modify-captured-camera-frames-under-scenekit
        */

        //1. Convert The Current Frame To Black & White
        guard let currentBackgroundFrameImage = augmentedRealityView.session.currentFrame?.capturedImage,
              let pixelBufferAddressOfPlane = CVPixelBufferGetBaseAddressOfPlane(currentBackgroundFrameImage, 1) else { return }

        let x: size_t = CVPixelBufferGetWidthOfPlane(currentBackgroundFrameImage, 1)
        let y: size_t = CVPixelBufferGetHeightOfPlane(currentBackgroundFrameImage, 1)
        memset(pixelBufferAddressOfPlane, 128, Int(x * y) * 2)

      }

 }
它成功地将摄影机提要渲染为黑白:

以黑白形式过滤SCNScene的元素:

正如@Middle正确地说的,如果您决定希望
cameraFeed
是彩色的,但
AR体验的内容是黑白的,那么您可以使用
filters
属性将过滤器直接应用于
SCNNode
,该属性就是:

要应用于渲染内容的核心图像过滤器数组 节点的名称

例如,我们使用
球体几何体动态创建3个
SCNNodes
,我们可以直接将
CoreImageFilter
应用于这些对象,如下所示:

guard let snapShotImage = self.augmentedRealityView.snapshot().highContrastBlackAndWhiteFilter() else { return }
/// Creates 3 Objects And Adds Them To The Scene (Rendering Them In GrayScale)
func createObjects(){

    //1. Create An Array Of UIColors To Set As The Geometry Colours
    let colours = [UIColor.red, UIColor.green, UIColor.yellow]

    //2. Create An Array Of The X Positions Of The Nodes
    let xPositions: [CGFloat] = [-0.3, 0, 0.3]

    //3. Create The Nodes & Add Them To The Scene
    for i in 0 ..< 3{

        let sphereNode = SCNNode()
        let sphereGeometry = SCNSphere(radius: 0.1)
        sphereGeometry.firstMaterial?.diffuse.contents = colours[i]
        sphereNode.geometry = sphereGeometry
        sphereNode.position = SCNVector3( xPositions[i], 0, -1.5)
        augmentedRealityView.scene.rootNode.addChildNode(sphereNode)

        //a. Create A Black & White Filter
        guard let blackAndWhiteFilter = CIFilter(name: "CIColorControls", withInputParameters: [kCIInputSaturationKey:0.0]) else { return }
        blackAndWhiteFilter.name = "bw"
        sphereNode.filters = [blackAndWhiteFilter]
        sphereNode.setValue(CIFilter(), forKeyPath: "bw")
    }

}
///创建3个对象并将其添加到场景中(以灰度渲染)
func createObjects(){
//1.创建要设置为几何体颜色的UIColors数组
让颜色=[UIColor.red,UIColor.green,UIColor.yellow]
//2.创建节点X位置的数组
设xPositions:[CGFloat]=[-0.3,0,0.3]
//3.创建节点并将其添加到场景中
对于0..<3中的i{
设sphereNode=SCNNode()
让球墨法=SCNSphere(半径:0.1)
球墨法.第一材料?.diffuse.contents=颜色[i]
sphereNode.geometry=球化法
sphereNode.position=SCInvector3(X位置[i],0,-1.5)
augmentedRealityView.scene.rootNode.addChildNode(sphereNode)
//a、 创建一个黑白过滤器
guard let blackAndWhiteFilter=CIFilter(名称:“CIColorControl”,带输入参数:[KCIIInputSaturationKey:0.0])否则{return}
blackAndWhiteFilter.name=“bw”
sphereNode.filters=[blackAndWhiteFilter]
setValue(CIFilter(),forKeyPath:“bw”)
}
}
这将产生如下结果:

 //-----------------------
 //MARK: ARSessionDelegate
 //-----------------------

 extension ViewController: ARSessionDelegate{

 func session(_ session: ARSession, didUpdate frame: ARFrame) {

        /*
        Full Credit To https://stackoverflow.com/questions/45919745/reliable-access-and-modify-captured-camera-frames-under-scenekit
        */

        //1. Convert The Current Frame To Black & White
        guard let currentBackgroundFrameImage = augmentedRealityView.session.currentFrame?.capturedImage,
              let pixelBufferAddressOfPlane = CVPixelBufferGetBaseAddressOfPlane(currentBackgroundFrameImage, 1) else { return }

        let x: size_t = CVPixelBufferGetWidthOfPlane(currentBackgroundFrameImage, 1)
        let y: size_t = CVPixelBufferGetHeightOfPlane(currentBackgroundFrameImage, 1)
        memset(pixelBufferAddressOfPlane, 128, Int(x * y) * 2)

      }

 }

有关这些过滤器的完整列表,请参考以下内容:

示例项目:这是一个完整的项目,您可以下载并亲自探索


希望它能帮助您……

如果您想实时应用过滤器,最好的方法就是使用。这些技术用于后处理,允许我们在多个过程中渲染
SCNView
内容,这正是我们需要的(首先渲染场景,然后对其应用效果)

这是我的建议


Plist设置 首先,我们需要在
.plist
文件中描述一种技术

下面是我提出的
plist
的屏幕截图(为了更好地可视化):

以下是它的来源:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>sequence</key>
    <array>
        <string>apply_filter</string>
    </array>
    <key>passes</key>
    <dict>
        <key>apply_filter</key>
        <dict>
            <key>metalVertexShader</key>
            <string>scene_filter_vertex</string>
            <key>metalFragmentShader</key>
            <string>scene_filter_fragment</string>
            <key>draw</key>
            <string>DRAW_QUAD</string>
            <key>inputs</key>
            <dict>
                <key>scene</key>
                <string>COLOR</string>
            </dict>
            <key>outputs</key>
            <dict>
                <key>color</key>
                <string>COLOR</string>
            </dict>
        </dict>
    </dict>
</dict>
请注意,片段着色器和顶点着色器的函数名称应与过程描述符中的
plist
文件中指定的名称相同

要更好地了解
顶点输入
顶点输出
结构的含义,请参阅

给定的顶点函数几乎可以在任何
DRAW\u QUAD
渲染过程中使用。它基本上为我们提供了屏幕空间的标准化坐标(可以通过片段着色器中的
vert.texcoord
访问)

片段函数是所有“魔法”发生的地方。在那里,您可以操纵从主通道获得的纹理。使用此设置,您可能会实现大量过滤器/效果等

在我们的例子中,我使用了一个基本的去饱和度(零饱和度)公式来获得黑色和白色


快速设置 现在,我们终于可以在
ARKit
/
SceneKit
中使用所有这些了

let plistName = "SceneFilterTechnique" // the name of the plist you've created

guard let url = Bundle.main.url(forResource: plistName, withExtension: "plist") else {
    fatalError("\(plistName).plist does not exist in the main bundle")
}

guard let dictionary = NSDictionary(contentsOf: url) as? [String: Any] else {
    fatalError("Failed to parse \(plistName).plist as a dictionary")
}

guard let technique = SCNTechnique(dictionary: dictionary) else {
    fatalError("Failed to initialize a technique using \(plistName).plist")
}
只需将其设置为
ARSCNView
technique

sceneView.technique = technique
就这样。现在,整个场景将以灰度渲染,包括拍摄快照时的


这可能是最简单、最快的方法:

将CoreImage过滤器应用于场景:

该滤光片给人留下了黑白照片的良好印象,在灰色中具有良好的过渡:

你也可以用这个,结果很容易转移