Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/swift/16.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Swift 使用ARKit检测人脸移动方向(上、左、下、右)_Swift_Ios11_Arkit_Face Detection - Fatal编程技术网

Swift 使用ARKit检测人脸移动方向(上、左、下、右)

Swift 使用ARKit检测人脸移动方向(上、左、下、右),swift,ios11,arkit,face-detection,Swift,Ios11,Arkit,Face Detection,我正在进行ARFaceTrackingConfiguration来检测人脸,我已经使用Vision框架轻松实现了这一点。现在,我想检测用户是否在看“左-右-上-下”,并基于此,我需要在屏幕上显示一些内容。但问题是我无法检测到如何做到这一点 到目前为止我试过什么 extension FaceDetectionViewController: ARSCNViewDelegate { //implement ARSCNViewDelegate functions for things like

我正在进行
ARFaceTrackingConfiguration
来检测人脸,我已经使用
Vision
框架轻松实现了这一点。现在,我想检测用户是否在看“左-右-上-下”,并基于此,我需要在屏幕上显示一些内容。但问题是我无法检测到如何做到这一点

到目前为止我试过什么

extension FaceDetectionViewController: ARSCNViewDelegate {
    //implement ARSCNViewDelegate functions for things like error tracking
    func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
        scanForFaces()
        if let faceAnchor = anchor as? ARFaceAnchor {
            DispatchQueue.main.async {
                //This changes as I move camera and cannot able to detect logic
                let position = faceAnchor.transform.position()
                self.lblX.text = "X: \(node.position.x)----Y: \(node.position.y)"
                self.lblY.text = "X: \(position.x)----Y: \(position.y)"
            }
        }
        //Same this problem appears here too
        DispatchQueue.main.async {
            self.lblX.text = " \(node.eulerAngles.x) "
            self.lblY.text = " \(node.eulerAngles.y) "
        }
    }
}

extension matrix_float4x4 {
    func position() -> SCNVector3 {
        return SCNVector3(columns.3.x, columns.3.y, columns.3.z)
    }
}

@objc
private func doFaceScan() {

    //get the captured image of the ARSession's current frame
    guard let capturedImage = sceneView.session.currentFrame?.capturedImage else { return }
    let image = CIImage.init(cvPixelBuffer: capturedImage)

    let detectFaceRequest = VNDetectFaceRectanglesRequest { (request, error) in

        DispatchQueue.main.async {
            //Loop through the resulting faces and add a red UIView on top of them.
            if let faces = request.results as? [VNFaceObservation] {
                for face in faces {
                    self.faceFrame(from: face.boundingBox)
                }
            }
        }
    }

    try? VNImageRequestHandler(ciImage: image, orientation: self.imageOrientation).perform([detectFaceRequest])
}

有人能帮忙吗?

我也在找同样的人,请帮忙