Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/ios/100.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Ios ARKit从屏幕平移创建节点_Ios_Swift_Drawing_Arkit - Fatal编程技术网

Ios ARKit从屏幕平移创建节点

Ios ARKit从屏幕平移创建节点,ios,swift,drawing,arkit,Ios,Swift,Drawing,Arkit,我目前正在开发一个应用程序,其目标是让用户能够在屏幕上滑动手指,并在场景中看到一行3D对象,类似于3D绘图。我用手势识别器设置了场景,但我不知道如何基于屏幕上的平移在3D空间中创建节点。这就是我现在所有的手势识别器 `@IBAction func panRecognized(_ sender: UIPanGestureRecognizer) { print("Panning") print(sender.location(in: self.view).x) print(s

我目前正在开发一个应用程序,其目标是让用户能够在屏幕上滑动手指,并在场景中看到一行3D对象,类似于3D绘图。我用手势识别器设置了场景,但我不知道如何基于屏幕上的平移在3D空间中创建节点。这就是我现在所有的手势识别器

`@IBAction func panRecognized(_ sender: UIPanGestureRecognizer) {
    print("Panning")
    print(sender.location(in: self.view).x)
    print(sender.location(in: self.view).y)
}`

我想知道的是,如何在3D空间中使用x和y坐标创建节点,同时使用设备的方向和位置?目标是将物体放置在距离设备约半米的地方

这里有几种方法可以实现这一点,但请注意,这只是一个起点,并没有优化:

您要做的第一件事是在viewDidLoad中创建一个
UIPangestureRecognitor
,例如:

let panToDrawGesture = UIPanGestureRecognizer(target: self, action: #selector(createNodesFromPan(_:)))
self.view.addGestureRecognizer(panToDrawGesture)
为了让它更有趣,还可以添加[UIColor]数组,例如:

let colours: [UIColor] = [.red, .green, .cyan, .orange, .purple]
然后,要使用手势识别器进行绘制,您可以执行以下操作:

///Creates An SCNNode At The Touch Location Of The Gesture Recognizer
@objc func createNodesFromPan(_ gesture: UIPanGestureRecognizer){

    //1. Get The Current Touch Location
    let currentTouchPoint = gesture.location(in: self.augmentedRealityView)

    //2. Perform An ARHitTest For Detected Feature Points
    guard let featurePointHitTest = self.augmentedRealityView.hitTest(currentTouchPoint, types: .featurePoint).first else { return }

    //3. Get The World Coordinates
    let worldCoordinates = featurePointHitTest.worldTransform

    //4. Create An SCNNode With An SCNSphere Geeomtery
    let sphereNode = SCNNode()
    let sphereNodeGeometry = SCNSphere(radius: 0.005)

    //5. Generate A Random Colour For The Node's Geometry
    let randomColour = colours[Int(arc4random_uniform(UInt32(colours.count)))]
    sphereNodeGeometry.firstMaterial?.diffuse.contents = randomColour
    sphereNode.geometry = sphereNodeGeometry

    //6. Position & Add It To The Scene Hierachy
    sphereNode.position = SCNVector3(worldCoordinates.columns.3.x,  worldCoordinates.columns.3.y,  worldCoordinates.columns.3.z)
    self.augmentedRealityView.scene.rootNode.addChildNode(sphereNode)
}
或者,您可以使用触摸来执行绘图,而不是使用手势识别器:

override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {

    //1. Get The Current Touch Location
    guard let currentTouchPoint = touches.first?.location(in: self.augmentedRealityView),
        //2. Perform An ARHitTest For Detected Feature Points
        let featurePointHitTest = self.augmentedRealityView.hitTest(currentTouchPoint, types: .featurePoint).first else { return }

    //3. Get The World Coordinates
    let worldCoordinates = featurePointHitTest.worldTransform

    //4. Create An SCNNode With An SCNSphere Geeomtery
    let sphereNode = SCNNode()
    let sphereNodeGeometry = SCNSphere(radius: 0.005)

    //5. Generate A Random Colour For The Node's Geometry
    let randomColour = colours[Int(arc4random_uniform(UInt32(colours.count)))]
    sphereNodeGeometry.firstMaterial?.diffuse.contents = randomColour
    sphereNode.geometry = sphereNodeGeometry

    //6. Position & Add It To The Scene Hierachy
    sphereNode.position = SCNVector3(worldCoordinates.columns.3.x,  worldCoordinates.columns.3.y,  worldCoordinates.columns.3.z)
    self.augmentedRealityView.scene.rootNode.addChildNode(sphereNode)

}
希望对你有帮助

更新:

如果要以设定的距离绘制,并且仅使用ARCamera的位置,则可以使用
ARSessionDelegate
执行类似操作(如果希望图形距离摄影机1m,请更改第3部分中sphereNode.position的Z值):


谢谢这正是我要找的!请将其标记为正确答案,以便我们可以结束此:)很高兴它有帮助:)一个问题。。。你知道我怎样才能使它与sceneCamera(又名设备)保持固定距离吗?我很难理解在这里该做什么。添加这段代码最终没有显示任何球体,我尝试了一些东西使其工作,但没有成功。你能给我一点提示,我需要在哪里添加什么吗?谢谢
@IBOutlet weak var augmentedRealityView: ARSCNView!
func session(_ session: ARSession, didUpdate frame: ARFrame) {

    //1. Create An SCNNode With An SCNSphere Geeomtery
    let sphereNode = SCNNode()
    let sphereNodeGeometry = SCNSphere(radius: 0.01)

    //2. Generate A Random Colour For The Node's Geometry
    let randomColour = colours[Int(arc4random_uniform(UInt32(colours.count)))]
    sphereNodeGeometry.firstMaterial?.diffuse.contents = randomColour
    sphereNode.geometry = sphereNodeGeometry

    //3. Position & Add It To The Scene Hierachy
    sphereNode.position = SCNVector3(0,  0, -0.5)
    updatePositionAndOrientationOf(sphereNode, withPosition: sphereNode.position, relativeTo: self.augmentedRealityView.pointOfView!)
    self.augmentedRealityView.scene.rootNode.addChildNode(sphereNode)
}


/// Updates The Position Of An SCNNode In Relation To The Camera Node
///
/// - Parameters:
///   - node: SCNNode
///   - position: SCNVector3
///   - referenceNode: SCNNode
func updatePositionAndOrientationOf(_ node: SCNNode, withPosition position: SCNVector3, relativeTo referenceNode: SCNNode) {

    /* Full Credit To Pablo
    https://stackoverflow.com/questions/42029347/position-a-scenekit-object-in-front-of-scncameras-current-orientation/42030679
    */

    let referenceNodeTransform = matrix_float4x4(referenceNode.transform)

    // Create A Translation Matrix With The Desired Position
    var translationMatrix = matrix_identity_float4x4
    translationMatrix.columns.3.x = position.x
    translationMatrix.columns.3.y = position.y
    translationMatrix.columns.3.z = position.z

    // Multiply The Configured Translation With The ReferenceNode's Transform
    let updatedTransform = matrix_multiply(referenceNodeTransform, translationMatrix)
    node.transform = SCNMatrix4(updatedTransform)
}