Ios 阿基特公司;Reality composer-如何使用图像坐标锚定场景
我已经编写了代码来初始化3个Reality Composer场景中的一个场景,根据一个月的哪一天按下按钮 一切正常 Reality Composer场景使用图像检测将对象放置在环境中,但当前,一旦图像脱离摄影机视图,对象就会消失 我想锚定场景,根节点是第一次检测到图像的位置,这样用户可以查看场景周围的情况,并且即使图像触发器不在摄影机视图中,对象也会保持不变 我尝试在下面传入func渲染器代码,但出现错误,表示视图控制器类没有.planeNodeIos 阿基特公司;Reality composer-如何使用图像坐标锚定场景,ios,swift,xcode,realitykit,reality-composer,Ios,Swift,Xcode,Realitykit,Reality Composer,我已经编写了代码来初始化3个Reality Composer场景中的一个场景,根据一个月的哪一天按下按钮 一切正常 Reality Composer场景使用图像检测将对象放置在环境中,但当前,一旦图像脱离摄影机视图,对象就会消失 我想锚定场景,根节点是第一次检测到图像的位置,这样用户可以查看场景周围的情况,并且即使图像触发器不在摄影机视图中,对象也会保持不变 我尝试在下面传入func渲染器代码,但出现错误,表示视图控制器类没有.planeNode func renderer(_ rendere
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let imageAnchor = anchor as? ARImageAnchor else { return }
let referenceImage = imageAnchor.referenceImage
// Create a plane to visualize the initial position of the detected image.
let plane = SCNPlane(width: referenceImage.physicalSize.width,
height: referenceImage.physicalSize.height)
plane.materials.first?.diffuse.contents = UIColor.blue.withAlphaComponent(0.20)
self.planeNode = SCNNode(geometry: plane)
self.planeNode?.opacity = 1
/*
`SCNPlane` is vertically oriented in its local coordinate space, but
`ARImageAnchor` assumes the image is horizontal in its local space, so
rotate the plane to match.
*/
self.planeNode?.eulerAngles.x = -.pi / 2
/*
Image anchors are not tracked after initial detection, so create an
animation that limits the duration for which the plane visualization appears.
*/
// Add the plane visualization to the scene.
if let planeNode = self.planeNode {
node.addChildNode(planeNode)
}
if let imageName = referenceImage.name {
plane.materials = [SCNMaterial()]
plane.materials[0].diffuse.contents = UIImage(named: imageName)
}
这是我的密码
import UIKit
import RealityKit
import ARKit
import SceneKit
class ViewController: UIViewController {
@IBOutlet var move: ARView!
@IBOutlet var arView: ARView!
var ARBorealAnchor3: ARboreal.ArBoreal3!
var ARBorealAnchor2: ARboreal.ArBoreal2!
var ARBorealAnchor: ARboreal.ArBoreal!
var Date1 = 1
override func viewDidLoad() {
super.viewDidLoad()
func getSingle() {
let date = Date()
let calendar = Calendar.current
let day = calendar.component(.day, from: date)
Date1 = day
}
getSingle()
ARBorealAnchor = try! ARboreal.loadArBoreal()
ARBorealAnchor2 = try!
ARboreal.loadArBoreal2()
ARBorealAnchor3 = try!
ARboreal.loadArBoreal3()
if Date1 == 24 {
arView.scene.anchors.append(ARBorealAnchor)
}
if Date1 == 25 {
arView.scene.anchors.append(ARBorealAnchor2)
}
if Date1 == 26 {
arView.scene.anchors.append(ARBorealAnchor3)
}
}
}
任何帮助都将不胜感激
干杯,
丹尼尔·萨维奇(Daniel Savage)正在发生的事情是,当图像锚离开视图时,锚实体将变得不受约束,并且它将停止渲染它及其所有后代 解决此问题的一种方法是,只需分离图像锚定和要渲染的内容,在代码中手动添加图像锚定,然后在首次检测到图像锚定时,将内容添加到另一个世界锚定下的场景中。更新图像定位变换后,更新世界定位以匹配 这样,当图像定位点可见以获取最新的变换时,可以使用图像定位点,但当图像定位点消失时,内容的呈现与图像定位点无关。如下所示(您必须创建一个名为ARTest的AR资源组,并向其中添加一个名为“test”的图像,以使锚正常工作): 注意:ARKit试图计算精确的图像平面时,ARImageAnchor的变换似乎会随着您的移动而频繁更新(例如,内容可能看起来位于正确的位置,但z值不准确),确保AR资源组中的图像尺寸准确,以便更好地跟踪图像
import ARKit
import SwiftUI
import RealityKit
import Combine
struct ContentView : View {
var body: some View {
return ARViewContainer().edgesIgnoringSafeArea(.all)
}
}
let arDelegate = SessionDelegate()
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
arDelegate.set(arView: arView)
arView.session.delegate = arDelegate
// Create an image anchor, add it to the scene. We won't add any
// rendering content to the anchor, it will be used only for detection
let imageAnchor = AnchorEntity(.image(group: "ARTest", name: "test"))
arView.scene.anchors.append(imageAnchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {}
}
final class SessionDelegate: NSObject, ARSessionDelegate {
var arView: ARView!
var rootAnchor: AnchorEntity?
func set(arView: ARView) {
self.arView = arView
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
// If we already added the content to render, ignore
if rootAnchor != nil {
return
}
// Make sure we are adding to an image anchor. Assuming only
// one image anchor in the scene for brevity.
guard anchors[0] is ARImageAnchor else {
return
}
// Create the entity to render, could load from your experience file here
// this will render at the center of the matched image
rootAnchor = AnchorEntity(world: [0,0,0])
let ball = ModelEntity(
mesh: MeshResource.generateBox(size: 0.01),
materials: [SimpleMaterial(color: .red, isMetallic: false)]
)
rootAnchor!.addChild(ball)
// Just add another model to show how it remains in the scene even
// when the tracking image is out of view.
let ball2 = ModelEntity(
mesh: MeshResource.generateBox(size: 0.10),
materials: [SimpleMaterial(color: .orange, isMetallic: false)]
)
ball.addChild(ball2)
ball2.position = [0, 0, 1]
arView.scene.addAnchor(rootAnchor!)
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
guard let rootAnchor = rootAnchor else {
return
}
// Code is assuming you only have one image anchor for brevity
guard let imageAnchor = anchors[0] as? ARImageAnchor else {
return
}
if !imageAnchor.isTracked {
return
}
// Update our fixed anchor to image transform
rootAnchor.transform = Transform(matrix: imageAnchor.transform)
}
}
#if DEBUG
struct ContentView_Previews : PreviewProvider {
static var previews: some View {
ContentView()
}
}
#endif