Java 如何在Vuforia AR库上呈现LibGDX modelInstance?

Java 如何在Vuforia AR库上呈现LibGDX modelInstance?,java,libgdx,augmented-reality,vuforia,Java,Libgdx,Augmented Reality,Vuforia,我知道社区中的大多数问题都应该至少用一些代码来完成。但是我在这里完全迷路了,我甚至不知道从哪里开始。我要做的是使用Vuforia AR库渲染LibGDX 3D模型实例。但是,我不知道如何让Vuforia渲染modelInstances或使用libGDX相机作为其相机 我做过外部研究,但没有找到有用的信息。有人能帮我开始吗?请注意,我并不特别精通Vuforia AR,但这个问题已经有一段时间没有答案了,所以我会试一试 LibGDX中的相机本质上只是两个4x4矩阵的包装器,视图矩阵Camera#v

我知道社区中的大多数问题都应该至少用一些代码来完成。但是我在这里完全迷路了,我甚至不知道从哪里开始。我要做的是使用Vuforia AR库渲染LibGDX 3D模型实例。但是,我不知道如何让Vuforia渲染modelInstances或使用libGDX相机作为其相机


我做过外部研究,但没有找到有用的信息。有人能帮我开始吗?

请注意,我并不特别精通Vuforia AR,但这个问题已经有一段时间没有答案了,所以我会试一试


LibGDX中的相机本质上只是两个4x4矩阵的包装器,视图矩阵
Camera#view
,投影矩阵
Camer#projection
(我相信还有另一个矩阵,模型矩阵,用于世界空间变换[但不是100%确定]在LibGDX中,该矩阵已经合并到视图矩阵中[因此
Camera#view
实际上是模型视图矩阵])

无论如何,在此基础上,除非有我不知道的更简单的解决方案,否则您应该能够使用这些基础矩阵来处理Vuforia和LibGDX API之间的投影

(建议进一步阅读:3D图形中的模型、视图、投影矩阵)


接下来是使用Vuforia渲染LibGDX 3D模型实例。此问题的一般解决方案是将ModelInstance转换为Vuforia可以识别的内容。这可以通过获取LibGDX模型表示的网格/顶点数据,并将其直接输入Vuforia来实现


IMO最好的方法是使用模型数据的核心表示形式,它可以很容易地提供给Vuforia和LibGDX(例如,两者都可以识别的特定文件格式,或者作为原始FloatBuffers,应该很容易包装并提供给任何一个API)。作为参考,LibGDX将模型作为顶点信息存储在一组FloatBuffers中,可通过
Model#mesh
ModelInstance#Model#mesh
访问。因此,我最终设法将这两个库合并起来。我不确定我所做的是否是最有效的工作方式,但这对我来说已经奏效了

首先,我以Vuforia的示例应用程序为基础。特别是使用FrameMarkers示例

我打开了一个空的LibGDX项目,导入了Vuforia jar并复制了SampleApplicationControl、SampleApplicationException、SampleApplicationGLView、SampleApplicationSession、FrameMarkerRenderer和FrameMarker

接下来,我在LibGDX的AndroidLauncher类上创建了一些属性,并初始化了所有Vuforia内容:

public class AndroidLauncher extends AndroidApplication implements SampleApplicationControl{
    private static final String LOGTAG = "FrameMarkers";


    // Our OpenGL view:
    public SampleApplicationGLView mGlView;
    public SampleApplicationSession vuforiaAppSession;
    // Our renderer:
    public FrameMarkerRenderer mRenderer;
    MyGDX gdxRender;
    // The textures we will use for rendering:
    public Vector<Texture> mTextures;
    public RelativeLayout mUILayout;

    public Marker dataSet[];

    public GestureDetector mGestureDetector;


    public LoadingDialogHandler loadingDialogHandler = new LoadingDialogHandler(
        this);

    // Alert Dialog used to display SDK errors
    private AlertDialog mErrorDialog;

    boolean mIsDroidDevice = false;
    @Override
    protected void onCreate (Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        vuforiaAppSession = new SampleApplicationSession(this);
        vuforiaAppSession.setmActivity(this);
        AndroidApplicationConfiguration config = new      AndroidApplicationConfiguration();


        // Load any sample specific textures:
        mTextures = new Vector<Texture>();
        loadTextures();
        startLoadingAnimation();
        vuforiaAppSession.initAR(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
        gdxRender = new MyGDX (vuforiaAppSession);
        gdxRender.setTextures(mTextures);
        initialize(gdxRender, config);

        mGestureDetector = new GestureDetector(this, new GestureListener());

        mIsDroidDevice = android.os.Build.MODEL.toLowerCase().startsWith(
            "droid");    
}
要记住的最后一件重要事情是
render()
方法。我基于FrameMarkerRenderer的渲染方法。它有一个布尔值,在相机启动时被激活。因此,我简单地更改了vuforia AR初始化和
render()
方法上的变量。我必须打开相机和身份矩阵,然后将模型乘以modelViewMatrix

@Override
public void render() {
    if (render) {
        // Clear color and depth buffer
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
        // Get the state from Vuforia and mark the beginning of a rendering
        // section
        State state = Renderer.getInstance().begin();
        // Explicitly render the Video Background
        Renderer.getInstance().drawVideoBackground();

        GLES20.glEnable(GLES20.GL_DEPTH_TEST);

        // We must detect if background reflection is active and adjust the
        // culling direction.
        // If the reflection is active, this means the post matrix has been
        // reflected as well,
        // therefore standard counter clockwise face culling will result in
        // "inside out" models.
        GLES20.glEnable(GLES20.GL_CULL_FACE);
        GLES20.glCullFace(GLES20.GL_BACK);
        cam.update();
        modelBatch.begin(cam);

        if (Renderer.getInstance().getVideoBackgroundConfig().getReflection() == VIDEO_BACKGROUND_REFLECTION.VIDEO_BACKGROUND_REFLECTION_ON)

                        GLES20.glFrontFace(GLES20.GL_CW);  // Front camera
        else
            GLES20.glFrontFace(GLES20.GL_CCW);   // Back camera

        // Set the viewport
        int[] viewport = vuforiaAppSession.getViewport();
        GLES20.glViewport(viewport[0], viewport[1], viewport[2], viewport[3]);

        // Did we find any trackables this frame?
        for (int tIdx = 0; tIdx < state.getNumTrackableResults(); tIdx++)
        {
            // Get the trackable:
            TrackableResult trackableResult = state.getTrackableResult(tIdx);
            float[] modelViewMatrix = Tool.convertPose2GLMatrix(
                    trackableResult.getPose()).getData();

            // Choose the texture based on the target name:
            int textureIndex = 0;

            // Check the type of the trackable:
            assert (trackableResult.getType() == MarkerTracker.getClassType());
            MarkerResult markerResult = (MarkerResult) (trackableResult);
            Marker marker = (Marker) markerResult.getTrackable();
            textureIndex = marker.getMarkerId();
            float[] modelViewProjection = new float[16];
            Matrix.translateM(modelViewMatrix, 0, -kLetterTranslate, -kLetterTranslate, 0.f);
            Matrix.scaleM(modelViewMatrix, 0, kLetterScale, kLetterScale, kLetterScale);
            Matrix.multiplyMM(modelViewProjection, 0, vuforiaAppSession.getProjectionMatrix().getData(), 0, modelViewMatrix, 0);
            SampleUtils.checkGLError("FrameMarkers render frame");
            cam.view.idt();
            cam.projection.idt();
            cam.combined.idt();
            Matrix4 temp3 = new Matrix4(modelViewProjection);
            modelInstanceHouse.transform.set(temp3);
            modelInstanceHouse.transform.scale(0.05f, 0.05f, 0.05f);
            controller.update(Gdx.graphics.getDeltaTime());
            modelBatch.render(modelInstanceHouse);
        }
        GLES20.glDisable(GLES20.GL_DEPTH_TEST);
        modelBatch.end();

}
@覆盖
公共无效呈现(){
如果(渲染){
//清晰的颜色和深度缓冲区
GLES20.glClear(GLES20.GL_颜色_缓冲_位| GLES20.GL_深度_缓冲_位);
//从Vuforia获取状态并标记渲染的开始
//部分
State State=Renderer.getInstance().begin();
//显式渲染视频背景
Renderer.getInstance().drawVideoBackground();
GLES20.glEnable(GLES20.Glu深度试验);
//我们必须检测背景反射是否活跃,并调整
//剔除方向。
//如果反射是活动的,这意味着后矩阵已被激活
//也反映了,,
//因此,标准逆时针面剔除将导致
//“由内而外”的模式。
GLES20.glEnable(GLES20.Glu CULL_面);
GLES20.glCullFace(GLES20.Glu-BACK);
cam.update();
modelBatch.begin(cam);
if(Renderer.getInstance().getVideoBackgroundConfig().getReflection()==VIDEO\u BACKGROUND\u REFLECTION.VIDEO\u BACKGROUND\u REFLECTION\u ON)
GLES20.glFrontFace(GLES20.GL_CW);//前置摄像头
其他的
GLES20.glFrontFace(GLES20.GL_CCW);//后摄像头
//设置视口
int[]viewport=vuforiaAppSession.getViewport();
GLES20.glViewport(视口[0]、视口[1]、视口[2]、视口[3]);
//我们在这架飞机上找到任何可追踪的东西了吗?
对于(int-tIdx=0;tIdx@Override
public void render() {
    if (render) {
        // Clear color and depth buffer
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
        // Get the state from Vuforia and mark the beginning of a rendering
        // section
        State state = Renderer.getInstance().begin();
        // Explicitly render the Video Background
        Renderer.getInstance().drawVideoBackground();

        GLES20.glEnable(GLES20.GL_DEPTH_TEST);

        // We must detect if background reflection is active and adjust the
        // culling direction.
        // If the reflection is active, this means the post matrix has been
        // reflected as well,
        // therefore standard counter clockwise face culling will result in
        // "inside out" models.
        GLES20.glEnable(GLES20.GL_CULL_FACE);
        GLES20.glCullFace(GLES20.GL_BACK);
        cam.update();
        modelBatch.begin(cam);

        if (Renderer.getInstance().getVideoBackgroundConfig().getReflection() == VIDEO_BACKGROUND_REFLECTION.VIDEO_BACKGROUND_REFLECTION_ON)

                        GLES20.glFrontFace(GLES20.GL_CW);  // Front camera
        else
            GLES20.glFrontFace(GLES20.GL_CCW);   // Back camera

        // Set the viewport
        int[] viewport = vuforiaAppSession.getViewport();
        GLES20.glViewport(viewport[0], viewport[1], viewport[2], viewport[3]);

        // Did we find any trackables this frame?
        for (int tIdx = 0; tIdx < state.getNumTrackableResults(); tIdx++)
        {
            // Get the trackable:
            TrackableResult trackableResult = state.getTrackableResult(tIdx);
            float[] modelViewMatrix = Tool.convertPose2GLMatrix(
                    trackableResult.getPose()).getData();

            // Choose the texture based on the target name:
            int textureIndex = 0;

            // Check the type of the trackable:
            assert (trackableResult.getType() == MarkerTracker.getClassType());
            MarkerResult markerResult = (MarkerResult) (trackableResult);
            Marker marker = (Marker) markerResult.getTrackable();
            textureIndex = marker.getMarkerId();
            float[] modelViewProjection = new float[16];
            Matrix.translateM(modelViewMatrix, 0, -kLetterTranslate, -kLetterTranslate, 0.f);
            Matrix.scaleM(modelViewMatrix, 0, kLetterScale, kLetterScale, kLetterScale);
            Matrix.multiplyMM(modelViewProjection, 0, vuforiaAppSession.getProjectionMatrix().getData(), 0, modelViewMatrix, 0);
            SampleUtils.checkGLError("FrameMarkers render frame");
            cam.view.idt();
            cam.projection.idt();
            cam.combined.idt();
            Matrix4 temp3 = new Matrix4(modelViewProjection);
            modelInstanceHouse.transform.set(temp3);
            modelInstanceHouse.transform.scale(0.05f, 0.05f, 0.05f);
            controller.update(Gdx.graphics.getDeltaTime());
            modelBatch.render(modelInstanceHouse);
        }
        GLES20.glDisable(GLES20.GL_DEPTH_TEST);
        modelBatch.end();

}