Pablo Estrada Pablo Estrada - 2 years ago 307
Java Question

How can I render a LibGDX modelInstance on Vuforia AR library?

I know most questions on the community should be done with at least some code. But I´m totally lost here, I don´t even know where to start. What I want to do is use the Vuforia AR Library to render LibGDX 3D modelInstances. However, I don´t know how can I make Vuforia render the modelInstances or use a libGDX camera as it´s camera.

Í´ve done external research but I have not been able to find useful information. Is there anyone who can help me get started with this?

Answer Source

Ok. So I finfally managed to combine both libraries. I´m not sure if what I´m doing is the most efficient way of working but it has worked for me.

First of all I´m basing myself on the Sample Apps from Vuforia. Specifically using the FrameMarkers example.

I opened an empty LibGDX project, imported the Vuforia jar and copied the SampleApplicationControl, SampleApplicationException, SampleApplicationGLView, SampleApplicationSession, FrameMarkerRenderer and FrameMarker.

Next, I created some attributes on the AndroidLauncher class of LibGDX and initialized all the Vuforia Stuff:

public class AndroidLauncher extends AndroidApplication implements SampleApplicationControl{
private static final String LOGTAG = "FrameMarkers";

// Our OpenGL view:
public SampleApplicationGLView mGlView;
public SampleApplicationSession vuforiaAppSession;
// Our renderer:
public FrameMarkerRenderer mRenderer;
    RobotGDX gdxRender;
// The textures we will use for rendering:
public Vector<Texture> mTextures;
public RelativeLayout mUILayout;

public Marker dataSet[];

public GestureDetector mGestureDetector;

public LoadingDialogHandler loadingDialogHandler = new LoadingDialogHandler(

// Alert Dialog used to display SDK errors
private AlertDialog mErrorDialog;

boolean mIsDroidDevice = false;
protected void onCreate (Bundle savedInstanceState) {
    vuforiaAppSession = new SampleApplicationSession(this);
    AndroidApplicationConfiguration config = new AndroidApplicationConfiguration();

    /*config.r =8;
    config.a = 8;
    config.g = 8;
    config.b = 8;*/
    // Load any sample specific textures:
    mTextures = new Vector<Texture>();
    gdxRender = new RobotGDX(vuforiaAppSession);
    initialize(gdxRender, config);

    mGestureDetector = new GestureDetector(this, new GestureListener());

    mIsDroidDevice = android.os.Build.MODEL.toLowerCase().startsWith(


I needed to set the activity so I created the setmActivity() on the SampleApplicationSession.

After that I implemented the Libgdx ApplicationAdapter class and passed the vuforiaAppSession as an attribute to access all the stuff I initialized.

public class MyGDX extends ApplicationAdapter  {
    ModelInstance modelInstanceHouse;
    private AnimationController controller;
    Matrix4 lastTransformCube;
    // Constants:
    static private float kLetterScale = 25.0f;
    static private float kLetterTranslate = 25.0f;
    // OpenGL ES 2.0 specific:
    private static final String LOGTAG = "FrameMarkerRenderer";
    private int shaderProgramID = 0;
    private Vector<com.mygdx.robot.Texture> mTextures;
    //SampleApplicationSession vuforiaAppSession;
    PerspectiveCamera cam;
    ModelBuilder modelBuilder;
    Model model;
    ModelInstance instance;
    ModelBatch modelBatch;
    static boolean render;
    public SampleApplicationSession vuforiaAppSession;

    public MyGDX ( SampleApplicationSession vuforiaAppSession){
        this.vuforiaAppSession = vuforiaAppSession;


The last important thing to keep in mind is the render() method. I based myself on the the render method of the FrameMarkerRenderer. It has a boolean that gets activated when the camera starts. So I simple changed the variable both on the vuforia AR initialization and the render() method. I had to put the camera on and identity matrix and multiply the model by the modelViewMatrix.

public void render() {
    if (render) {
        // Clear color and depth buffer
        // Get the state from Vuforia and mark the beginning of a rendering
        // section
        State state = Renderer.getInstance().begin();
        // Explicitly render the Video Background


        // We must detect if background reflection is active and adjust the
        // culling direction.
        // If the reflection is active, this means the post matrix has been
        // reflected as well,
        // therefore standard counter clockwise face culling will result in
        // "inside out" models.

        if (Renderer.getInstance().getVideoBackgroundConfig().getReflection() == VIDEO_BACKGROUND_REFLECTION.VIDEO_BACKGROUND_REFLECTION_ON)

                        GLES20.glFrontFace(GLES20.GL_CW);  // Front camera
            GLES20.glFrontFace(GLES20.GL_CCW);   // Back camera

        // Set the viewport
        int[] viewport = vuforiaAppSession.getViewport();
        GLES20.glViewport(viewport[0], viewport[1], viewport[2], viewport[3]);

        // Did we find any trackables this frame?
        for (int tIdx = 0; tIdx < state.getNumTrackableResults(); tIdx++)
            // Get the trackable:
            TrackableResult trackableResult = state.getTrackableResult(tIdx);
            float[] modelViewMatrix = Tool.convertPose2GLMatrix(

            // Choose the texture based on the target name:
            int textureIndex = 0;

            // Check the type of the trackable:
            assert (trackableResult.getType() == MarkerTracker.getClassType());
            MarkerResult markerResult = (MarkerResult) (trackableResult);
            Marker marker = (Marker) markerResult.getTrackable();
            textureIndex = marker.getMarkerId();
            float[] modelViewProjection = new float[16];
            Matrix.translateM(modelViewMatrix, 0, -kLetterTranslate, -kLetterTranslate, 0.f);
            Matrix.scaleM(modelViewMatrix, 0, kLetterScale, kLetterScale, kLetterScale);
            Matrix.multiplyMM(modelViewProjection, 0, vuforiaAppSession.getProjectionMatrix().getData(), 0, modelViewMatrix, 0);
            SampleUtils.checkGLError("FrameMarkers render frame");
            Matrix4 temp3 = new Matrix4(modelViewProjection);
            modelInstanceHouse.transform.scale(0.05f, 0.05f, 0.05f);


It´s a lot of code, but I hope it helps the people that try to start integrating both libraries. I don´t think this is efficient but it´s the only solution I´ve come with.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download