Skip to content

Set up the session

Now we're going to track our reference image for the first time! 😎

Create the viewport

We begin by creating the viewport. Remember that the viewport is the area in which we'll display the augmented scene. Add the following to index.html and to ar-demo.js:

index.html
<!doctype html>
<html>
    <head>
        <meta charset="utf-8">
        <meta name="viewport" content="width=device-width,initial-scale=1">
        <title>MARTINS.js WebAR demo</title>
        <script src="martins.js"></script>
        <script src="ar-demo.js"></script>
        <style>body { background-color: #3d5afe; }</style>
    </head>
    <body>
        <div id="ar-viewport"></div>
        <img id="my-reference-image" src="my-reference-image.webp" hidden>
        <video id="my-video" hidden muted loop playsinline autoplay oncanplay="this.muted=true;this.play()">
            <source src="my-video.webm" type="video/webm" />
            <source src="my-video.mp4" type="video/mp4" />
        </video>
    </body>
</html>
ar-demo.js
window.onload = async function()
{
    try {
        if(!Martins.isSupported()) {
            throw new Error(
                'Use a browser/device compatible with WebGL2 and WebAssembly. ' +
                'Your user agent is ' + navigator.userAgent
            );
        }

        const tracker = Martins.Tracker.ImageTracker();
        await tracker.database.add([{
            name: 'my-reference-image',
            image: document.getElementById('my-reference-image')
        }]);

        const viewport = Martins.Viewport({
            container: document.getElementById('ar-viewport')
        });
    }
    catch(error) {
        alert(error.message);
    }
};

Create the source of data

Let's set up our source of data. We get the HTMLVideoElement corresponding to the test video and then we use it to instantiate a video source of data. Write the following to ar-demo.js:

ar-demo.js
window.onload = async function()
{
    try {
        if(!Martins.isSupported()) {
            throw new Error(
                'Use a browser/device compatible with WebGL2 and WebAssembly. ' +
                'Your user agent is ' + navigator.userAgent
            );
        }

        const tracker = Martins.Tracker.ImageTracker();
        await tracker.database.add([{
            name: 'my-reference-image',
            image: document.getElementById('my-reference-image')
        }]);

        const viewport = Martins.Viewport({
            container: document.getElementById('ar-viewport')
        });

        const video = document.getElementById('my-video');
        const source = Martins.Source.Video(video);
    }
    catch(error) {
        alert(error.message);
    }
};

Start the session

The session is a central component of a WebAR experience. The Martins namespace has a very special method called startSession. It receives a settings dictionary that lets us configure the new session in different ways. Add the following code to ar-demo.js:

ar-demo.js
window.onload = async function()
{
    try {
        if(!Martins.isSupported()) {
            throw new Error(
                'Use a browser/device compatible with WebGL2 and WebAssembly. ' +
                'Your user agent is ' + navigator.userAgent
            );
        }

        const tracker = Martins.Tracker.ImageTracker();
        await tracker.database.add([{
            name: 'my-reference-image',
            image: document.getElementById('my-reference-image')
        }]);

        const viewport = Martins.Viewport({
            container: document.getElementById('ar-viewport')
        });

        const video = document.getElementById('my-video');
        const source = Martins.Source.Video(video);

        const session = await Martins.startSession({
            mode: 'immersive',
            viewport: viewport,
            trackers: [ tracker ],
            sources: [ source ],
            stats: true,
            gizmos: true,
        });
    }
    catch(error) {
        alert(error.message);
    }
};

Most of the settings passed to startSession correspond directly to the concepts we saw earlier. We're starting a new session in immersive mode, with the tracker, source of data and viewport that we have just configured. Let me explain what stats and gizmos mean:

  1. When you set stats: true, you're asking the engine to display a stats panel that shows useful data such as the current framerate. This is useful when developing WebAR experiences, but you should disable it in production.

  2. The option gizmos: true enables the gizmos. Gizmos are visual artifacts that help you visualize the current state of the tracker. They too are useful in development. In production, you may disable them or enable them partially (more on that later).

Open http://localhost:8000. You should see the tracking in action. Even though there is no virtual scene yet, the gizmos will show you the image being tracked.

Image tracking

Image tracking in action!

The code I have just presented is, in essence, what you need to start a session. I'm going to move it to a new function called startARSession for convenience:

ar-demo.js
window.onload = async function()
{
    try {
        const session = await startARSession();
    }
    catch(error) {
        alert(error.message);
    }
};

async function startARSession()
{
    if(!Martins.isSupported()) {
        throw new Error(
            'Use a browser/device compatible with WebGL2 and WebAssembly. ' +
            'Your user agent is ' + navigator.userAgent
        );
    }

    const tracker = Martins.Tracker.ImageTracker();
    await tracker.database.add([{
        name: 'my-reference-image',
        image: document.getElementById('my-reference-image')
    }]);

    const viewport = Martins.Viewport({
        container: document.getElementById('ar-viewport')
    });

    const video = document.getElementById('my-video');
    const source = Martins.Source.Video(video);

    const session = await Martins.startSession({
        mode: 'immersive',
        viewport: viewport,
        trackers: [ tracker ],
        sources: [ source ],
        stats: true,
        gizmos: true,
    });

    return session;
}

Now all you have to do to start a new session is call startARSession()!

Write the user callback

The user callback is a function responsible for updating and rendering the virtual scene. We have no virtual scene at the moment, but we can already set up that function. In order to do this, we must call session.requestAnimationFrame() and pass the user callback as an argument.

ar-demo.js
window.onload = async function()
{
    try {
        const session = await startARSession();

        function animate(time, frame)
        {
            session.requestAnimationFrame(animate);
        }

        session.requestAnimationFrame(animate);
    }
    catch(error) {
        alert(error.message);
    }
};

async function startARSession()
{
    // ...
}

requestAnimationFrame

Note that session.requestAnimationFrame() is different from window.requestAnimationFrame(). The former is a call to the WebAR engine, whereas the latter is a standard call to the web browser.