In Chrome 67, we announced the WebXR Device API for both augmented reality (AR) and virtual reality (VR), though only the VR features were enabled. VR is an experience based purely on what's in a computing device. AR on the other hand allows you to render virtual objects in the real world. To allow placement and tracking of those objects, we just added the WebXR Hit Test API to Chrome Canary, a new method that helps immersive web code place objects in the real world. Where can I get it? This API is intended to stay in Canary for the immediate future. We want a protracted testing period because this is a very new API proposal and we want to make sure it's both robust and right for developers. A compatible smartphone running Android O or later. With these, you can dive into the demos or try out our codelab.
At Google IO this year, we demonstrated augmented reality with an early build of Chrome. This may be obvious to you since you're reading this on a website devoted to the web. If you build demonstrations with this new API, prepare for this question. You'll get it a lot. Speaking of IO, if you want to hear more about the immersive web in general, where it is, where it's going check out this video. What's it useful for? Augmented reality will be a valuable addition to a lot of existing web pages. For example, it can help people learn on education sites, and allow potential buyers to visualize objects in their home while shopping. Our demos illustrates this. They allow users to place a life-size representation of an object as if in reality. Once placed, the image stays on the selected surface, appears the size it would be if the actual item were on that surface, and allows the user to move around it as well as closer to it or farther from it.
This gives viewers a deeper understanding of the object than is possible with a two dimensional image. If you're not sure what I mean by all of that, it will become clear when you use the demos. If you don't have a device that can run the demo, check out the video link at the top of this article. One thing that demo and video doesn't show is how AR can convey the size of a real object. The video here shows an educational demo that we built called Chacmool. A companion article describes this demo in detail. The important thing for this discussion is that when you place the Chacmool statue in augmented reality, you're seeing its size as though it were actually in the room with you. The Chacmool example is educational but it could just as easily be commercial. Imagine a furniture shopping site that lets you place a couch in your living room. The AR application tells you whether the couch fits your space and how it will look next to your other furniture.
A key problem to solve when implementing augmented reality is how to place objects in a real-world view. The method for doing this is called ray casting. Ray casting means calculating the intersection between the pointer ray and a surface in the real world. That intersection is called a hit and determining whether a hit has occurred is a hit test. This is a good time to try out the new code sample in Chrome Canary. Before doing anything, double-check that you have the correct flags enabled. Now load the sample and click "Start AR". Notice a few things. First, the speed meter which you may recognize from the other immersive samples shows 30 frame per second instead of 60. This is the rate at which the web page receives images from the camera. The other thing you should notice is the sunflower image. It moves as you move and snaps to surfaces such as floors and table tops.
If you tap the screen, a sunflower will be placed on a surface and a new sunflower will move with your device. The image that moves with your device, and that attempts to lock to surfaces is called a reticle. A reticle is a temporary image that aids in placing an object in augmented reality. In this demo, the reticle is a copy of the image to be placed. But it doesn't need to be. In the Chacmool demo, for example, it's a rectangular box roughly the same shape as the base of the object being placed. The Chacmool demo shows what AR might look like in a production app. Fortunately, there is a much simpler demo in the WebXR samples repo. My sample code comes from the AR Hit Test demo in that repository. FYI, I like to simplify code examples for the sake of helping you understand what's going on. The basics of entering an AR session and running a render loop are the same for AR as they are for VR.
|