Tips For Using AR & VR At Events & Tradeshows

In this article we share some of our hard earned experience in deploying Augmented Reality (AR) and Virtual Reality (VR) in real world situations, such as trade-shows and events. 


Much of the magic of VR is the ability to naturally move around virtual environments. Additionally, in AR it is to see a virtual object tethered to the real world. This is thanks to positional tracking. Without positional tracking the experience feels odd- in VR we may feel locked into a body we can’t control and in AR, objects can feel floaty and detached from the real world.

To achieve this, much like humans do, the device needs to track it’s position relative to some ideally fixed reference point (i.e. the floor or wall). Otherwise things can get confusing!

A Little History

The first generation of VR headsets (HTC Vive and Oculus Rift) used fixed external sensors that communicated with the headset. The exact implementation varied between manufacturer, but the basic principle was that the sensors provided a fixed frame of reference from which the headset could work out it’s position.

This is still the most accurate method but is less convenient to set-up. It requires more equipment and clear line of sight between the sensors and the devices.


Second generation headsets like the Oculus Rift S & Quest, as well as AR devices including mobile phones, now use ‘inside out’ tracking. This doesn’t require external sensors and is therefore much more convenient.

It works by looking for visual features in the room it can track using one or more cameras embedded in the device. This combined with information from its gyroscope and accelerometer (equivalent to our inner ear), produces a conceptual model of the room in which it can locate itself. We call this SLAM, which is short for Simultaneous Localisation and Mapping.

For simplicity, most computer vision happens in black and white, so a visual feature is really a distinctive area in the camera view of high contrast. Once the device has recognised a feature, it will attempt to re-find the position of this ‘reference point’ over time. From that it can work backwards to find out how the user has moved relative to that point. Typically it tracks lots of these features, so if one goes out of view or gets obscured, it can still guess where it is. The more features, the greater the accuracy.

This is no easy task and it’s a testament to modern technology that a device costing just a few hundred pounds achieve this with such a level of reliability. However we still need to be careful in the conditions we use the device in.

Ideal Environments For VR/AR

For SLAM to work the ideal environment should:

  • Have sufficient features – The room has objects that have discernible details or shape. A completely smooth featureless wall or floor is hard to detect, so textured surfaces over plain ones are preferred, or put furniture into a completely bare room.
  • Be evenly lit – If the room is too dark, the background noise in the cameras will increase, which can make detection harder. Go too bright and glare will cause the camera to white out losing valuable information.
  • Visually static – This seems obvious as most rooms don’t move, however the system doesn’t really know what’s the room and what isn’t. So if you are in a moving crowd, or projecting moving images onto walls, or even have changing lighting, you have a very visually dynamic environment. This can cause issues.

That said, the tracking systems are pretty robust, so you don’t have to have all of these. However, the closer you can get to this ideal the better the system will behave.


If you are planning to use an ‘inside out’ VR headset at an event, consider the following check-list (note most of these also apply to AR as well):

  • Headset is not directly facing a blank wall – If you can’t avoid this, then try putting a design or logo on the wall so there is something to track.
  • User’s are not directly backing a wall – It will work fine until the user turns their head, at which point the cameras have nothing to look at but a wall.
  • Lighting – Try to avoid a single bright spotlight that casts strong shadows or produces glare. Use several lights that light the space more evenly.
  • Avoid flashing, moving lights and projections – These will change the visual appearance of the room.
  • Consider shadows – If you have a spotlight that is throwing moving shadows against a wall from which the device is tracking then it can confuse the device.
  • Give users space – Crowds can be confusing as people tend to move. So give space between the user and audience in the environment to allow the camera to see the floors and walls to provide a fixed tracking point. Also for safety reasons, it’s good to have a barrier between the users and spectators.
  • Check materials – Mirrors, glass and shiny surfaces appear different depending on the viewing angle and this can confuse the cameras. Matte surfaces are better.
  • Moving Surfaces – This might be obvious but rotating platforms and kinematic sculptures can cause issues.

If you are in doubt, it may be that using an external tracking system might be better for your use case.

Have any questions? Mbryonic is a digital agency that create virtual reality, augmented reality and interactive 3D experiences, so drop us a line and we’ll be happy to answer your queries!

Get Started

Want to learn how VR or AR can benefit your organisation, or have a brief you'd like a quote on?

Our friendly experts are here to help. Fill in your details and we'll get right back to you.