ALL THE LATEST NEWS ABOUT THE VIRTUAL + AUGMENTED + MIXED REALITY BUSINESS

Interviews & Opinion

Immersion Without Isolation

Immersion Without Isolation

From the perils of experiencing VR in public to developing tech that can offer both immersion and integration, Ed Daly, MD at seeper, explores the future of consensual Virtual Reality.

 

VR has the power to create the most incredible immersive experiences, to transport us from the ‘real’ world into fantastical virtual environments. But of course this means we lose contact with our surroundings and our ability to interact naturally with others; this is a problem we need to solve.

At seeper, we use creative technology for live events and installations and our relationship with VR is, well, mixed. On the one hand, clients are interested in the novelty and immersive experiences we can create with VR. On the other hand, once wearing a VR device, the physical venue matters less, if at all; perhaps undermining the future of destinations that ask customers to get off the sofa and travel to them, be that a theme park, museum, or experiential event. And then there are the practical challenges we face when using VR in public spaces as well.

 

VR In Public - Five Challenges

 

1) Disconnecting Your Senses From The Real World...

In Public We’ve evolved to use our senses for matters of life and death. We may not be in danger of being mauled by a tiger while enjoying a VR experience in a natural history museum, but we might just be in danger of having our bag swiped! There’s something disconcerting about surrendering our awareness of what’s happening around us in a public space. We need to mitigate this…

 

2) Bumping Into Things!

VR systems that allow you to move around present problems at home too, but in public spaces the additional fear of walking into walls or into total strangers is heightened. I recently had this experience at Bjork Digital, when pairs of guests wandered around in an enclosed space and we were all distracted wondering whether we were about to clatter into one another.

 

3) Interacting With Hosts

Pretty much any live VR experience requires some guidance from support staff, whether that be practical advice on adjusting focus or helping users navigate an interactive experience. This presents problems for both the user and the host in communicating, and for the host in understanding what’s going on. 

 

4) Interacting With Products

We’ve looked several times at how a brand’s product can be incorporated into a VR experience. The challenge is how to enable a VR user to pick something up, or even to place something in their hand without mishap.

 

5) Interacting With Other Visitors

The destination experiences we create are for groups, and often families. Immersion in the virtual can mean disconnection from the shared experience. On a thrill ride, part of the fun is exchanging excited thoughts as the ride starts to move, or watching the reaction on a friend’s face.

Interacting with hosts, products and visitors in VR is challenging.
 

We’ve considered all the above and, realising they have limitations, we’ve developed our own solutions...

 

VR In Public - Five Solutions

 

1) Augmented/Mixed Reality

Well the answer to achieving immersion without isolation is simple, it’s Augmented or Mixed Reality. Right? Devices like HoloLens or the Meta2 solve all of the above problems, but there is a price - there is no isolation, but there is no real immersion either. Placing digital ‘holograms’ around the real world has its applications for sure, and we’re working with this technology in themed entertainment environments where the immersion in a fantastical world is provided by physical scenery, lighting, projection-mapping and more. However, in most cases an AR/MR experience cannot be described as truly immersive.

 

2) Skeletonisation

So if the problem with AR/MR is that we’re not really ‘immersed’, another solution is to use sensors to bring data about the physical world into immersive VR displays. Skeletonisation is what Kinect and LeapMotion do; sensors and software recognise people (or their hands) and then this ‘skeleton’ can be used to drive digital avatars or hands. So which of the five challenges does this solve? Well, LeapMotion doesn’t get you far; it’s designed specifically to allow you to use your hands to interact with the digital world - but you can’t see the physical things around you and you can’t pick anything up, because these ‘things’ can’t be skeletonised. Full body skeletonisation for ‘avateering’ (avatar puppeteering) can in theory help resolve ‘interacting with hosts and other visitors’. The reality is often glitchy avatars as the software struggles to track the skeleton, especially in environments with noisy data (sunlight, arm crossing, people close to each other.) Nonetheless, it’s a partial solution for interacting with people.

Skeletonisation: great for hands, not so much for the body.

3) Digital Twin

This is terminology used in Internet of Things to refer to a digital version of a physical object with sensors that report its state in order that the ‘digital twin’ can reflect the physical object in realtime. The case in point is the VIVE controller. As anyone who has used the VIVE attest, this works great - and hints at the magic of introducing physical, tactile things into the virtual world. However, this isn’t a general solution. It’s possible to use sensors (whether embedded or on the outside) to track objects, i.e. go beyond ‘skeletons’ of people and hands, in order to allow other ‘things’ to be twinned and interacted with inside VR. But this can only be used for specific objects that the software knows about in advance, and ideally objects that use embedded sensors - so that rules out the cup of tea I might want to drink while playing a VR game!

Vive Controllers work great, but only if the software knows about objects in advance.

4) Camera Feed

There are some demos out there using the front facing camera on a VIVE. You could go further and overlay digital content - in an attempt to create a HoloLens like experience using a fully immersive VR headset and a camera. However, this camera feed is just 2D and doesn’t enable the digital and physical content to be treated as more than layers. So, in effect we’re further back than we started with HoloLens. Picture-in-picture is another way to go; it solves the practical problems, though it makes for a pretty odd experience - and can be disorienting. So while this may be a useful option for ‘switching’ between the immersive and the digital and it’s more elegant that taking the headset on and off, again we’re forced to choose between immersion and isolation.

We're pretty sure this guy is real; his top says so.

5) Realtime Scanning

So the final solution I’m going to describe is what we’re currently developing under the name seeForm. The story behind seeForm began after we received several enquiries from drinks brands looking for a VR experience which allowed users to sample their drink while immersed in a VR environment. So we needed a solution to ‘interacting with things’. LeapMotion style skeletonisation of hands doesn’t give you the bottle. HoloLens won't immerse you in the virtual environment…

So what we did is attach an Intel realsense laser depth sensor to a VIVE. (One challenge is that this Intel device can’t see anything closer than 1-metre out, there is a new model due to be released soon which will solve this.) For any engineers out there wondering, we did look at using LeapMotion to do this but their solution is hardwired to prevent access to the raw depth data because they are focused on skeletonised hand tracking.

HTC Vive coupled with Intel's realsense laser depth sensor.

Then what we can do is scan the environment in realtime to create a point cloud representation of the real world, we have both 3D points and colour samples. In our initial prototype, we’re rendering out the data with minimal processing though, in principle, this can be used to create other 3D representations (for example using the ‘surface reconstruction’ process that HoloLens uses under the hood.)

Realtime scanning.

The use of LIDAR type scanning is nothing new, but to solve our problems we need this to work in realtime, with a scanner you can wear. And now both the hardware is (almost) available to achieve this using software like seeForm.

This gives us the ability to bring the physical world - obstacles, things to pick up - and people into the virtual world and to combine and manipulate them digitally any way you want. We had fun demoing the seeForm prototype at a Wired event recently, using a scanned real environment (which included people), mixed with the realtime seeForm view. Users were unsure whether the people they were seeing were physically with them at the venue or were part of the virtual world! As one of the people hosting the event, it was fantastic to be able to step inside their virtual experience, literally pointing out features and communicating naturally.

 

What Next?

We need a refined hardware and software solution but for us, the direction of travel is clear. We want a ‘virtual real slider’, or mixing desk, allowing us to integrate the real and the digital flexibly and seamlessly. We can use proximity, gesture and voice to enable people in the real world to step into immersive experiences when they need to; and if (while wearing a VR headset) I reach out to pick something up, then I want to see it appear. All the components are there, so with a little more work we needn’t have to choose between immersion and isolation.

 

About The Author

Ed Daly has spent 20 years as an entrepreneur in the creative industries, starting and growing video games development studios and an innovation consultancy. At seeper, Ed’s mission is to lead the creative and technical team to new heights and bring their work to a bigger audience. This involves building a scalable operation while retaining a culture of innovation. Ed has a Computer Science degree (and just about remembers how to code) and an MBA. Outside of work, Ed takes the children on long walks to wear them out so he can be left to read long books.

PocketGamer.biz regularly posts content from a variety of guest writers across the games industry. These encompass a wide range of topics and people from different backgrounds and diversities, sharing their opinion on the hottest trending topics, undiscovered gems and what the future of the business holds.