ALL THE LATEST NEWS ABOUT THE VIRTUAL + AUGMENTED + MIXED REALITY BUSINESS

Feature

GDC 2022: PlayStation Virtual Reality 2

GDC 2022: PlayStation Virtual Reality 2

While Sony itself wasn’t present in any significant capacity at Games Developer Conference 2022, that didn’t stop exciting new PS VR 2 secrets trickling out from other sources.

Announcing the expansion and support of PS VR 2 into its offerings, Unity – the makers of cross-platform game development tools – reminded us of the upcoming hardware’s innate powers as well as hinting at some of its previously unseen capabilities and the possibilities open to bright-minded developers.

PS VR 2 tech specs

In brief, PS VR 2 features 4K OLED displays, a 110-degree field of view, a resolution of 2000 x 240 per eye and up to 120Hz refresh rates. Meanwhile improved motion tracking (over the previous PS VR hardware) promises a more immersive experience without a requirement for external tracking hardware. You can see the full specs in this previous article.

On-board eye-tracking is built into every PS VR 2 and Unity sees it as being a major part of every offering on the platform.

Ahead of the system's launch, Unity’s presentation at GDC focussed on two of PS VR 2’s most exciting new features:

Firstly, Unity will feature extensive support for the on-board eye-tracking. This high-level tech is built into every PS VR 2 and Unity sees it as being a major part of every offering on the platform. Not only is eye tracking a useful input into the game – such as being able to know where the player is looking in order highlight in-game choices and eliminate needless ‘gorilla arm’ waving – but the Unity team was keen to spell out the exciting possibilities behind the scenes.

Foveated rendering takes advantage of the inherent way the eye and brain work together to perceive an image. While it’s easy to believe that everything within your full field of vision is perfectly in focus and given equal prominence in your perception, in reality only a tiny part of your view – your ‘gaze’ (aka the part being focused on by your fovea) – actually gets your brain’s full attention.

Foveated rendering prioritises this gaze point above the rest of the frame, reducing detail in areas outside the fovea’s range. Now your true point of focus – revealed through the PS VR 2’s eye-tracking – can be given maximum care and attention with all other parts of the image taking a backseat. The result is a huge saving in render power with imperceptible impact for the user. After all, if you can’t ‘see’ it, then why render it?

And new, single pass rendering further reduces CPU usage by traversing the scene only once when rendering for both eyes.

The team was able to achieve 2.5x GPU performance when implementing fixed foveated rendering.

It all means that using Unity’s system the team was able to achieve 2.5x GPU performance when implementing fixed foveated rendering (based on an assumed fixed focal point) but this figure increased to a 3.6x performance boost when combined with eye tracking to zone in the GPU’s power.

In addition to knowing where the player is looking, the hardware (and Unity’s software) can also detect pupil diameter – which could be used to deliver the optimum settings for the display – and ‘blink states’ with the game knowing if the player had their eyes closed and potentially missed a vital part of the action.

More realistic avatars

Similarly, eye-tracking allows for the creation of more realistic avatars for social apps and the like by successfully animating more lifelike faces and expressions bringing digital likenesses to life. Unity even offers ‘wink detection’ for use in social apps and which could even be implemented in-game to trigger an interaction.

And the benefits don’t end there. Eye-tracking can also produce metrics to show where the player is sending their attention and so could therefore be used to trigger in-game ‘clues’ and nudges to gamers walking in circles, missing the point or generally barking up the wrong tree.

Exciting new possibilities: ‘3D haptics’ where the vibration units in the left controller, headset and right controller could be triggered at different strengths and timings.

Players can ‘feel’ the 3D world

Alongside eye-tracking, PS VR2 ’s enhanced haptics similarly push the envelope and deliver exciting new possibilities.

In the demo, the Unity team suggested multiple uses for the haptic motor inside the headset. This could be triggered to represent the rush of an object passing close to the head or to let the player know when they’ve reached a boundary in a play area or come in contact with a surface.

Most interestingly, the team offered up the possibilities of ‘3D haptics’ where the vibration units in the left controller, headset and right controller could be triggered at different strengths and timings. By triggering the left controller at full strength, the headset at medium (a few milliseconds later) and the right at low strength milliseconds after that, the player will mentally build a 3D picture and perceive an impact or hit to their left.

And it’s worth noting that the hand controllers not only feature vibration motors but speakers too, allowing developers to further build a sense of a 3D space around the player through the combined use of sound and haptics. Also clever controller features such as finger touch detection let the game know where your finger is resting on a button without actually pressing it, potentially allowing more realistic rendering of in-game hands and finger positions.

And with six degrees of movement detection and the adaptive triggers lifted from PS5 (which can be made stiffer or looser depending on what you’re pushing, lifting, or moving in game) it all adds up to a new experience that the player can feel and hear in their hands.

Multiple display modes put you in the game

Finally, the team outlined the different modes of output from the hardware allowing an external screen to either mirror the player's view – allowing passive players to see what the headset wearer is enjoying – or to provide an alternative, interactive new view allowing them to play along and be part of the action from a different viewpoint.

It’s certainly an exciting package of features and we can’t wait to see what smart developers will be able to do with it. Sign up at Unity to find out more.

This article was first published on BeyondGames.biz.

Tags:
Editor - PocketGamer.biz

Daniel Griffiths is a veteran journalist who has worked on some of the biggest entertainment media brands in the world. He's interviewed countless big names, and covered countless new releases in the fields of videogames, music, movies, tech, gadgets, home improvement, self build, interiors and garden design. Yup, he said garden design… He’s the ex-Editor of PSM2, PSM3, GamesMaster and Future Music, ex-Deputy Editor of The Official PlayStation Magazine and ex-Group Editor-in-Chief of Electronic Musician, Guitarist, Guitar World, Rhythm, Computer Music and more. He hates talking about himself.