ALL THE LATEST NEWS ABOUT THE VIRTUAL + AUGMENTED + MIXED REALITY BUSINESS

Feature

From Hydrographer To Total Art: One Developer’s Journey From Surveying To Synesthesia

From Hydrographer To Total Art: One Developer’s Journey From Surveying To Synesthesia

We were treated to a guerilla demo of indie project GNOSIS at VR Connects London earlier this year and it made a lasting impression on us. The way it presents information to the user – and how it allows them to navigate it – is unlike anything we’d seen before.

Here, lone developer Robert Bogucki recounts the journey of creating an award-nominated data-rich VR artefact borne out of a passion for graphics and data, and reflects on what it means to do VR technology justice.

  

I’m a marine hydrographer by trade and a VJ /creative coder by passion; those disciplines are not as unrelated as they might seem. Hydrographic surveying involves a mix of earth sciences, IT, multi-sensor data fusion (GPS, inertial, sonar) and data processing. Working with the huge sea floor mapping datasets for charting or subsea construction would simply not be possible to the scale and accuracy required today without the advances in graphics processors brought about by the games industry.

Growing up as a gamer, I was always fascinated by how hardware capabilities and limitations unleash creativity.

Growing up as a gamer, I was always fascinated by how hardware capabilities and limitations unleash creativity; architectures of graphics and sound chips bringing about new artistic styles. My passion for graphics led me to ocean mapping and an MSc at the University of New Hampshire, where I created an Augmented Reality system using the open-source Ogre3D game engine, ARToolKit and OpenCV libraries.

Fusion of datastreams also features heavily in typical set-ups used by interactive media artists for installations/live performances; these often quite complex systems interface software and hardware via MIDI, OSC protocols, the DMX standard and video texture sharing.

Hydrographers usually work five weeks on, five off – leaving me time to engage in passion projects, which included Machineparanoia using Ogre3D game engine, Resolume and Spout. We progressed this by adding a Kinect sensor to allow festivalgoers to mix their dance moves into the projected VJ-mix for a German installation. This led to using Kinects to capture dance performances which became the basis of the project I’ve been working on ever since...

 

Gnosis Launch Trailer from VJ Rybyk on Vimeo.

 

List of topics covered in this article: 


Click here to view the list »
  • Unity And Unreal In Performative Art

    Though GNOSIS might seem as just another example of Unity-based VR, it emerged through a rather unique process of experimentation while I arrived at my own tools and techniques. My surveying and information visualisation experience came in useful while working with the Kinect pointcloud datasets, as well as the accompanying generative audiovisualisations.

    Calibrating the dual Kinect sensor setup.

    VR masterpiece Thumper resembled a VJ set-up in the way it treated the entire world as a music track.

    At the recent A MAZE./Berlin independent game festival – where GNOSIS was nominated for the Most Amazing Game award - I watched DROOL’s Brian Gibson give a demo of the Level Editor tool for the VR masterpiece Thumper and it bore an uncanny resemblance to a VJ set-up in the way it treated the entire gameworld as a music track, not unlike a sequenced video synthesiser.

    Brian’s naming of the pioneer videographer/animator Norman McLaren as an influence only reinforced that impression. Indeed, Unity and Unreal Engine are more and more frequently present in the performative art context, and VR is one of the reasons for that - see this excellent generative/interactive art Facebook group for examples.

  • Packing Pixels

    One GPU-centric interactive media technique which I feel is very relevant for VR, especially the ‘cyberpunky’ datavis-oriented variation, is pixel-packing. Most, if not all, games already utilise texture atlasing, where text font glyphs or sub-textures are combined into a single texture for more efficient use of videocard memory. Pixel-packing could be seen as extending and generalising this by storing general purpose data inside individual pixels of a texture.

    Once the pointcloud is streamed into Unity, it is trivial to multiply it, or modify it. The ‘reflection’ effect was generated on the fly using a geometry shader.

    Video RAM opens up a new world for reading, processing and displaying data.

    Thanks to the wonders of General-Purpose computing on Graphics Processing Units (GPGPU), once our data is in video RAM, a new world of possibilities opens up for reading, processing and displaying it; invaluable to game VFX artists, physics simulation experts and creative coders alike. Elburz Sorkhabi introduces the motivations and specifics of pixel-packing very accessibly here. Within my own project, pixel-packing helped me handle pointcloud streams, which are always challenging due to their sheer data volume.

     

    A single frame of fused and cleaned pointcloud data...

    ...pixel-packed into a PNG image in MATLAB.

  • From Dance To Texture

    The dance performance at the heart of GNOSIS was recorded with two calibrated Kinect depth cameras, providing full 360-degree coverage of the dancer. The pointclouds were pixel-packed into a sequence of 32-bit RGBA images (512x156 pixels) using MATLAB. Each image needed to encode the 3D positions and colors of 39,536 points, so it was crucial to consider required range and precision of positions, and the tradeoff between the spatial extent and the small-scale precision of the features.

    Disk storage is abundant, while VR frame rendering requirements are steep.

    The resulting images were combined into a movie file; lossy codecs are completely unusable with this approach, so the choice was between non-lossy (for example MPNG) or uncompressed RGBA (‘raw’ video). I went with the latter to minimize CPU usage as disk storage is relatively abundant, while VR requirements for frame rendering times are steep.

    The movie is played back into a Render Texture in Unity using the SHUU Games AVI Player plugin, all texture filtering needed to be disabled. Finally, Vertex Texture Fetches within vertex shaders read the encoded information for each vertex and decode them using bit-shifting/unpacking.

     

    A lesson learned the hard way was not to use metallic fabrics or surfaces in an IR depth camera shoot, or you’ll have to deal with some data loss and very challenging artifacts.

  • Sound In Space

    It’s no secret that spatialised sound, especially synchronised with visuals, is key to immersion in VR. While Gnosis is no rhythm-action game, it heavily relies on audio to aid the player, who is exploring an enigmatic cyber-archive. To convey non-verbal clues, we used stylised audio waveform visualisations. They were inspired by the art of Oscilloscope Music, which involves drawing shapes (Lissajous figures) on an oscilloscope with the X/Y channels assigned to L/R channels of stereo audio.

    A three-channel 46-second long oscilloscope music audio waveform of an animated GNOSIS audiosculpture.

    Spatialised sound is key to immersion in VR.

    Using the recently released OsciStudio created by Hansi Raber and Jerobeam Fenderson, we extended that technique into three dimensions to accommodate three-channel spatialised sound. Together with Dr. Jonathan Weinel, researcher of A/V media and electronica, we created a series of synesthetic audiovisual sculptures in which the sound directly ‘paints’ animated visuals. Such designed audio data was pixel-packed and rendered with vertex and fragment shaders, see figures below.

    A 4096 x 4096 pixel RGBA texture encoding 16 audio waveforms.

    Texture-encoded audio played back in-game via shaders as a 3D Lissajous effect synchronised with spatialised multichannel sound.

    See here for video of above.

  • Flying Solo

    Being a solo developer on GNOSIS certainly influenced most design decisions. It made perfect sense for me to reduce the asset-making burden by using generative techniques to create the game environments. While following the Demoscene tradition of abstract generative, audioreactive spaces created directly from information and code, I added a semantic/linguistic aspect - language, associations, culture.

    Using Python scripting, I mined the vast public domain ConceptNet5 graph database of concepts/words/names, following certain thematically relevant topics. The game uses that collection of 30,006 knowledge nodes, interlinked by 158,000 relationship graph edges to randomly grow a unique smaller graph network instance for every playthrough.

    A 8192 x 8192 32bit RGBA texture encodes 158k connections of four types between 36k nodes, enabling the creation of a graph in a fraction of a second using shaders.

    Compute shaders make calculating millions of force vectors a trivial task at VR-required framerates.

    This graph du jour ties in with the gameplay puzzle elements and is available for self-paced exploration and discovery. Not surprisingly, this entire archive database ended up datapacked into a texture. All graph database operations in GNOSIS - querying nodes to find connected neighbours when building the random subgraph - would not be feasible without compute shaders.

    Pixel-packing amounts to storage and data delivery in the GPGPU context, while compute shaders represent the processing side. They were also used to implement a force-driven graph simulation which allows a randomly generated graph to self-organise over time using physics-based node interactions. Compute shaders make calculating millions of force vectors required for this effect a trivial task at VR-required framerates.

  • Synthesis Of The Arts

    A pattern emerges of representing and moving data from one domain (visual, audio, symbolic) to another, to leverage the massive parallelism of the GPU, at the same time creating synesthesia for the user. The terms ‘synesthetic’ or ‘psychedelic’ are often used to describe experimental VR titles such as Rez Infinite or Thumper, but both terms carry distracting connotations. An appreciative first-time player of GNOSIS at the Berlin exhibit taught me the German word Gesamtkunstwerk. That Romantic pursuit of ‘synthesis of the arts’ may be relevant in bringing VR to full potential.

    VR is intimately tied with human perception – injecting video straight into our eyes, messing with our proprioception, watching our every movement, tracking our gaze. In pursuit of perfectly photorealistic illusions to confound our brain, we’ve ended up creating increasingly bio-inspired machines; GPU architecture as a distant approximation the human wetware. GPUs enable the images produced by convolutional neural network code like Google’s Deep Dream, these digital hallucinations uncannily mirroring the idiosyncrasies of our visual cortex.

     

    If our minds were GPUs, seeing sounds and hearing colours would require no more than simply swapping one texture for another!

    Inside A Living Mind

    Some players of GNOSIS shared impressions of being inside a living mind, as they were reminded of the way the mind processes information. Perhaps they were simply reactions to being immersed within the interconnected knowledge graph visualisation, but those remarks made me wonder whether having the game designed from the ground up to ‘live’ mostly on the GPU might have contributed to that effect.

    Clinically defined, synesthesia is the condition of stimulating one sensory/cognitive pathway leading to experiences in another pathway. If our minds were GPUs, seeing sounds and hearing colours would require no more than simply swapping one texture for another!

  • Doing VR Justice

    But let’s say we have the ‘total art’ of VR all figured out, having full control over the player’s reality. Where would we take them and why? Every single developer out there will have a different answer… I suspect Gnosis may not necessarily classify as a game to some, which is why I refer to it as a VR music album or VR artefact, but the procedurally mixed collaborative soundtrack by five electronic artists is something we’re very excited about!

    If the player comes away with a consideration for the richness of thought that came before us, then I feel I've done VR some justice .

    It’s definitely not a passive experience either; it’s intended as an intellectually stimulating activity rather than plain consumption of media. Although the knowledge content presented to the user is broad and random, the user is nevertheless immersed in a complex construct of real world ideas and relationships. Through a synergy of music, visuals and ideas we hope to awaken curiosity and awareness, rather than just mesmerise or desensitize with pointless sensory overload.

    If the player comes away from a 10-minute session of GNOSIS with a few new synonyms or a definition of an unfamiliar term, having perhaps travelled through a cluster of philosophers in the Lexicon Mode - triggering a reflection on the richness of thought that came before us - then I feel I've done VR technology some justice.

    Abstract synesthetic visualisation, GNOSIS.

PocketGamer.biz regularly posts content from a variety of guest writers across the games industry. These encompass a wide range of topics and people from different backgrounds and diversities, sharing their opinion on the hottest trending topics, undiscovered gems and what the future of the business holds.