Can new computer vision techniques be used to craft a virtual reality environment that is more connected to our perception of the physical world?
This project was born out of my dislike for archetypal virtual environments that, in all of their hyperrealism, felt disconnected from our fuzzy perception of the real world.
To challenge this paradigm, I crafted an interactive VR environment by relinquishing creative control to computer vision algorithms. These algorithms were designed to reconstruct real‑world spaces from photographic data.
This project was an attempt to create a virtual environment that is more immersive because it is linked to our perception of the physical world.
Liquefied Realities is made out of digital reconstructions of demolition yards and construction sites in Toronto. These spaces were first photographed and then converted into 3D models. These models were later arranged into an immersive environment in the Unreal game engine that viewers could navigate and interact with using controllers.
In 2018, this project was included in the First Workshop on Computer Vision for Fashion, Art and Design at the European Conference on Computer Vision in Munich, Germany. In 2017, it was exhibited in the LAST Festival at Stanford University and in the Tech Art Fair at the Ontario Science Centre. This project was also used to create a music video for musician Julius Smack.
Sound design for this project was done by Liam Baker.