The process of uploading human mind to the digital space. At this stage, human mind has been integrated into the digital space, meaning that humans can perceive all virtual objects in the digital space.
2020 Ars Electronica In Kepler's Gardens
2019 SIGGRAPH Asia Best XR Tech Award
2019 Arte Laguna Priza
2018 Taipei Digital Art Awards
Created by Taiwanese artists Hu, Chin-Hsiang and Tsai Bing-Hua, this piece attempts to use a hybrid reality, LED lights, wearable devices, and fans to build a installation that uploads the human mind to digital space. Imagine that upload process can see virtual object in real space. When you see the virtual object and feel the influence (wind and vibration), after passing through the upwardly extending tunnel, the screen enters the completely virtual space, but you don't know whether the upload is completed.
Combining VR and depth-aware camera, it instantly creates the illusion of false and real interlacing in space, and sees that his hand is digital.
The light changes in the real space, let the sound, light and virtual illusion pull the audience away from reality, you can't tell whether it is true or not when you experience it.
Passing the touch in the digital space, allowing the virtual image to directly affect the viewer
The virtual image is directly in contact with the viewer in the digital space, and is presented to the audience through the vibration-sensing clothing.
Use Unity 3D in conjunction with Oclucs Rift to let the audience experience the VR virtual space, and then use the Zed Mini Camera, so that viewers can see the real world wearing VR glasses, and the images show the virtual objects of the audience through the instant particle effects. , use the program to create unlimited tunnels and space composed of squares.
The wearable device contains 8 vibration motors. Arduino Fio writes several programs and receives computer signals through Xbee. There are 8 fans around the audience. The fan uses the AC dimmer device to instantly adjust the current and control the air volume.
Both the wearable device and the fan integrate the Serial and OSC signals through openFrameworks(ofxSerial,ofxGui,ofxPubSubOsc). Unity tells the oF which motor to vibrate and also tells the oF to play the fan's program number.
There are also 8 groups of LEDs around the audience. The IC is WS2812. Through Pixelpusher control, the software uses Processing to write real-time programs.
Hardware List:
- Oculus rift CV1
- ZED mini Camera
- Wearable Device(xBee+vibration motors+Arduino Fio)
- LED strips(WS2812)
- Fanx16
- Speaker
- Computerx1 (i7, 16G Ram, GTX 1070X2)