Experiments that Are Going to Promote Oculus Rift Beyond the Gaming Industry

In spite of the fact that Oculus Rift is not yet available to the broad publicity, hackers and developers have already created a rich subculture on a background of virtual reality headset. Also, they are not limiting themselves with gaming industry only. Openness of the platform means that developers can give free rein to their imagination and use additional hardware.

Within Salted Perception (which is an experiment of the German developer Undev) Kinect is connected with Rift headset by cable ties. When a particular program is activated, it retransmits the camera signal back to Rift headset, adding colorful psychedelic filters. The resulting image is quite distorted and abstract, but it is still possible to work with it, because it literally creates a “psychedelic reality” for the viewer. Kinect mounting is quite heavy, but it hints at the possibility of reaching augmented reality, if Rift future models will include a front camera.

An exciting experiment of BeAnotherLab, Gender Swap, is also attached to the camera headset, allowing two people of different sexes to exchange their positions and see each other from a different perspective. Feelings are becoming even more realistic when people agree to simulate the movements of hands or legs. Spanish studio, which is in charge of the project, says the idea is for the Rift to contribute to the development of empathy. Also, similar experiment can help people in wheelchairs “walk”, or at least remain in a state close to walking.

Sagar Patel, Indian developer, formerly employed at Q-Games, developed a pilot game for the Oculus Rift – Frequency Domain, which generates the surreal landscapes out of music. It would be sufficient to just import the audio file, and music waveforms will create the abstract polygonal mountains and canyons around the viewer. Patel says that watching music visualization is incredibly fascinating. “Observation of the music helps me in focusing on it and hearing the entire array of sounds”, he says. He also plans to involve the Leap Motion controller and introduce an additional element of interaction. “Rift opens up new perspectives, and I can not wait until people start coming up with new methods of interaction with a headset in conjunction with other types of devices”.

In the SharkPunch experiment of Chaotic Moon, Leap Motion controller is attached to the frontal part of the headset. If this USB-sized device is placed on a table, one can control software with hand gestures. If it is put vertically on the Rift, the feeling of “seeing” one’s own hands appears. SharkPunch is a pretty simple game, but it effectively uses a quite useless LeapMotion hardware. Hydra option of Razer is another way to simulate the movement of hands in a virtual reality, although in this case it is much less convenient to manage the two controllers instead of gesture management as in SharkPunch.


A group of students in Royal College of Arts in London developed an intriguing virtual reality tool named GravitySketch. It is a Wacom-style tablet. As you draw a sketch in the air, the motion of hands is recorded in a computer on Arduino basis and visualized on the headset. As a result, it is possible to draw a three-dimensional shapes that float right in front of the viewer.


B.A.D. Intuitive Aerial is described as “the world’s most advanced platform for aerial cinematic”. Its experiment allows filmmakers to film scenes in the air without a helicopter. In the company’s last experiment its drones were wirelessly transmitting live images to the Oculus Rift headset. The delay of transmitting is about 120 milliseconds, which is sufficient for a high quality viewing. Web cameras with low resolution were used, but theoretically Aerial can use more advanced optics. Also, the technology can be implemented for the military purposes.

One of the most significant problems associated with Oculus Rift is the loss of peripheral vision. The feeling of wearing glasses still remains. To correct this, Caleb Kraft from hackaday.com came up with an ingenious solution. He used a clone of Ambilight technology from Philips – LED light bulbs that imitate the color of the film you are watching – and put them in the corners of the headset’s periphery. They reacted to what was happening on the screen, simulating peripheral vision. In case Oculus will fail in finding a way to extend a narrow vision angle, this could be quite a good solution.