The Oculus Rift
This section will explore the possibilities that Virtual Reality and an immersive experience can bring to the experiment.
One of the main aim of the overall project is to facilitate the user to connect with his/her own voice.
Being totally surrounded may help the user to disconnect from the reality and to unlock the voice, mostly if that is the only tool for the user to interact with the environment.
The experience is approached as a sort of a game in which the user has to repel the objects that otherwise will crush against him/her.
The interactive solution takes inspiration from the everyday life reality. We are target of a myriad of tiny little problems, that metaphorically fall on our head, sometimes they chase us, and often we feel totally unable to react to them.
I think that introducing the user into a safe and alternative reality will help to stimulate a reaction.
For this part I used ofxOculusDK2 addon. I started from the basic example and re adapted the whole environment to what I needed by inserting lights, a voice reacting texture that wraps around the external walls, finally placing a solid floor that is uses shaders.
The bullet physics world
One of the main elements of the projects was implementing the physics into the world, either Oculus and Leap. To do this I used ofxBullet addon that wraps the bullet physics library in a very efficient way. The trickiest part was matching the cameras of the 3 different projects (Bullet, Leap and Oculus). Then I recreated a audio reactive floor which responds only to the voice of the user (just the voice not any other sound).
The voice tool
The only way the user has to control the world around is with the voice. The system uses ofxMaxim and the Mel-frequency cepstrum algorithm. In this way the objects react only with the voice and no with other sounds. Clapping hands, or whistle will intentionally not work.
To implement the algorithm in the project I wrote 2 main functions. I create 2 polylines, in the first one I am using the Mfccs as constant and as variable the Mfccs[i]. In the second I use the Mfccs[i] and Mfccs[i+1] to generate the points. Then I get the centroid of these polygons. The centroid will give me Euclidean distance which happens also to be the average data relative to the kind of voice I am using. This is an approximative method I found to get the type of voice in real time.
Another function gets the rms of the voice signal.
The drawing hands
One of the most difficult parts of this project was the implementation of the Leap Motion in the Oculus Rift environment.
I managed to merge the 2 “worlds” together but there are still problems for the Oculus rift distortion that don’t match with the Leap camera.
I started out with the examples of ofxLeapMotion and ofxLeapMotion2 addons. I had to modify part of the code to merge some functionalities.I redesigned the hands and implemented the addon with the possibility to be integrated with VR headsets, by using methods from the official SDK.
The app that uses the Leap Motion to draw and to modify the drawing with the voice works well in standalone. It has been implemented with gestures that create objects in the field.
However I decided to focus on the VR experience for now. Having the hands in the virtual environment was going to be helpful for creating the shapes without using keyboard commands but Openframeworks is probably not the best tool to work on this. I might have to port everything to Unity in the future.