Imagemap
Technology WorkflowVVVVPure Datapduinoarduino-serial/comportJavascriptseriously.jsJohnny-FiveQuartz ComposerKineme | Serial IOMax/MSP/JitterOSC/SerialSensor TechnologiesGalvanic Skin Response (GSR)e-Health Sensor PlatformShimmer GSR+ UnitGrove GSRElectroencephalogram (EEG)SoftwareNeuromoreProcessingHardwareOpen BCI (Ganglion 1)3D PrintingEmotiv Epoc+Pulse Sensor (BPM)Virtual Reality IntegrationMU Max-Unity Interoperability ToolkitNetsend/NetreceiveUnityMicrophone InputMU Unity Terrain Test  MU Unity Screen TestUnity First Person Controller TestMU Interoperability TestMU Unity/Max Collision Framerate TestToolkit ResourceMax_OculusMax/MSP/Jitter - Oculus Environment Test ...Jitter Chromakey in Oculus Environment3D ModellingBlenderCustom PC BuildOculus RiftLive Chroma KeyVideo CompressionHapTesting Playback on 4 Videos Using HapThe Video Processing SystemJitter Movie Playback OptimisationsPatchesopenFrameworksProcessing
hide
Technology Workflow
leaf Arrow Link

During my initial research into the different types of real-time video software I was using a Mac. Given that VVVV is only available for Microsoft Windows systems, I focused on working with software that I could test on my current machine. Now that I have moved over to PC I might follow-up with this software if I run into issues using Max/MSP/Jitter. In the meantime, I need to limit my software choices to what works and which visual languages I am more familiar with. I am already dealing with a wide range of software types and programming languages and I am cautious of the work involved in making changes to an already diverse workflow.

hide
Quartz Composer
Arrow Link

Quartz Composer is a node-based visual programming language that stems from Mac's Xcode development environment. One of the key reasons why I moved away from this approach is that most of the open source projects and communities developing for virtual reality are being built using the Windows operating system.

leaf

Tested potentiometer with the Kineme Serial patch to demonstrate communication between Arduino and Quartz Composer. I plugged a particle system into the string output from the Arduino to illustrate this functionality.

hide

My initial concern when I was experimenting with different real-time video software was being able to determine when a piece of film had finished playing and immediately call sensor data to play the next clip without there being a noticeable delay. This was easier said than done. In most cases when you load a video you gain access to the frame count, which means that you could construct a piece of code that prepares the next video based on sensor data before or at the end of a video playing. However, given that I don't want to be limited to clips of the same length and looking for a more fluid method I managed to find a more effective workaround in Jitter. In addition to this Max/MSP/Jitter offers easy communication via OSC and serial. Finally, when considering implementing these videos in VR Max/MSP/Jitter offers a couple of approaches, ranging from generating virtual environments entirely in the software and exporting video textures to the Unity development engine. With these options in mind and the stability offered with the various tests I have run I have decided to make this the core real-time video software that I will use for my practice.

hide
OSC/Serial

Both of these methods allow real-time communicating between Arduino-based sensors and Max/MSP/Jitter. OSC is used where serial writing is not an option.

hide
Sensor Technologies
Arrow Link

In order to obtain unconscious interaction I am currently looking at the use of the following sensor types. All of these use wither serial or OSC to send their data to Max/MSP/Jitter where it can be programmed to create different kinds of interaction.

hide
Electroencephalogram (EEG)

This is the most ambitious of all the sensors I propose using as it is still an incredibly experimental approach to take. Positioning and head movement from the Oculus Rift could potentially impact interpreted data so the extent in which I use this in my research is still under consideration. I am also concerned with legitimate meaning being derived from my sensors, especially given the experimental approaches I am taking with interactive narrative design.

hide
Software
hide
Pulse Sensor (BPM)
Arrow Link

This approach is the most basic of all three. It is an entirely open source project that has the ability to print a BPM graph in processing, but I favour serial printing the BPM variable data so it can be easily sent to Max/MSP/Jitter. Here the data can easily be extracted and worked into an interactive narrative system.

hide
Virtual Reality Integration

When implementing virtual reality integration there are a number of considerations to make in order to bring interactive video projects created in Max/MSP/Jitter into virtual environments. The following branches not only outline some key approaches in my practical research, but also indicates the relationship between each of these aspects, giving further insight into the development process. 

hide

This toolkit was made by Virginia Tech back in 2008, but is still the only toolkit that allows Jitter video files to be sent to Unity as video textures. Given the time since its creation most of the only documentation indicated that the toolkit had become deprecated and was no longer compatible with newer version of Unity. However, while researching interactive VR installation art projects that use film I came across more recent uses of MU in recent years. Given that Unity's VR was being used in the project I deduced that the creators of these projects had overcome the deprecation issues. Having reached out to a couple of them as getting word back I got some advice, but nothing concrete as it seemed to be something that none of the artists had documented. After an extensive period of tinkering with the scripts I have managed to get all aspects of the toolkit to work with the newest version of Unity and plan on creating a resource for other practitioners to be able to work with this toolkit once I have finalised some VR tests. 

leaf
Netsend/Netreceive
Arrow Link

The MU toolkit makes use of the netsend and netreceive objects created by Olaf Matthews to send and receive messages over a network. All the developer has to do is make sure that the ports match between Unity and Max.

hide Arrow Link

Primarily Unity is a game development engine however it can also be used to build interactive/virtual environments beyond the function of a traditional game. Some of the key reasons why I have decided to use Unity are: it has an inbuilt gravity engine, offers collision detection so interactors can't pass through solid objects and support VR development.

leaf

Given that I am considering one of my installations to be built using real-time video of the interactor being shown to them inside of a VR environment I need to consider how audio can be used to expand this experience. For this piece I have been considering hanging a condenser microphone from the ceiling and carrying its input into the VR environment as well. I have made a script adds this functionality to Unity at a keypress.

leaf

This initial test was to see how a basic terrain would work surrounded by video textures. After showing this to my supervisors they advised making my approach more streamline and focused on the moving images rather than adding complex environment building into the mix.

leaf

This test illustrates a more refined environment in which the video texture is being displayed on the screen of a television. This is an open source 3D model that I added to the environment in order to test how 3D models combined with a video texture affects framerate. Without the use of Hap the system appears to struggle with including more than one video texture at a time. This is an aspect that I plan on looking into further.

leaf

Initial test to learn how to import a basic Blender 3D model and getting to grips with the first person controller.

leaf

First test indicating that object movement in Unity via Max/MSP/Jitter is working perfectly as per the test environment Virginia Tech offer, but at this stage I could not get Unity to receive the video textures at all.

hide
leaf

This test was used to see if collision and movement had any impact on the framerate of video textures.

hide
3D Modelling

Given that I might include some rudimentary 3D models I have decided to expand my knowledge of 3D modelling.

leaf
Blender
Arrow Link Arrow Link

Blender is an open source 3D modelling software in which you can create models that can be imported into both Max/MSP/Jitter and Unity. I have begun learning the basics of this in case using open source 3D models becomes too limiting.

hide
Custom PC Build
Arrow Link Arrow Link

In order to run efficiently the Oculus Rift requires an Oculus Ready PC. This is a high performance computer system that has been optimised for use with virtual reality headsets. Once Oculus released their recommended specifications, I opted to build my own computer using these specifications as a guideline. This option was more cost effective and gives me the ability to upgrade the system in the future. Meeting these requirements is vital to creating high-end virtual reality experiences that don't suffer from performance issues.   

hide
Oculus Rift
Arrow Link Arrow Link

I recently received my Oculus Rift and have begun testing with it.

hide
Video Compression
hide

Hap is a GPU accelerated codec that allows for increased playback performance when working with multiple video files. With CPU load reduced there are less issues with framerate drops, which results in a more fluid virtual reality experience. Initial tests indicate that this works, but more extensive testing is required.

leaf
openFrameworks
Arrow Link

Given that I am less familiar with C++ and found a favourable solution in Max/MSP/Jitter (where I can easily implement Javascript) I have decided not to use this software.

leaf
Processing
Arrow Link

When testing video in Processing I found that it struggled to handle multiple video files, while receiving sensor input. The slowdown I experienced made me consider the stability of using this as my primary approach.