Sense Stage Print

Marije Baalman, instructor.
Saturday, September 18.  9.30-13.30h  (part I) +
Sunday, September19. 10-14h (part II),
Location: .HBC
60 €
max. 16 participants
level: intermediate

 

The SenseStage research project has resulted in the development of a low-cost, open-source wireless infrastructure for live performance and interactive, real-time environments.

 


The infrastructure consists of small wireless boards, that can be used both for sensing and actuation, and that use the Zigbee protocol to create a mesh network. The second part of the infrastructure consists of a software protocol to communicate between programming environments that are commonly used for the creation of interactive music, video and other media. The protocol is based on OpenSoundControl, and provides mechanisms to ensure robustness and ease of use, so that communication of data between collaborators and their software becomes about sharing and using the data, rather than figuring out how to communicate the data. We provide a host for this datasharing network, and several clients that implement the protocol to communicate with the host, so that setup of the system is fast, and easily fits into the workflow of the user. Additionally, the datasharing network can easily receive the data coming from the wireless sensor network, so the components together form an integrated system.
This datasharing framework is implemented in SuperCollider and provides many additional features in SuperCollider to manipulate data and map it to busses on the audio server and perform actions with the data (see Quarks: SenseWorld - SenseWorld DataNetwork - SenseWorld MiniBee).


During the workshop an introduction will be given to the whole framework and how to work with it within SuperCollider. By the end of the workshop we should have created one or more collaborative projects. There will be some wireless sensing nodes available with some sensors and actuators attached, but you are also free to bring your own controllers or interfaces (MIDI/HID devices/handmade), provided that you have them already working within SuperCollider (i.e. are able to receive data from or transmit data to them), so that they can be hooked into the data sharing network.
More information: http://sensestage.hexagram.ca


The workshop should be of interest to anyone who is interested in working with realtime sensor data for either live performance or interactive installations, using SuperCollider and looking for a framework to route, process and map the sensor data to sound and other media. For the workshop basic knowledge of and ability to use SuperCollider needed.