TD492/CS4xx Independent Project
2003 Advisor: Clare Congdon
2002 Advisor: Jim Thurston
Colby College, Waterville, ME
Chaosdevice: Theater/Dance
MIDI Lighting/Sound Control System
News
May 19, 2003
I gave the last presentation for this project on Monday, May 19, 2003 at 1pm in the light booth of Strider. Hopefully someone else will pick up this project next year, so we'll see more updates in the months to come!
May 16, 2003
An informal presentation of this system was given on Thursday, May 15, 2003. I presented the system as it was set up for CS Fest, using a MIDI keyboard and three sensors. I also showed how it worked with lights by using a scale model of the theater and several small lighting units.
Dec 20, 2002
I gave a second presentation to my TD335 class in late 2002. This presentation included more details on the client/server model, details on a cue scripting language I am developing, and further insights into the system's use at Colby, with two specific examples using Colby performances. The web version of this presentation is available.
May 19, 2002
I presented the system to Colby's Theater department, members of the Computer Science department, and several other interested parties, on Friday, May 18, 2002 at 1pm in Strider Theater. A web version of the Powerpoint presentation is available.
Project Statement
Lighting and sound control
is often limited to a stage manager calling cues, and a board operator
following the commands blindly. Finer control is desired during dance
pieces and dramatic scenes in plays. I plan to design a system allowing
a computer to track movement onstage and automatically execute lighting
and sound cues based on this movement.
From past experiences I've learned how difficult it can be to execute
a cue at exactly the right moment, especially when that cue requires several
actions on the part of the board operator. This system would allow the
actor to have control over the cues. For example, an actor could stroll
about the stage, and lighting units would track him. Movement would cue
a sound effect of wind blowing. Faster movement would result in louder
or more complicated sound effects, and a new color onstage. A large part
of this project will be researching sensing devices. Infrared sensors
would allow an actor to "trip" a cue: waving a hand could start
a series of cues, or entering a doorway could cause the room of a house
to illuminate. Motion sensors would recognize faster movement and change
the stage "look" accordingly.
The purpose of this project is not to replace the lighting or sound
operators. Instead, the system is designed for a specific scene in a performance
or dance piece, when control should be given to the actions of the performers.
The system will allow these moments to occur at the discretion of the
actor/dancer, not of the stage manager.
As this will be a complicated project, I intend to work toward
a series of goals, which will result in useful products over the semester.
For example, an early product will allow the lighting operator (or designer)
to have direct control over the lighting "look" onstage with
the pass of a mouse. Much of this semester will be spent planning and
determining what hardware must be purchased or built. The final product
may not emerge by May, but the research and development will be useful
to designers and stage managers alike.
Spring 2003 Goals & Accomplishments
Current Status
Going into Spring 2003, this project consisted of a Java application that could track mouse movement and execute lighting cues based on mouse positioning information. There was no communication between sensor devices and the system. I had a loose connection between a MIDI keyboard and the rest of the system: I could execute a Program Change on the keyboard and forward it on to the lighting console, changing the current cue. There was not a lot of research on how the ETC console interpreted MIDI commands, mostly because I was unaware of the differences between ETC's MIDI interpretation and the existing NoteOn/NoteOff formats used by my keyboard and other devices. The Java system was not stable: it used buggy hacks to work around Java's MIDI problems. (Most notably, there was no Java-native MIDI OUT -- the implementation was done in C, and then compiled to work with Java). The system would crash regularly and required frequent reboots.
Goals
My goals for the last semester are divided into two groups: research and experimentation. On the research end, find a performance group that uses sensor devices and learn more about how the system i used in productions. If possible, see such a production to get a feel for the integration of technology into the theater. In terms of technical research, I aim to learn how the ETC lighting console interprets MIDI so I may properly send cue commands to the console, without worrying about overloading and crashing it.
On the experimentation side, I aim to fully implement the system, eliminating at least as many bugs as would make it stable enough to run some light cues. Next come the sensors: find a way to communicate with the CubeX through Java, routing sensor information to new classes I will create to handle large amounts of data. If possible, demo the system with a small play or my own creation, showing its features to the theater department. Use the system with Colby Dance Theater or another performance. Find a performance that would embrace a budding technology without worrying about minor problems. Finally, find someone who could take over next semester.
Accomplishments
As of late May, I can look back on my goals and feel satisfaction that most of them were accomplished. For research, I flew to Los Angeles and met with Jeff Burke at UCLA's HyperMedia labs, home of the Iliad production I mentioned back in April, 2002's journal entry. (The Wired article is here). This was the cream of the crop of sensor devices, and I did not expect to follow in their (rather large) footsteps. Instead, I hoped to catch a sense of what technologies were currently being used, and what I might be able to implement back at Colby.
At UCLA, Max is used extensively when desiging sound for theater, and quite a bit for lighting as well. While more experimental methods have been developed (see the Iliad link above), Max is embraced as a powerful system for manipulating MIDI. There are several courses being taught on using Max with sound design. After meeting with David Beaudry (check out his bio -- that's Max on the projection screen behind him!) and Jeff Burke (who designed the sensor systems for the Iliad project), I realized that using Java to develop the sensing systems would be inventing the wheel. So I turned to Max.
With Max, integrating the sensors was made easier by Infusion Systems' iCube and oCube objects, which perfectly integrade the CubeX into the Max language. This made my job easier, though by no means complete. I was able to make the system stable, dealing with only the occasional system crash as I fixed problems related the the USB<->MIDI link. I was still left with the issues of learning the Max language, implementing my system, debugging it, and getting it working in a production at Colby -- no small feat, based on my attempts to do the same with Java!
I was never able to demonstrate the system using a small scene from a play. That is partly my fault, for not developing such a scene, but largely due to issues with cabling: we don't have the necessary wiring to run MIDI to the stage, and the CubeX must be onstage for the sensors to reach. However, I did provide examples of how I would augment existing scripts in earlier presentations, and I stand by those. My informal demonstrations described how fictitious events would occur onstage, though they were limited to breaking a laser beam and describing how an actor would do so in a scene.
My ETC research went well, though it hit a plateau when only half of the commands worked. I could manipulate submasters on the board, controlling lights directly, but I could not dim them, nor could I reliably change cues. In terms of demonstrating the system, however, being able to turn on a light was often enough to show how a cue might be used.
MIDI research took my project in another direction when it came time to present it during CS Fest in early May. I hooked up my MIDI keyboard, unused since my Java experimentations, and used it to demonstrate Max's live MIDI talents. Instead of turning on a light, I would have the system play a song, and the other sensors manipulated this song while it was playing. More information on this system is here.
The system is nearly ready for use in a small scene. Given more research into communication with the lighting console, another student could use it in a play with the Max patches I've developed. Between my documentation and Max manual, learning the basics of the iCube object and basic Max programming should not be difficult.
[top]Current Equipment List
- Lighting Console:
- ETC Insight 2x Lighting Control board (running Insight 3.03 software) with 256 dimmers, 1024 channels.
- Development Hardware
- Compaq
Deskpro 6400EN PII-400 Workstation
- 128 MB RAM, 4GB HD
- Windows 2000 Workstation
- Apple
PowerMac G4 Tower Workstation
- 384 MB RAM, 30GB HD
- MacOS 9.2
- Compaq
Deskpro 6400EN PII-400 Workstation
- Development Software
- Java 1.4 JDK [Win]
- Sun's Forte for Java 3.0, Community Edition [Win]
- Cycling74 Max/MSP 4.0 [Mac]
- MIDI Interface:
- MOTU FastLane 2x2 USB MIDI Interface with MIDI Thru
- Yamaha PSR-510 PortaTone
MIDI Keyboard
- Sensing Equipment
- Infusion
Systems I-CubeX Digitizer (Firmware 4.0)
- 1 TapTile sensor
- 1 SeeLaser sensor
- 1 SeeGreen sensor
- 1 Turn actuator
- Infusion
Systems I-CubeX Digitizer (Firmware 4.0)
Sensing Technology
These are sensing products I've come across in my research.
- I-CubeX: Translates information from a variety of plug-in sensors into MIDI. Sensors range from thermo and light detection to pressure switches and electromagnetic fields. UVA has proven this device to be useful for this application. Its advantage lies in its modularity; dozens of sensors are available for it. However, it is also expensive: the basic system, with a few sensors, retails for $625.00. Sensors average $50 each, but range from $10 to over $300.
- Buchla Lightning II sensor: "a specialized MIDI controller that senses the position and movement of handheld wands and transforms this information to MIDI signals for expressive control of electronic musical instrumentation. In addition to functioning as a powerful MIDI controller, LIGHTNING II, with its self contained 32 voice synthesizer, comprises a complete, ready to play instrument."
- MIDI Dancer: "a wireless movement sensing system worn on the body of a dancer or other performer. It uses sensors to measure the flexion of up to eight joints on the dancers body and then transmits the position of each of those joints to a computer off stage." The site also provides a list of equipment they use for many of their productions.
- softVNS is "a device that allows you to extract motion from live video images in real-time." Basically, it does frame subtraction on a live video feed, allowing immediate feedback from movement on stage. The older version used SCSI to talk to Macs, but the newer versions appear to use more standard connections.
- Interactive Dance Club uses a variety of sensors with the I-CubeX system to provide instant feedback for dancers. Very cool mixing of video and sound.
- Soundbeam, "...a distance-to-voltage-to-MIDI device which converts physical movements into sound by using information derived from interruptions of a stream of ultrasonic pulses." This site offers a product which does exactly what I need to do, except at a distance of 6 meters. For small-scale effects, it might be the right tool for the job. An array of them covering a wider area could be linked together.
- Drake Music Project: "We use music technology to create opportunities for people with physical disabilities to explore, compose and perform music." This would be the perfect toolkit if it weren't (a) prohibitively expensive; (b) designed for short-range applications. Still, it might be useful to contract the company for ideas.
- Glolab: Parts: Parts for pyroelectric infrared motion detector. Sensors could be built using equipment like this. Problem is, most of these companies specialize in small sensing equipment designed for close-range accuracy.
- Many of these came from DT&Z Sensory Devices, a very useful site for researchers.
Java MIDI Control
- The article "Understanding and Using Java MIDI Audio" gives some sample code for having Java speak MIDI to devices. However, it is more bent on controlling a synthesizer than sending the raw MIDI control codes, as I'll be doing. Should I add in a sound generator down the line, however, this might be more useful.
- Earthweb provides "The Java Sound MIDI API", an article describing the ShortMessage object in Java. This object is what I will be using to send MIDI control codes to the various devices in the project.
- The site Linux MIDI & Sound Applications contains an enormous assortment of links to Linux MIDI applications. As I'm a Linux fan, and I find Linux to be much more stable than Windows as a production machine, this may be useful during further research and programming, should my programming platform shift to Linux.
jMax Package
- jMax is a free software package from ircam (Institut de Recherche et Coordination Acoustique/Musique). jMax is written in Java, and is available for a few different platforms, including Win2k and OS X.
- General documentation is available from Stanford.
- Quick Reference guide is available from Artengine.
- Patches and other information are available from Mamalada.
- jMax official web page
- Latest Windows binaries
Progress Journal
Colby's Insight board seems to respond only selectively to MIDI commands. Here is my summary of commands given, and results achieved:
Type Details Effect NOTE ON(note, velocity)Turns on a submaster, which in turn controls whatever lights are assigned to that submaster. A NOTEON command has a velocity element, too. Turned on a submaster, as expected. Velocity element did not adjust starting intensity of submaster, as hoped. NOTE OFF(note, velocity)Turns off a submaster. Like the NOTEON, it also has a velocity element. Turned off a submaster, as expected. Velocity element did nothing.
PROGRAM CHANGE(x)Runs a cue using the C/D fader. If x is 0, advance to the next cue. Otherwise, load cue x and GO. The old Java program I wrote last year was able to run program changes without a hitch. With Max, however, I can send an initial PROGRAM CHANGE 0 (to advance cues), but then the board stops responding to further PC's. CONTROL CHANGE(x)Runs a cue using the A/B fader. If x is 0, advance to the next cue. Otherwise, load cue x and GO. Same problem as PROGRAM CHANGE above. PITCH BEND(x, y)Adjust light intensity based on x and y values. x sets a range and y fine-tunes that range. Never saw any effect from this command. Was never able to adjust intensity of a cue or a submaster. From these results, I can conclude that I am either making improper assumptions about sending cues or the ETC does not respond properly to PROGRAM CHANGE commands. Needless to say, this is a frustrating problem, because I am unable to reliably change cues. I can still control submasters, but I do not appear to be able to dim them; only to turn them on and off.
Future experiments (and a conversation with ETC) may alleviate this problem. I've heard that getting tech support from ETC on older boards is rare, but it may be worth a shot.
[top]
May 6 was CS Fest 2003, the Computer Science Department's annual event for demonstrating student projects. I demonstrated the system by attaching my MIDI keyboard to the system. Using a system of mirrors, I projected a laser beam around the room, and then back to the SeeLaser sensor. When the beam was broken, the keyboard would play a quick rendition of "Clocks" by Coldplay. To make the demonstration interactive, attendees were invited to step on the TapTile sensor and to twist the Turn sensor. Stepping on the TapTile raised the pitch of the song, and turning the Turn sensor changed the current instrument being played.
The demonstration went well. I received many positive comments, mainly due to the fun of lasers. It was difficult to explain how the system was really to be used, since there was no light console in Mudd with which to fully demonstrate the system. I used several slides from my second PowerPoint presentation to show the sequence of events required for a cue to fire using my system.
This was my first time using the MIDI keyboard with the system since the Java version of my project. As such, I learned that Max allows me to quickly compose and record a MIDI song, which meant that I could play "Clocks" and almost immediately start looping it. This presents an interesting application: why not have dancers create a song onstage by tripping various sensors, and then start playing it back when they stopped dancing? The dancers could manipulate their new song in much the same way that I manipulated "Clocks" at CS Fest.
It turns out that sending messages to the ETC board was easier than I thought. I tried to use MIDI Show Control (MSC), but that never seemed to work properly. I was using a Sysex method in MAX to send a System Exclusive MIDI command down the wire to the ETC, and it didn't respond in the slightest.
Then I took a closer look at the ETC manual and realized that the MIDI commands were a lot simpler. Here is how ETC interprets the MIDI commands:
Command Format Effect Note off <8n> <kk> <vv> Turn off a submaster Note on <9n> <kk> <vv> Turn on a submaster Control change <Bn> <kk> <vv> Program change <Cn> <kk> Pitch bend <En> <ll> <mm> Key:
Argument Meaning Range in hex (decimal) nMIDI channel number 0-F (0-16) kkKey number 0-7F (0-127) vvVelocity 0-7F (0-127) llLeast significant 7 bits of pitch bend value 0-7F (0-127) mmMost significant 7 bits of pitch bend value 0-7F (0-127) 8, 9, B, C, EThese are hex values, not variables. <8n> means 0x81 for a Note off on MIDI channel 1.
[top]
April 7, 2003
Success! I have linked the I-CubeX to Max, and Max to the ETC lighting controller. Using a laser beam and a laser sensor, I can break a beam and a MIDI message will turn on a light.
Working ICube Patch. Turns a submaster on and off when laser beam is broken. How does this patch work? Let's look at each part. The top commands initialize the CubeX for one sensor. I have my CubeX object set up with 3 virtual sensors, but the other two are unused.
0 reset resets the Cube-X to all default values.
1 connect 1 connects virtual input to sensor 1. This mapping allows me to map a sensor plugged into one port to show up in another virtual port.
1 res 0 0 interval 50 sets the sampling interval to 50 miliseconds. Every 50ms, the system will check the sensor values.
1 on turns on virtual sensor 1.The iCube object receives input from those commands, which lets me modify its settings on the fly. If I want to change the refresh rate, that's simple to do while the system is running.
Notice the midiin and midiout objects connected above and below the iCube. midiin is connected to input port 1 of the FastLane, which is physically connected to the output of the CubeX. midiout is connected to output port 1 of the FastLane, which is physically connected to the input of the CubeX.
The left three boxes below the iCube show sensor values from the device. 9.92 is coming from the laser sensor. The other two are 0 because there are no other sensors connected. The circle will light up if the iCube has a configuration problem.
Trace the line from 9.92 down to the <=. This operator checks the value from the sensor and compares it to 1. If 9.92 is less than or equal to 1, then the <= operator will send a 1 to its output. Otherwise, a 0 is sent. In this case, a 0 is sent.
The select 1 functon looks at the result of the <= function. If a 1 is received, the left output is followed. Otherwise, the right will be followed. In this case, 9.92 is not <= 1, so a 0 was sent: therefore, the right output will be followed. I placed a circle in the path to show which path was taken. Notice the right side is illuminated.
Let's ignore the onebang 1 for a moment. Follow the connection straight down to the Turning light OFF, and also to the right, where we see another circle and Sending OFF command to lights. These will both be executed at the same time, so we will see "Turning light OFF." printed in the MAX debugging window, and the 128 78 0 will also be executed. These numbers are sent to midiout, which is connected to output port 2 on the FastLane. It is physically connected to the ETC lighting console. These numbers correspond to 0xC1 78 0, which tells the console to turn submaster 78 off. This turns off the light.
Had we followed the left path, we would have sent a 144 78 127 down the wire to the ETC board, which would have turned on submaster 78. These digits correspond to 0xE1 78 127, the command to turn on a submaster.
The problem with having a refresh rate of 50 is that the <= operator will refresh every 50ms as it checks the value coming from the CubeX sensor. This means that a MIDI command will be sent every 50ms, since we are either turning a light on or off. I don't want to send too many MIDI commands to the ETC console, because sending too many messages can (and will) lock up the console, requiring a power cycle to fix it. This is why I added the onebang 1 commands.
The onebang command allows a command to pass through it only once. It will then ignore subsequent input until it receives a bang on the upper right input. It is, in effect, a simple gate to protect our devices from being flooded with redundant MIDI messages. The 1 after onebang means that it will allow the first input through without an initial bang. Input is sent to the output on the bottom left.
Let's look at those onebang commands now. On the right-hand onebang, notice that its output goes in three directions: to the print statement, to the numerical MIDI value, and to a circle. That circle then connects to the input of the left-hand onebang. The opposite is true for the left-hand side: input is sent to a print statement, the MIDI value, and to the right-side onebang.
Now trace through the system, starting at initial value 127, where the beam is solid. Max will be hitting the right-hand onebang every 50 miliseconds, because 127 is not <= 1. One output will go through the onebang, send the print statement, send the MIDI code, and then bang the right-hand onebang. Then the left-hand onebang will prevent furter output. The light is now off.
Break the beam. The sensor value goes to 0, sending control down the left-hand side. Our onebang is free to let input through, since it has received a bang and has never been used before. A message is printed, the MIDI code is sent, and the right-hand onebang receives a bang. The light goes on, but no more MIDI messages are sent because the gate is closed.
Once the beam is solid again, the right-hand side will execute, then close its gate and open the gate on the left-hand side. It's a simple system, but prevents a very real problem of sending too much information to the lighting console.
[top]
March 15, 2003
LED Actuator I'm familiarizing myself with the sensors and MAX's Java-based editors. They allow you to easily adjust the I-CubeX to the types of sensors plugged into it. Unfortunately, the Java editor is unstable, and did not always recognize the presence of the CubeX.
When it does work, the editor is very useful. I have no screenshots, but I'll describe the process in general:
For each sensor attached to the CubeX, create a sensor object, then tell the program what type of sensor you are using. It seems to know most of the sensors you can buy, and it allows you to add new ones with custom information. This information includes sensitivity data that allows the sensor to report with greater accuracy. I can also customize the refresh rate for the CubeX, which would be useful if I were reporting to a slower system that could not handle 50ms refresh rates of sensor data.
The system knew my TapTile sensor, and I set up input 1 accordingly. I could choose what type of message would be sent by the CubeX when sensor data was received. This is a cool concept: In lieu of using software with the CubeX, I could hook it up directly to a MIDI keyboard and issue Note on commands when pressure is applied to the tile.
Continue this process for each sensor, then save the information to the CubeX. When in standalone mode, it will retain this information and use it the next time it is powered on. In host node, no information is stored, and the CubeX uses default settings for dealing with sensors.
[Update] Fortunately, I've found that this system is unnecessary for the mode I will be using with Max. With Max, I place the CubeX in "host" mode, which means that it retains no information about sensors once it is powered down. All instructions are given to it through Max. By using the loadbang command, I can cause certain operations to occur on Max startup, so I can force my Max patch to send the proper initialization commands before using the system.
[top]
January 15, 2003
The sensors are in! The theater department now has a laser beam ("Laser") with sensor ("SeeLaser"), an LED light ("Flash"), a knob ("Turn"), and a pressure sensor ("TapTile"). However,
Laser Actuator the ETC board does not support the range of MIDI that I expected. When they say they send messages via MIDI, they're referring more to the 5-pin DIN hardware than to the MIDI specification. The ETC lighting manual provides scant details on how MIDI is sent and received from the board, but I'm puzzling through it.
TapTile Sensor Using a Yamaha PSR-510 MIDI keyboard, I rigged up a system to simulate sensor events passing down the wire:
- Keyboard sends PROGRAM CHANGE (x) com mand
- Java program sends PC, notes new program number, x.
- Java program sends a PCs to the lighting console, based on x.
- ETC Lighting console receives PC and changes to cue x.
My example happened to use a PROGRAM CHANGE (x) on both ends of the system. It could have sent any type of MIDI command down the wire.
[top]
Saturday, Dec 14, 2002
- A
Flock of Words, a realtime video/animation video controlled by dancers
wearing sensors. This composition premiered at New York University in
the spring of 1995. From its website:
- A Flock of Words uses custom computer software to analyze the music being performed by an ensemble of human players and guide the simultaneous presentation of animation, video, lighting effects, and computer music. The most important technical innovation of this piece is the real-time animation of graphic objects in response to the analysis of musical input, coupled with the simultaneous presentation of video, lighting, and algorithmically-generated computer music. The performance of a six-member ensemble is tracked in real time through a MIDI keyboard and a percussion sensor. During the performance by a computer program analyzes this information looking for musical attributes such as register, density, and articulation. The analysis machine then sends commands to two other computers, displaying video clips and generating animations of words from Elias Canetti¹s text Crowds and Power.
- Computer
Intelligence in the Theater discusses theater technology in use
at the Institute for Studies in the Arts (ISA) at Arizona State University.
From their website:
- The Intelligent Stage is a mediated performance space that responds to the actions of performers as they move. The system’s primary sensing occurs through a program called EYES that analyzes video activity to understand what is happening on stage [Figure 1]. (Speech recognition is also being researched as a another means of sensing). Sensing by the computer allows performers to control electronic theatrical elements such as sound, lighting, video, and slides through digitized video, photoelectric switches, contact switches, and many types of activities. Media responses occur through several controller computers that manipulate the theatrical electronic media.
- Synth Zone is "an attempt to ease the search for synthesizer and electronic music production resources on the Internet." It provides links to manufacturers' websites, a forum to discuss equipment, and tech support questions for a variety of subjects. Most importantly, however, it provides a large list of MIDI controllers, with links to their websites. Several technologies discussed earlier on this journal are linked on this page. A very useful resource for me while looking for controllers like the I-CubeX.
- Stage
Research, Inc. manufactures SFX ProAudio Show Control software,
an all-in-one solution for light and sound control for the theater (see
screenshots). Its
software allows complex cues to be chained together and run in parallel,
with a highly advanced scripting language. They provide many "success
stories" for their software.
- One
such story is presented by Martin Desjardins, a graduate of
Yale School of Drama, discussing
his work on Yale's production of Iphigenia at Aulis. As
Colby recently ran a production of Iph, this alternate
account provided another look at the production, and how its design
could be enhanced through the use of sophisticated sound control.
He even offers sound clips (in MP3 format) of some music effects!
- He writes: "From [the] opening moments, through the death of Iphigenia and the abandoning of Clytaemnestra on the shores of Aulis, sound and music are a constant presence in the production. In the nearly 90 minutes of theatre, more than 8GB of data streamed through SFX night after night, providing sound and music to all but three brief moments in the performance."
- Another story is presented by Paul Estby, resident designer at the Indiana Repertory Theater. They performed Macbeth in 1999 using the SFX system. Some of the technology used in their system -- like PCanywhere to monitor the systems -- closely resembles ideas I've presented to the Colby faculty while discussing my own project. They used an ETC Obsession console, similar in some respects to the Insight 2x we use at Colby. Unfortunately, their Obsession lacked the MIDI input required to connect their sound and lighting systems. Otherwise, they would have used SFX to send sound and lighting cues over MIDI to ensure proper timing.
- One
such story is presented by Martin Desjardins, a graduate of
Yale School of Drama, discussing
his work on Yale's production of Iphigenia at Aulis. As
Colby recently ran a production of Iph, this alternate
account provided another look at the production, and how its design
could be enhanced through the use of sophisticated sound control.
He even offers sound clips (in MP3 format) of some music effects!
- MIDI
Movement Module: "M3" is a system built by Dr. Michel
A. Lynch in 1993. It uses a wearable sensor apparatus to wirelessly
transmit information about a dancer's body to a receiving system. A
tiny computer fitted to the small of the dancer's back has wires running
to hands, feet, and other parts of the body, interpreting simuli and
converting it to a digital signal. Many detailed
diagrams of the system are provided, and pictures
of the device in use are provided. From the website:
- A data transmission channel is used to format a serial bit stream with suitable formatting to move the data via an FM radio transmitter to a receiving system. The entire sensor, computer and transmitter system is mounted on the dancer. The receiving system consists of an FM receiver, a 68HC11 computer and MIDI interface. Its purpose is to receive the serial data stream Erom the dancer and reformat it for input into the MAX system running on a MacIntosh computer where the various sensor commands are reworked and elaborated for the musical purposes of the dance.
- The
Sensor Chair was designed in Fall 1994 for Media/Medium,
a mini-opera composed for magicians Penn & Teller. The instrument
takes a new approach to sensing body movement, though one I doubt anyone
would be thrilled to use onstage:
- ...the copper plate affixed to the top of the chair cushion is a transmitting antenna being driven at roughly 70 kHz. When a person is seated in the chair, they effectively become an extension of this antenna; their body acts as a conductor which is capacitively coupled into the transmitter plate.
- Pickups placed on poles around the chair interpret the distance between the performer's body and the sensors. This data is transmitted to a Macintosh computer, which interprets it and determines the performer's position.
- Technical drawings of the system are provided by Joseph Paradiso of the MIT Media Lab.
- The
Martial Arts Combat
Environment, created in part by Marc
Shahboz, uses wireless sensors in ways we've seen in other systems.
From the website:
- Wireless custom sensors placed on the body and on the weapons of the participants register data that control what is being seen and heard by the audience. Each performance is unique. The chance elements of combat feed the computer with information. Data is used in many different ways. It can be used immediately, plugged into an algorithm, stored or randomized to create the environment. Couple the computer generated sonic and graphic portions with the sights and sounds of live, martial arts combat, interactive lighting and room ambiance and you get a performance that must be experienced in person to be truly appreciated.
- Spitz, Inc. manufactures theater automation systems and lighting kits for planeteria and domes. They have designed a complex software package called ATM-4 Theater Automation System. This system takes the previous systems I've seen, which utilize scripting languages, and puts a GUI on top of this layer. The result is a multitrack editor that allows multiple simultaneous cues to executed and edited in parallel. Though I've seen reviews of the system, and the screenshots are tiny, it looks very impressive to me. They even make a wireless system that allows remote operation of the system. =)
[top]
Wednesday, May 1, 2002
- "Borg of the Dance," an article in Wired News, discusses the use of technology in dance performances. Video-conferencing, motion capture, and sensing technologies -- such as MidiDancer -- are discussed.
- The Dance Multimedia Learning Center at Arizona State University has an interactive stage with motion sensors. These sensors are linked to their lighting and sound systems.
[top]
Thursday, April 25, 2002
Wired ran an article in 2001 describing UCLA's recent production of Homer's The Iliad, which incorporated digital projections, an online story for build-up, and interactive exhibits. UCLA's HyperMedia Lab designed the technology for the play.
After much haranguing of the Computer Science faculty here and much digging around on the 'net, I've concluded that the Java MIDI classes are still fairly bug-ridden. After adding a MIDI device selector, I've discovered that a fair amount of my test code worked all along; the Java synthesizer simply wasn't functioning properly. As always, the latest incarnation of the Applet is available here.
Wire is an addition to Java 1.3/Windows. It provides a more direct interface for sending/receiving MIDI signals to/from external devices. Their package includes a "MDI Thru" example, which will show incoming MIDI packets from external devices. Though I haven't done much with it yet, this package may prove useful later, when I'll be accepting incoming MIDI messages. Documentation for the new classes can be found here.
Java Music Projects lists 20-odd MIDI projects designed in Java. Packages include MIDI editors, sound processors, and low-level drivers to add functionality to the MIDI classes.
[top]
Thursday, April 18, 2002
Interactive museum exhibits, interaction between actor and video, and sensor-based interactive art and theater. All are discussed in an article from the IBM Systems Journal [local mirror: article] from 2000. Its authors hail from the MIT Media Labratory, where they design "smart" technology and interactive multimedia.
[top]
Wednesday, April 10, 2002
- An
article in the
October, 2001 edition of Entertainment
Design magazine discusses the use of motion sensing technology to
track actors onstage.
- A Java-based system is described as making the tracking information available as a "subscription," allowing other devices to receive the information and process it to their own ends.
- A
Mac running the Cycling74 Max
software package (available only for MacOS) listens for this information
and sends the appropriate MIDI commands to the lighting and sound
controllers.
- This may be a better way to go, given the learning curve of sending System-Exclusive MIDI codes using Java.
- jMax is a free software package from ircam (Institut de Recherche et Coordination Acoustique/Musique), the makers of the Max package. jMax is written in Java, and is available for all platforms.
- See the new
jMax section of these pages for more information.
- I'm going to take a breather from Java and dive into jMax, in hopes that it will prove to be more useful for my purposes. Its stability is not guaranteed, but if things work out there's always Max itself.
[top]
Thursday, March 14, 2002
- The MIDI interface has arrived,
and I have written a Java applet to send MIDI commands to the lighting
console.
- Currently, the applet sends only PROGRAM CHANGE commands, allowing cues to be executed. However, I am researching System Exclusive (Sysex) message packets, which will allow me to use MIDI Show Control (MSC). It appears to offer finer control over cue selection and execution. I will post updates to the applet as it is a work in progress.
- The latest version of the applet can always be found here. It requires the Java plug-in to run, and may not function on Macs yet. (Macs don't have Java 1.4, which is the version I'm using).
[top]
Thursday,
February 28, 2002
- Yale's School of Drama produces Tech Brief, a publication aimed at technical managers in theater. Thanks for Jim Thurston for this link.
Wednesday, February 27, 2002
Research
- UVA has an Automated Lighting Studio site which provides details on their projects and an equipment list for their lighting system. Their Lighting Integrated Technology (LIT) project's objectives seem to parallel my own. They use the I-CubeX system, a sensing device from Infusion Systems.
- In my quest for MIDI hardware, I found the MIDI Adapter Zone, which contains a host of links to MIDI hardware and software companies.
- Many additions to the Sensing and Java MIDI sections.
Java + I/O (Input/Output)
- Java I/O is a 600-page book detailing the use of input/output streams with Linux. It may be useful in the future as I look more closely at integrating sensors with the project. This page is a review of the book by the author; it is listed on Amazon here.
- Sun provides a web page for the Java Communications API, which "can be used to write platform-independent communications applications for technologies such as voice mail, fax, and smartcards." More importantly, it allows Java to communicate via serial port, which I may use if inputting signals via MIDI doesn't work as planned.
- JavaWorld has a dated article (1998) on the Java Comm API, explaining how to communicate with an RS-232 serial port using Java. This is how I would have my program speak to sensors that wouldn't happily speak MIDI instead.
- J3D.org has an article on using sensors with Java3D, Java's 3-dimensional API. This may be of use later, when I'm taking the input signals from sensors and translating them into the 3-D space of the stage.
Experimentation
- Through use of the above articles, I've experimented with the ShortMessage object, and I believe I'm ready to start sending MIDI codes to the lighting console. However, without having a MIDI interface attached to my computer, I cannot test the code locally.
[top]
Thursday,
February 21, 2002
Research
- Control Systems for Live Entertainment (John Huntington, 2000)
- Artistic
License: This site lists a MIDI<->DMX converter for
nearly $2,000 (!!). Converting MIDI to DMX to span technologies may
not be such a cost-effective goal. Complexities of timing and signal
conversion makes this a bad choice.
- Laser Harp project, part of a CS class at Stanford. Many links to MIDI/wireless technologies. Lists a variety of useful links to everything from MIDI codes to Bluetooth wireless applications.
- Many additions to the Sensing and Java MIDI sections.
Experimentation
- Research into the Java Sound API, which I intend to use for this project. The API simplifies sending a receiving MIDI control signals in Java.
- Reference guide for Java Sound.
- I have experimented with writing code to track mouse motion; this will be used later in the project for tracking mouse movement over a stage plot. The experimental applet can be run here. A tutorial "How to Write a Mouse-Motion Listener" is here.
[top]
©2003 Samuel Shaw <sbshaw at alum dot colby dot edu>Last Updated Thu, 11 September, 2008 0:47