Please note all research has moved to http://www.mrpauladams.com
The Tanglible interface places the emphasis on the physical movement of real world objects which interact with an audio visual software system. The design of these systems are often orientated towards providing a high level of accessibilty and clearly percievable outcomes from the process of user interaction.
The example here is of particular interest because it demonstrates a clear correllation between the actions of the user, the controller interface and the audio generated response:
The author describes this tool as the following:
The reactable, is a multi-user electronic music instrument with a tabletop tangible user interface. Several simultaneous performers share complete control over the instrument by moving physical objects on a luminous table surface. By moving and relating these objects, representing components of a classic modular synthesizer, users can create complex and dynamic sonic topologies, with generators, filters and modulators, in a kind of tangible modular synthesizer or graspable flow-controlled programming language.
This instrument is being developed by a team of digital luthiers (Sergi Jordà, Martin Kaltenbrunner, Günter Geiger and Marcos Alonso), at the Music Technology Group within the Universitat Pompeu Fabra in Barcelona, Spain.
Nime (New Interfaces for Musical Expression) This is an annual international conference which demonstrates presents new developments in human computer interfaces.
The link presented here inludes performance from the 2004 conference held in at Shizuoka University of Art and Culture in Japan.
The examples included in the video clip place the emphasis on physical gestural activity from the artist, inluding the physical simulation of playing a real world instrument, playing actual instruments which are electronically connected to the computer interface, controlling the sound parameters using data gloves and the act of physical movement on the part of the performer to generate sound transformations.
This could well be a useful conference to attend, the next one is being held at New York University in June of 2007. The submission date for taking part in the conference is the beginning of January which may be too soon to demonstrate any workable prototypes.
Interesting from the pespective of navigating the interface’s environment:
The video shows a prototype application of RhNav – Rhizome Navigation in action: A graphical navigation interface incorporating realtime website traffic and user behavior analysis
SONASPHERE is a kinetically driven interactive music environment. In SONASPHERE, functional units, such as sound samples, effects and mixers, are represented as small spherical ‘Objects’ floating within 3D Space. Connections are made between these objects, using simple mouse interaction, to establish signal stream networks. (Nao Tokui 2003)
Not Laptop based but interesting from the perpective of the relationship dynamic between the performers interface>sound combination
8=8 is a group of 4 programmers = 4 composers = 4 VJs = 4 musicians = 4 artists. All four bring their own programming, visual and musical sensibilities to a collective instrument, the Abstract Machine Hypertable. By moving one’s hands over the surface of the Hypertable, images and sounds are generated, creating a unique opportunity for musical improvisation. 8=8 uses the Hypertable to perform singular programs/instruments in a concert=performance=demo context.
All programs are generated by the members of 8=8. There is no distinction between performer = musician = artist = programmer. All members have been exploring computer programming as an artistic medium for several years, notably through collaborative research at the Atelier Hypermedia (http://hypermedia.loeil.org).
Although the Hypertable was originally designed as an interface for algorithmic cinema, 8=8 quickly discovered its musical potential as an instrument. Each program is essentially an entire visual and musical software/instrument in and of itself. Some of the programs allow for direct manipulation of pre-recorded samples, other instruments generate sounds on-the-fly, while still others combine these two notions with the idea of composition taking place in real-time, on the surface of the table, and in front of the public. Some compositions=programs=instrume nts are visually baroque, others are on the contary are minimalist. All work off the principle that image+interaction pilots the sound generation