Regenerative Music explores new physiological interfaces for musical instruments. The overall goals of this project are to investigate the creation of ``Regenerative Music''. In Regenerative Music, the computer, instead of taking active only cues from the musician, reads physiological signals (heart beat, respiration, brain waves/Brain Computing Interfaces/BCI, etc..) from the musician/performer.
These signals are then used to alter the behaviour of the instrument itself. For instance filter settings on the sound can be applied, to which the musician responds by changing the way they play the instrument. The music will in turn generate an emotional response on the part of the musician/performer, and that this emotional response will be detectable by the computer, which then alters the behaviour of the instrument further in response.
In this way, Regenerative Music looks at how the musician can learn to respond to this new physiologically driven instrument, as well as how the instrument can learn to infer the wishes of the musician through their physiological signals, in addition to the normal playing of the instrument. In a sense, the musician and instrument each play off of each other, and together, both can be viewed as an "instrument". The choice of how to map from physiological signals into instrument behaviour would be an artistic one, under the control of the musician.
In regenerative music the computer, instead of taking only active cues from the musician, reads physiological signals from the musician/performer. The music which the regenerative algorithm then creates will be heard by the musician/performer. It is hoped that the music will in turn generate an emotional response on the part of the musician/performer, and that that emotional response will be detectable by the computer, which can then alter the music in some way in response. Continuing in this fashion, it is clear that there is a well defined feedback loop occurring between the human and the computer. Humanistic Intelligence is defined as intelligence that arises from the human being in the feedback loop of a computational process in which the human and computer are inextricably intertwined. The sort of feedback interaction of Regenerative music is of interest, and is at the heart of Humanistic Intelligence.
In DECONcert 1, we hooked up 48 people's EEG signals, which were collectively used to affect the audio environment. Signal averging was used across groups to clean the signal, and look for collective alpha synchronization (which occurs, for instance, when people close their eyes).
DECONcert utilized electroencephalogram (EEG) sensors which sensed electrical
activity produced in the brains of the participants. 48 participants were
equipped with EEG sensors, and the signals from the brains of the participants
were used as signals to alter a computationally controlled soundscape.
DECONcert allowed the particpants to form a feedback loop with the
computational process of musical composition. The soundscape being generated
generates a response from the participants, and the collective response from
the group of participants is sensed by the computer, which then alters the
music based upon this response. Again, the participants hear the music, and
again respond, and again the computer senses and further alters the sound. In
this way, collaborative biofeedback is being used in conjuction with an
intelligent signal processing system to continually re--generate the music on