Conventional music notation (i.e. sheet music), as we all know, has been in existence for hundreds of years. And although much maligned for its connection to the academic study of music and vast complexity, it has survived as one of the most popular musical notation forms (probably the most popular). And in keeping with the 20th and 21st centuries, it has undergone some technological advancements. Sheet music is now accessible in a digital format (using programs such as Sibelius) and distributed on a worldwide scale via the internet (in downloadable PDF form).
But such success does not mean sheet music is without its rivals. One rival in particular is the growing community of electronic artistes creating their own music notation - as the notation for electronic music has not yet been standardised.
Here are some examples of the differing types of notation for electronic music:-
Due to technological restraints, I have provided links to images of the following types of notation.
I. Graphic scoring (image) © Hans-Christoph Steiner, 2004
II. Piano Roll Notation (image) © VoxNovus, 2008
Notation shown in reference to the keys on a piano. The length of the bars indicate how long a key is held down for. This notation is commonplace in MIDI editing software such as Cubase.
III. Line staves showing relative pitch (image) © VoxNovus, 2008
IV. Prose Scores
Sometimes referred to as verbal-, instruction- or text- notation, prose scores are used for non-reading musicians - and are intriguing as they do not employ diagrams, but rather just use ordinary text. This in turn leaves the interpretation of the scored music up to the performer. This is especially useful in experimental music - a genre where technology and electronic means of music-making play a key role.
The key difference I have noticed between traditional music notation and the above forms of notation is the performer's interpretation of the music. In traditional sheet music, phrases such as legato and staccato play a heavy part in the way the performer translates the images he/she sees into music. With electronic music notation, the interpretation is appears to be entirely up to the performer. All that is pre-destined about the players performance is the pitch. Things such as tempo, attack and release are left up to the performer - an aspect which is essential to experimental music.
Wednesday, 29 September 2010
What does the future hold for computer generated music in live performance?
From my research, it appears to me that the sky really is the limit for computer generated music in live performance. From such humble days where a user could only produce tones on a ZX81, electronic music is reaching out far and wide - and not just in music.
Projects such as IRCAM (Institut de Recerche et Coordination Acoustique/Musique) are constantly blurring the boundaries that seperate electronic music from areas such as education, society and media (for example).
Central to IRCAM is the development and modification of software to be distributed to each and every field they touch. So, for example, they are developing software for educative establishments - which in turn could give rise to a new wave of electronic artistes.
With the technology ever-changing and projects covering many different fields, there is limitless potential for live electronic music. For example, one such project developed at IRCAM (which achieved critical acclaim) was WindSet - wherein accurate virtual models were made of brass instruments to be used by electronic artistes. The end result was the software package BRASS 2.0 (distributed by Arturia). But more than just being a library of samples, this piece of software takes into account the way the instruments are played and how they interact with each other - aside from many other useful functions (such as the option to place an instrument in a "virtual stereo space").
This software includes accurate representations of how these instruments harmonise, and the ability to edit and control the attack of notes played and use of mutes (both static and dynamic).
BRASS 2.0 achieved critical acclaim after its release in 2006, and was described as "uncannily human in its response"1 in the May 2006 issue of Sound on Sound.
Such success is standard for projects developed at IRCAM, and with other projects such as Listen (a project completed in 2003 which "[provides] users with intuitive access to personalised and situated audio information spaces while they naturally explore everyday environments") new ways of reaching your audience - and perhaps even finding new ones - are appearing left-right-and-centre.
Gone are the days of unrealistic sounding Cubase MIDI sounds being controlled by Fisher Price-esque MIDI controllers - synthesised sounds are fast-becoming more functional, more realistic and most important of all, more usable in many different genres of music.
References:
1: http://www.arturia.com/evolution/en/company/clippings.html#Brass
2: http://www.ircam.fr/307.html?&L=1&tx_ircamprojects_pi1[showUid]=8&tx_ircamprojects_pi1[pType]=p&cHash=a272231854
Projects such as IRCAM (Institut de Recerche et Coordination Acoustique/Musique) are constantly blurring the boundaries that seperate electronic music from areas such as education, society and media (for example).
Central to IRCAM is the development and modification of software to be distributed to each and every field they touch. So, for example, they are developing software for educative establishments - which in turn could give rise to a new wave of electronic artistes.
With the technology ever-changing and projects covering many different fields, there is limitless potential for live electronic music. For example, one such project developed at IRCAM (which achieved critical acclaim) was WindSet - wherein accurate virtual models were made of brass instruments to be used by electronic artistes. The end result was the software package BRASS 2.0 (distributed by Arturia). But more than just being a library of samples, this piece of software takes into account the way the instruments are played and how they interact with each other - aside from many other useful functions (such as the option to place an instrument in a "virtual stereo space").
This software includes accurate representations of how these instruments harmonise, and the ability to edit and control the attack of notes played and use of mutes (both static and dynamic).
BRASS 2.0 achieved critical acclaim after its release in 2006, and was described as "uncannily human in its response"1 in the May 2006 issue of Sound on Sound.
Such success is standard for projects developed at IRCAM, and with other projects such as Listen (a project completed in 2003 which "[provides] users with intuitive access to personalised and situated audio information spaces while they naturally explore everyday environments") new ways of reaching your audience - and perhaps even finding new ones - are appearing left-right-and-centre.
Gone are the days of unrealistic sounding Cubase MIDI sounds being controlled by Fisher Price-esque MIDI controllers - synthesised sounds are fast-becoming more functional, more realistic and most important of all, more usable in many different genres of music.
References:
1: http://www.arturia.com/evolution/en/company/clippings.html#Brass
2: http://www.ircam.fr/307.html?&L=1&tx_ircamprojects_pi1[showUid]=8&tx_ircamprojects_pi1[pType]=p&cHash=a272231854
Subscribe to:
Posts (Atom)