Wednesday, 15 December 2010
Les Paul
Another legendary engineer, Les Paul helped create multi-track recording and the electric guitar. His influence reached so far that Gibson dedicated an entire guitar range to him - which could arguably be one of the most recognisable instruments in music today.
Les Paul Bio
http://gibson-lespaul-guitars.com/articles/les-paul-biography
Les Paul: The Original Circuit Bender written by Chris Edwards
http://kn.theiet.org/magazine/issues/0915/original-circuit-bender-0915.cfm
Eric Clapton playing a Gibson Les Paul Standard.
Slash playing a Gibson Les Paul Custom
Tom Dowd
Tom Dowd was a very influential recording engineer. He pioneered many of the technological advances that have become standard hardware components - such as the motorised vertical fader.
Tom Dowd bio
http://www.soulwalking.co.uk/tom%20dowd.html
Tom Dowd dedication
http://mixonline.com/mag/audio_tom_dowd/
Blues Brothers - Rawhide
A clip that brought rhythm and blues back in vogue in the 80s and 90s (apologies about the quality).
Interesting links
Stax Records
http://www.soulsvilleusa.com/about-stax/
Sweet Soul Music: Rhythm and Blues and the Southern Dream of Freedom written by Peter Guralnick
http://shop.staxmuseum.com/browse.cfm/sweet-soul-music-by-peter-guralnick/4,52.html
Feel Like Going Home: Portraits in Blues and Rock 'n' Roll written by Peter Guralnick
http://www.amazon.com/Feel-Like-Going-Home-Portraits/dp/0316332720
About the author
http://en.wikipedia.org/wiki/Peter_Guralnick
An article on the influential Muscle Shoals Studio
http://www.npr.org/templates/story/story.php?storyId=1437161
A museum dedicated to the history of Motown
http://www.motownmuseum.com/mtmpages/index.html
B.B. King's own line of Gibson signature guitars
http://www2.gibson.com/Products/Electric-Guitars/ES/Gibson-Custom/BB-King-Lucille.aspx
An article discussing the Robert Johnson Devil story
http://www.mudcat.org/rj-dave.cfm
Inspiration
Enjoy!
Bobby Bland - Ain't No Love In the Heart of the City
http://www.youtube.com/watch?v=jVwJGyUbkow&feature=fvsr
Al Green - Let's Stay Together
http://www.youtube.com/watch?v=COiIC3A0ROM
Detroit Spinners - It's A Shame
http://www.youtube.com/watch?v=l5VaRSEvq8U
Albert King - Born Under A Bad Sign (live)
http://www.youtube.com/watch?v=BKY8KIt9kqc
The first track is sung by a blues singer called Bobby Bland. Relatively unknown in the UK, all he is known for is his pair of concerts with blues guitarist B.B. King.
The second track is much more popular, due in no small part to it featuring in Pulp Fiction. It is ranked the 60th greatest song of all time by Rolling Stone magazine on their list of the 500 Greatest Songs of All Time.
The third track is an archetypal motown single, which served to make the genre much more popular.
The last track is a blues standard sung by blues guitarist Albert King. It has become one of the seminal versions.
Here I have listed every genre that influenced my composition choices. I hoped to represent each of these genres - Soul, Rhythm and Blues, Motown and Blues - in my track. Had I had more time in the studio I would have wanted to replace the MIDI instruments with their live counterparts.
Reflection
I have really enjoyed this module. Creating music is a big passion of mine, and I was glad we were assigned it as it took my mind off my other two modules, which I felt had no relevance to Music Technology (SM 2021 SME Marketing and IM 2030 Video & Sound Techniques).
I decided to create a modern day soul instrumental song (similar to the style of Little Beaver), as I am a big fan of the genre (both modern day and soul from the 1960s and 1970s).
I started using Cubase to create the MIDI backing of my composition, as I am well versed in using the MIDI features of that program. Despite my lack of keyboard playing skills, I managed to play all the parts I needed.
I then imported the MIDI file into Logic Pro, with which I recorded the live parts of my composition.
However, I did run into trouble with this module. Every time I booked a Logic studio something went wrong; in my first session I was unable to record anything through the hardware, even when I asked for assistance from a technician (who was equally as perplexed by the problem). The second session was the most successful, despite the fact that I had to wait over an hour for my studio to be free as a class had overrun and would not let me in.
In the third and final session, in which I wanted to work on both this module and IM 2030 Video & Sound Techniques, I was unable to gain access to the facility because it had been double booked by the equipment stores staff - I had been given a booking receipt by the equipment stores staff with the wrong name and I.D. number, which made gaining access to the pre-booked studio an impossibility. Luckily all I needed to do was bounce my composition down to a sound file, which I was able to do in the Macintosh labs. Unfortunately however, this meant I was not able to get a true grasp of the levels of my various instruments because I had to make my final mix on iPod headphones on an iMac and not through studio monitors on a Mac Pro (which I would have very much preferred).
It was when we were inducted into the studios in the MPC that we found out that one of the technicians is an Apple certified Logic Pro instructor. It was the opinion of every member of my course that he should have taught us this module, as it seemed to us that he knew more about the software than our existing tutors.
If I had been given more time to complete this module then I would have tried to replace the software instruments with their live counterparts, and I would have recorded a vocal line (provided I had written lyrics in time). Perhaps I could have presented it to my class by playing it live with other musicians - this would have taken a lot of time to set up however. I am presenting my composition at 4.30pm today - I will make another post with how it goes.
Tuesday, 14 December 2010
Monday, 13 December 2010
Studio time #2
I recorded the live parts of my track in the studio today, which were lead guitar, rhythm guitar and bass. I started on the mix-down but unfortunately I ran out of time because a seminar group overran by an hour and would not let me into the studio - even though I had booked it a week before hand and had a booking receipt.
If I think this will make my submission late then I will apply for extra time before the deadline.
Tuesday, 7 December 2010
Studio time #1
When it came to recording live electric guitar however, I ran into a few problems. After the usual troubleshooting - check guitar isn't faulty, check cable isn't faulty, check the D.I. is working, check the I/O and routing settings, etc - I called a technician. After 10 minutes of tinkering, we couldn't figure out the problem. This was towards the tail end of the session anyway, as the MIDI took up most of the time.
I am considering applying for an extension on this assignment, as this is the first time I've been able to book the studios in the MPC since the start of the semester - not to mention the equipment not working in this specific instance. I will be sure to ask for a studio adjacent to a live room so I can record the guitar in a Control Room that has been properly setup for recording live music.
I'll upload a few screengrabs in the next post.
Thursday, 2 December 2010
Progress so far.
Thursday, 28 October 2010
Project B
So far I have created the chord pattern and drum track for the intro, choruses and a verse.
I have chosen to use these programs as Cubase is the only MIDI sequencer I know how to use effectively at the moment; and Logic because Macintosh computers process and record live music better than Windows PCs (which is the only way I can access Cubase). With Logic I can also record and edit at home, instead of having to be on campus to work.
I'll keep updating here with my progress.
Wednesday, 13 October 2010
Pure Data tasks done!
Above we have a quarter tone scale, which is activated by clicking the bang in the top left corner. This not only sets off the first tone but sets off the delay chain on the right hand side - which adds 0.5 to the MIDI value of the frequency - which increases it a quarter tone (adding 1 each time would increase it each time by a semi-tone). There is an on/off switch for the audio output, a 'stop' button which stops the delay chain, and a send/receive set of objects to clear up screen space.
Above we have a logarithmic glissando. Similar to the quarter tone scale patch, there is a delay chain (albeit somewhat smaller than before) which activates the frequencies in the message boxes that follow on logarithmically from the original one ('55'). Again there is an on/off function for the audio output and a stop function (this time wired).
Last but not least here is a linear glissando. As with the previous two patches, there is a delay chain which means only one bang has to be clicked. From the first bang click, the input into the mtof object (midi to frequency) increases by 1 (a semi-tone). Also there is an on/off switch for the audio ouput and a stop function - using a send/receive set.
I have enjoyed creating these patches, and as I said before I am going to play around a bit more with Pd to see what I can do with it.
It is from today's tutorial that I have realised I am getting ahead of myself on this module, so this module may take a back seat to the other two - I will still be posting every week however!
Saturday, 9 October 2010
ChucK
Wednesday, 6 October 2010
Object-oriented Programming
It is safe to assume that most software that include use interfaces of menus, windows and similar icons has been designed from an object-oriented programming perspective.
Objects can take many different forms in PureData - from an audio output ([dac~]) to printing data ([print]). Objects have two important characteristics: attributes and behavior.
Attributes: aspects/characteristics of an object.
Behaviour: the actions an object knows how to do - each object has a list of messages it knows, and it responds to them.
These two characteristics are important when working out classes. A class a collection of objects that have similar attributes.
Example
Let's say we have two classes - BrassInstrument and StringInstrument - what attributes would the objects of these groups have?
BrassInstrument
1. Made of brass/similar metal.
2. Requires an airflow to create sound.
3. Sound can be controlled by valves, which direct airflow.
4. The sound produced does not resonate in the body of the instrument.
StringInstrument
1. Made of wood/mostly of wood.
2. Requires string vibration to produce sound.
3. Sound can be controlled by fretting the strings, which changes pitch produced by string.
4. The sound produced resonates in the body of the instrument.
5. Has to be played to produce sound.
What would their behaviors be?
BrassInstrument
1. Dynamics - the musician adjusts his/her playing (inputting data) to make the sound louder (brass instrument responding to data - behaving)
2. Note produced - the musician has the choice of what note to play - this is achieved by changing the direction of the airflow using the valves.
2. Articulation - the musician plays with a legato style, but changes to staccato (for example)
3. Expression - halfway through a performance, the musician starts to play with vibrato
4. No output - instrument not being played.
5. Has to be played to produce sound.
Nos. 2 & 3 are also achieved by using the valves on a brass instrument.
StringInstrument
1. Dynamics - the musician bows strings harder or softer
2. Note produced - the musician has the choice of what string to play and where to fret it (increase pitch by holding string down to neck) - if at all
3. Articulation - the musician plays notes for longer than indicated on sheet music (legato)
4. Expression - the musician bends the note he/she is playing over and over (vibrato).
5. No output - instrument is not being played.
6. Chords - with string instruments, it is possible to produce two sounds at different pitches - called a dryad (a chord). This is achieved by bowing more than one string at a time (double-bowing)
7. Pizzicato playing - the notes can be plucked as well as bowed. This produces a different sound.
As you can see, although the above classes are similar, the behaviors of the objects within them are achieved in different ways (due to the differing nature of the objects themselves) - and as a result we get two different sounds i.e. the sound of a brass instrument and the sound of a string instrument.
States
A state of an object is determined by adding the values of that object's attributes.
For example, states for a Guitar would be:
1. Not being played.
2. A Bb note being played (for example)
And due to the fact that they are both instruments, states of a Saxophone would be the same - only the messages that control these states differ. For example, if we constructed a saxophone patch in PureData and sent it the message "palm mute", the saxophone would not know hot to behave in accordance with this message and the sound output would not be affected. However, if we sent a guitar patch the same message, the sound would change as a guitar patch would be programmed to behave in the appropriate way in response to this message.
As you can imagine, there are many different classes and objects in PureData. One of the most important is the [dac~] object - this acts as an audio output and as such has two inputs or inlets, which act as a left and right input (for use with a speaker system with stereo capability).
Another object in this class is [adc~]. This acts as the audio input. This means you can process an incoming signal through PureData.
In a seperate class are the number functions - *, /, +, -, pow.
These functions can be made into objects in PureData, and because they have similar attributes and behavior they are in the same class.
As we already know, we can send messages to objects. But what we don't know is that sending a message to an object changes its state. For example:
Above we have a simple oscillating patch, oscillating at 440 Hz. If we add a connect a message box to the oscillating object, and type a number into the message box - we can change the state of the object (in execute mode) by clicking on it. Thus:
Tuesday, 5 October 2010
ChucK
Having completed the first two PureData tasks I moved on to the next two - but due to the difficulty I have been having with them I shall wait until my next Music Production II lecture to get some assistance. That is why I downloaded and installed ChucK - a program that utilises the Command Line (Windows)/Terminal (Mac OS X) application to make music. I have made progress to the point where I can get sounds out of ChucK, but nothing extensive.
I shall do a post in the Music Production II lecture tomorrow.
Friday, 1 October 2010
Progress so far...
1. Two simultaneous random melodies, and
2. Two different intervals using two bangs.
Here are screen grabs of the patches:
I decided to get an early start on these tasks as I have no prior knowledge of this kind of programming. I hope to get the other 2 patches on here soon.
PureData
Wednesday, 29 September 2010
Differences between conventional musical notation and the derived notation for electronic music
But such success does not mean sheet music is without its rivals. One rival in particular is the growing community of electronic artistes creating their own music notation - as the notation for electronic music has not yet been standardised.
Here are some examples of the differing types of notation for electronic music:-
Due to technological restraints, I have provided links to images of the following types of notation.
I. Graphic scoring (image) © Hans-Christoph Steiner, 2004
II. Piano Roll Notation (image) © VoxNovus, 2008
Notation shown in reference to the keys on a piano. The length of the bars indicate how long a key is held down for. This notation is commonplace in MIDI editing software such as Cubase.
III. Line staves showing relative pitch (image) © VoxNovus, 2008
IV. Prose Scores
Sometimes referred to as verbal-, instruction- or text- notation, prose scores are used for non-reading musicians - and are intriguing as they do not employ diagrams, but rather just use ordinary text. This in turn leaves the interpretation of the scored music up to the performer. This is especially useful in experimental music - a genre where technology and electronic means of music-making play a key role.
The key difference I have noticed between traditional music notation and the above forms of notation is the performer's interpretation of the music. In traditional sheet music, phrases such as legato and staccato play a heavy part in the way the performer translates the images he/she sees into music. With electronic music notation, the interpretation is appears to be entirely up to the performer. All that is pre-destined about the players performance is the pitch. Things such as tempo, attack and release are left up to the performer - an aspect which is essential to experimental music.
What does the future hold for computer generated music in live performance?
Projects such as IRCAM (Institut de Recerche et Coordination Acoustique/Musique) are constantly blurring the boundaries that seperate electronic music from areas such as education, society and media (for example).
Central to IRCAM is the development and modification of software to be distributed to each and every field they touch. So, for example, they are developing software for educative establishments - which in turn could give rise to a new wave of electronic artistes.
With the technology ever-changing and projects covering many different fields, there is limitless potential for live electronic music. For example, one such project developed at IRCAM (which achieved critical acclaim) was WindSet - wherein accurate virtual models were made of brass instruments to be used by electronic artistes. The end result was the software package BRASS 2.0 (distributed by Arturia). But more than just being a library of samples, this piece of software takes into account the way the instruments are played and how they interact with each other - aside from many other useful functions (such as the option to place an instrument in a "virtual stereo space").
This software includes accurate representations of how these instruments harmonise, and the ability to edit and control the attack of notes played and use of mutes (both static and dynamic).
BRASS 2.0 achieved critical acclaim after its release in 2006, and was described as "uncannily human in its response"1 in the May 2006 issue of Sound on Sound.
Such success is standard for projects developed at IRCAM, and with other projects such as Listen (a project completed in 2003 which "[provides] users with intuitive access to personalised and situated audio information spaces while they naturally explore everyday environments") new ways of reaching your audience - and perhaps even finding new ones - are appearing left-right-and-centre.
Gone are the days of unrealistic sounding Cubase MIDI sounds being controlled by Fisher Price-esque MIDI controllers - synthesised sounds are fast-becoming more functional, more realistic and most important of all, more usable in many different genres of music.
References:
1: http://www.arturia.com/evolution/en/company/clippings.html#Brass
2: http://www.ircam.fr/307.html?&L=1&tx_ircamprojects_pi1[showUid]=8&tx_ircamprojects_pi1[pType]=p&cHash=a272231854