The fundamentals of audio 101
Sound is something we hear twenty-four seven as it’s an important fundamental way for us to communicate and understand a number of things in everyday life. The different things we hear and interpret through sound is an amazing thing. Here is a guide to break down the basics of audio production.
What is audio production?
Audio production is the process of producing and mastering different types of sound into a final project. There are four main stages of the process which are:
- Composition – this is the first step into making your project. In this stage you can write lyrics, play around with sound, experiment with different instruments and vocals. You have the freedom to be creative and get an idea of what it is you want to make
- Tracking – this is the second step where everything is recorded in the studio and then arranged into a particular way. Instruments and vocals can be recorded separately or together depending on how composers like to work. Composers will go through a process called “comping” which is basically going through the best pieces of audio and combining them into a final track.
- Mixing it up – this is the third step where the audio engineer ensures all pieces of audio are balanced and work together. This process is done in an audio editing software programme and where the audio track can truly be transformed. Automation, equalization, compression and various other techniques can be applied to the track to make the audio track sound better
- Mastering the track – this is the last step in the audio production process. Mastering is done in order to optimise playback on different media formats and to balance sonic elements of the audio track. Audio needs to the best possible quality for different media formats. Today there are several streaming services and devices in which audio needs to be adapted to
What is frequency?
Frequency (also known as a pitch) in audio refers to the amount of times sound pressure waves repeats itself. Sound waves can pass through air and water. Once the waves are formed, it causes the eardrum to vibrate which allows us to hear sound.
Low frequency is measured around 500 Hz or lower. This frequency is lower for the human ear to hear. Turning up the bass up on your stereo would create more low frequency waves. An example of low frequency would be the sound of waves in the ocean.
Medium frequency is has a range between 300 kHz to 3000 kHz. This also has strong magnetic properties when at lower frequencies. Medium frequencies are typically used for medium wave broadcasting.
High frequency is measured around 2000 Hz and above. Anything above 16 kHz can’t be heard however that doesn’t mean to say that it’s inaudible. An example of high frequency would be a whistle being blown to create that high pitched sound.
What is a bitrate?
A bitrate is amount of information that is transferred into audio. Bitrates are present in both audio production and video production. A bit is digital words that are made up of ones and zeros. These are individual samples of information also known as bit depth.
A sample rate refers to the amount of times a sample of audio is taken per second. For example: The most common sampling in the professional industry is 44.1kHz. What this means is that a sample is being taken 44,100 times every second. So once the audio is playing, the sound is reassembled 44,100 times per second.