Soundscaping is an acoustic environment made by sound to imitate an environment for humans. This was first coined by the Canadian composer Murray Schafer in the 1960s. Schafer was a naturalist composer who centred his work around realism. I took inspiration from Schafer’s work; he quoted “A soundscape is any collection of sounds like a painting is a collection of visual actions”. This can be fascinating, I believe it’s possible to portray any type of environment even better than a composed piece of music can. By capturing a sense of reality soundscaping serves me as the composer an opportunity to immerse the listener into my work. Playing with realism makes the listener believe they are the main character immersed by the feelings the composer is trying to create. When creating my audio podcast I chose to use soundscaping throughout the entire piece. I aimed to create the most realistic performance possible. Adding narrative audio on top will make the story clear and effective, almost as if the listener can paint the story inside their heads. As I continued this creative process I found that the more immersive the piece – the more it affected it became to the listener.
Author: Eleanor Anderson
Stereo + Panning Throughout Radio
When working with radio projects I discovered the importance of panning and stereophonic sound. Stereophonic allows the listener to hear a piece from a human’s most natural form – multi-directional! Despite first being used in 18881 in Paris with several telephone transmissions, stereo-only became popular when it first aired on radio in 1961. This then got used in the late 1960s / ’70s during the psychedelic movement within the music. Famously bands such as Pink Floyd showcased this as normality on many of their albums.
In my experience, I have found that panning allows the most immersive experience possible. It adds another dimension to the sound which creates room for the listener to become truly immersed. Creating sound which reverts to a human’s natural state of hearing makes the piece realistic. This realism leaves the listener vulnerable to feeling and provokes certain emotions throughout the performance. When creating car sounds in my piece I could pan these to create a realistic vision of cars passing. I find this useful as it gives me chance to shatter reality and create a more atmospheric effect on the listeners’ mood and feelings. In future, I will always make my art pieces in stereo due to such success.
During a lecture, I recently learned the differentiation between Expressionism and Impressionism. Impressionistic music first took its spin on music in 1918. Being a sound that is not trying to grab the listener attention and aims to fit a mood or aesthetic. Theodore Adorno stated “eliminate all of traditional music’s conventional elements, everything formulaically rigid. In contrast to expressionist music which presents different levels and pitches throughout the piece.
Admittedly I came to realise that all the pieces I make and the most definite expressionist pieces of art. From this, I would like to experiment more with the field of impressionistic music. It fascinates me how despite there being fewer elements to attract your attention, it still holds the power to be something so atmospheric and moving. A blend of sounds can be as beautiful and thought-provoking as an expressionist can. Although within expressionism it is easier to create a specific mood with individual elements such as speed, pitch or instruments; impressionism leaves a space for the listener to reflect. Without their attention being constantly grabbed the free space and sound levels within the music send the listener into their self-reflection to provoke any mood they want. Its effects are more internal than expressionist.
In the future, I hope to try and create more impressionist sound pieces and learn more examples.
Christina Wheelers lecture made me reflect on her views on creative performance. Christina talked about the endless opportunities in which performance and sound art can be presented. She asked questions such as, where and how should the audience be positioned? Where should the performance be shown? How should they hear the sound, and where would the sound be played. In my previous blog posts, I have talked about the importance and effects spatiality can have upon sound; how the room in which sound is played is an additional instrument in itself. In a study by ‘Yueying Li’, she claims “an increasing number of theories and studies addressing spatial topics have argued that these two fields are closely linked and one would not be complete without the other”. Without exploring speciality your sound would have no uniqueness and space within the audio. I truly believe that this area has only been able to be truly explored due to the rise in technology. Before the 1900’s sound was limited and concert halls stood as the heart for live performance. This generic positioning of audience and performer has been able to outlive itself thanks to new realms of technological advancements. With arrays of speakers and microphones; sound can be heard from different directions meaning that the audience themselves can also change direction. The advancements in a film it has allowed a parallel for sound and can create artistic performances in any format or area.
Thanks to these advancements it has not only allowed space to affect the initial audio and recording it has also allowed effect to take place during performance. Experimenting with your surroundings can change the way in which the audience portrays your piece. I find this interesting as I believe it can add another layer to my work and create a more immersive experience. Vicky Bennet showed us a clear example with her exhibition performance ‘Gone, Gone Beyond 360’ where she incorporated the visuals of a 360-degree screen. After all, the way your listener views your piece is the most important aim of creating it in the first place. I hope to use this in future performances to capture a true aesthetic and mood in line with my artwork. It will help me go to that extra length of expanding the listeners mind and opening areas for them to feel my work rather than listen.
In conclusion advancements within technology have allowed room for us to break the barrier between audience and performer and create harmony for everyone to experience art at its fullest capability. The advancement of technology and how it has affected creative performance:
Pro Tools – Lesson 6
1.) What audio file formats can be imported to Pro Tools without requiring conversion?
- WAV and AIFF do not require conversion
2.) What condition would cause a file in one of Pro Tools’s native format to require conversion on import?
- Any files that have a different sample rate from the session, regardless of format, must be converted.
3.) Name some common audio file formats that Pro Tools can convert on import.
- MP3,
- Windows Media Audio (WMA),
- Sound Designer (SDII),
- Audio Interchange File Compressed (AIFC)
- Waveform Audio File (WAV)
- Audio Interchange File Format (AIFF)
4.) What are some video file formats that can be imported by Pro Tools?
- MP3, BWF, SD I/II, MXF, AIFL, MWA, MOV, AAC, M4A, REX, ACID
5.) What is the difference between split stereo and interleaved stereo? Which is/are supported for importing into ProTools?
- split stereo is 2 or more separate mono files for left, right, etc. channels. interleaved is 1 file that contains right/left stereo information.
- ?
6.) What is the difference between the Add button in the Import Audio dialogue box and the Copy button? Which button will force-copy the files into your session’s Audio Files folder?
- The Add/Add All button will add compatible files or clips to the Clip List without copying them to the Audio Files folder. The Copy/Copy All button will add the compatible files or clips the Clip List but will copy them the Audio Files folder.
- ?
7.) What happens when you use the Workspace browser to import audio that is not compatible with your session’s parameters (in other words, audio that requires conversion)? What happens when you import audio that does not require conversion?
- it converts automatically
- Non-compatible audio is converted (and copied) when imported from Workspace browser.
8.)What steps are required to conduct a search for an audio file using the workspace browser?
- 1. Window menu > New workspace > Default
- 2. Select Advanced Search (Magnifying glass icon)
- 3. Choose the folder that you want to search by selecting it in the Locations pane on the right side of the window (in this case, choose “Audio” folder).
- 4. Click “Kind” and “File type” on the Search Column.
How would you go about importing a QuickTime movie file to Pro Tools while simultaneously importing the audio embedded in the file?
- (Make sure that the Video Engine is enabled in the Playback Engine option)
- 1. File menu > Import > Video
- 2. Select “Import Audio from File” option in the Video Import Options dialog box
How many video files can be associated with a standard Pro Tools session at once?
- Only One
Corruption On Childhood Innoncence:
My piece is a reflection upon the corruption of childhood innocence. It mirrors a short journey through life; showing how adulthood and added stimuli from the outside world can destroy innocence. I hoped to portray the negative effect it can have upon life. In this piece, I will be writing about my journey and the reasons behind my creative process.
Originally I wanted to represent how childhood innocence blinds people from the class divide and how your economical status is inevitable due to where and who you are born to. My idea was to take a field recording of private school children playing and state school children playing. However, this created some moral barriers and was not entirely ethically correct. From this, I decided to reflect the piece on something a little more personal – my experiences entering adulthood and my perception of life as a whole.
The piece begins with very little sound and noise. A piano key and low pitched thudding in a rhythmic pattern. To me, this highlights the sound of a heartbeat showing connotations of new life. A heartbeat is what every human share. No matter who you are! This represents a human stripped down to its bare; naked from any outside influence or complexities. The sound is pure with no added other noise. I have played the piano from the age of 4 and chose to include this instrument to represent the heartbeat as it is a personal view of my life. As the piece develops and the beating continues extra noise is added including an increased raw which slowly takes dominance of the piece. This is the corruption from the outside world creeping in and taking over your purity.
As the piece moves on to the next part, I include a field recording taken from an old childhood video from my dad’s recording camera. These videos are something entirely personal to me and my life as it is a true representative of my life growing up. I decided to include a recording from one of my childhood assemblies. This recording involves many young voices, talking, and general chatter while we all play with each other. I liked this recording as I thought it fit the brief of my piece perfectly. The children (including me) are unaware of the camera recording them meaning their conversations are natural and pure. In their eyes, there is no outside influence on their conversation which represents to me how childhood innocence is dependant upon outside corruption. I added reverb and delay along with tampering the volume to go up and down throughout the recording. I used this to create a wave effect and intentionally show how slowly over time certain parts of your life take away that childhood innocence. Adulthood creeps up on your life and I truly believe from my experiences it has not been sudden. By creating a waved effect upon the sound by editing the recording it highlights slow corruption and the process of puberty. Silent parts of the piece demonstrate how I began to feel lost and out of place through my teenage years. Despite the fact, the pieces still going on the silence is lonely and creates a feeling of isolation – mirroring my emotions undergoing my journey into adulthood.
As a sufferer from anxiety and paranoia, I often refer to my head as a “wasps nest”. Throughout the end part of the piece, I wanted to present this and the disarray that life has upon your mental state. The piece then begins to build in intensity through volume and added sounds. I started to include rhythmic beats and percussion as it shows a representation of adult working life and the routines in which we bind ourselves. It begins to get chaotic with several different noises and sound effects ranging from many different synthesisers, midi inputs and field recordings. Banging sounds taken from field recordings are used to add to this disordered sound, highlighting how problems and stress in adulthood life shatter your innocence. The noise ranges in pitch volume and type of recording which I hoped would portray severals paths of influence on the brain and the disordered stimulus it can create. Adults face multiple types of stress which is what I hoped it would indicate.
Finally, the childhood recordings begin to come back into play and build up in sound as the chaos begins to die down. As a result, despite everything, your childhood will always be implemented in your life. It becomes your route for identity and carries on with you mentally throughout your entire existence. To end the piece I chose the heartbeats from the beginning sequence to start playing back again. I believe despite your influences and your positionality in life, everyone shares this process. By repeating the beginning I wanted to show a full circled effect – mirroring the circle of life. It shows me how everyone follows this process through and the heartbeats are leading onto someone else’s life.
Everyone is born innocent and it is your life and society’s choices that destroy this. The frustrations, stress, and business of adult life take away your innocence bit by bit. Everyone shares this process and it is just a cycle waiting to happen. My goal was to produce this within my piece and I hope these visions come across when listening.
Pro Tools – Lesson 4
1.) What are some actions that can be initiated from The Dashboard?
- Create a new blank session on local storage. Create a new blank project, with or without cloud backup. Create a new session or project from a template. Open a session or project from a list of recently opened Pro Tools documents. Open a project that you created or are a collaborator on. Open a session from a connected storage location on your system.
2.) What is the difference between a session and a project in Pro Tools?
- A session file is saved to local storage. A project is stored remotely (on the cloud)
3.) What is required to create a project document? What are some reasons you might want to create a project instead of a session?
- An Avid account, and Internet connection. The reasons to use a project include the following: a.to protect your Pro Tools work against loss in the event of Drive failure or other computer mishap. b.To be able to access your projects from anywhere with an Internet connection. c.You can collaborate with other Pro Tools users anywhere in the world.
4.) What are some available options for parameter settings in the dashboard?
- Create from Template, Audio file type, Sample Rate, Bit Depth, I/O settings, Interleaved, Prompt for Location, Show on Start-up
5.) What audio file types are supported in Pro Tools? What is the default file type?
- (WAV or AIFF)
6.) What is the maximum sample rate supported in Pro Tools? What is the maximum bit depth?
- 192Khz. 32-bit
7.) What menu command lets you add tracks to your session? What keyboard shortcut can you use to access this command?
- Track. cmd+shift+N
8.) How many tracks can you add to a session at one time?
- you can simultaneously add as many tracks with as many different configurations as your session will allow.
9.) Describe some primary track types that are available in Pro Tools. Describe the two types of folder tracks.
- Primary Track Types: Audio, MIDI, Instrument, Video, Auxiliary, VCA, Master Fader. Basic Folders: purely for organisational purposes, essentially just containers for visually grouping sets of related tracks together into a collapsible view. Basic folder tracks do not have any signals rooted through them. Aside from solo and mute functionality that propagates to their constituent tracks, basic folder tracks have no mixing controls. Routing Folders: have all of the signal routing functionality of an Auxiliary input track (audio input an output selectors, insert points and send routing), along with mixing controls (Pan and Volume) and all associated automation controls in the Mix and Edit windows. Routing folders are designed primarily for sub mixing and stem mixing workflows combining key features of Auxiliary inputs and VCA master tracks with folder behaviour for organising and managing sets of tracks.)
10.) Which timebase do Audio tracks use by default? Which timebase do MIDI and Instrument tracks use by default?
- Audio tracks are Sample-based by default, while MIDI and Instrument Tracks are tick-based.
11.) What happens to the Audio and MIDI data on a track when the track gets deleted from your session? Can the Track > Delete command be undone?
- When you delete tracks, your audio or MIDI clip data will remain in the Clip list, but your arrangement of clips on the deleted track (the tracks playlist) will be lost.
12.) Name the two types of cursors available in the Edit window. What is the difference between them?
- 1. Playback cursor 2. Edit cursor. The Playback cursor is a solid, non-blinking line that moves across the screen during playback and indicates where the current playback point is. The Edit cursor is a blinking line that appears on a track playlist when you click with the selector tool in a track .
13.) Which tool can be used to set the playback point by clicking directly on a track?
- Selector Tool
14.) What is the Playback Cursor Locator used for? Where will the Playback Cursor Locator appear (in what Ruler)?
- To help locate the playback cursor when it might have moved off screen after reaching the edge of the Edit window. It appears in the Main Timebase Ruler.
15.) What is the purpose of the Save As command? Which session will be opened after completing the Save As command – the original or the renamed copy
- It’s useful for saving a copy of a session under a different name or in a different drive location. The Save As command leaves the original session unchanged and allows you to continue working on the renamed copy. And such, it is particularly useful when experimenting, to save alternate versions of your work. This command is also useful for saving stages of your work under different names. By working in this way you can always retrace your steps if you ever need to go back to an earlier stage of the project. It is the renamed copy that will remain open to continue working on.
16.) What is the purpose of the Save As New Version command? What type of Pro Tools document does this command apply to?
- It provides similar benefits to the Save As command, but is available only when working on project documents
17.) How can you open a session after locating it in a workspace browser?
– Double-click
Pro Tools – Lesson 3
1.) What icon is used for the Zoomer tool in the Edit window? How can you use this tool to quickly zoom out, filling the Edit window with the longest track in the session?
- Magnifying glass. Double click to see ALL (fn+f5 to use Zoomer)
2.) Which Edit tool is represented by a hand icon? What is this tool used for?
- Grabber Tool. It is commonly used for arranging clips. You can use the grabber tool to select an entire clip with a single mouse click you can also use the gravity movie clip along the timeline, within its current track into movie clips between tracks
3.) Which tool is active when the Trim, Selector, and Grabber icons are all selected (highlighted in blue) in the Edit window toolbar?
– The Smart Tool
4.) What are the four Edit modes in Pro Tools? How can you switch between them?
- Shuffle (fn+f1). Slip (fn+f2). Spot (fn+f3). Grid (fn+f4)
5.) Why should you use caution when editing synchronized material in Shuffle mode? When is Shuffle mode useful?
- Movements and edits made on shuffle mode will cause timing changes for the media on affected tracks. This mode should be used with caution when editing material that is synchronised to other tracks or are aligned to a timing reference or tempo. It’s useful as a way to make clips line up next to each other without overlapping or leaving silence between them. This can be convenient when you need to shorten a line of dialogue by removing a pause, cough repeated word, or similar unwanted material.
6.) How does editing a clip in Slip mode affect the timing of other clips on the track?
- In Slip mode, you can move, trim, cut, or paste clips freely within a track without affecting the placement of other clips on the track. All selections, clip movements and edit operations at unconstrained
7.) When is it helpful to work in Spot mode? When it is helpful to work in Grid mode?
- Spot mode lets you move and trim clips using precise locations or durations specified in a dialogue box. In Grid mode selections, clip movements, and trim operations are constrained by the grid, i.e. it is useful for quantising material.
8.) What are some ways to set the Main Time Scale in Pro Tools?
- VIEW > MAIN COUNTER. Main TimeScale pop-up menu
9.) What are the two types of Rulers available in Pro Tools? What is the difference between them?
- Timebase rulers and Conductor rulers. Timebase rulers measure time in various ways (they include Bars| Beats, Minutes: seconds, samples, timecode, timecode 2, Feet+Frames). Conductor rulers contain events that map out locations, characteristics and changes within a session (these include Markers, Tempo, Meter, Key, Chords)
10.) What are some ways to hide Rulers that you do not need displayed in a session?
– VIEW > RULERS. Opt+click directly on a Ruler’s name in the Timeline display area.
11.) Which Pro Tools windows provide access to MIDI controls, such as Wait for Note, Metronome, and MIDI Merge?
- The MIDI Control section in the Edit and Transport windows
12.) What is the purpose of the Metronome button in the MIDI Controls area? What kind of track must be added to a session for the Metronome button to work?
- The metronome button is used in conjunction with a click track and controls whether or not the click will be audible. When the Metronome button is active, a metronome click will sound during playback and recording, as specified by the settings in the Click/Countoff options dialog box (SET-UP > Click/ Countoff). Metronome playback requires a click track on the click source to be configured for your session.
13.) What are the two states or modes available for controlling the current session tempo? How can you switch between these modes?
- Tempo Map Mode and Manual Tempo Mode
14.) What is displayed by the Tempo field in the MIDI Controls area? What are some ways to set the session tempo using this field?
– The session’s current tempo is based on the play selection. In manual tempo mode (or when the session tempo has not yet been defined) you can enter a BPM value directly into this field. In addition, when the tempo field is selected, you can tap in a tempo from a midi controller or from the computer keyboard using the T key
Halim El- Dabh
An Egyptian sound artist, who was a pioneer for electronic sound also known for his “musque concrete” and “electroacoustic” work. ‘Wire Recorded Piece’ was the earliest electronic piece of recorded tape music.
After reading the article based on exhibition experiences within late 20th century Germany, I came to ask myself what works better for the art: including touch or not including touch? It is well known about Germany’s history under strict dictatorship rule throughout many centuries including only years before this article was discovered. During the end of the century, Germany took a new step towards freedom and began inheriting new waves of contemporary art within their lifestyle. Music became more experimental, and life found a new sense of freedom. However with the Berlin Wall still standing until 1989 Berlin still held a strict divide within society- highlighting an everlasting presence and control from the government. The debate of whether viewers were allowed to touch the art still tells me that western Germany still held a strong control over society’s actions. During Sehen und Hören, Josef Haubrich- Kunsthalle, Cologne 1974. They presented their work in a clinically white room with sealed windows- allowing no exposure from the outside world. The article states it “allegedly allows optimal concentration on their perception”. This feels to me as if the viewers were being forced to view the art in a certain way. Art is ambiguous and can sometimes be vague. People’s perceptions are based on their positionality and life context. The art should speak thousands of languages and meanings – dependent on the viewer and what their life had entailed. By shutting out any other influence stimulates a strict and regimented environment which contrasts the idea of a new, free Germany. Its militant environment mirrors the dictatorship past; allowing no flexibility during the exhibition. On the other hand, this could be viewed in another light. By blocking out any outside life during the exhibition could allow the opportunity for the viewer to forget existing oppression and current social ills happening in the outside world. Boycotting existing life can almost create a sense of calm and focus to experience the exhibition to its full potential. To me, the contrast between the white walls and the art almost creates a sense of divide – mirroring Berlin’s current state due to the Berlin Wall and the divide between Eastern and Western Germany. The art represents a sense of freedom away from strict rule juxtaposing the other side of the wall where rigid dictatorship rule was still held in place. These restrictions within the gallery could have had a greater effect and shown to the viewer what they would have seen outside anyway. They do the work for the windows meaning they’re not needed almost making it an immersive experience. Despite the fact, they didn’t want to do that.
Thinking about my future work, I will make the decision whether or not to incorporate touch within my piece. As already discussed earlier in my blog, I can clearly see the benefits of incorporating other senses to highlight the sound. However, I do believe that what senses you include is entirely down to the theme and focus of the piece. I will go away and think about my up and coming performance and decide whether I should or shouldn’t include this.