21 Days of Video Game Music – Day 1

Had a little break from blogging for a while but, given the current situation I currently have a lot of time on my hands so I’ve decided to do the Video Game Music Academy “21 Days of Video Game Music” challenge!

I’ll be writing a new piece of music every day for the next 21 days, some of these will just be quick sketches or ideas and some will be a bit more fleshed out or (hopefully) finished. I’ll also write a short blog post here about each one and if I use any novel or interesting (at least to me!) techniques I’ll talk about them here. I’ll be sharing them on twitter too, and you can follow the hashtag #21DaysofVGM to see what other folks taking part in the challenge are doing.

Here’s todays piece:

Global Game Jam 2020

This weekend I went to my first ever Global Game Jam and I had a lovely time!
I had no team when I arrived and so was put in a team with other folks who I hadn’t met before. Considering we’d never met or worked together before we managed to work together well and get a reasonably working demo of what was quite an ambitious idea together. You can download and play the game we made here: https://globalgamejam.org/2020/games/airplane-graveyard-8


The theme this year was repair and we also tried to pick up a couple of the diversifiers, namely that the story was inspired by a lesser known woman from history and that the real world weather affected gameplay somehow. The basic narrative idea behind our game was that you are an airplane pilot who’s plane has gone down over a mysterious archipelago and you need to collect parts to repair your plane before a storm comes and washes you away. You can travel between the islands on a small boat. Some of the more complex mechanics (like the storm!) didn’t make it into the game due to time constraints but some of the ideas that we managed to get working in there were (in my opinion at least) quite interesting, notably that the gameplay would be affected by the weather in the real world.

The islands were procedurally generated each time the map is spawned, meaning they’re different each time. We also decided not to include any original music in the game overworld. I wrote a short theme for the menu screen but other than that all of my work was sound design and a small amount of implementation. Inspired by Galaxy News Radio from Fallout 3, we decided that the music would come from a radio that was in the boat you travel around the islands on. I did my best to select music that felt appropriate to the time period (and storm or plane related) and tried to get some songs that were released in 1937 (the year of Amelia Earhart’s disappearance, the inspiration behind our unnamed protagonist).

I picked four songs to play on the radio and the programmers helped me to get them to playback in a random order in Unity. Prior to this, Charlie (our writer) came up with the idea of news bulletins and weather forecasts which are affected or triggered by the weather somehow. He wrote four bulletins and we ran off to a side room to record them, along with 8 wind reports/directions (north, north east, east etc). I then edited these into eight bulletins, each with a wind direction appended to the end of it. Jason (one of our programmers) then wrote some code which would cause each of these to be triggered based on relevant wind data collected from a weather API. Jason also programmed the wind data to create an in game force which acts on the boat, pushing it in the opposite of whatever direction the wind is coming from.

So, I took all these assets (the bulletins/weather reports and four songs) and processed them using a combination of Izotopes Trash 2 and some convolution reverb to make them sound like they were playing from a radio. I then handed over the assets to Jason and we hooked them up to his code in Unity and it all worked! He also wrote some code to duck the music when the bulletins were playing and given more time I’d have like to tweak this a little as it didn’t quite work as well as I’d have liked. I’m not super experienced with Unity, but I’m sure there’s probably a simpler way of achieving this within the engine using internal bussing or something. We also simplified the mechanic by attaching the bulletins and weather reports together, but it would have been nice to keep these separate for more variety/flexibility at run time.

Aside from this, most of the assets/sound design were more straightforward but no less fun to create, I particularly enjoyed creating a loop of “plane being fixed” sounds for the engineer character who is on the island where the player character spawns. I used various sources for this but mostly a paper shredder, which sounded surprisingly good! I had a lovely time and will definitely go next year, thanks to all of my team, especially Jason and Vikki for being patient with my endless questions about programming! Also had the chance to use Adam Croft’s instant take suite in Reaper for the first time and it is really awesome! It worked very well and was really simple to set up with key commands etc so thanks Adam! I highly recommend it to all sound designers and editors using Reaper, especially if you’re switching from Pro Tools like me and rely a lot on Audiosuite as part of your workflow.

First Game Jam!

Last week I went to my first game jam and had a great time! Fuse Jam in Bristol. We made a little game where you float around on a boat and you can raise and lower the water level by collecting pluses or minuses dotted around the world.

You can play or download the game that we made here: https://tidalloch.itch.io/tidal-loch

I made the music that plays in the background when you start and quite a lot of the sound effects too, I collaborated on the audio with Starshine Audio.

It’s the first time I’ve ever done any kind of audio work for games and luckily folks were kind enough to help me with implementation in Unity and explaining some of the basics of how that works.

In terms of audio stuff I created the music by making multiple variations of a simple piano loop, made using Kontakt, Tritik’s Krush (which is free and awesome) and Izotopes DDLY that looped seamlessly, and then telling unity to play them in a random order.

The idea was to create the impression of the music being generative, without the serious work involved in doing that properly! I was pretty happy with the result.

In terms of sound design I kept things pretty simple due to time constraints and because my goal for the jam was more to meet people and learn a bit of unity/implementation stuff than imprve my design chops.

I used Collected Transients flusher library quite a lot though, especially for the water drain/fill sounds. Thanks to some of the folks at the jam I’ve now discovered Bitsy so the next mission is to make a little game using that and then learn how to make audio work within it!

Learning Granular Synthesis Pt 1

In the last couple of weeks on my MA course, we’ve been learning about editing audio using phase vocoder based technologies, and granular synthesis. This is frankly, all a bit technical and mind bending for me, but I think i’m starting to get my head around it and I’ve been playing around with editing some of my field recordings to create ambient/drone pieces.

On the 2nd of November I was out and about around the Harbourside in Bristol making some field recordings.


Field Recording on the pontoon beneath Prince’s Street Bridge

I borrowed some kit from the universities asset store (Sound Devices 552, Rode NT4) and made some basic stereo recordings from various points around the Harbourside.

Here’s one of those recordings, after normalisation and trimming in Audacity.

As you can hear in the above clip, theres some pretty prominent violin playing from a guy to my right at the start (he was pretty good!)

I decided that it might be fun to try and use some spectral editing to remove some of the partials that were not his violin, with the aim of making his playing more prominent and hopefully without completely trashing the original sound in the process.

I used SPEAR to do this, which is pretty fun to play around with and free! Here’s the result.

As you can hear I (somewhat) succeeded in making the violin more prominent, however I kind of turned the rest of the recording into a strange, banshee like sine wave fest. Not exactly what I was aiming for, but since SPEAR actually re-synthesises the sound, it’s kind of impossible to avoid this if you remove lots of the quieter partials. At least for me it is, more practice and time with it will help i’m sure.

Anyway, I then took this file and fed it into MacPOD (another cool piece of free software) and did some granular synthesis mostly just using snippets from the first 10 seconds or so where the violin is playing. Here’s the result of that.

I’m pretty happy with this as both something substantially more listenable (to my ears at least) than the original re-synthesis, and significantly far removed from the original field recording.

I’m currently playing around with Max, building a very basic granular synth patch, in part 2 I’ll talk about that, share the patch and hopefully some interesting sounds I’ve made with it!