Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Making the algorithmic music for Uurnog - Nifflas
- Background
- I've always felt the game industry got the ability to play back entire recordings of musical pieces too early. Just as things were getting really interesting with music systems like iMUSE that could do incredible seamless transitions for games like Monkey Island 2 and Day of the Tentacle, music software that triggers the individual notes from within the game was beginning to be phased out.
- Today, the audio quality of game music is amazing, but it rarely adapts to the gameplay much more than crossfading between a set of layers. There are of course exceptions, such as FRACT OSC, Dyad, Panoramical and many recent Mario games.
- Before making games, I spent years learning to compose music. Throughout most of my game developer career however, I've not been on a development platform that allowed me to experiment with music as much as I wanted. All that changed when I switched to Unity and realized I finally had the tools I needed to create and integrate my own music software with my games. Uurnog is my first release where all the in-game music is composed with my own software. The software is fully algorithmic, which means I don't compose by actually placing notes. Instead, various algorithms are configured through an inspector-like interface, and any field in this inspector can be configured to take input from what happens in the game.
- With this software, I hope to highlight some possibilities with dynamic music and maybe inspire more people to experiment with it.
- Decomposing Music
- The language in which we express music can limit the way we think about it. I wanted to approach music like a possibility space, much like the way I think about game and level design. The language in which music is expressed is, however, generally based on the idea that everything is static. Notes, chords and keys have absolute names. Since I knew both the idea of "when", "what" and "how many" was going to be dynamic and often random in my software, I had to throw away most words that usually describe music and look at everything from another angle.
- My substitute for a note became an "index", which is a floating point value. The way it works is that when an algorithm has picked out an index to play, it is evaluated through a "scale" algorithm which converts the index into the final pitch. The "scales" objects often represents chords or entire scales, but sometimes goes into experimental territory like picking frequencies from the harmonic series.
- Because scales are defined separately from the indexes, the music software gains the ability to in realtime swap out the key or chord of the currently playing music pattern. Things like transposing the music in realtime, or even adding a new chord progression, becomes trivial since nothing other than the scale objects actually needs to be manipulated.
- What's Next
- Uurnog only features a first prototype version of the music software, and currently it's entirely based on triggering individual notes through PlayScheduled. There are so many things I learned how I could do much better, and I also need to pick up some C++ so that I can start using the native Unity audio SDK and build my own synthesizers into the software.
- Developing the tool was way too much fun for me to stop now, and I may even take a break from making games and see if I can do music for somebody else's Unity game.
Advertisement
Add Comment
Please, Sign In to add comment