08/06/17 – More Practical Work

Since last time

I have finished working through the Wwise 201 certification aimed at understanding interactive music methods. I thoroughly enjoyed this course and found it challenging and rewarding in equal measure. There were some elements which I felt were perhaps superfluous to my interests (making a sampled MIDI instrument for instance), however it’s good to even just put a pin in those things for future reference.

Integrating Wwise with Unity

Through the course, you build up the musical interactions on a level of the Cube demo, however, towards the end I began building my own simple scene in Unity to better understand the integration of Wwise events and their implementation through code. Much like FMOD, Wwise integrates into Unity with it’s own set of objects and C# commands. The objects seem more powerful than FMOD’s, however I haven’t used FMOD in a while now, and admittedly the last time I updated it there were new objects which had been introduced for controlling parameters without code. The Wwise objects are similar. There are objects which load soundbanks and events (on loading or other specified timings), as well as objects which change exist just to change switches, which is very useful. While it’s possible to tailor Wwise events to purely change switches, it’s nice to have a separate object which can do that as well.

Interactive music workflow

For the Unity scene I was building, I created a simple interactive music workflow in Wwise. It consists of a switch container called Music which contained five playlists: Intro, Pre-Verse, Verse, Verse End, and Bridge. Each of these playlists objects is assigned to a switch of the same name, with the exception of Verse End, which I’ll explain shortly. Each playlist object contains two or three music segments, and plays through each and loops at the final one in the list. So for example, when the Pre-Verse section plays it plays the first music segment containing 2 bars of harp line and piano without fill once, then goes on to play a 2 bar segments of harp and piano with fill and loops there. Then when the switch can be changed, and if it is changed from Pre-Verse to Verse, at the exit cue (the end of the bar) the Verse segment will play (along with any musical tails in the post exit cue section of the Pre-Verse segment), which adds a lead cello line. It totals 8 bars in length and loops indefinitely. If the switch is changed to Bridge (it can be changed to anything and will transition at the exit cue of the playing segment – the containers as structured in such a way that this will never seem jarring) there is a transition rule in place which plays the Verse End segment to link the two sections, and then plays the Bridge section, which ends on a looped Post-Bridge music segment, similar to the Pre-Verse, but with slightly different instrumentation.

This workflow was constructed using a track I had already written, and while there is a certain modularity to it, it was not written to be used as interactive music, and so it is not quite as flexible as it could be. That being said it’s a fine piece for proof of concept work.

Putting work in Unity

Next on the to do list was integrating the music and transitions within Unity. There were a few ways I could approach this. As detailed above, Wwise events can be used to change switches, but there are also bespoke switch changing Wwise objects in Unity. Initially I began by creating an event in Wwise called Game_Start which set the switch to Intro and then played the Music switch container, then a new event for each switch. I then mounted Game_Start onto the player on an AKEvent object, and loaded the soundback with an AKBank object. I then created three 3D objects (black monoliths) which were set to triggers, mounted them with the switch changing AKEvents, and set them to respond to AKTriggerEnter, meaning when the player passed through them (I’m still a little hazy if it’s the Game_Start AKEvent or the AKbank on the player which does the actual triggering…) the switch would be changed. I changed this after testing, retaining only one event (Game_Start), and then using the AKSwitch objects on the monoliths to change the switches instead, as this seemed to be removing an unnecessary element (the switch events).

Once this basic integration was up and working I then decided to work on changing RTPCs (real time parameter controls) through the game world, first through pre-defined parameters, then through code.

RTPC control in Unity

Something unique to objects in the Interactive Music Hierarchy and absent from the Actor-Mixer Hierarchy is a Playback Speed control in the general settings of objects. I don’t fully understand why this isn’t present in the Actor-Mixer Hierarchy, and can envisage myself potentially putting sound effects in the Music Hierarchy just to take advantage of this. But that is by the by. For this integration experiment, I decided that this would be a good variable to control because there would be very clear feedback, and because I want to work somewhere along these lines for my final project (more on that later…).

I created a game parameter in Wwise called SlowFast, tied it to the built in distance parameter, and then in the RTPC tab of the Music switch container I created an RTPC with Playback Speed along the Y axis and SlowFast along the X. I then moved the first point of the envelope of the RTPC to 0.25 on the Y and 0 on the X, and the second to 1.75 Y and 100 X (the picture below is from the code version where I dispensed with distance as a parameter and use a simple 1-10 scale).

RTPC

This meant that the further away from the object controlling SlowFast the listener was, the faster it would play the music, and the closer the listener gets, the slower it would get. In Unity I then found I had to mount the Game_Start event on an object (I used a sphere with a scale of 200 to represent the limits of parameter given it has a radius of 100 units) in order for the distance parameter to work. And work it did, however I realised that although I was successfully changing the parameter and therefore the playback speed of the music, I was no longer able to trigger the switch objects. Presumably this is because they are triggered by the Game_Start event and it was no longer on the player.

While I was satisfied that I had successfully controlled an RTPC in the game, I was disappointed that in its current state it the music transitions no longer functions, so set about implementing RTPC control through code in order to achieve both simultaneously (I recognise that I should have spent longer trying to achieve both dynamics just with game objects, but I suspect the problem is that I am relying on just one Wwise event).

RTPC control through C# code

C# code integration for Wwise seems… simpler than FMOD. I’m sure that’s a generalisation and again, I haven’t worked with FMOD for some time so things may have changed, however on a line for line basis, there seems to be far less to do with Wwise integration. It’s possible to load both sound banks and events in C#, however that wasn’t what I was exploring. Instead I start writing a script based on these tutorials where if the player held down one key (K) the music would slow and if they held down another (L) it would speed up. Changing an RTPC value in code is as simple as AKSoundEngine.SetRTPC followed by parenthesis in which you state the name of the game parameter in Wwise followed by the value you wish to set the RTPC to. In the script I created a float called playbackSpeed, a public float called Speed, and then created two if statements, which said if the key is pushed down (one for K one for L) then playbackSpeed += or -= (depending on the key) Speed*Time.deltatime; followed by AKSoundEngine.SetRTPC (“SlowFast”,playbackSpeed); and presto. It worked. the player could still trigger transitions in the music by passing through the triggers, and could control playback speed with K and L. To add a little more visual feedback I took the sphere I had used earlier, reduced it scale to 15, and then gave it a simple orbit script, making it orbit around the central monolith, like some black sun. I then added the above IF statements to the orbit script, so that the speed of the orbit would be tied to the playback speed of the music. Simple but effective.

Since this was written, I have subsequently added a cymbal stinger to the transition from the Intro and Pre-Verse sections to the bridge, as these don’t use the Post Verse 2 bar section as a transition (the main verse does). This is a nice little additional behavior which adds just a tiny bit of difference, but enough to demonstrate the seed of something greater. The final iteration of the above implementation can be seen in the video below:

Here is a short audio recording of the parameter being controlled:

Technical understanding before Thesis

As has been detailed above and in previous posts, I have been working hard to learn Wwise and understand it’s integration in Unity. I know I want to build my final project with these tools, and have some back ideas about what I want to explore with this project (time manipulation, subversion of expectation of the arcade genre), however, as Owen pointed out, I am still lacking a sound focused question. Now that I have finished the Wwise certifications and have made a proof of concept Unity scene of both interactive music integration and RTPC control though code, it is time to spend time developing a sound focused thesis which can tie together the work I have done so far. I have spoken to Sandy and he says the space invaders with a troubling narrative idea I came up with would be easy to implement, however it should remain in back up until I have a thesis which either ties directly into it, or which lets me develop another game utilising the tools I have gathered thus far. Let the thinking begin.

Post-scrip – One final, further alteration to the scene

It occurred to me, while playing around with the hierarchy structure in Wwise, that I could implement further chaos by assigning each individual music container a time stretch effect, controlled by SlowFast, but with hand drawn spiky envelopes for each, speeding up and slowing down the audio at random intervals, as seen here:

The result is a curious one, where individual instruments will speed up while others are slowing down, resulting in some loops finishing before others. It is chaotic, but also interesting. Demonstrated in the Unity scene here: