Dynamic Music using Sequenced Sound Modules

Mitchell Boot (500767952) – Gameplay Engineering

Table of contents

  1. Introduction
  2. Goals and Scope
  3. Implementing SharpMik
  4. Event areas
  5. Rhythm Mechanics
  6. Conclusion
  7. Sources

Introduction

Many games these days make use of some sort of dynamic music to fill up their soundscape. Usually this works through rendering multiple version of the same song and dynamically switching and fading between these to create a piece of music that evolves during the gameplay. For this project I wanted to create a system like this but instead make use of more old-school sound technology using tracker modules. These modules are made up out of pitch instructions and sample data that is played back in real time to create the music, in contrary to recordings that are mainly used these days. Because the music is played back in real time I can easily manipulate the playback of these files and create music that reacts to the environment and vice versa.

Goals and Scope

My goal for this project is to implement tracker module support into Unity and give developers tools to manipulate the playback of the music during runtime. This dynamic music would be controlled through an AOE (Area of effect) system, where the different playback paramaters will be adjusted when the player enters and leaves certain areas. Later in the project I decided that I wanted to increase this scope to also include the creation of rhythm based mechanics. For this I decided to take inspiration from the JRPG game Mother 3 and have the rhythm mechanics take place in a turnbased battle system. All in all, the specific goals I have set up for the project are:

  • Add real time module playback to Unity
  • Create an AOE system that changes how the module is played back when a given object enters it. The following elements can be altered by these events:
    • Loading up new music
    • Muting & Unmuting of channels
    • Changing the playback volume of channels
    • Change playback tempo
      • Tempo change should have a smoothing option
  • Create a system that can read out the currently playing note data and send it over to a given script. The note data that’s being sent over should contain:
    • The current song position
    • The currently playing instrument
    • The pitch that’s being played
    • The channel the instrument is being played in

Implementing SharpMik

For this project I’ve decided to make use of an open source C# module player called SharpMik. We can then hook this up to the Unity Audio Source player to play back the sound data.
When playing back a module file in SharpMik I get the audio data returned in the form of PCM data. PCM (Pulse-Code Modulation) is the way audio data is stored, this is an array of floats ranging between -1f and 1f. Each of these floats represents a position the speaker needs to be at at that point in time, by quickly playing these points back the speaker vibrates at a very fast frequency and creates the desired sound.

Figure 1: PCM data of a sine wave represented in Audacity as a waveform

We can use this PCM audio data sent from SharpMik to playback the modules in Unity. Here we make use of OnAudioFilterRead(), this function takes a list of floating points and sends that to an attached Audio Source component.

void OnAudioFilterRead(float[] dataIO, int channelsRequired)
{
    if (player == null || driver == null)
        return;
 
    driver.Mix32f(dataIO, channelsRequired, finalGain);
}
 
public virtual void Mix32f(float[] dataIO, int channelsRequired, float gain)
{
    // We force 16-bit mixing, so we're working in 2-byte chunks.
    uint outLen = (uint)dataIO.Length * 2;
    float normalize = gain / 32768f;
    if (playerBuffer.Length < outLen)
        playerBuffer = new sbyte[outLen];
 
    uint inLen = WriteBytes(playerBuffer, outLen);
    for (uint w = 0, r = 0; r < inLen; ++w, r += 2)
    {
        dataIO[w] = ((playerBuffer[r] & 0xff| (playerBuffer[r + 1<< 8)) * normalize;
    }
}

The DataIO array is what is ultimately being played back in Unity, this array is adjusted in the Mix32f function which retrieves the data from SharpMik using the WriteByes() function. This data is then multiplied with the desired volume the audio is supposed to play back at using the gain variable and is played back at a bit depth of 16 by dividing that by the 16 bit number range (32768). This way we can make sure the PCM data is in between the -1f and 1f number range as needed.

Lastly this is all called from a ModAudioManager object. I decided to implement this using the Singleton pattern to make sure that only one instance of the audio manager can be activate at a time, this also makes it possible to efficiently call functions from the manager from other objects without needing to store a reference to the ModAudioManager object and call GetComponent<>().

While using singletons could be considered bad practice by some, I found that in this case it would be appropriate to give all objects access to the currently playing audio data. As I’d like to have objects such as NPC’s, game managers and background objects access to it for thing like rhythm mechanics and having animations play to the rhythm of the sound.

Event Areas

With the SharpMik player set up we can properly play back our module files. Now we can use the player to adjust how the data is played back. Since I wanted to make a system for this based on position I decided to make use of Compound Colliders in Unity. Here the RigidBody component is attached to an initial parent object, then we can add children that have the Colliders added to them. Then when one of these colliders is triggered the OnTriggerEnter() function on the parent collider is activated.

Figure 2: Initial settings of the ModEventArea object

When the player enters one of the hitboxes the OnTriggerEnter() function of the ModEventHandler script will be called. By adjusting these settings you can queue up new module files, mute/unmute any desired channels and adjust the tempo.

private void OnTriggerEnter(Collider other)
{
    //check if the collider has the selected layermask
    if((target.value & 1<< other.gameObject.layer) != 0)
    {
        //load in a new module when available
        if (moduleAsset != null) LoadModule();
 
        //destroy gameobject after loading the module when flag is set
        if (destroyOnTrigger) Destroy(gameObject);
 
        //muting and unmuting channels when there's entries in the array
        if (muteChannels.Length != 0) MuteChannels(muteChannels);
        if (unmuteChannels.Length != 0) UnmuteChannels(unmuteChannels);
 
        //change the tempo of a module when its not set to 0
        if (tempo != 0) ChangeTempo(tempo);
    }
}
 
private void LoadModule()
{
    SharpModManager.Instance.ApplyModule(moduleAsset, doesLoop);
}

These functions ultimately call their respective functions in the ModAudioManager object which handles the interaction between the frontend in Unity and the SharpMik source code.
Using the OnTriggerExit() function it’s also possible to revert the muting/unmuting of channels and tempo changes when the flag is set in the inspector.

Rhythm Mechanics

Now that I implemented a system that has the music react to gameplay I wanted to make use of the module player to have the game react to the music as well. For this I decided to implement an Observer Pattern in the module player itself. With this, functions from other objects and scripts can add themselves to a list of subscribed functions through the ModAudioManager. This list of functions is then called every tick whenever an instrument is playing and gets sent a class holding data for the current tick.

private static List<Action<ModTickData>> subscribers = new List<Action<ModTickData>>();

public static void SubscribeTickUpdate(Action<ModTickData> action)
{
    subscribers.Add(action);
}

private static void NotifySubscribers(List<ModNoteData> modNoteData)
{
    //create new tickdata object to send to all the subscribed functions
    ModTickData modTickData = new ModTickData
    {
        patternPosition = s_Module.patpos,
        songPosition = s_Module.sngpos,
        noteData = modNoteData
    };
 
    foreach (var subscriber in subscribers){
        subscriber(modTickData);
    }
}
namespace SharpMik.Unity
{
    public class ModTickData
    {
        public int patternPosition;
        public int songPosition;
        public List<ModNoteData> noteData;
    }
 
    public class ModNoteData
    {
        public int sample;
        public int channel;
        public int pitch;
    }
}

The ModNoteData class is filled up in the sample handling section of the SharpMik Modplayer script. With this we can have events trigger whenever a specific sample is played, in a specific channel and at a specific pitch.

Another possible approach was to update a public modNoteData object and have each object check for updates every frame. This is easier to implement but I quickly noticed that when there were more and more objects that made use of this data the execution time became a bit slower. It was also redundant to have multiple objects try to grab the same data constantly and check if it was actually updated or not. That’s why I ended up making use of the observer pattern approach, since this sent the data over to each object only when it was updated, making it much more efficient.

void Start()
{
    SharpModManager.Instance.SubscribeTickUpdate(SwapMaterial);
}
 
private void Update()
{
    if(compareIndex != index){
        meshRenderer.material = materials[index];
        compareIndex = index;
    }
}
 
private void SwapMaterial(ModTickData tickData)
{
    foreach(var note in tickData.noteData)
    {
        if (note.sample == reactToSample)
        {
            index++;
 
            if (index == materials.Length) index = 0;
        }
    }
}
Figure 3: Demonstration of the above code set up to react to the snare drum

Lastly, I used this to create a rhythm system. By setting up an empty sample in the module you can create flags that the game can read and process. When you hit one of these flags a timing window starts in which the player should hit space. However an issue with this is that it doesn’t account for the fact that players may hit spacebar a small bit before the flag has actually been reached, which results in a seemingly dropped input.

I had 2 ideas on how I could address this issue. Initially I simply added a flag one tick before the proper beat and started counting from there. This however ended up being unreliable as you lose a lot of control over when the timing window starts, this method also causes the timing window to increase or decrease drastically based on the tempo of the song.
Instead of this method I decided to simply delay the check whether the input counts or not for any early inputs. This means that the game doesn’t immediately react in case you’re early, in practice however this isn’t all too noticeable.

//code if the spacebar is hit within the timing window
//Input.GetKeyDown(KeyCode.Space) checks for a late hit within the timing window
//beatFrame is a check for the early hit within the timing window
if (currHitWindow != 0 && (Input.GetKeyDown(KeyCode.Space) || beatFrame))
{
    currHitWindow = 0;
    startWindowCount = false;
 
    //do damage to enemy
    enemyUnit.TakeDamage(playerUnit.damage);
    //update slider
    float sliderAmount =  (float)enemyUnit.currHP / (float)enemyUnit.maxHP - 1;
    enemyHealth.offsetMax = new Vector2(sliderAmount, enemyHealth.offsetMin.y);
    //play hit sound
    audioSource.Play();
}
 
//start timing window when the spacebar is activated or the current tick has a flag
if (beatFrame || Input.GetKeyDown(KeyCode.Space))
{
    startWindowCount = true;
    beatFrame = false;
}
 
if (startWindowCount)
{
    currHitWindow++;
    if (currHitWindow > hitWindow)
    {
        currHitWindow = 0;
        startWindowCount = false;
    }
}
 
//player gets 4 chances to hit an attack
if (hitAmount == 4)
{
    battleState = BattleState.OPPONENT;
}

Conclusion

In the end I managed to get done a dynamic music system that reacts to the player and a system that has the game react to the music using the module player. I’m quite happy what I’ve done for this project since this is a project I’ve been wanting to look deeper into for quite some time now. Sadly I didn’t get as much done as I had hoped, some things I planned that are missing at the moment are:

  • Channel volume control in the event system
  • A finished RPG battle system, quite a few of the mechanics had been finished but it wasn’t yet in a presentable state
  • A transition between the overworld section and the battle system to create a proper gameplay prototype

Sources

Related Posts