Asterism


Spinning moon

Full Moon
Dev Blog #02

16 May 2022


Curating visuals and events to music in Unity


This month I've been thinking about the code structure of Asterism, and how I can work with Unity in the most flexible and painless way throughout the project. I started by trying to identify the key systems which will need designing. While I want to allow myself room for the project to evolve creatively, I also want to try to put into place as early as possible tools and code that will allow me to find limitations to work within. I need to know what is possible and within my resources technically so that I can focus on being creative within that framework.

I made some notes in Miro on the questions that I need to answer, or at least look into, to help me focus my thoughts on what the game is. To do this, I need to spend some time experimenting with each of the questions.

[Notes on Miro - planning systems]

In Asterism I want to curate the visuals and mechanics to a series of songs. Alternating song sections will have different perspectives, environments, etc., similar to the way that many music videos have recurring themes and parallel stories running through the verses and choruses, perhaps with a narrative shift in a bridge or a new section of the song.

So I want some way of defining these sections and cutting between them exactly in time to the music. It would also be great if I were able to sync certain visuals to the beat.

Switching between song sections


When I worked on a curated musical gameplay scene in Before I Forget, I only had to worry about one audio track with four jump-cuts, so I didn't spend much time making a robust scene-switching system. Below is a screenshot of the way the code and scene is set up for this.

[Before I Forget hallway scene in Unity]

[Before I Forget hallway scene code]

I created a list of intervals in seconds between each jump-cut, and a countdown coroutine that triggers each time a jump-cut happens. When the countdown is complete the next jump-cut happens and various objects turn on / off, and bits of code update. This actually works fine and should always be in time with the music, considering that 'WaitForSeconds' is not framerate dependent (aside from at tiny fractions of seconds). However it would be much better if these jump-cuts were more closely coupled to the audio time itself, rather than the 'game time'.

When I wrote the code for my game Beat Skipping I used a similar system to control when the scene switching occurs. This uses Unity's Time.deltaTime, which again is not framerate dependent. However, this is still not directly coupled to the audio system and may end up out of sync on different computers.

[Beat Skipping code]

Since writing the above code, I've learnt about a useful Unity property called dspTime, which is based on the number of samples processed by the audio system, so it will always give an accurate position in time through an audio track.

Something that I did do in both these examples that I'll probably stick with for Asterism is to organise scene sections under parent gameobjects in the hierarchy that can simply be switched on / off at the right time. With this method I can also easily switch between character controllers and cameras for each section by organising them below this parent object too.

Events on the beat


Last year I did some work on a system to handle beat management from a song. It's a little bit of manual setup, but once the data is there it works pretty well at making things happen on the beat, as well as switching between song sections using dspTime as mentioned above. The BPM of the song is set at the top, and then I create a list of all the sections of the song where anything related to the rhythm differs. So for example, here the BPM is 90, but what I've called the 'beat rate' is like the drum beat, or at what fraction of the BPM do I want things to trigger on. Then I have a set of actions I can call from other scripts that will always occur on the beat.

[Song Manager class in the Unity inspector]

[Song Manager code]

This is the data for the song 'Transmission' which has a series of slow beeps in its intro, followed by a bassline and finally a drum beat. I did some visual testing a while back using the Song Manager to time events. You can see the beat rate triggering the visuals in the video below where the radar spins 360 degrees every 'beat'.



This is a really useful system that I will probably continue to use for any repeating events and visuals in the game.

Unity Timeline


You know that feeling when you find a tool or piece of software that not only does everything you need it to, but also makes your job more enjoyable and opens up new possibilities that you had never imagined? OK, maybe that's just me, but I'm not exaggerating when I say that I think Unity Timeline is The One. The one tool I need that feels like it was made for Asterism. I’ve discovered that I work in a pretty visual way with music and find it really helpful to organise my thoughts about a song alongside a view of the soundwave or shape of the music. A talk I saw by Anna Meredith about her process a couple of years ago was super inspiring to me in regards to this way of thinking. Here’s a great clip of her talking about her music sketches.

I’d been getting frustrated with my Song Manager script and how I don’t have a graphical representation of where the sections are within the track, as well as how manual and inflexible I knew it would be for the full project. While it does work (and probably will still be useful for ‘on-the-beat’ events), it’s such a fiddly setup. I knew that ideally what I wanted was a way to look at the soundwave of the track, and say “at this point in the song, trigger this event”. Unity’s animation tool has a system for this called Animation Events, but as the name suggests this is very catered to animations rather than audio. I briefly looked into middleware such as Wwise, thinking that I might be able to use custom cues to send event triggers over to Unity at points in the music. I didn’t get far enough to find out if this is possible, and wasn’t super keen on using middleware anyway since this method again felt fiddly.

What I really wanted was a Unity tool that allows you to place events along a visual representation of an audio track, use dspTime, and would work alongside my Song Manager script should I still want to trigger events on the beat. Well it turns out this tool has existed since 2018! Unity Timeline is essentially a video editor inside Unity, perfectly designed to curate cinematic cutscenes, or, music videos. I also found this brilliant tutorial that covers the basics of Timeline, and an incredible interactive example of it in action. I spent a few hours playing around with Timeline’s functionality, and looking into extending the functionality further, as well as experimenting with it in a Biome Collective game jam (0:24 - 0:36) and love everything about it so far. So the next step was to see if I could draft a whole song game scene from Asterism using Timeline.

Draft scene for “Human”


This is my first attempt at a playable draft of one of the songs on Asterism, so I’ll document my process as I go. I’m using the track ‘Human’ as my test song as it’s fairly short compared to some others and already has fairly strong visuals in my head. I’m yet to record the vocals (or any of the instruments), but the song is generally about self-acceptance and opening up to someone who is supportive of you.

You can listen to the rough version of 'Human' here:


When thinking about the game and the visuals, I started initially by imagining the environment and motion. The player begins by looking up at a sky of planetary rings drifting over them while they stand small and motionless on the ground. As the song progresses they get caught in an updraft that spins them up and around until they are floating on top of the rings they were looking at earlier. They fly through them and look back at the planet spinning far away.

[Sketch of gameplay ideas for 'Human']

After this I had a brainstorming session with my partner James. We listened to the track and talked through where each section would start and finish, as well as its primary emotions and narrative. I had set some limitations in terms of input (player movement, camera rotation, one interact button) and the types of interactions that can occur e.g. you can’t pick up an object, you can only ‘interact’ or trigger something by walking into it. This helped to come up with some more detailed ideas that fit in with my initial sketch.

I was very inspired recently playing Pale Machine by Ben Esposito as well as thinking about games like Wario Ware, which focus on one simple interaction in a series of vignettes. Bitsy has also been a constant source of inspiration for several years for me in the way that it keeps interaction to a minimum, but allows enormous amounts of variation and exploration within that.

Something I am keen to design into Asterism is the ability for the player to express themselves through the mechanics, and allow them to experience different visual pathways on repeated playthroughs. I’m really excited about some of the initial ideas from this brainstorming session on how this might work, but perhaps I’ll save the details of this for a future blog post!

[More detailed sketch of gameplay ideas for 'Human']

From this sketch I was able to make a game design plan for the scene on Miro, which I’ve divided into the distinct gameplay sections of the song and added details of input, movement, camera etc. that I should be able to translate into Unity (for higher resolution right-click the image and select Open in New Tab).

[Miro plan for 'Human']

Lastly this month I used these sketches and design plans to create a first pass of the playable scene for Human in Unity. Here's a recording of this scene being played through.



The controls are limited to a combination of 'look','move' and 'interact' as indicated by the UI in the bottom-left corner. It's been a very useful exercise for me to figure out if the interaction ideas I had work in practice, and for me to use in testing with other people to figure out what is or isn't enjoyable or clear. The visuals are all very placeholder but are a useful starting point too to get a sense of scale and colour. Below are a couple of screenshots of the Timeline editor for this scene.

[Timeline editor for 'Human']

[Timeline editor for 'Human']

I'll be making these drafts of each song as I continue with development to test out the ideas, and then refining the gameplay from there. Over the next four weeks though I'm going to take a look at some of the other areas of the game now that I am happy with this initial workflow for the songs.

Previous Full Moon Dev Blog

Next Full Moon Dev Blog

Sign up to my newsletter to get updates about the project!