Exploration: Interface Device as Instrument

How can we apply the principles of musical expression to create more expressive game controller input?

What if deliberately applied used musical articulations to game controller input?
What if we deliberately applied used musical articulations to game controller input?

What if the aspects of music are directly mapped to the input used to control a game, together with the resulting behaviour from the game itself? Aspects of music1 are:

  • Pitch
  • Dynamics
  • Rhythm
  • Articulation
  • Timbre
  • Order

Pitch is the frequency of a musical tone, Dynamics refers to the volume and/or velocity of a certain note, Rhythm is a regularity of notes to indicate a pattern to follow, Articulation is the intensity and speed/duration of the variations with which this tone is played, Timbre is the specific sound quality of the instrument of a certain note and Order is the composition of notes.

Pitch is a one-dimensional axis, and can both be seen as a ‘stepped’ axis (notes like A, Bb and C#) or as an analog axis (all the tones between A and A# for example).

Chromatic scale, indicating stepped pitch variations (one-dimensional axis)
Chromatic scale, indicating stepped pitch variations (one-dimensional axis)

This could be mapped to the different available buttons (A button for A pitch, B for A# etc).

Pitch could easily be mapped to the different buttons on a game controller
Pitch could easily be mapped to the different buttons on a game controller
Or could the analog stick control the pitch-axis, mapping the notes C, C#, D etc along the horizontal/vertical axis of the stick (ranging from 1.0 to -1.0)
Or could the analog stick control the pitch-axis, mapping the notes C, C#, D etc along the horizontal/vertical axis of the stick (ranging from 1.0 to -1.0)?

Dynamics refers to the volume and velocity of notes, so it seems almost natural to map this to pressure of the input device. Dynamics could make use of the pressure sensitivity of a certain button, where the player can make variations, like starting with a firm press, or smoothly transitioning from unpressed to pressed.

This example illustrates how there could be variations in button presses when they have sufficient levels of pressure sensitivity
This abstract example illustrates how there could be variations in button presses when they have sufficient levels of pressure sensitivity
Some more examples for expression in gradually  changing volume (taken from 'animation easing curves')
Some more examples for expression in gradually changing volume (taken from ‘animation easing curves’)

A volume pedal is used by rolling your foot on the pedal, gradually controlling the volume (from 0.0 to 1.0). This could be used to do the Tremolo effect
A (music) volume pedal is used by rolling your foot on the pedal, gradually controlling the volume (from 0.0 to 1.0). This could be used to do the Tremolo2 effect. Can this be mapped to game controllers too?
Rhythm could be determined by the game, or the player could receive benefits from using inputs in a self-determined rhythm. But seeing how rhythm games are a genre on its own, it seems that this aspect of the overlap between music and games is already well-represented.

The colored lines on the fretboard indicate if and how long a note should be held
The colored lines on the fretboard indicate if and how long a note should be held

Articulation is a part of the broader term dynamics, but focuses more on accent (putting emphasis on particular notes) and the intensity of playing particular notes. This website 3   gives a good overview of a lot of the different articulations a music player can use while playing notes.

So how can we apply the "vibrato" technique to pressure sensitive game buttons?
Could we apply the “vibrato” technique to pressure sensitive game buttons?
Bass player Esperanza Spalding playing rhythmically by accentuating and articulating specific notes
Bass player Esperanza Spalding playing rhythmically by accentuating and articulating specific notes

Timbre concerns the sound (the ‘final’ output) of the instrument, for example: a C note played by a violin sounds different than the exact same C note on a piano. So Timbre could map to the output of the game; the audiovisual feedback. Like one musical synthesizer that can produce different kinds of sounds (like simulated pianos, harpsichords, and even drums), the input of the game controller (like Sony Playstation’s DualShock controller or Nintendo’s WiiMote) itself does not change with every game, the buttons stay in the same place, but it’s their interpretation by the game (to which I’m making the analogy to Timbre) that changes.

Part 2 of this article wil focus more on answering the question:

How could all these mappings result in more player expression in games?

Leave a Reply