Every sound so far is dry. No reflections, no echoes. No room. This lesson adds the room.
Repertoire: Firelink Shrine (Motoi Sakuraba, Dark Souls, 2011), Temple of Time (Manaka Kataoka, Zelda: BotW, 2017), Temple of Time (Koji Kondo, Zelda: OoT, 1998). Three approaches to making space the instrument.
Also: Fujimusume (traditional Nagauta/Kabuki). The shamisen and the spaces between its strikes.
Parallel composition: Your ambient piece gets reverb and delay. The koto starts living in a room.
What you already know: Everything from EDM.0-3. Patterns, drums, bass, waveforms, ADSR, filtering, layering, panning.
A clap, raw:
The same clap in a room:
.room() is how much reverb. .size() is how big the space. Listen to the tail after the transient.
Simulated reflections of sound in a space. .room(amount) sets how much signal enters the reverb. .size(dimension) sets the room size.
Try .room(0.9).size(0.95) for a cathedral. Try .room(0.3).size(0.2) for a bathroom. The size changes the character of the reflections.
Now put the koto in a room:
Play it without the .room() line first. Then add it back. The notes are the same. The space changes everything.
Reverb is many reflections blurred. Delay is distinct echoes:
One note. The delay repeats it, each echo quieter. .delay() sets the echo level. .delaytime() sets the gap. .delayfeedback() controls how many times it repeats.
.delay(amount) = wet level. .delaytime(seconds) = gap between echoes. .delayfeedback(amount) = how many echoes. Above 0.8, the echoes can run away.
Try .delaytime(0.5) for slower echoes. .delayfeedback(0.7) for a longer tail. Now combine both: add .room(0.6).size(0.8) to the delayed koto. Reverb on the echoes.
The signal splits. Dry = original. Wet = effected. .room(0.5) means 50% wet. .delay(0.3) means 30% delay mixed in.
Delay has a feedback loop: output feeds back into input. Each pass is quieter. At 0.5, each echo is half the volume. At 0.9, echoes barely die. Above 1.0, runaway feedback.
You now have reverb and delay. Listen to what happens when composers make space the primary compositional tool.
Dark Souls: Firelink Shrine (Motoi Sakuraba, 2011). A handful of notes in a massive acoustic space. The melody is almost nothing. The room does the rest:
Remove .room(0.8).size(0.9) and play it again. Without the space, it’s just notes. With it, it’s a place.
Zelda: Breath of the Wild: Temple of Time (Manaka Kataoka, 2017). Piano notes floating in silence with long delay tails:
Zelda: Ocarina of Time: Temple of Time (Koji Kondo, 1998). Choir pad, bells, sustained harmonic wash:
Sakuraba: reverb on sparse guitar. Kataoka: delay on sparse piano. Kondo: sustained pad with reverb, bells with delay above it. Same tools. Three worlds.
For 30 years, Zelda games had melodic, hummable soundtracks. Every area had a song. Then Breath of the Wild shipped with almost nothing. Sparse piano notes. Long silences. Environmental sound.
Hajime Wakai (sound director): the world was the music. Wind, rain, footsteps, birds. The piano notes weren’t a soundtrack. They were punctuation. The silence between them was the composition.
The player was supposed to feel alone in a huge landscape. A full orchestral score would have broken that.
Kondo’s OoT Temple of Time fills every moment with sound. BotW’s fills it with the absence of sound. Both use .room(). The aesthetic decision is what differs.
In the Kabuki dance Fujimusume (The Wisteria Maiden), the shamisen plays sparse, percussive strikes with long silences between them. The strikes are sharp and bright. The silences are structural. The dancer moves in the spaces the music leaves.
Same principle as Firelink Shrine and BotW: the sound event matters, but the space around it matters more.
Nagauta (“long song”) is the musical form behind Kabuki dance. It combines shamisen, voice, and percussion ensemble. The percussion uses kakegoe (rhythmic shouts) between beats. The shouts aren’t decoration. They’re timing cues that live in the rests, shaping the negative space.
Your koto plays sparse notes with rests between them. Your shakuhachi breathes in the gaps. When you add .room() and .delay(), you’re doing what the Kabuki theater does acoustically: letting the space between events become part of the composition.
A pad is a sustained sound that fills the harmonic space. Long attack, long release. It fades in and fades out. The OoT Temple of Time used one. Here’s how to build it.
Three notes stacked: E3, B3, E4. A sawtooth filtered down to warmth. The .attack(2) means it takes 2 seconds to fade in. The .release(2) means it lingers after the note ends. That slow breathing is what makes a pad a pad.
Change the chord: try [a3,c4,e4] for A minor. Try [d3,f3,a3] for D minor. Adjust .lpf() between 300 (dark, buried) and 1200 (bright, present). The filter is the pad’s personality.
Now layer it under a melody:
The pad holds the harmony. The melody floats above it. This is the OoT Temple of Time architecture: wash + detail.
Sawtooth (or square) wave. Filter it low (.lpf(400-800)). Long attack (.attack(1-3)). High sustain. Long release. Add .room() for space. Stack 2-3 notes for a chord. That’s a pad.
This is the transformation. Same notes from L0-L3. But now they live in a space.
Five layers. The koto decays into reverb and delay trails. The shakuhachi breathes in the spaces. Compare this to the L0 version. Same notes. Different piece. The effects are the transformation.
Add a pad underneath: replace the bass drone with note("[e2,b2,e3]").s("sawtooth").lpf(400).attack(2).sustain(0.5).release(2).room(0.9).gain(0.2). Different foundation. Different feeling. Try removing the pulse entirely. Does the pad hold the piece on its own?
Two versions. The first: your ambient piece with everything from L0-L4a. The second: a beat underneath it. Play both. Decide which is yours.
[a2,c3,e3]. How does the mood shift?.delayfeedback() on the koto to 0.6. The echoes last longer. Does it feel more spacious or more cluttered?| tool | does | looks like |
|---|---|---|
| .room() | reverb amount (0-1) | .room(0.5) |
| .size() | room size (0-1) | .size(0.8) |
| .delay() | delay wet level (0-1) | .delay(0.4) |
| .delaytime() | gap between echoes (seconds) | .delaytime(0.5) |
| .delayfeedback() | echo decay (0-1, above 0.8 = danger) | .delayfeedback(0.3) |
| pad | sustained chord with long ADSR + filter | sawtooth + lpf + slow attack/release |
Next: Chords. Intervals, triads, and dark progressions.
Tracks that demonstrate this lesson’s concepts.
| artist | track | why |
|---|---|---|
| Motoi Sakuraba | Dark Souls: Firelink Shrine (2011) | (game) sparse guitar in massive reverb |
| Manaka Kataoka | Zelda BotW: Temple of Time (2017) | (game) piano notes in silence, delay as composition |
| Koji Kondo | Zelda OoT: Temple of Time (1998) | (game) choir pad + bells, the original atmospheric Zelda |
| Brian Eno | Music for Airports 1/1 (1978) | (ambient) the genre definition: tape loops at different lengths creating slowly shifting patterns |
| Boards of Canada | Music Has the Right to Children (1998) | (electronic) filtered, degraded, nostalgic textures |