Module 2
Formstorming
Weekly Activity Template · Interactive Sound · P5.js · Media Motion and the Body
By documenting these twenty-five distinct scenes, I have captured the subtle transition from merely hearing daily noise to actively identifying valuable media assets that exist all around the campus. There are some sounds which are purely functional, such as the constant humming of an air conditioner, while others are just brief interactions like the sharp clink of a metal spatula or the mechanical click of a door button. This exercise, which focuses on objects ranging from library chairs to telephone keypads, allowed me to build a rich sonic library while developing a disciplined habit of observation and respect for my surroundings.
Activity 1
✦
Activity 2
✦
Project 2
✦P5 Interactive Audio Web Header
The Sound of Melting · Voice-driven Interaction
How did you engage with the project theme?
I engaged with the project theme by designing an interactive sound experience called The Sound of Melting. The concept treats sound as a physical force — something that can transform the material state of an object. I used a soft-serve ice cream cone as the interaction subject because it is familiar, thermally unstable, and emotionally charged. The cone is built from 65 overlapping semi-transparent circles. When users speak into their microphone, their voice drives three distinct physical responses: Smash (solid rupture), Melt (liquid droop), and Freeze (crystallisation through silence). Each mode gives the user a completely different interaction contract with the same visual form.
What was successful?
The three-mode physics system worked effectively. Each mode responds to microphone amplitude in a visually distinct way — Smash explodes scoops radially outward, Melt droops them downward under gravity, and Freeze holds them completely still until silence is detected. The spring-return physics ensures that all particles always come back to their home positions, keeping the sketch recoverable after any interaction. The landing page with three mode-selection cards also worked well — it clearly communicates the concept before any sound is made and reduces the cognitive load of entering the experience. The overall colour system (pink for Smash, warm orange-red for Melt, grey for Freeze) helped users immediately feel the temperature and energy of each mode.
What was challenging?
The biggest technical challenge was the browser gesture requirement. Web Audio API blocks microphone access until the user clicks something, which caused a blank screen on first load. I solved this by adding a click-to-enable screen before the sketch starts. A second challenge was off-screen particles — when high amplitude was applied, scoops flew off the canvas and never returned. I fixed this with a spring-return force on every particle. A third challenge was low contrast in Freeze mode: the grey scoops became nearly invisible against the light blue background. I solved this by increasing stroke weight and darkening the stroke colour specifically for that mode. Finally, mode-switching mid-interaction caused velocity carryover — I fixed this by resetting all particle velocities to zero on every mode switch.
How did you explore the lecture content?
I drew on lecture content about the four dimensions of sound experience — physiological, cognitive, emotional, and behavioural. Voice as a physical input (physiological), immediate visual causality between sound and form change (cognitive), the affective charge of ice cream as a comfort object under stress (emotional), and the distinct interaction loop each mode creates through its physics parameters (behavioural). I also researched Bill Viola's The Messenger (1996) and Granular-Synthesis's Modell 5 (1994) as artist references. Viola's loop structure and body-as-transformation-site concept directly informed how I designed the rest state. Granular-Synthesis's idea that form emerges from massed identical units informed the 65-circle scoop architecture.
How does this website summarize your work and go beyond activities 1 & 2?
Activity 1 built a sonic library of 25 campus sounds and introduced the idea of voice as a force rather than a signal. Activity 2 explored visual responses to those sounds through 25 GIF iterations, narrowing the form toward the soft-serve cone and spring physics. The final project takes those discoveries further by combining all three outcomes into one coherent system: a multi-mode voice-driven sketch with a professionally designed landing page, physics tuned per mode, and a visual language consistent across all states. The project goes beyond the activities by introducing mode-switching architecture, semantic idle breathing (Freeze holds still, Melt sags, Smash breathes gently), and a fully published interactive experience on the Phoenix server.