Here is the video documentation for my latest project, the Audiovisual Sandbox. It was built using Processing for the Programming for Artists module at Goldsmiths.
Download the code from my GitHub.
My intent was to explore basic audiovisual interaction using Processing. I wanted
to build a ‘sandbox’ of particles that react to a sine wave passing through the
centre of the screen. When the particles collide with the sine wave, they are to
bounce off of it. Consequently I have had to explore methods of collision detection
in this project.
First, a waveform is created using the Minim library. This wave is output both sonically,
though the speakers, and visually, as a drawn animated waveform. It is worth noting that the
width of the image is equal to the buffer size: 1024. Therefore, each pixel across the screen
represents one sample.
Two arrays of the particle class are also created: a white and red set. The white particles spawn
from the top of the screen, and the red from the bottom. This has been used to help differentiate
the particles (since one large mass of particles can be difficult to observe), but also serves
to help debugging: it becomes more apparent when one particle escapes to the other side of the
waveform when it is a different colour.
Then we begin checking for collision. If the particles collide with the sides of the screen,
they rebound off of it. In future implementations I would like to include some sort of physics,
where there is a slight increase in speed immediately after bouncing off the wall, as would happen
in real life.
Then we check for collisions against the waveform. First, we check if the particle is in the ‘collision
area’ – the segment of the screen which the waveform oscillates in, and therefore where collisions
are possible. If this is true, then we take the particle’s x position and find its equivalent sample;
if xPos = 249, then we lookup sample 249 in the buffer. The value of this sample is then read (using
only the left audio channel for simplicity), and translated into a y position, matching where the
waveform is at that particular point on screen.
Now that we have both the x and y position of a part of the waveform, we can check if the particle is
colliding with it. We check within a range (called the ‘margin’), so if the particle and waveform positions
are within this range, we deem a collision to have occurred. This is because only checking if the two
values are exactly equal will mean we miss many collisions.
If a collision has occurred with a downward-moving white particle, we inverse the y speed so that it travels upwards, back towards the top of the screen. Similarly if the particle is an upwards-moving red one, the y speed is inversed so it travels downwards. If we were to inverse the y speed of an upwards-moving white particle (or downwards-moving red particle) this would give us an undesired result and cause glitching, since we want the particles to rebound off the wave and back towards the side of the screen they spawned on.
When collisions with the waveform occur, one of two samples is triggered (related to the particle’s colour).
Once we determine a collision has occurred, we stop checking for collisions against the waveform. Once a set period of time has passed (called ‘delay’), checking resumes. This is to prevent the particles from hitting multiple times against the same part of the waveform, which could cause it to glitch to the other
This sketch can provide a number of different results depending on the amount and size of particles,
as well as changing the amplitude, frequency, and shape of the waveform. Try playing around with
these features to see what happens!
The code presents quite a good representation of one possible implementation of my initial concept,
but can definitely be refined and expanded. The main challenge was getting the particles to interact
properly with the waveform.
– Some particles still manage to escape to the other side of the waveform
– Slower particles can pass through several cycles of the sine wave once they have collided with it,
since collision is no longer being checked. Realistically the particles should continue to bounce
off of the waveform, with a change in x direction, as well as y direction.
This project only scratches the surface of the overall ‘audiovisual sandbox’ concept.
There are many ways this project could be developed, including (but not limited to) the following:
– Using tangents to determine at what angle the particle hits the waveform, and changing direction
– Including an adaptable collision area that grows and shrinks along with the waveform’s amplitude
– Adding forces/physics, so that after a collision the particle speeds up as energy is transferred
to it. It would then return back to normal speed by simulating drag.
– Adjust the margin and delay values so that they are dependent on the speed of the particle
– Have particles be attracted to the waveform instead of rebounding off them, with a force of attraction
proportional to the ampitude. It would be interesting to see if you could observe the different
wave shapes, without drawing the wave, but just by looking at the movement of the particles
– Enable more control over the waveform e.g. multiplying several sine waves together to create different
timbres and visual patterns
– A version could also be created that reacted to microphone input rather than a synthesised wave. If
the microphone is stereo, you could even build a different layer of interactivity where blowing into
the microphone from the left side causes the particles to scatter over to the right side of the screen,
and vice versa