Dean Dot Dog

March 6, 2026

Morphisms

As a final project for the Gray Area Creative Code Intensive last year, I wanted to make something that explored how user interfaces create mappings between parameter spaces. That’s essentially what a user interface is: buttons, dials, switches, etc define an input parameter space that gets translated into an output parameter space, such as pixels on a screen. What kind of parameter spaces have interesting “shapes” to link together? What kind of user interface could I build that would be unexpected, dynamic, yet intuitive?

The resulting installation was a collection of three coupled systems:

A viewer encounters this scene as an orb set out on a pedestal, with the particle simulation playing on a projector or display in front of them, and music playing from speakers nearby. The viewer can pick up and handle the orb which acts like a controller for the surrounding visuals and audio. As they rotate the orb in their hands, the visuals and music evolve, reacting to the viewer.

This is what it looked like in action, apologies for YouTube compression artifacts:

Pretty neat! The particle system falls into different curlicue attractors depending on its parameters. The behavior of the particles is deterministic, but highly non-linear and unpredictable so it feels like a nice surprise when some clear lines suddenly emerge from the noise. I also really liked the orb as a control input: it has no natural orientation so every position feels as significant as any other, unlike knobs or sliders that are oriented around min and max values. All together, this creates a feeling like you are groping around in darkness: you are in control, but don’t know where you are going or where you have been, getting little hotter/colder signals around the edges that lead you towards interesting positions. Overall I was very pleased how the interaction design of this one came out.

Technical Design

Signal Flow

Inside of the 3d-printed orb is an ESP-32 microcontroller that broadcasts data collected from a BNO055 orientation sensor via OSC over wifi. It measures rotational orientation in quaternions as well as rotational velocity.

A Node server running on my laptop listens for the OSC messages, performs some light processing on the values, and re-broadcasts them to the other clients.

The particle simulation is javascript running in a browser, built on P5.js. While the system can appear complex, the math is actually fairly simple. First initialize a few thousand points to random positions, then repeatedly apply a map function:

$$x_{t+1} = \sin(x_t^a - y_t^a + b)$$$$y_{t+1} = \cos(c x_t y_t + d)$$

This takes a point \(x_t, y_t\) and moves its position in the next frame \(x_{t+1}, y_{t+1}\). The four parameters \(a\), \(b\), \(c\), and \(d\), are linked to the four respective quaternion coordinates and are updated continuously as the Node server streams them over SocketIO. Different values of these parameters result in very different emergent behavior so as the orb rotates in a viewer’s hands the particle system transitions chaotically (but continuously!) between different attractor patterns and noise, or collapses to a point. Additionally, the simulation calculates some variance statistics of the particle positions to measure how “noisy” the current state is and emits that back to the Node server to be converted to MIDI, so that the sound reacts to the state of the particles.

The last piece is a generative ambient music composition running in Ableton Live. The coordinating Node server converts the quaternion data from the microcontroller and noise statistics from the simulation into MIDI values that are mapped to control various effects knobs in Ableton. Changing the orb rotation cross-fades between the different musical voices, quickly accelerating the orb triggers a pitch shift effect, and a “noisy” state of the particle simulation introduces fuzz and distortion to the track.

Learnings and Additional Ideas

Background Inspiration

The seed of the idea came from this beautiful image I stumbled across made by Simone Conradi.

Conradi's Orbit

The image visualizes the behavior of an iterated map that takes a point \(x_t, y_t\) and moves it to a new position \(x_{t+1}, y_{t+1}\):

$$x_{t+1} = \sin(x_t^2 - y_t^2 + 3.669)$$$$y_{t+1} = \cos(2 x_t y_t + 4.419)$$

You can make an image like this one by randomly initializing a few thousand points and repeatedly applying this map function to them while keeping track of all the positions that they have visited, then build up those positions into something like a heatmap that shows where the points most commonly occur. Although the points are randomly initialized, they won’t stay randomly distributed over the space and will fall into attractor regions like the many of the curlicues in the image above.

This image was pretty cool, but I was immediately curious what other patterns would emerge if I tweaked the constants in that map function around. I coded up a simple python script to generate similar heatmap images and had a fun afternoon going back and forth between typing in new constants and generating pretty new images. And from there started thinking about how I might continuously move through this parameter space with realtime visualization of the resulting attractor shapes.