An experiment in nature-evoking art, physics, graphics, and interaction. Adapted into an interactive music video collaboration on Max Cooper’s Emergence project, featured on Experiments with Google and its Chrome Experiments Collection front page. Developed it further into a standalone visuals version for myriad live shows and events with audience interactions, and published Open-Source for reuse and learning. The nature-evoking visuals are an emergent physical system of interacting particles – complex organic forms emerge spontaneously, from simple interactions at the individual scale; in a fluid-like advection, particles recursively affect and affected by the field of motion. Executed the process via low-level stack.gl
, WebGL 1.0
, GLSL
, JavaScript
; custom-built partly as a learning exercise – creating custom General-Processing GPU methods, GPGPU
; keyframe-based animation system, and keyboard-driven timeline editing; audio-response triggers, based on audio waveform and Fast-Fourier Transform, FFT
, reacting to thresholds across time-windows in a calculus-based interpretation.
2018.01–2022.10 [0
1
2
3
4
5
6
7
8
9
A
B
C
D
E
F
G
H
] ++
[toggle any arrows for details ~ double-tap media for full-screen ~ try any broken links on **archive.org]**
glsl-optical-flow
.AudioContext
APIs, so music and ambient sounds trigger visual reactions; this uses a GLSL
calculus-based method on the raw audio data signals, triggering responses to changes surpassing chosen thresholds across any of multiple tiers of processed signals.https://epok.tech/tendrils?loop_presets=5000
https://www.youtube.com/watch?v=Zqn-XnIyFpQ
https://www.youtube.com/watch?v=xAzJBpLp3Ys&t=1785s
https://www.youtube.com/watch?v=z8pUTbqRqRg