Contents in order of relevance:
Double-tap media to expand the view; open project links for fuller details.
Some examples of past work relevant to the immersive arts context - real-time interactive installations and online-digital experiences, VR and XR, immersive audio - created in roles of Creative Developer and Technology Director.
AI Creative Technologist creating the app and AIs at the heart of Chorus of Light, the Samsung interactive immersive installation at Vivid festival of light. Near the Sydney Opera House, for a month, thousands of visitors each night walked through a 3D grid of volumetric light effects, to speak to tablets in the centre to share their hopes for the future as a custom volumetric light-effect filling the entire space. The web-app and AIs capture each person’s speech in any language, translate and speak it back in a generated voice for each language in turn, interpret its emotion and meaning into categories, to sequence a custom effect sent to play on the lighting system – a full-stack JS
and Node
app, views and interaction, audio-verbal inference and translation via Google Cloud
and Gemini
akin to Galaxy AI
, combination and time-sequencing of custom light-effects, real-time networking over OSC
and WebSockets
. Led this core tech from pitch to production – in collaboration with Amplify, Lightology, Susan Kosti, Ta-ku, Eat the Elephant.
2024.03–2024.05 [0
1
2
3
4
5
6
7
8
9
A
B
C
D
E
F
G
H
I
J
] ++
[CV++ Chorus of Light, installation, AI ~ project @ Samsung, Vivid, Amplify](https://epok-tech.notion.site/CV-Chorus-of-Light-installation-AI-project-Samsung-Vivid-Amplify-1476c002a33f80d0b393f4694596a62f)
https://www.youtube.com/watch?v=xAzJBpLp3Ys&t=1785s
https://www.youtube.com/watch?v=Zqn-XnIyFpQ
An experiment in nature-evoking art, physics, graphics, and interaction. Adapted into an interactive music video collaboration on Max Cooper’s Emergence project, featured on Experiments with Google and its Chrome Experiments Collection front page. Developed it further into a standalone visuals version for myriad live shows and events with audience interactions, and published Open-Source for reuse and learning. The nature-evoking visuals are an emergent physical system of interacting particles – complex organic forms emerge spontaneously, from simple interactions at the individual scale; in a fluid-like advection, particles recursively affect and affected by the field of motion. Executed the process via low-level stack.gl
, WebGL 1.0
, GLSL
, JavaScript
; custom-built partly as a learning exercise – creating custom General-Processing GPU methods, GPGPU
; keyframe-based animation system, and keyboard-driven timeline editing; audio-response triggers, based on audio waveform and Fast-Fourier Transform, FFT
, reacting to thresholds across time-windows in a calculus-based interpretation.
2018.01–2022.10 [0
1
2
3
4
5
6
7
8
9
A
B
C
D
E
F
G
H
] ++
[CV++ ***Tendrils, interactive AV ~ project @ Max Cooper***](https://epok-tech.notion.site/CV-Tendrils-interactive-AV-project-Max-Cooper-1476c002a33f8020bdbfda70acb83563)
Prototype-led exploration of new WebXR
APIs’ potentials for evocative multi-sensory ways to explore physical products on Oculus Quest 2. Iterated numerous prototypes for a fictional demo e-bike brand OeO; each exploring aspects of VR storytelling, interaction, immersion; culminating in a single, immersive, end-to-end experience of the e-bike and its features – evoking a feeling of weight when lifted, via other senses, haptics, physics; riding while steered smoothly on-course and avoiding motion-sickness; pulling apart components with both hands into exploded views; gaze interactions to encourage up-close exploration and curiosity; locomotion and teleportation in the environment, into seamless scene changes at action points. Research raised creative and technical insights to the team and client, throughout concept and development.
2020.12–2021.04 [0
1
2
3
...] ++
As the first featured partner in Amplify’s innovation initiative, in ***Futures x epok.tech*** I delivered the SpaceBeats prototype, leading a workshop for 3 teams of in-house creatives to explore and experiment with AI for creative use-cases. A sonic-spatial AI, allowing artists and fans to co-create music compositions communally by movement in space – a musician places melodies in a space using GPS, as the audience move in the space their devices play new compositions generated by AI interpolating their co-ordinates between the musician’s placed melodies, each person playing various musical parts and collaborations – the generated music branches from the artist's vision, influenced by the audience, a communal discourse mediated by AI. After weeks guiding the teams’ explorations of creative concepts, research, and executions of diverse routes – women’s football, multi-sensory event mementos, communal music co-creation – the latter emerged for me to develop into a working open-source prototype, with Google Magenta
and Tensorflow
for AI melodies, Fly.io
serves real-time socket.io
.
2023.04–2023.09 [0
1
2
3
4
5
4
6
7
8
] ++
[CV++ Futures: SpaceBeats, AI workshop, spatial-music ~ project @ Amplify](https://epok-tech.notion.site/CV-Futures-SpaceBeats-AI-workshop-spatial-music-project-Amplify-1476c002a33f80bb8719f236616be46e)
Creative-Technology Director of these rich sites showcasing HERE's vision for the future of location technologies – articles exploring "Autonomous World", "Articles and Content", and "VR"; the editorial UI tray smoothly overlaying slick WebGL
effects transforming seamlessly between forms for each theme – alongside a rich WebVR
experience, immersive exploration of a futuristic cityscape storyline.
2017.07–2017.10 [0
1
2
3
4
5
6
7
8
] ++
[***CV++ HERE Vision and VR, interactive 3D stories ~ projects @ B-Reel, HERE***](https://epok-tech.notion.site/CV-HERE-Vision-and-VR-interactive-3D-stories-projects-B-Reel-HERE-1476c002a33f801aaf25fac26b4b5996)