Choreographic Interfaces
Choreographic Interfaces: Embodied Digital Interactions Through Movement-Driven Shader Systems
This project explores the intersection of movement, computer vision, and real-time visual transformations through an interactive digital environment. By tracking the upper body's position in space, the system creates a responsive canvas where movement becomes both the input mechanism and artistic expression.
Using a webcam and PoseNet machine learning model, the interface tracks specific body parts (wrists and elbows) and maps them to different visual effects created with GLSL shaders. As the user moves within the frame, their body directly manipulates these shader effects in real-time, creating a dynamic visual composition that responds to their choreography.
Keyboard Controls
While exploring the interface, you can use these keyboard shortcuts:
- D - Toggle debug overlay (shows tracking data)
- F - Toggle fullscreen mode
- T - Adjust influence duration
- M - Adjust influence strength
Click on any quadrant to enter fullscreen mode for that shader, or click again to return to the four-quadrant view.
Concept & Inspiration
This project investigates the relationship between human movement and computational aesthetics. Drawing inspiration from dance notation systems, interactive art installations, and real-time visual performance tools, the Choreographic Interface creates a dialogue between physical expression and digital visual language.
Unlike traditional interfaces that rely on conscious input (clicking, typing), this project makes the entire body the input device. The result is an embodied interaction where movements create cascading visual effects, encouraging exploration through gesture and spatial positioning.
Technical Implementation
The system is built with:
- p5.js - For canvas handling and basic drawing functions
- ml5.js - Provides PoseNet functionality for body tracking
- WebGL & GLSL - For hardware-accelerated shader effects
- JavaScript - Manages the interaction between tracking and visualization
The interface is divided into four quadrants, each with a unique shader effect. When a tracked body part (wrist or elbow) enters a quadrant, it activates and controls the corresponding shader effect in real-time.
Body Part to Shader Mapping
Body Part | Shader Effect | Description |
---|---|---|
Left Wrist | Gradient Shader | Colorful gradients that respond to position and velocity |
Right Wrist | Swirl Shader | Spiral distortion that follows movement |
Left Elbow | Wave Shader | Ripple effects that emanate from position |
Right Elbow | Kaleidoscope Shader | Symmetrical patterns that mirror movement |
Shader Influence System
The most unique aspect of this interface is the "shader influence system," which allows effects to transfer between body parts based on movement. When a body part moves quickly, it temporarily influences another shader, creating a cascade of visual effects.
Step | Source | Target | Influence |
---|---|---|---|
1 | Right Wrist | Left Wrist | Swirl → Gradient |
2 | Left Wrist | Right Elbow | Gradient → Kaleidoscope |
3 | Right Elbow | Left Elbow | Kaleidoscope → Wave |
4 | Left Elbow | Right Wrist | Wave → Swirl |
This creates a circular relationship that allows effects to cascade through your body movements, encouraging exploration of the entire movement space.
Shader Implementation
Each shader is implemented as a GLSL fragment shader that receives input parameters such as:
- Hand position (x, y) - Normalized coordinates of the controlling body part
- Velocity - How quickly the body part is moving
- Time - For animation effects
- Influence parameters - When another shader is influencing this one
Future Directions
This project opens up several avenues for future exploration:
- Full-body tracking for more complex interactions
- Audio reactivity to integrate sound and movement
- Multi-user collaborative environments
- Recording and playback of movement sequences
- Integration with live performance contexts