see it in action

Beyond Analog Metaphors

Click to expand

Most creative coding tools simulate analog hardware: virtual knobs, cables, circuit boards. This imposes constraints that don't exist in computation.


MayaFlux embraces true digital paradigms. Oscillators, patch cables, and envelope generators are pedagogical crutches borrowed from hardware that never constrained digital computation. MayaFlux embraces recursion, look-ahead processing, arbitrary precision, cross-domain data sharing, and computational patterns that have no analog equivalent.


Polynomials sculpt data. Logic gates make creative decisions. Coroutines coordinate time itself.

What Makes MayaFlux Different

Click to expand

Unified Data Streams

Audio, visual, and control signals share the same numerical substrate. A node output routes to an RtAudio callback and a Vulkan push constant simultaneously, with no translation layer.


Nexus: Spatial Entity Lifecycle

Fabric, Wiring, Emitter, Sensor, Agent. A spatial computation layer where entities perceive and influence audio and graphics simultaneously. Not a scene graph. No update loop. No component system.


Granular Synthesis as Data Analysis

Recordings decompose into populations of named, attributed grains. Sort by spectral centroid. Filter by variance. Reorder by any function you can express in code. The composition is an explicit analytical argument about the material, not a phasor and a scatter parameter.


Portal::Text

A glyph is a quad. A quad is four numbers. Four numbers can go anywhere. Text renders as a texture driven by the same node graph as audio and geometry: per-glyph oscillators, physics, GPU bindings.


Physical Modelling Networks

ModalNetwork, WaveguideNetwork, and ResonatorNetwork implement physical modelling synthesis as first-class node graph citizens. Excitation, coupling, boundary conditions, and spatial routing are all live-configurable parameters.


Cross-Modal Node Bindings

Node outputs bind directly to GPU shader parameters. Audio envelopes, spectral data, and control signals reach the GPU through the same node API, with no bridging code.


Live Coding with Lila

An embedded Clang interpreter evaluates arbitrary C++23 at runtime via LLVM ORC JIT. Latency is one buffer cycle. Algorithms change without stopping audio or tearing down the graphics context. Used in production during a 20-minute live set on a Steam Deck.


Coroutine Temporal Control

C++20 coroutines are the scheduling primitive. SampleDelay, FrameDelay, and MultiRateDelay awaiters let temporal intent cross domain boundaries. Time is compositional material, not a callback interval.


Lock-Free Architecture

No mutexes in the real-time path. atomic_ref, CAS-based dispatch, and lock-free registration lists coordinate all four execution contexts: RtAudio callbacks, Vulkan render threads, async input backends, and user coroutines.


Live Signal Matrix

IOManager provides a unified interface across the full signal matrix: audio files, video files, live camera devices, and image assets. SamplingPipeline adds polyphonic multi-cursor playback with independent speed and looping per voice.


MIDI and HID Input

RtMidi and HIDAPI backends feed an async InputManager that dispatches to InputNode instances in the node graph. Controllers, sensors, and custom HID devices become signal sources with the same routing API as any other node.

Philosophy

Click to expand

Domain is decided last

Sound, visuals, control signals are all numbers. A camera is a position and two matrices. A light is a position and a falloff function. A voice is a cursor into a buffer. Domain vocabulary describes what something is used for, not what it is computationally. MayaFlux does not enshrine that vocabulary in its type system.


Same structure, different outputs

A gravitational attractor and a reverb send are the same computational structure pointed at different outputs. MayaFlux has no light type, no force type, no camera type. It has positioned entities, functions that fire, and outputs that route to audio, geometry, or GPU descriptors. What the entity is in domain terms is entirely the user's imagination applied to pure math.


Don't name what doesn't need a name

Vulkan does not have a camera. It has a push constant slot. If you put a matrix there, the vertex shader reads it. MayaFlux exposes this as-is. Not naming it is not a missing feature. It is a deliberate refusal to add abstraction that adds nothing but the illusion of safety.


Code as Creative Material

Data transformation is the creative act. Programming is compositional structure, not plumbing. When tools protect you from complexity, they also protect you from possibility.


Time as Structure

Temporal relationships are part of artistic expression, not implementation detail. Coroutines are the scheduling primitive. Time is material.

Built From Necessity

Click to expand

MayaFlux wasn't built to improve on existing tools. It was built because they could not support the work already happening.

  • 15+ years of interdisciplinary performance across Chennai, Delhi, and the Netherlands
  • Production audio engineering for Unreal Engine 5 and Metro: Awakening VR
  • Experimental creative computing education
  • Live performance under pressure: 0.3 was developed in parallel with a 20-minute live coding set on a Steam Deck at TOPLAP Bengaluru. Four pieces. The framework and the performance were simultaneous.

Current Status

Click to expand

Version 0.3.0, April 2026. Alpha. Stable core.


Stable: Audio processing, lock-free node graphs, coroutine scheduler, channel routing with crossfade transitions.


Stable: Vulkan 1.3 dynamic rendering, multi-window, geometry nodes, texture pipeline, compute shaders, GPU readback.


Stable: Nexus spatial entity layer: Fabric, Wiring, Emitter, Sensor, Agent, SpatialIndex3D with lock-free snapshot publication.


Stable: Granular synthesis with offline grain attribution, SamplingPipeline for polyphonic multi-cursor playback.


Stable: Portal::Text: FreeType glyph rendering as GPU texture, driven by the same node graph as audio and geometry.


Stable: Live camera input (Linux, macOS, Windows), video file playback, image loading via IOManager.


Stable: MIDI and HID input backends, async dispatch, InputNode infrastructure.


Stable: Lila, LLVM ORC JIT for live C++23 evaluation at sub-buffer latency.


Next (0.4): Nexus and agent build-out. Kinesis as first-class analysis namespace: Eigen-level linear algebra, FluCoMa-level audio analysis primitives.


Planned (0.5): Native UI framework, building on existing Region and WindowContainer infrastructure.

Ready to Explore?

Click to expand

MayaFlux is for creators who've outgrown callback-driven thinking and want unified streams across audio, visual, and algorithmic composition.