The MIDI Machine

A serverless SaaS that I built with Rust compiled to WASM, and deployed using AWS Amplify.

At a glance
Start date2022-01-25
Language(s)Rust, JavaScript, Python
GitHub Repo(s)
URLhttps://midi.alteredbrainchemistry.com
The MIDI Machine

Origins

This project began as a capstone concept in LSU’s Digital Media Art Engineering program, exploring how visual input could be transformed into musical output. My role focused on designing the procedural music‑generation component and improving the structure and performance of the existing Python prototype.

To support this, I analyzed the original implementation—which included several dense and hard‑to‑trace sections—and refactored it into a clearer, more modular architecture. The combined codebase was roughly 1000 lines; after restructuring, I reduced it to about 500 lines while preserving full functionality and improving maintainability.

I also optimized the image processing pipeline by replacing disk‑based intermediate storage with in‑memory operations, significantly improving performance and responsiveness.

During this process, I discovered that the prototype’s melody‑sequencing logic relied on a shuffled sequence derived from image data, introducing nondeterministic behavior. I designed my music‑generation system to work coherently with this mechanism by defining chord types, scale‑appropriate harmonies, and a chord‑selection process that maps the shuffled values to musically valid choices in the key of C minor

Emergent Behavior

An early loop‑ordering quirk unexpectedly produced rich polyphonic textures. After analyzing the behavior, I corrected the underlying logic and incorporated the resulting musical patterns into the final design, giving the system a distinctive melodic character

Rust Refactor

To achieve real-time performance and broader deployability, I reimplemented the entire system in Rust using the midly crate. I built lightweight abstractions for note insertion and MIDI finalization, enabling the Rust version to generate MIDI files in under a second. The project also compiles to WebAssembly, allowing it to run directly in the browser.

I then replaced the original image-derived note sequence mechanism with a hash-seeded RNG (chacha20). This approach dramatically improves performance: generating a hash and sampling an RNG operates in approximately O(n) + O(1), whereas deriving randomness from image analysis scales with image dimensions O(w*h)). The new method provides deterministic, reproducible melodic structures with significantly lower computational cost.

Chord Pruning

The Rust version introduces a feature for shaping the harmonic “vocabulary” of the MIDI Machine. The vocabulary is the set of chords the system can choose from during generation. Chord pruning allows this set to be restricted to chords that fit a selected scale.

When pruning is enabled, the system initializes all possible chords for the chosen chord types across every root note. It then filters out any chords containing notes outside the selected scale, leaving only scale‑compatible options. Because the pruning process uses hash‑based collections, the resulting chord sets are sorted to ensure deterministic ordering. As an optional feature, sorting can be disabled to introduce additional randomness at the cost of reproducibility.

Another possible implementation uses bitwise operations, but that approach enforces full determinism and does not allow for optional non‑deterministic behavior. The current design supports both deterministic and non‑deterministic modes, so the bitwise method was not used here.