GIMLeT – Gestural Interaction Machine Learning Toolkit
-
Updated
Nov 2, 2024 - Max
GIMLeT – Gestural Interaction Machine Learning Toolkit
"Dicy2 for Max" is a Max package implementing interactive agents using machine-learning to generate musical sequences that can be integrated into musical situations ranging from the production of structured material within a compositional process to the design of autonomous agents for improvised interaction. Check also our plugin for Ableton live !
A set of Max abstractions designed for computing motion descriptors from raw motion capture data in real time.
Model View Controller in Max
Repository for the TouchBox
A student collaboration for a class at HKU. A standalone system that uses cloth to interact between human touch and a musical/sound coding system
Add a description, image, and links to the interaction-design topic page so that developers can more easily learn about it.
To associate your repository with the interaction-design topic, visit your repo's landing page and select "manage topics."