synth1_crop.jpg

Timbre Space Control

Timbre Space Control

layout.png

Using a machine learning model, I derived a low-dimensional control space that allows quick navigation and expressive control in high-dimensional synth parameter sets. Here, six sliders determine the timbre in six-dimensional space, and the two-dimensional joystick moves about that point in the primary timbral directions learned by the model.

Full technical description of the machine learning model is available in [1] and [2]. User study detailed in [3] and [4].

Timbre-space model uses a joystick’s X and Y axes to navigate in a 6-D parameter space.

Timbre-space model uses a joystick’s X and Y axes to navigate in a 6-D parameter space.

[1] Jeff Gregorio and Youngmoo Kim. 2019. Augmenting Parametric Synthesis with Learned Timbral Controllers. Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 431–436. http://doi.org/10.5281/zenodo.3673025

[2] https://github.com/JeffGregorio/TimbreMap

[3] Jeff Gregorio and Youngmoo Kim. 2021. Evaluation of Timbre-Based Control of a Parametric Synthesizer. Proceedings of the International Conference on New Interfaces for Musical Expression, Online and NYU Shanghai. http://nime2021.org/program/#/paper/203

[4] Jeff Gregorio. 2021. Timbre Space Learning for Augmentation of Musical Audio Synthesizer Interfaces. (Unpublished doctoral dissertation). Drexel University, Philadelphia, PA, USA.