Google's Project Magenta researchers have developed a machine to serve as an instrument that is the physical interface for its NSynth algorithm, which uses a deep neural network to create new sounds. Magenta uses machine-learning tools to enable artists to create art and music in new ways.
The researchers teamed with Google Creative Labs to create the NSynth Super instrument, which allows musicians to make music using sounds generated by the NSynth algorithm from four different source sounds. NSynth Super has a touch screen, which users can drag their fingers across to play sounds.
The instrument uses 16 original source sounds across a range of 15 pitches, which were recorded in a studio and then entered into the NSynth algorithm. This led to over 100,000 new sounds that were loaded into the experience prototype. Musicians choose source sounds they would like to explore, and drag their finger across the touchscreen to combine acoustic qualities.
From Tech Xplore
View Full Article
Abstracts Copyright © 2018 Information Inc., Bethesda, Maryland, USA
No entries found