This is a Brain Music EEG headset.Credit: Osaka University

Music, more than any art, is a beautiful mix of science and emotion. It follows a set of patterns almost mathematically to extract feelings from its audience. Machines that make music focus on these patterns, but give little consideration to the emotional response of their audience. An international research team led by Osaka University together with Tokyo Metropolitan University, imec in Belgium and Crimson Technology has released a new machine-learning device that detects the emotional state of its listeners to produce new songs that elicit new feelings.

“Most machine songs depend on an automatic composition system,” says Masayuki Numao, professor at Osaka University. “They are preprogrammed with songs but can only make similar songs.”

Numao and his team of scientists wanted to enhance the interactive experience by feeding to the machine the user’s emotional state. Users listened to music while wearing wireless headphones that contained brain wave sensors. These sensors detected EEG readings, which the robot used to make music.

Find your dream job in the space industry. Check our Space Job Board »

“We preprogrammed the robot with songs, but added the brain waves of the listener to make new music.” Numao found that users were more engaged with the music when the system could detect their brain patterns.

Numao envisions a number of societal benefits to a human-machine interface that considers emotions. “We can use it in health care to motivate people to exercise or cheer them up.”

The device was on display at the 3rd Wearable Expo in Tokyo Japan last January.

 


Source: Osaka University