Researchers, including an Indian, carried out a study using equipment to read the minds of participants and find out what song they were listening to by decoding the signals sent by their brains to the machine.

The team of researchers from Brazil, India, Germany and Finland conducted the study at Brazil’s D’Or Institute for Research and Education and used a Magnetic Resonance (MR) machine to read participants’ minds which displayed 85% success.

The study, published in the journal Scientific Reports, contributed to the improvement of the technique and paved the way for new research on reconstruction of auditory imagination and inner speech.

In the experiment, six volunteers heard 40 pieces of classical music, rock, pop, jazz and others.

The neural fingerprint of each song on participants’ brain was captured by the MR machine while a computer was learning to identify the brain patterns elicited by each musical piece.

Musical features such as tonality, dynamics, rhythm and timbre were taken into account by the computer.

The computer showed up to 85% accuracy in identifying the correct song — a great performance when compared with the previous studies.

In the future, studies on brain decoding and machine learning will create possibilities of communication regardless of any kind of written or spoken language.

“Machines will be able to translate our musical thoughts into songs,” said Sebastian Hoefle from D’Or Institute and Ph.D student from the Federal University of Rio de Janeiro, Brazil.

In the future, Hoefle expects to find answers for questions like “what musical features make some people love a song while others don’t? Is our brain adapted to prefer a specific kind of music”?

In the clinical domain, this technology could be used to enhance brain-computer interfaces in order to establish communication with locked-in syndrome patients.