Can computers write music that portrays human feelings? A BYU graduate student is working on a project that does just that.
Kristine Monteith is both a music therapist and a PhD student at BYU in computer science. The goal of he research is to make computers more emotionally aware.
"One of the main questions in artificial intelligence is ‘Can you get a computer to act like a human?'" she said.
Monteith has created a program which allows a user to type in an emotion, such as fear, and the computer writes an original song based on that emotion.
"It's not going to replace Bach and Beethoven anytime soon, but it's decent."
So if you're happy, your computer kicks out a happy tune.
She's also trying to use this program in re undergraduate field of music therapy. Eventually she hopes to be able to create songs to raise or lower people's blood pressure during therapy. This could help calm a patient when they are, for instance, hyperventilating or during an anxiety attack.
She can also sync up the program with text, for instance one of Aesop's tales, and have the music match the emotional content of the words read, leading to a heightened experience by the reader.