A few seemingly simple questions are actually the beginning of much bigger questions. Christina Vanden Bosch der Nederlanden of the University of Toronto made the observation. A cellist from an early age, she became a neuroscientist and had long wondered how the brain could understand the difference between music and speech. ,We know that from the age of 4 children can and easily do a clear distinction between music and language. She remembers. While this seems clear enough, there is little or no data to ask children to make these kinds of distinctions.” addresses this in a study made public on the occasion of the annual meeting of Cognitive Neuroscience Society (CNS) in San Francisco.
Contrast results between infants and adults
The work is the result of an experiment conducted with 4-month-old infants: they listened to words and songs, either pronounced in a voice similar to the one used to address babies, or in a monotonous tone. During this time, the researchers recorded electrical activity in the brain using an electroencephalogram (EEG). They find that infants are more successful in following sentences when spoken rather than singing, but the finding in adults is the opposite: they integrate words better when they are sung. Researchers also note that tone and rhythm affect brain activity. According to him,lack of sound stability“There is an important acoustic characteristic for directing attention in infants. According to Christina Vanden Bosch der Nederlanden, pitch stability can help a listener identify a song, and conversely, instability tells the infant that it is a song.” Speaks to the person, does not sing.
Understand how people feel by distinguishing between music and speech
In an online experiment, the scientist and his colleagues asked children and adults to qualitatively describe how music and language differ.Both children and adults described characteristics such as tempo, pitch, rhythm as important characteristics to differentiate between speech and song., she says. ,It gave me a huge data set that says a lot about how people think that music and language differ acoustically and also how the functional roles of music and language differ in our daily lives.”
Clinical applications of the future?
understand the relationship between music and languageHelp explore fundamental questions of human cognition, such as why humans need music and speech, and how humans communicate and interact with each other through these forms“, believes Andrew Chang, one of the participants in this CNS meeting. These results are also a path toward new clinical trials, which can be used for verbal communication for people with aphasia, for example. There may be interest in music as an alternative form, that is, say those who have lost part or all of their speech ability.
Interested in this topic? Come and discuss it on our forum!
Analyst. Amateur problem solver. Wannabe internet expert. Coffee geek. Tv guru. Award-winning communicator. Food nerd.