By Nicholas C. Firth
Created Out of Mind and UCL researcher Nicholas Firth presents findings from a new experiment, which harnesses wearable technologies to explore people’s relationship with both familiar and unfamiliar music, and what this might mean for people living with dementias.
In the 1982 film Blade Runner (and recent sequel), the fictional Voight-Kampff machine was used to test whether an individual was human or android. This machine measured respiration, blush response, heart rate and eye movement, whilst asking questions designed to evaluate levels of empathy.
At Created out of Mind we have also set up a machine to examine people’s physical responses to emotional stimuli. However, the purpose of our experiment ‘Play it Again’ is to explore how people living with and without dementias react to familiar and unfamiliar music.
Music has a long-standing connection with dementia, and often people living with dementias experience music in a similar way to people who do not have cognitive decline.
Like the Voight-Kampff machine, our experiment measures heart rate and eye movement, as well as perspiration, body temperature and facial expressions, to examine differences between familiar and unfamiliar music.
“We hope to communicate something of the richness of response that many of us - whether living with dementia or without - have to music.”
A few weeks ago we invited our first participant, a musician, to come and take part in our experiment. Mary, as we’ll call her, was a willing research participant and quickly got suited up! This included putting on an Empatica E4, which is like a high-tech FitBit, on each of her wrists. These devices measure:
- Electrodermal activity (perspiration levels)
- Heart rate
- Body temperature
- Blood volume pulse
- Acceleration, or movement
Each of these measures contains information about how the body is reacting to its environment. For example, there is a known relationship between electrodermal activity (a measure of skin’s heat conductivity) and emotional arousal. In the case of this experiment we expect to see higher levels of emotional arousal for familiar music compared to unfamiliar music.
Once the wristbands were on, Mary sat down in front of our eye-tracker which would be used to measure her pupil size, and a video camera which would film her, during the experiment.
The experiment took around 12 minutes, during which time Mary listened to 31 12-second clips of songs, answering two questions about each. She was highly familiar (five out of five on our scale) with all but one of our well known pop songs and highly unfamiliar (zero out of five) with our lesser known pop songs matched for musical style. Music clearly knows no age as our familiar songs were aimed at people between 55 and 70 years of age, but Mary who was considerably younger than this knew many of them, which was great!
Physiological responses vary depending on the person, context and type of music, so we could not predict exactly how Mary’s measures would change for familiar and unfamiliar music. However based on previous work, we did expect to see higher electrodermal activity and larger pupil size for familiar music as opposed to unfamiliar music. What we did see is a higher average pupil size for the first five seconds of each song, but then pupil size lowered for the latter part of the clips (below-left graph).
We also saw consistently higher electrodermal activity when Mary listened to familiar music, which corresponds with our expectations (below-right graph). So Mary’s results did have some inconsistencies over time – in this case, this doesn’t mean she’s an android, but it does need further investigation!
As well as capturing the more traditional physiological measures, we also decided to measure how Mary’s facial expressions changed throughout the experiment using Microsoft's Cognitive Services. The emotion recognition tool combines video with facial recognition and gives a prediction (expressed as percentage likelihood) as to which of eight emotional states the face is showing: neutral, happiness, sadness, contempt, surprise, disgust, anger and fear.
Mary's most common emotion was neutral, which made up just over 70% of the experiment, which is unsurprising given the conditions. Her second and third were happiness and sadness which made up 13% and 12% respectively. As the only other person in the room, I was grateful to see that her contempt, disgust, anger and fear scores were negligible throughout the experiment.
We can use Mary’s answers to the questions about how familiar she was with the music to create two sets of music- highly familiar and highly unfamiliar- and look at the differences in her emotion prediction for both. From the graphs below we can see that on average Mary was happier listening to familiar songs than unfamiliar songs, in particular one of her own songs. I should note that this wasn’t self-indulgent- she was completely unaware that I was going to spring this song on her, and I think the laughter was picked up by the emotion recognition! Similarly, there was a decrease in neutrality and sadness when listening to familiar rather than unfamiliar music.
Currently our results are not conclusive (or “statistically significant”) due to the small number of experiments we have run, but the researchers at Created out of Mind are encouraged by the positive responses we have observed so far. We hope that we can use the data collected during our Hub residency at Wellcome Collection to communicate something of the richness of response that many of us - whether living with dementia or without - have to music, and the valuable impact it has on our lives.
Created Out of Mind project 'Play it again' explores how familiar music can provoke a measurable physiological response in people living with and without dementias. In particular we are measuring responses to music in people who have difficulty communicating their familiarity to music, to test the effectiveness of music in creating a positive response. This work is led by Created Out of Mind collaborator and UCL Computer Scientist Nicholas Firth.
If you would like to find out more information, then please contact the Created Out of Mind team at: firstname.lastname@example.org