A1 Journal article (refereed)
What makes music memorable? : Relationships between acoustic musical features and music-evoked emotions and memories in older adults (2021)

Salakka, I., Pitkäniemi, A., Pentikäinen, E., Mikkonen, K., Saari, P., Toiviainen, P., & Särkämö, T. (2021). What makes music memorable? : Relationships between acoustic musical features and music-evoked emotions and memories in older adults. PLoS ONE, 16(5), Article e0251692. https://doi.org/10.1371/journal.pone.0251692

JYU authors or editors

Publication details

All authors or editors: Salakka, Ilja; Pitkäniemi, Anni; Pentikäinen, Emmi; Mikkonen, Kari; Saari, Pasi; Toiviainen, Petri; Särkämö, Teppo

Journal or series: PLoS ONE

eISSN: 1932-6203

Publication year: 2021

Publication date: 14/05/2021

Volume: 16

Issue number: 5

Article number: e0251692

Publisher: Public Library of Science (PLoS)

Publication country: United States

Publication language: English

DOI: https://doi.org/10.1371/journal.pone.0251692

Publication open access: Openly available

Publication channel open access: Open Access channel

Publication is parallel published (JYX): https://jyx.jyu.fi/handle/123456789/75697


Background and objectives
Music has a unique capacity to evoke both strong emotions and vivid autobiographical memories. Previous music information retrieval (MIR) studies have shown that the emotional experience of music is influenced by a combination of musical features, including tonal, rhythmic, and loudness features. Here, our aim was to explore the relationship between music-evoked emotions and music-evoked memories and how musical features (derived with MIR) can predict them both.

Healthy older adults (N = 113, age ≥ 60 years) participated in a listening task in which they rated a total of 140 song excerpts comprising folk songs and popular songs from 1950s to 1980s on five domains measuring the emotional (valence, arousal, emotional intensity) and memory (familiarity, autobiographical salience) experience of the songs. A set of 24 musical features were extracted from the songs using computational MIR methods. Principal component analyses were applied to reduce multicollinearity, resulting in six core musical components, which were then used to predict the behavioural ratings in multiple regression analyses.

All correlations between behavioural ratings were positive and ranged from moderate to very high (r = 0.46–0.92). Emotional intensity showed the highest correlation to both autobiographical salience and familiarity. In the MIR data, three musical components measuring salience of the musical pulse (Pulse strength), relative strength of high harmonics (Brightness), and fluctuation in the frequencies between 200–800 Hz (Low-mid) predicted both music-evoked emotions and memories. Emotional intensity (and valence to a lesser extent) mediated the predictive effect of the musical components on music-evoked memories.

The results suggest that music-evoked emotions are strongly related to music-evoked memories in healthy older adults and that both music-evoked emotions and memories are predicted by the same core musical features.

Keywords: emotions; music; music psychology; memory (cognition); memories (mental objects); older people

Free keywords: emotions; music cognition; music perception; memory; bioacoustics; elderly; entropy; regression analysis

Contributing organizations

Ministry reporting: Yes

Reporting Year: 2021

Preliminary JUFO rating: 1

Last updated on 2022-17-06 at 10:27