Unravelling individual rhythmic abilities using machine learning

Simone Dalla Bella*, Stefan Janaqi, Charles-Etienne Benoit, Nicolas Farrugia, Valentin Bégel, Laura Verga, Eleanor E Harding, Sonja A. Kotz*

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Humans can easily extract the rhythm of a complex sound, like music, and move to its regular beat, like in dance. These abilities are modulated by musical training and vary significantly in untrained individuals. The causes of this variability are multidimensional and typically hard to grasp in single tasks. To date we lack a comprehensive model capturing the rhythmic fingerprints of both musicians and non-musicians. Here we harnessed machine learning to extract a parsimonious model of rhythmic abilities, based on behavioral testing (with perceptual and motor tasks) of individuals with and without formal musical training (n?=?79). We demonstrate that variability in rhythmic abilities and their link with formal and informal music experience can be successfully captured by profiles including a minimal set of behavioral measures. These findings highlight that machine learning techniques can be employed successfully to distill profiles of rhythmic abilities, and ultimately shed light on individual variability and its relationship with both formal musical training and informal musical experiences.
Original languageEnglish
Article number1135
JournalScientific Reports
Volume14
Issue number1
DOIs
Publication statusPublished - 11 Jan 2024

Fingerprint

Dive into the research topics of 'Unravelling individual rhythmic abilities using machine learning'. Together they form a unique fingerprint.

Cite this