Syncposium 2023

Practical info updated & program added (bottom of page)

The Syncposium kicks off in 2023 as a concise scientific event, bringing together renowned international experts who have engaged in extensive research on rhythm as a fundamental dimension of human life. The nature of the topic is intrinsically multidisciplinary, lying at the intersection of neuroscience, psychology, movement sciences, and musicology.

The format of this event is designed to be compact, condensing a series of keynote lectures into two half-day sessions. During this time, speakers will have ample opportunity to elucidate their perspectives. In contrast to larger scientific conferences, our goal is to provide an agile organization that is both budget-friendly and accessible, with free attendance for all. The event’s brief duration is intended to accommodate attendees’ schedules. By offering a hybrid format, we aim to reach a diverse audience, enabling them to fully benefit from the rich content. The discussion around the day’s topics will be enhanced through extended Q&A sessions, round tables, and an on-site reception for informal scientific discussions. The event can host up to 120 attendees onsite and provide unlimited access to online streaming.

The theme of this inaugural edition, hosted by the IPEM Institute for Systematic Musicology at Ghent University (Belgium), is ‘Current Perspectives on the Modelling of Rhythmic Interactions’. Examining the spectrum of human behaviours, ranging from walking and dancing to ensemble music playing, it becomes apparent how intricately complex rhythmic patterns evolve over time. However, systematically investigating the underlying mechanisms and fundamental dynamics of these patterns is a challenging endeavour that necessitates simplification. Our discussions will aim to demystify the concept of modelling and contemplate its core principles beyond complex mathematical formulas. We will explore how scientists navigate complexity, the perspectives they adopt, the assumptions underlying their approaches, and the limitations and biases that arise. Our speakers’ presentations will revolve around the frameworks shaping their research, from brain networks and brain-body-environment interactions to interpersonal coordination and musical interactions.

We envision the Syncposium as a continuously evolving dialogue, striving not only to comprehend, but also to shape future trajectories of the study of human rhythmic interactions.



20/09/2023 (15:00 – 19:00)

21/09/2023 (09:00 – 13:00)


Online: zoom link reveived at registration

In person:

  • Room ‘Blauwe Vogel’
  • De Krook Library
  • Platteberg 11
  • 9000 Ghent, BE

Register (free of charge)


Day 1 – Rhythm in social interactions

20/09/2023, 15:00 – 19:00)

Precise yet flexible rhythmic interpersonal coordination, as in musical group performance, requires individuals to anticipate and adapt to each other’s action timing. While these processes are based on fundamental sensory-motor mechanisms, there is large variation in coordination skills between individuals. A growing body of research has used computational modeling approaches to explore behavioral and brain correlates of these individual differences. I will give an overview of studies that have used the Adaptation and Anticipation Model (ADAM) to investigate rhythmic coordination skills in different domains, including human participants interacting with other humans, computer-controlled virtual partners, and robots, as well as in neuroimaging experiments and lesion studies.

Peter Keller is Professor of Neuroscience in the Center for Music in the Brain and the Department of Clinical Medicine at Aarhus University (Denmark), with a joint appointment in the MARCS Institute for Brain, Behaviour and Development at Western Sydney University (Australia). His background includes music performance and composition as well as scientific research, with degrees in Music and Psychology from the University of New South Wales (Australia). His research addresses the psychological and neurophysiological underpinnings of human interaction in musical contexts.

Music is fundamentally a social phenomenon, in that we listen to, synchronize to, and make music together. In many contemporary styles of music, much of the musical material is improvised on the spot, which requires the involved musicians to agree on predictive structures such as meter and tonality. In this lecture, I will briefly introduce the theory of predictive processing of music as a framework for understanding the fundamental brain processes underlying musical perception and action and propose how this theory may be extended to account for the dynamics underlying collective music making.

 Peter Vuust. Professor Peter Vuust, Ph.D. is a unique combination of a top-level jazz musician and scientist. He leads the Danish National Research Foundation’s center for “Music In the Brain” and holds joint appointments as full professor both at the Danish Royal Academy of Music and the Dept of Clinical Medicine Aarhus University. He has published more than 150 scientific papers in high ranking international journals, among others the review “Music in the brain” in Nature Reviews Neuroscience (March, 2022) centered around the theory of predictive coding of music. He uses state-of-the-art brain scanning techniques such as fMRI, PET, EEG, MEG and behavioral measures and is a world leading expert in the field of music and the brain – a research field he has single-handedly built up in Denmark as leader of the center for Music In the Brain (MIB) currently employing more than 30 researchers. Among many other grants, he has received DKK 98 million (~ US $ 15 million) as PI, from the Danish National Research Foundation.

In addition, Prof Vuust is a renowned jazz bassist and composer; leading the Peter Vuust Quartet with Alex Riel, Lars Jansson and Ove Ingemarsson of which seven records have been released so far. He has also played on more than 100 recordings and been sideman with international jazz stars such as Lars Jansson, Tim Hagans, John Abercrombie, Dave Liebman and many more. He is the recipient of the 2009 Jazz Society of Aarhus’ “Gaffel”-prize. His album “September Song” was widely acclaimed by reviewers and received a nomination for a Danish Music Award in 2014. As professor at the Royal Academy of Music in Aarhus, Denmark, he has taught electric and acoustic bass as well as music theory, ear training and ensemble playing. He has given many keynote talks and masterclasses at international conferences and institutions on a wide range of topics ranging from the neuroscience of music to improvisation and composition. He has written three monograph’s “Polyrhythm and –meter in modern jazz; a study of Miles Davis’ Quintet from the 1960s”, “Music on the Brain”, and most recently a book on musical leadership.

Understanding rhythmic entrainment is essential for understanding adaptive, expressive and empathic behavior in humans. Here we develop a methodological approach that builds on Kuramoto-like dynamical equations (ODE) for the phase flow in a dyad rhythmic entrainment setup, similar to the setup of the ‘drifting metronomes’ paradigm developed in the Phd-thesis of Mattia Rosso. The ODE-equations are used as predictor engine in a regression approach, allowing an estimation of the coupling strengths and phase delays. Using these estimations, several hypotheses are tested, unraveling ingredients of rhythmic entrainment, such as: metronome adherence, leader/follower gaps and phase delays. A smooth regression analysis based on the ODE equations posterior predictions, offers a dynamic viewpoint reflected in the instantaneous period representations of the phase flow. The result shows that human rhythmic entrainment follows a dynamic depending on coupling strength and phase delay of a particular interaction setup. This finding suggests a very precise brain component for future neuroscientific work on dyad rhythmic entrainment and beyond.

 Marc Leman is Methusalem research professor in Systematic Musicology, director of IPEM, Institute for Psychoacoustics and Electronic Music, Dept. Musicology, and head of the department of Art History, Musicology, and Theatre Studies at Ghent University, Ghent, Belgium. He founded the ASIL (Arts and Science Interaction Lab) at the KROOK (inaugurated in 2017). His research is about (embodied) music interaction, and associated epistemological and methodological issues. He has > 450 publications, among which several monographies (e.g. “Embodied music cognition and mediation technology”, MIT Press, 2007; “The expressive moment”, MIT Press, 2016) and co-edited works (e.g. “Musical gestures: sound, movement, and meaning”, Routledge, 2010; “The Routledge companion to embodied music interaction”, 2017).  He teaches courses in Music Psychology (BAIII) and Music Interaction and Technology (MA) at UGent. He was supervisor and co-supervisor of several large National (MAMI…) and European projects (MEGA, EBRAMUS, BEATHEALTH,…). He consulted the EU Commission, national research councils, institutions and projects. He is promotor of > 40 PhD Students and was editor in chief of Journal of New Music Research (till 2004). From 2016 to 2018 he was member of the Scientific Advisory Committee (SAC) of Science Europe. Recent ongoing projects include Expressive music interaction: Methusalem II (2015 – 2021), Expressive timing: Foundations of expressive timing control in music (2014 – 2018), Conbots (ICT-EU, starting in February 2020).

In my talk, I will discuss the development of embodied virtual agents (EVAs) for musical interaction. By implementing low-dimensional oscillator models, EVAs are capable of adaptive and anticipatory synchronization with real-time human rhythmical input. The use of EVAs in music art and science will be discussed. For music science, EVAs form a novel methodological tool to advance research on the principles of joint musical coordination and experiences. For music art, EVAs and virtual environments open up new creative and artistic possibilities. Experiments and applications will be presented to illustrate this potential of EVAs.

 Pieter-Jan Maes is professor in systematic musicology at IPEM, Ghent University, Belgium. I hold a master degree in musicology and did my PhD on embodied music listening and mediation technologies. I worked previously as post-doctoral fellow at McGill University Montreal at the Dept. of Music Research and the Dept. of Psychology. I was involved in several national and international research projects.  ( 

My research focuses on human embodied interaction with, and via, music; “the body in action.” Through empirical music research, I aim to better understand the dynamical coordination processes observed in bodily and brain activation of humans engaged in music performance and listening. I consider the study of coordination processes as a gateway towards a better understanding of the powerful (inter)subjective experiences that music can evoke. In my empirical research, I focus specifically on innovations on two (related) levels. First, I want to improve computational analysis and modelling of bodily behaviour and responses, in relation to subjectively-felt experiences, and musical characteristics and structures. I am particularly interested in exploring timeseries analysis methods within the framework of dynamical systems theory. Second, I investigate the potential of technologies in the domain of extended reality (XR) to innovate stimuli creation and control in empirical experiments, including the use of embodied virtual avatars and agents. I fully believe in the potential of XR to bridge the requirements of ecologically-validity and controllability in experimental research, as well as in its potential to create “impossible” stimuli. I coordinate the state-of-the-art research infrastructure located at De Krook in Ghent, called the Art and Science Interaction Lab, ASIL ( The vision is to establish the ASIL as unique research hub that proactively drives innovation in humanities and human science research, as well as in the cultural-creative sector, by relying on the possibilities of XR technologies.

Day 2 – Rhythm in the brain

21/09/2023, 09:00 – 13:00

Why do humans spontaneously dance to music? To test the hypothesis that motor dynamics reflect predictive timing during music listening, we built melodies with varying degrees of rhythmic predictability. Magnetoencephalography data showed that while auditory regions track the rhythm of melodies, intrinsic neural dynamics at delta (1.4 Hz) and beta (20-30 Hz) rates in the dorsal auditory pathway embody the experience of groove. Critically, neural dynamics are organized along this pathway in a spectral gradient, with the left sensorimotor cortex acting as a hub coordinating groove-related delta and beta activity. Combined with predictions of a neurodynamic model, this indicates that spontaneous motor engagement during music listening is a manifestation of predictive timing effected by interaction of neural dynamics along the dorsal auditory pathway.

Benjamin Morillon is a cognitive neuroscientist, specialized in human auditory neurophysiology and the sequential encoding of information in the brain. His research approach involves integrating cognitive psychology, multi-modal functional neuroimaging and computational modeling to understand how humans process auditory information. His primary focus is on exploring the interplay between auditory and motor neural systems, analyzing how motor routines and cortical motor activity influence auditory processing and behavior. He also studies auditory hemispheric asymmetry, investigating the similarities and differences between the two auditory cortices and their relationship to efficient auditory neural coding.

When listening to music, people of all cultures tend to spontaneously perceive and move the body along with a periodic beat. However, the nature of neural processes that support beat perception remains largely unknown. Here, we review a recent line of research investigating the neural basis of this perceptual phenomenon by capitalizing on electrophysiological recordings of brain activity. This research provides converging evidence that beat perception involves mechanisms that transform the rhythmic stimulus into a temporally recurrent format with emphasized beat periodicity. This transformation takes place already in the auditory cortex, and seems to be present in human adults and infants as well as non-human primates. The resulting “periodized” format may provide a basis further driving coordinated musical behaviors.

Sylvie Nozaradan, MD PhD, is head of the Rhythm & Brains lab based in Brussels, and an Associate Professor at the Institute of Neuroscience of UCLouvain, Belgium since September 2018. The same year, she was awarded an ERC Starting Grant from the European Research Council to develop a research line on rhythm and the brain. Previously, she received an ARC Discovery Early Career Researcher Award from the Australian Research Council, which gave her the opportunity to start developing her research independently for three years at the MARCS Institute, Western Sydney University (Australia), with the mentorship of Pr. Peter Keller. She has a PhD degree in neuroscience from UCLouvain (supervisor: Pr. André Mouraux) and the BRAMS, Montreal, Canada (supervisor: Pr. Isabelle Peretz), where she investigated neural entrainment to musical rhythm. She has a dual background in music (Master in piano, Conservatoire Royal de Bruxelles, Belgium) and science (medical doctor, UCLouvain).

 It is well established that cortico-cerebellar-cortical circuitry monitors motor behaviour, but recent evidence established that this circuitry similarly engages in the temporal encoding of basic and more complex (multi)sensory information. Consequently, cerebellar computations might be universal with regards to the temporal encoding of motor and basic and complex (multi)sensory information as (i) such information stimulates and monitors cortical information processing, (ii) cerebellar-thalamic output might be a possible source of endogenous activity, predicting the outcome of cortical information processing, and (iii) possibly provide a temporal frame for the binding of information. I will discuss our current conceptual thinking as well as empirical evidence in support of these considerations with an extension to social interactions.

Sonja Kotz is a translational cognitive neuroscientist, investigating temporal, rhythmic, and formal predictions and control mechanisms in audition, music, and speech across the lifespan, in animal models, and patients (PD, stroke, tinnitus, psychosis, dyslexia). In her research she utilizes a wide range of behavioral and neuroimaging methods (M/EEG, s/f/rsMRI, TMS). She heads the Neuropsychology section at the Faculty of Psychology and Neuroscience at Maastricht University, The Netherlands and holds several honorary professorships (Leipzig, Lisbon). She is a senior/associate editor for several impact journals in the field (e.g., Imaging Neuroscience, Cortex, Neurobiology of Language). Find more about her lab @

Our previous clinical trials suggest that musical interventions positively impact the overall wellbeing of patients with neurocognitive disorders such as Alzheimer’s disease, vascular dementia, or mixed dementia. These interventions not only benefit patients at the emotional, cognitive and behavioral level, but also reduce distress among caregivers. Based on these findings, we previously suggested that the positive effects of rhythmic stimulation in patients with neurodegenerative disease could be attributed to the ability to move in synchrony with music. However, past research has been inconclusive as to whether decline of cognitive functioning affects rhythmic performances and socio-emotional engagement in music synchronisation. Moreover, the underlying mechanism through which musical interventions provide benefits in degenerative diseases remains unknown. To address this question, we will present findings from a series of clinical studies. Aged participants with varying degrees of cognitive disorders, including those with subjective complaints were assessed while performing a joint rhythmic task with a musician, with two auditory sequences (metronome vs music). Our data provides evidence that regardless of the level of global functioning of elderly patients, music promoted sensorymotor synchrinisation (SMS), albeit to varying degrees. However, our research revealed a decline in SMS consistency that correlated with cognitive functioning (i.e. Mini-Mental State Examination (MMSE) scores) suggesting that rhythmic dysfunction is associated with the development of neurocognitive disorders. We also found an effect of aging on SMS consistency, particularly when participants tapped to music as opposed to a metronome. Lastly, we examined whether the social context plays a role in patients’ ability to synchronize their movements with musical rhythms and investigated whether the social aspect could enhance patients’ motor skills and socio-emotional engagement. All these results will be discussed in relation the beneficials effects of music-based interventions in aging and neurodegenerative disease.

Lise Hobeika, a cognitive neuroscientist based at the Paris Brain Institute and the University of Lille, explores the interplay between auditory perception, emotions, and cognition. After getting her master degree at the Ecole Normale Supérieure (ENS) at Paris, she performed her PhD at IRCAM (Institut of Research on Coordination Acoustic Music) on the effect of social contexts on auditory space perception. Her current research focuses on the positive impact of music interventions on clinical populations, specifically examining the role of rhythmic stimulation in improving emotional, social, and cognitive functioning in aging and with neurocognitive disorders. More recently, she started studies on the cerebral dimensions of tinnitus.

Séverine Samson is a cognitive neuropsychologist and a professor of Psychology at the University of Lille in France, and has recently been affiliated with the Hearing Institute in Paris. While maintaining clinical activity in neuropsychology in an Epilepsy unit (Pitié-Salpêtrière Hospital, Paris), she developed neuropsychology training programs at the University of Lille. Her research focuses on musical perception, memory, and emotions using methods taken from psychophysics, cognitive psychology and neuroimagery. She addresses her research questions by analysing different neuropathologies of epileptic, degenerative, developmental and sensory origin and uses music as a framework for understanding the functioning of human cognition and interpersonal coordination. This evolution has led her to experimentally investigate potential therapeutic applications of music in the rehabilitation of cognitive and affective neurological disorders. The multi-disciplinary approach used combines clinical research with the experimental rigor of basic research, at the interface of art, science and cognition.