News

  1. News
  2. 2015
  3. June
  4. Capturing Similarity in Music
News from City, University of London
Soundwaves
Science & Technology Series: Research Spotlight

Capturing Similarity in Music

Dr Tillman Weyde, head of City's Music Informatics Research Group (MIRG) is leading a team of researchers who are building an “Integrated Audio-Symbolic Model of Music Similarity” (ASyMMuS). They are developing a framework and conducting experiments into the integration of melodic and structural models with audio-based similarity models.
by John Stevenson (Senior Communications Officer)

Similarity is central to music and is a benchmark for testing theories of music perception and cognition. It is also useful in practical applications such as music retrieval and recommendation and is an overarching question relating to many aspects of music.

City researchers in the Department of Computer Science, participating in the the inter-university ASyMMuS project, are modelling the concept of similarity in music by developing an initial framework and conducting experiments into the integration of symbolic, melodic and structural similarity models with audio-based models.  The model being developed will capture various notions of similarity and will be evaluated using the cultural information usually found in music collections (such as genre, style, composer) as well as user annotations or similarity ratings, like those captured with the Spot the Odd Song Out game developed at City's Music Informatics Reseach Group (MIRG) by Dr Daniel Wolff.

EarphonesTheories of musical similarity have mostly centred around symbolic representation, addressing musical structures in areas such as melodic and harmonic development. However, aspects of expressive performance such as micro-timing and timbre are often ignored.  Audio-based models can capture the nuanced details of the timbre and dynamics but little of the musical structure as it unfolds over time. The research in ASyMMuS is based on the recent Digital Music Lab (DML) project which harnesses recent advances in audio transcription and alignment to create music analysis tools for music collections with audio and symbolic content. Principal investigator, Dr Weyde, said: “Large datasets of acoustic and symbolic music data, generated or aligned through DML and ‘Optical Music Recognition from Multiple Sources’ Big Data projects, encourage an approach that combines symbolic and audio-based analyses into a joint similarity mode. This opens up scope for new research tools in music information retrieval and musicology, as the interaction between symbolic structure and acoustic information such as timbral texture has rarely been addressed”.

Dr Weyde adds that “this could reveal aspects that have been unexplained or unnoticed so far. If successful, this project might contribute to breaking the glass ceiling in music recommendation.”

The ASyMMuS project was launched by City, University of Lancaster and University College London with support from the British Library. The £76,779 ASyMMuS project is sponsored by the Arts and Humanities Research Council (AHRC) as part of its investment in Big Data. Dr Daniel Wolff and Dr Emmanouil Benetos from City are also involved in the project and collaborate with co-investigators Dr Nicholas Gold of University College London, Dr Alan Marsden of the University of Lancaster and Dr Aquiles Alancar-Brayner of the British Library.

Definition
Music Informatics

Music Informatics is the study of computational models of music analysis, music generation, and music information retrieval.

Tags , , , , ,
Share this article

Find us

City, University of London

Northampton Square

London EC1V 0HB

United Kingdom

Back to top

City, University of London is an independent member institution of the University of London. Established by Royal Charter in 1836, the University of London consists of 18 independent member institutions with outstanding global reputations and several prestigious central academic bodies and activities.