Name that tune: A pilot study in finding a melody from a sung query

Bryan Pardo*, Jonah Shifrin, William Birmingham

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

45 Scopus citations


We have created a system for music search and retrieval. A user sings a theme from the desired piece of music. The sung theme (query) is converted into a sequence of pitch-intervals and rhythms. This sequence is compared to musical themes (targets) stored in a database. The top pieces are returned to the user in order of similarity to the sung theme. We describe, in detail, two different approaches to measuring similarity between database themes and the sung query. In the first, queries are compared to database themes using standard string-alignment algorithms. Here, similarity between target and query is determined by edit cost. In the second approach, pieces in the database are represented as hidden Markov models (HMMs). In this approach, the query is treated as an observation sequence and a target is judged similar to the query if its HMM has a high likelihood of generating the query. In this article we report our approach to the construction of a target database of themes, encoding, and transcription of user queries, and the results of preliminary experimentation with a set of sung queries. Our experiments show that while no approach is clearly superior to the other system, string matching has a slight advantage. Moreover, neither approach surpasses human performance.

Original languageEnglish (US)
Pages (from-to)283-300
Number of pages18
JournalJournal of the American Society for Information Science and Technology
Issue number4
StatePublished - Feb 15 2004

ASJC Scopus subject areas

  • Software
  • Information Systems
  • Human-Computer Interaction
  • Computer Networks and Communications
  • Artificial Intelligence


Dive into the research topics of 'Name that tune: A pilot study in finding a melody from a sung query'. Together they form a unique fingerprint.

Cite this