Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Tempo Induction and Beat Tracking for Audio Signals MUMT 611, February 2005 Assignment 3 Paul Kolesnik.

Similar presentations


Presentation on theme: "1 Tempo Induction and Beat Tracking for Audio Signals MUMT 611, February 2005 Assignment 3 Paul Kolesnik."— Presentation transcript:

1 1 Tempo Induction and Beat Tracking for Audio Signals MUMT 611, February 2005 Assignment 3 Paul Kolesnik

2 2 of 19 Contents Introduction and Definitions Overview of Works Summary of Common Techniques Conclusion

3 3 of 19 Introduction Concepts  Tempo rate at which a musical piece is played score time units per real time unit (eg. beat per minute)  Beat unit in a sequence of impulses which define the tempo of a musical piece Has no exclusive definition, can be ambiguous and context- dependent (eg. 60 bpm vs. 120 bpm) Has period (inter-beat interval) and phase (estimated to beat position in score)

4 4 of 19 Introduction Tempo Induction Process of estimating basic tempo from musical data Beat Tracking  Definition The process of extracting beat information from a musical score (based on tempo information) Differentiated from score following through the absence of a score  Applications Performance analysis, perceptual modeling, audio content analysis and transcription, performance synchronization

5 5 of 19 Introduction Beat Tracking of Musical Data MIDI - symbolic representation, information needed for beat tracking is encoded directly in the data (eg. note onsets) Audio - needs preprocessing to extract symbolic representation of the signal

6 6 of 19 Overview of Works Schloss (1985)  One of the earliest works  Onsets detected as peaks in the slope of amplitude envelope of a high-pass filtered signal (HFC analysis).  Allen and Dannenberg (1991)  Definition of a period and phase, the concept of beam search using multiple tempo / beat hypotheses  Not clear if MIDI or Audio is used as input  Goto and Muraoka (1995-2001)  Extensive work on beat-tracking systems with and without drums, combined in 2001  Early system (with drums): examines frequencies of snare and bass drums, finds onsets and matches them to a set of pre- stored drum patterns

7 7 of 19 Overview of Works  Goto and Muraoka (ctd.)  Disadvantage: limited to a specific musical style. Advantage: highly successful for this style.  Later system (without drums): using higher-level musical knowledge - chord changes - to determine low-level beat indications.  Both systems combined into a single system that uses a combination of drum indications, chord changes and onset indications.  Systems based on multiple agent architecture, with each agent predicting beat times using different strategies.

8 8 of 19 Overview of Works  Goto and Muraoka (ctd.) Musical-knowledge rules used by the system  Onset-related rules  (a-1) A frequent inter-onset interval is likely to be an inter-beat interval  (a-2) Onset times tend to coincide with beat times (i.e. sounds are likely to occur on beats)  Chord-related rules  (b-1) Chords are more likely to change on beat times than on other positions

9 9 of 19 Overview of Works  Goto and Muraoka (ctd.)  Chord-related rules (ctd)  (b-2) Chords are more likely to change on half-note times than on other positions  (b-3) Chords are more likely to change at the beginnings of measures than at other positions of half-notes  Drum pattern-related rules (ctd)  (c-1) The beginning of the input drum pattern indicates a half-note time  (c-2) The input drum pattern has an appropriate inter-beat interval

10 10 of 19 Overview of Works  Scheirer (1998)  Based on tuned resonators  Signal is split in 6 frequency bands; amplitude envolopes are extracted and passed through a set of 150 comb filters (each representing possible tempo on a discretized scale); output summed across frequency bands; highest value determines the tempo and phase  Problem: filter spacing in relation of tempo representations

11 11 of 19 Overview of Works  Dixon (2001)  Accepts audio or symbolic (MIDI) data  Two stages of processing  Tempo induction (examines times between pairs of onsets)  Beat tracking (determines the period / alignment of the beats)

12 12 of 19 Overview of Works  Tempo Induction  Examines times between pairs of note onsets  Uses clustering algorithm to determine significant clusters of inter-onset intervals  Each cluster represents a hypothetical tempo  Output: a list of ranked tempo hypotheses  For audio, significant preprocessing (onset detection) is needed -- this is done using amplitude envelope techniques described in (Schloss 1985)

13 13 of 19 Overview of Works  Beat Tracking  Tempo induction calculates the frequency/period of the beat  Beat tracking calculates the phase  This is done using the multiple hypothesis search, with the best output score determining the identified beat phase  Each hypothesis search is conducted by a beat tracking agent, which predicts a beat time and matches it to rhythmic events, adjusts its hypothesis, creates a new one or deletes it if two identical hypotheses are reached

14 14 of 19 Overview of Works  Beat Tracking  For each tempo induction-generated hypothesis, there is a group of agents created to track the piece at this tempo  Works based on the assumption that there is at least one event in the initial section of music (5 sec) that coincides with the beat time  Agent adjustments, creations, deletions take place based on the analyzed information  Decisions are made based on how evenly spaced events are, how often they match the expected beat time values, and salience of matched events

15 15 of 19 Overview of Works  Musical Salience  How significant any particular event is based on the higher-level knowledge of the musical context in which it takes place  Examples of salience factors  Note duration  Simultaneous note density  Amplitude  Pitch

16 16 of 19 Overview of Works  Davies and Plumbley (2004)  A realtime audio beat-tracking system, allowing tempo changes and different styles  Performs onset detection using high frequency content and complex domain algorithms  Tempo induction and beat alignment (phase) estimation are similar to techniques implemented in Scheirer (1998) and Dixon (2001)  Uses autocorrelation function to determine the beat period, and comb filters for beat phase detection

17 17 of 19 Summary of Common Concepts Audio Tempo Induction and Beat Tracking is more complex than MIDI due to its non-symbolic nature Note onset is the most popular element used for tempo induction Common tempo induction techniques are HFC analysis for signals with drums present, and complex frequency analysis as a more general tool Higher-level musical knowledge rules can facilitate the tempo induction process

18 18 of 19 Summary of Common Concepts Beat tracking of the signal involves determining the phase (alignment) of the signal and is done based on onset detection data and tempo induction (period estimation) data Multiple hypotheses techniques (introduced as a beam search technique by Allen and Dannenberg 1991) is commonly used and is superior to individual solution approach used in early systems, since it allows to recover from encountered tempo induction and beat tracking errors

19 19 of 19 Conclusion HTML Bibliography http://www.music.mcgill.ca/~pkoles Questions


Download ppt "1 Tempo Induction and Beat Tracking for Audio Signals MUMT 611, February 2005 Assignment 3 Paul Kolesnik."

Similar presentations


Ads by Google