Listeners can easily identify complex periodicities such as the rhythms that normally occur in musical performances, even though these periodicities may be distributed over several interleaved time scales. At the simplest level, the pulse is the basic unit of temporal structure, the foot tapping "beat". Such pulses are typically gathered together in performance into groupings that correspond to metered measures, and these groupings often cluster to form larger structures corresponding to musical "phrases". Such patterns of grouping and clustering can continue through many hierarchical levels, and many of these may be readily perceptible to an attentive listener.
This paper presents a psychoacoustically based method of data reduction motivated by the desire to analyze the rhythm of musical performances. The resulting information is then analyzed by Periodicity Transforms (which are based on projections onto "periodic subspaces") to locate periodicities in the resulting data. These periodicities represent the rhythm at several levels, including the "pulse", the "measure", and larger structures such as musical "phrases." The implications (and limitations) of such automated grouping of rhythmic features is discussed. The method is applied to a number of musical examples, its output is compared to that of the Fourier Transform, and both are compared to a more traditional "musical" analysis of the rhythm. Unlike many methods of rhythm analysis, the techniques can be applied directly to the digitized performance (i.e., a soundfile) and do not require a musical score or a MIDI transcription. Several examples are presented that highlight both the strengths and weaknesses of the approach.
For more information, ask about
W. A. Sethares and T. W. Staley, "Meter and Periodicity in Musical Performance", Journal of New Music Research. You can download a slightly raw pdf version here.
To get to my homepage, click here.