music21.search.base

base classes for searching scores.

Functions

music21.search.base.approximateNoteSearch(thisStream, otherStreams)

searches the list of otherStreams and returns an ordered list of matches (each stream will have a new property of matchProbability to show how well it matches)

>>> s = converter.parse("tinynotation: 4/4 c4 d8 e16 FF a'4 b-")
>>> o1 = converter.parse("tinynotation: 4/4 c4 d8 e GG a' b-4")
>>> o1.id = 'o1'
>>> o2 = converter.parse("tinynotation: 4/4 d#2 f A a' G b")
>>> o2.id = 'o2'
>>> o3 = converter.parse("tinynotation: 4/4 c8 d16 e32 FF32 a'8 b-8")
>>> o3.id = 'o3'
>>> l = search.approximateNoteSearch(s, [o1, o2, o3])
>>> for i in l:
...    print("%s %r" % (i.id, i.matchProbability))
o1 0.666666...
o3 0.333333...
o2 0.083333...
music21.search.base.approximateNoteSearchNoRhythm(thisStream, otherStreams)

searches the list of otherStreams and returns an ordered list of matches (each stream will have a new property of matchProbability to show how well it matches)

>>> s = converter.parse("tinynotation: 4/4 c4 d8 e16 FF a'4 b-")
>>> o1 = converter.parse("tinynotation: 4/4 c4 d8 e GG a' b-4")
>>> o1.id = 'o1'
>>> o2 = converter.parse("tinynotation: 4/4 d#2 f A a' G b")
>>> o2.id = 'o2'
>>> o3 = converter.parse("tinynotation: 4/4 c4 d e GG CCC r")
>>> o3.id = 'o3'
>>> l = search.approximateNoteSearchNoRhythm(s, [o1, o2, o3])
>>> for i in l:
...    print("%s %r" % (i.id, i.matchProbability))
o1 0.83333333...
o3 0.5
o2 0.1666666...
music21.search.base.approximateNoteSearchOnlyRhythm(thisStream, otherStreams)

searches the list of otherStreams and returns an ordered list of matches (each stream will have a new property of matchProbability to show how well it matches)

>>> s = converter.parse("tinynotation: 4/4 c4 d8 e16 FF a'4 b-")
>>> o1 = converter.parse("tinynotation: 4/4 c4 d8 e GG a' b-4")
>>> o1.id = 'o1'
>>> o2 = converter.parse("tinynotation: 4/4 d#2 f A a' G b")
>>> o2.id = 'o2'
>>> o3 = converter.parse("tinynotation: 4/4 c4 d e GG CCC r")
>>> o3.id = 'o3'
>>> l = search.approximateNoteSearchOnlyRhythm(s, [o1, o2, o3])
>>> for i in l:
...    print("%s %r" % (i.id, i.matchProbability))
o1 0.5
o3 0.33...
o2 0.0
music21.search.base.approximateNoteSearchWeighted(thisStream, otherStreams)

searches the list of otherStreams and returns an ordered list of matches (each stream will have a new property of matchProbability to show how well it matches)

>>> s = converter.parse("tinynotation: 4/4 c4 d8 e16 FF a'4 b-")
>>> o1 = converter.parse("tinynotation: 4/4 c4 d8 e GG2 a' b-4")
>>> o1.id = 'o1'
>>> o2 = converter.parse("tinynotation: 4/4 AAA4 AAA8 AAA16 AAA16 AAA4 AAA4")
>>> o2.id = 'o2'
>>> o3 = converter.parse("tinynotation: 4/4 c8 d16 e32 FF32 a'8 b-8")
>>> o3.id = 'o3'
>>> o4 = converter.parse("tinynotation: 4/4 c1 d1 e1 FF1 a'1 b-1")
>>> o4.id = 'o4'
>>> l = search.approximateNoteSearchWeighted(s, [o1, o2, o3, o4])
>>> for i in l:
...    print("%s %r" % (i.id, i.matchProbability))
o3 0.83333...
o1 0.75
o4 0.75
o2 0.25
music21.search.base.mostCommonMeasureRythms(streamIn, transposeDiatonic=False)

returns a sorted list of dictionaries of the most common rhythms in a stream where each dictionary contains:

number: the number of times a rhythm appears rhythm: the rhythm found (with the pitches of the first instance of the rhythm transposed to C5) measures: a list of measures containing the rhythm rhythmString: a string representation of the rhythm (see translateStreamToStringOnlyRhythm)

>>> bach = corpus.parse('bwv1.6')
>>> sortedRhythms = search.mostCommonMeasureRythms(bach)
>>> for dict in sortedRhythms[0:3]:
...     print('no: %d %s %s' % (dict['number'], 'rhythmString:', dict['rhythmString']))
...     print('bars: %r' % ([(m.number, 
...                               str(m.getContextByClass('Part').id)) 
...                            for m in dict['measures']]))
...     dict['rhythm'].show('text')
...     print('-----')
no: 34 rhythmString: PPPP
bars: [(1, 'Soprano'), (1, 'Alto'), (1, 'Tenor'), (1, 'Bass'), (2, ...), ..., (19, 'Soprano')]
{0.0} <music21.note.Note C>
{1.0} <music21.note.Note A>
{2.0} <music21.note.Note F>
{3.0} <music21.note.Note C>
-----
no: 7 rhythmString: ZZ
bars: [(13, 'Soprano'), (13, 'Alto'), ..., (14, 'Bass')]
{0.0} <music21.note.Note C>
{2.0} <music21.note.Note A>
-----
no: 6 rhythmString: ZPP
bars: [(6, 'Soprano'), (6, 'Bass'), ..., (18, 'Tenor')]
{0.0} <music21.note.Note C>
{2.0} <music21.note.Note B->
{3.0} <music21.note.Note B->
-----
music21.search.base.rhythmicSearch(thisStream, searchStream)

Takes two streams – the first is the stream to be searched and the second is a stream of elements whose rhythms must match the first. Returns a list of indices which begin a successful search.

searches are made based on quarterLength. thus an dotted sixteenth-note and a quadruplet (4:3) eighth will match each other.

Example 1: First we will set up a simple stream for searching:

>>> thisStream = converter.parse("tinynotation: 3/4 c4. d8 e4 g4. a8 f4. c4.").flat
>>> thisStream.show('text')
{0.0} <music21.clef.TrebleClef>
{0.0} <music21.meter.TimeSignature 3/4>
{0.0} <music21.note.Note C>
{1.5} <music21.note.Note D>
{2.0} <music21.note.Note E>
{3.0} <music21.note.Note G>
{4.5} <music21.note.Note A>
{5.0} <music21.note.Note F>
{6.5} <music21.note.Note C>    
{8.0} <music21.bar.Barline style=final>

Now we will search for all dotted-quarter/eighth elements in the Stream:

>>> searchStream1 = stream.Stream()
>>> searchStream1.append(note.Note(quarterLength = 1.5))
>>> searchStream1.append(note.Note(quarterLength = .5))
>>> l = search.rhythmicSearch(thisStream, searchStream1)
>>> l
[2, 5]
>>> stream.Stream(thisStream[5:7]).show('text')
{3.0} <music21.note.Note G>
{4.5} <music21.note.Note A>

Slightly more advanced search: we will look for any instances of eighth, followed by a note (or other element) of any length, followed by a dotted quarter note. Again, we will find two instances; this time we will tag them both with a TextExpression of “*” and then show the original stream:

>>> searchStream2 = stream.Stream()
>>> searchStream2.append(note.Note(quarterLength = .5))
>>> searchStream2.append(search.Wildcard())
>>> searchStream2.append(note.Note(quarterLength = 1.5))
>>> l = search.rhythmicSearch(thisStream, searchStream2)
>>> l
[3, 6]
>>> for found in l:
...     thisStream[found].lyric = "*"
>>> thisStream.show()
../_images/searchRhythmic1.png

Now we can test the search on a real dataset and show the types of preparation that are needed to make it most likely a success. We will look through the first movement of Corelli Trio Sonata op. 3 no. 1 (F major) looking to see how much more common the first search term (dotted-quarter, eighth) is than the second (eighth, anything, dotted-quarter). In fact, my hypothesis was wrong, and the second term is actually more common than the first! (n.b. rests are being counted here as well as notes)

>>> grave = corpus.parse('corelli/opus3no1/1grave')
>>> term1results = []
>>> term2results = []
>>> for p in grave.parts:
...    pf = p.flat.stripTies()  # consider tied notes as one long note
...    temp1 = search.rhythmicSearch(pf, searchStream1)
...    temp2 = search.rhythmicSearch(pf, searchStream2)
...    for found in temp1: term1results.append(found)
...    for found in temp2: term2results.append(found)
>>> term1results
[0, 7, 13, 21, 42, 57, 64, 66, 0, 5, 7, 19, 21, 40, 46, 63, 0, 8, 31, 61, 69, 71, 73, 97]
>>> term2results
[5, 29, 95]
>>> float(len(term1results))/len(term2results)
8.0
music21.search.base.translateDiatonicStreamToString(inputStreamOrIterator, returnMeasures=False)

Translates a Stream or StreamIterator of Notes and Rests only into a string, encoding only the .step (no accidental or octave) and whether the note is slower, faster, or the same speed as the previous note.

Skips all but the first note of tie. Skips multiple rests in a row

Each note gets one byte:

A-G = note of same length as previous H-N = note of longer length than previous O-U = note of shorter length than previous Z = rest

>>> s = converter.parse("tinynotation: 3/4 c4 d8~ d16 r16 FF8 F#8 a'8 b-2.")
>>> streamString = search.translateDiatonicStreamToString(s.recurse().notesAndRests)
>>> print(streamString)
CRZFFAI
>>> len(streamString)
7

If returnMeasures is True, returns an array of measureNumbers where each entry represents the measure number of the measure of the object at that character position :

>>> streamString2, measures = search.translateDiatonicStreamToString(s.recurse().notesAndRests, 
...                                    returnMeasures=True)
>>> streamString == streamString2
True
>>> measures
[1, 1, 1, 1, 1, 2, 2]
music21.search.base.translateDurationToBytes(n)

takes a note.Note object and translates it to a two-byte representation

currently returns the chr() for the note’s midi number. or chr(127) for rests followed by the log of the quarter length (fitted to 1-127, see formula below)

>>> n = note.Note("C4")
>>> n.duration.quarterLength = 3  # dotted half
>>> trans = search.translateDurationToBytes(n)
>>> trans
'_'
>>> (2**(ord(trans[0])/10.0))/256  # approximately 3
2.828...
music21.search.base.translateIntervalsAndSpeed(inputStream, returnMeasures=False)

Translates a Stream (not StreamIterator) of Notes and Rests only into a string, encoding only the chromatic distance from the last note and whether the note is slower, faster, or the same speed as the previous note.

Skips all but the first note of tie. Skips multiple rests in a row

Each note gets one byte and encodes up from -13 to 13 (all notes > octave are 13 or -13)

>>> s = converter.parse("tinynotation: 3/4 c4 d8~ d16 r16 F8 F#8 a'8 b-2.")
>>> sn = s.flat.notesAndRests.stream()
>>> streamString = search.translateIntervalsAndSpeed(sn)
>>> print(streamString)
Ib RH<9
>>> len(streamString)
7

If returnLastTuple is True, returns a triplet of whether the last note was a rest, whether the last note was tied, what the last quarterLength was, and what the last pitches’ midi number was

which can be fed back into this algorithm:

>>> streamString2, measures = search.translateIntervalsAndSpeed(sn, returnMeasures=True)
>>> streamString == streamString2
True
>>> measures
[1, 1, 1, 1, 1, 2, 2]
music21.search.base.translateNoteTieToByte(n)

takes a note.Note object and returns a one-byte representation of its tie status. ‘s’ if start tie, ‘e’ if stop tie, ‘c’ if continue tie, and ‘’ if no tie

>>> n = note.Note("E")
>>> search.translateNoteTieToByte(n)
''
>>> n.tie = tie.Tie("start")
>>> search.translateNoteTieToByte(n)
's'
>>> n.tie.type = 'continue'
>>> search.translateNoteTieToByte(n)
'c'
>>> n.tie.type = 'stop'
>>> search.translateNoteTieToByte(n)
'e'
music21.search.base.translateNoteToByte(n)

takes a note.Note object and translates it to a single byte representation.

currently returns the chr() for the note’s midi number. or chr(127) for rests

>>> n = note.Note("C4")
>>> b = search.translateNoteToByte(n)
>>> b
'<'
>>> ord(b) 
60
>>> ord(b) == n.pitch.midi
True

Chords are currently just searched on the first Note (or treated as a Rest if None)

music21.search.base.translateNoteWithDurationToBytes(n, includeTieByte=True)

takes a note.Note object and translates it to a three-byte representation.

currently returns the chr() for the note’s midi number. or chr(127) for rests followed by the log of the quarter length (fitted to 1-127, see formula below) followed by ‘s’, ‘c’, or ‘e’ if includeTieByte is True and there is a tie

>>> n = note.Note("C4")
>>> n.duration.quarterLength = 3  # dotted half
>>> trans = search.translateNoteWithDurationToBytes(n)
>>> trans
'<_'
>>> (2**(ord(trans[1])/10.0))/256  # approximately 3
2.828...
>>> n.tie = tie.Tie('stop')
>>> trans = search.translateNoteWithDurationToBytes(n)
>>> trans
'<_e'
>>> trans = search.translateNoteWithDurationToBytes(n, includeTieByte=False)
>>> trans
'<_'
music21.search.base.translateStreamToString(inputStreamOrIterator, returnMeasures=False)

takes a stream (or streamIterator) of notesAndRests only and returns a string for searching on.

>>> s = converter.parse("tinynotation: 3/4 c4 d8 r16 FF8. a'8 b-2.")
>>> sn = s.flat.notesAndRests
>>> streamString = search.translateStreamToString(sn)
>>> print(streamString)
<P>F<)KQFF_
>>> len(streamString)  
12
music21.search.base.translateStreamToStringNoRhythm(inputStream, returnMeasures=False)

takes a stream or streamIterator of notesAndRests only and returns a string for searching on, using translateNoteToByte.

>>> s = converter.parse("tinynotation: 4/4 c4 d e FF a' b-")
>>> sn = s.flat.notesAndRests
>>> search.translateStreamToStringNoRhythm(sn)
'<>@)QF'
music21.search.base.translateStreamToStringOnlyRhythm(inputStream, returnMeasures=False)

takes a stream or streamIterator of notesAndRests only and returns a string for searching on.

>>> s = converter.parse("tinynotation: 3/4 c4 d8 e16 FF8. a'8 b-2.")
>>> sn = s.flat.notesAndRests
>>> streamString = search.translateStreamToStringOnlyRhythm(sn)
>>> print(streamString)
PF<KF_
>>> len(streamString)  
6

Wildcard

class music21.search.base.Wildcard

An object that may have some properties defined, but others not that matches a single object in a music21 stream. Equivalent to the regular expression ”.”

>>> wc1 = search.Wildcard()
>>> wc1.pitch = pitch.Pitch("C")
>>> st1 = stream.Stream()
>>> st1.append(note.Note("D", type='half'))
>>> st1.append(wc1)    

Wildcard bases

Wildcard read-only properties

Read-only properties inherited from Music21Object:

Wildcard read/write properties

Read/write properties inherited from Music21Object:

Wildcard methods

Methods inherited from Music21Object:

Wildcard instance variables

Instance variables inherited from Music21Object:

WildcardDuration

class music21.search.base.WildcardDuration(*arguments, **keywords)

a wildcard duration (it might define a duration in itself, but the methods here will see that it is a wildcard of some sort)

TODO: Write

WildcardDuration bases

WildcardDuration read-only properties

Read-only properties inherited from Duration:

WildcardDuration read/write properties

Read/write properties inherited from Duration:

WildcardDuration methods

Methods inherited from Duration: