|Further specification:||Data set|
|Pointers:||Contact Luc De Raedt Luc.DeRaedt@cs.kuleuven.ac.be|
|Data complexity:||1800 notes|
The task in this application is to induce theories for predicting MIDI files from the musical analysis of a score.
MIDI stands for Musical Instrument Digital Interface, first published in 1983 by International MIDI Association (IMA). MIDI does not have the same quality of a CD, as its encoding is much simpler. The encodings are also specific for each instrument or synthesizer. W.r.t. Ampico MIDI is a significant advance, it has e.g. 128 levels to encode the force instead of 14 for Ampico, and it applies not only to piano, but to any digital instrument. However, the ideas are very similar. It is because we did not have any Ampico device available that we employed MIDI.
Our experiments with MIDI, were all carried out on a performance by Carl Verbraeken of Mendelssohn's Lied Ohne Worte. Two important changes were made w.r.t. the Ampico data. First, the data were extended to include also the non-melody notes. Just playing the melodic notes on a piano is not interesting. Second, the right pedal of the piano was taken into account. When this pedal is applied, the strings are not damped. As the left pedal was always applied in our case study, we did not do anything special for this pedal. Other changes were made so that the Ampico data format could be reused. Beats on MIDI (in our study) were counted in 384 ticks, Ampico in 120 ticks. Velocity in MIDI is connected with volume and is counted in 128 levels, Ampico's force is encoded in 14 levels. We found it convenient to reduce these 384 ticks to 120 and the 128 levels to 14. This is useful for recycling the encoding and for comparing the results. (In this respect, our work is closer to Ampico than to MIDI, despite the fact that our predictions are used to generate a MIDI file).
New predicates were defined to capture the pedal and the non-melodic notes. Otherwise the same data representation format as above was employed.