Giter Club home page Giter Club logo

python-midi's Introduction

Table of Contents

Python MIDI

Python, for all its amazing ability out of the box, does not provide you with an easy means to manipulate MIDI data. There are probably about ten different python packages out there that accomplish some part of this goal, but there is nothing that is totally comprehensive.

This toolkit aims to fulfill this goal. In particular, it strives to provide a high level framework that is independent of hardware. It tries to offer a reasonable object granularity to make MIDI streams a painless thing to manipulate, sequence, record, and playback. It's important to have a good concept of time, and the event framework provides automatic hooks so you don't have to calculate ticks to wall clock, for example.

This MIDI Python toolkit represents about two years of scattered work. If you are someone like me, who has spent a long time looking for a Python MIDI framework, than this might be a good fit. It's not perfect, but it has a large feature set to offer.

Features

  • High level class types that represent individual MIDI events.
  • A multi-track aware container, that allows you to manage your MIDI events.
  • A tempo map that actively keeps track of tempo changes within a track.
  • A reader and writer, so you can read and write your MIDI tracks to disk.

Installation

Follow the normal procedure for Python module installation:

python setup.py install

Examine a MIDI File

To examine the contents of a MIDI file run

$ mididump.py mary.mid

This will print out a representation of "Mary had a Little Lamb" as executable python code.

Example Usage

Building a MIDI File from scratch

It is easy to build a MIDI track from scratch.

import midi
# Instantiate a MIDI Pattern (contains a list of tracks)
pattern = midi.Pattern()
# Instantiate a MIDI Track (contains a list of MIDI events)
track = midi.Track()
# Append the track to the pattern
pattern.append(track)
# Instantiate a MIDI note on event, append it to the track
on = midi.NoteOnEvent(tick=0, velocity=20, pitch=midi.G_3)
track.append(on)
# Instantiate a MIDI note off event, append it to the track
off = midi.NoteOffEvent(tick=100, pitch=midi.G_3)
track.append(off)
# Add the end of track event, append it to the track
eot = midi.EndOfTrackEvent(tick=1)
track.append(eot)
# Print out the pattern
print pattern
# Save the pattern to disk
midi.write_midifile("example.mid", pattern)

A MIDI file is represented as a hierarchical set of objects. At the top is a Pattern, which contains a list of Tracks, and a Track is is a list of MIDI Events.

The MIDI Pattern class inherits from the standard python list, so it supports all list features such as append(), extend(), slicing, and iteration. Patterns also contain global MIDI metadata: the resolution and MIDI Format.

The MIDI Track class also inherits from the standard python list. It does not have any special metadata like Pattern, but it does provide a few helper functions to manipulate all events within a track.

There are 27 different MIDI Events supported. In this example, three different MIDI events are created and added to the MIDI Track:

  1. The NoteOnEvent captures the start of note, like a piano player pushing down on a piano key. The tick is when this event occurred, the pitch is the note value of the key pressed, and the velocity represents how hard the key was pressed.
  2. The NoteOffEvent captures the end of note, just like a piano player removing her finger from a depressed piano key. Once again, the tick is when this event occurred, the pitch is the note that is released, and the velocity has no real world analogy and is usually ignored. NoteOnEvents with a velocity of zero are equivalent to NoteOffEvents.
  3. The EndOfTrackEvent is a special event, and is used to indicate to MIDI sequencing software when the song ends. With creating Patterns with multiple Tracks, you only need one EndOfTrack event for the entire song. Most MIDI software will refuse to load a MIDI file if it does not contain an EndOfTrack event.
You might notice that the EndOfTrackEvent has a tick value of 1. This is because MIDI represents ticks in relative time. The actual tick offset of the MidiTrackEvent is the sum of its tick and all the ticks from previous events. In this example, the EndOfTrackEvent would occur at tick 101 (0 + 100 + 1).

Side Note: What is a MIDI Tick?

The problem with ticks is that they don't give you any information about when they occur without knowing two other pieces of information, the resolution, and the tempo. The code handles these issues for you so all you have to do is think about things in terms of milliseconds, or ticks, if you care about the beat.

A tick represents the lowest level resolution of a MIDI track. Tempo is always analogous with Beats per Minute (BPM) which is the same thing as Quarter notes per Minute (QPM). The Resolution is also known as the Pulses per Quarter note (PPQ). It analogous to Ticks per Beat (TPM).

Tempo is set by two things. First, a saved MIDI file encodes an initial Resolution and Tempo. You use these values to initialize the sequencer timer. The Resolution should be considered static to a track, as well as the sequencer. During MIDI playback, the MIDI file may have encoded sequenced (that is, timed) Tempo change events. These events will modulate the Tempo at the time they specify. The Resolution, however, can not change from its initial value during playback.

Under the hood, MIDI represents Tempo in microseconds. In other words, you convert Tempo to Microseconds per Beat. If the Tempo was 120 BPM, the python code to convert to microseconds looks like this:

>>> 60 * 1000000 / 120
500000

This says the Tempo is 500,000 microseconds per beat. This, in combination with the Resolution, will allow you to convert ticks to time. If there are 500,000 microseconds per beat, and if the Resolution is 1,000 than one tick is how much time?

>>> 500000 / 1000
500
>>> 500 / 1000000.0
0.00050000000000000001

In other words, one tick represents .0005 seconds of time or half a millisecond. Increase the Resolution and this number gets smaller, the inverse as the Resolution gets smaller. Same for Tempo.

Although MIDI encodes Time Signatures, it has no impact on the Tempo. However, here is a quick refresher on Time Signatures:

http://en.wikipedia.org/wiki/Time_signature

Reading our Track back from Disk

It's just as easy to load your MIDI file from disk.

import midi
pattern = midi.read_midifile("example.mid")
print pattern

Sequencer

If you use this toolkit under Linux, you can take advantage of ALSA's sequencer. There is a SWIG wrapper and a high level sequencer interface that hides the ALSA details as best it can. This sequencer understands the higher level Event framework, and will convert these Events to structures accessible to ALSA. It tries to do as much as the hard work for you as possible, including adjusting the queue for tempo changes during playback. You can also record MIDI events, and with the right set of calls, the ALSA sequencer will timestamp your MIDI tracks at the moment the event triggers an OS hardware interrupt. The timing is extremely accurate, even though you are using Python to manage it.

I am extremely interested in supporting OS-X and Win32 sequencers as well, but I need platform expert who can help me. Are you that person? Please contact me if you would like to help.

Scripts for Sequencer

To examine the hardware and software MIDI devices attached to your system, run the mididumphw.py script.

$ mididumphw.py
] client(20) "OP-1 Midi Device"
]   port(0) [r, w, sender, receiver] "OP-1 Midi Device MIDI 1"
] client(129) "__sequencer__"
] client(14) "Midi Through"
]   port(0) [r, w, sender, receiver] "Midi Through Port-0"
] client(0) "System"
]   port(1) [r, sender] "Announce"
]   port(0) [r, w, sender] "Timer"
] client(128) "FLUID Synth (6438)"
]   port(0) [w, receiver] "Synth input port (6438:0)"

In the case shown, qsynth is running (client 128), and a hardware synthesizer is attached via USB (client 20).

To play the example MIDI file, run the midiplay.py script.

midiplay.py 128 0 mary.mid

Website, support, bug tracking, development etc.

You can find the latest code on the home page: https://github.com/vishnubob/python-midi/

You can also check for known issues and submit new ones to the tracker: https://github.com/vishnubob/python-midi/issues/

Thanks

I originally wrote this to drive the electro-mechanical instruments of Ensemble Robot, which is a Boston based group of artists, programmers, and engineers. This API, however, has applications beyond controlling this equipment. For more information about Ensemble Robot, please visit:

http://www.ensemblerobot.org/

python-midi's People

Contributors

akonradi avatar chao-mu avatar craffel avatar ebattenberg avatar eli-b avatar erriez avatar evojimmy avatar exowanderer avatar hskim08 avatar kieranmenor avatar lechuck42 avatar ryneches avatar saccharomyces avatar tdhsmith avatar vishnubob avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

python-midi's Issues

Question: converting from ticks to milliseconds

Hi,
I'm having some trouble using the framework to do what I want:
I read a midi file, I convert ticks to absolute but now I want to convert those ticks to milliseconds.

The documentation is confusing me: in several places it mentions the importance of a good time model and how conversions taking into account tempo and resolution are taken care of by the framework, but I don't see how to get the millisecond value for an event at all. As for calculating it with the formula, it's not clear to me how to use the TempoMap or EventStreamIterator classes to track changing tempos. What are these streams they expect as a parameter? And what window length is required for the EventStreamIterator?

So the question is: how can I get a millisecond timevalue (relative to the start of the piece) of an
event?

Question: how to show the current beat in realtime

I want to create something which can show the current beat while a song is playing back. It doesn't have to play back the actual song, but it needs to be synchronized with the song playback. For example, if I start the program and the song at the same time, the program would show the current beat in the song playing back.

It seems like it could be done with the sequencer, although some guidance / examples would be helpful. Looks like according to the README, there's only an ALSA sequencer working, but I did see some code for an OSX sequencer -- is that known to work?

Also, does it seem feasible without a sequencer so I could avoid messing around with drivers?

mididump.py does not load entire file

I can't seem to get midiplay.py to play the whole file. It stops at 45 seconds in. mididump.py has the same issue. It doesn't load my file completely before dumping that file.

My file is about 1min 30secs.

Questin: dump midi commands in realtime

How can I put out the raw midi commands to the stdout?
Use case: I am implementing a little toy that understands noteon, noteoff and alloff commands, but I only have a serial port to communicate to it. I want to send the raw commands with the correct timing to make it play.
Thanks in advance for your help

test_mary will always pass!

I think you were victim of a cut-and-paste bug. Lines 22-23 tests.py reads:

event1 = pattern1[track_idx][event_idx]
event2 = pattern1[track_idx][event_idx]

but it should be

event1 = pattern1[track_idx][event_idx]
event2 = pattern2[track_idx][event_idx]

The way you have it, event1 == event2, so the test will always pass.

Btw, thanks for this most excellent work. It has been very useful to me.

Installation on Linux has unnecessary swig dependency for Python-only functionality

When I attempt to do a pip install on Ubuntu, there's an error because I don't have swig installed. So then I do sudo apt-get install swig, and try again, and it works.

However, my application is not actually using those features of the library that require swig.

So possibly it would make sense to split this library into two parts: the first with the pure Python functionality, and the second with the Alsa sequencer and the swig dependency.

If this issue is fixed, and my previous issue (Pypi downloadability), then this library would get closer to the "it just works" stage, with perhaps the Python 3 issue being the only other problem.

Add the ability to find matching NoteOff event

From #56:

The easiest (but not efficient) method that comes to mind right now would be to make a helper function findMatchingNoteOff(track, noteOn) that scanned for the first instance in track of a NoteOffEvent on with the same channel and pitch as noteOn, and occurring after its absolute tick value.

I'm opening an issue for this since it feels like having this in the library would be useful to others. PR to follow soon.

Best way to deepcopy a track

I'm trying to get a deep copy of a track via:

copy.deepcopy(track)

but I'm getting this error:

  File "/Users/tleyden/Development/climate_music/src/transformer.py", line 52, in transform
    track_copy = copy.deepcopy(track)
  File "/Users/tleyden/DevLibraries/anaconda/lib/python2.7/copy.py", line 190, in deepcopy
    y = _reconstruct(x, rv, 1, memo)
  File "/Users/tleyden/DevLibraries/anaconda/lib/python2.7/copy.py", line 351, in _reconstruct
    item = deepcopy(item, memo)
  File "/Users/tleyden/DevLibraries/anaconda/lib/python2.7/copy.py", line 190, in deepcopy
    y = _reconstruct(x, rv, 1, memo)
  File "/Users/tleyden/DevLibraries/anaconda/lib/python2.7/copy.py", line 346, in _reconstruct
    setattr(y, key, value)
  File "/Users/tleyden/DevLibraries/anaconda/lib/python2.7/site-packages/midi/events.py", line 125, in set_velocity
    self.data[1] = val
AttributeError: data

As an alternative, I'm writing my own kludge and just propagating the events I care about:

def copy_track(track):
    """
    copy.deepcopy() didn't work, so I hand rolled this as a workaround
    """
    track_copy = midi.Track()
    for event in track:
        if isinstance(event, midi.NoteOnEvent):
            on = midi.NoteOnEvent(tick=event.tick, velocity=event.velocity, pitch=event.pitch)
            track_copy.append(on)
        if isinstance(event, midi.NoteOffEvent):
            off = midi.NoteOffEvent(tick=event.tick, velocity=event.velocity, pitch=event.pitch)
            track_copy.append(off)
     return track_copy

If it's not feasible to make copy.deepcopy() work, is there a more elegant way of rolling my own?

Note off events of this MIDI file are not parsed

Only note on events are read from this midi file : http://download.jzy3d.org/debug-deb_passMINp_align.mid

A third party software (Ableton Live) can load the file properly with midi off events.

capture d ecran 2015-09-21 a 18 07 35

When the MIDI clip is exported from Ableton, it can be properly read by python-midi (see the file as exported by live : http://download.jzy3d.org/debug-poliner-debussy.mid)

To reproduce :
‘‘‘
p = midi.read_midifile("../data/debug-deb_passMINp_align.mid");
print "tracks : " + repr(len(p))
print p[0]
print p[1][0:20]
‘‘‘

Will output :

tracks : 2
midi.Track(
[midi.TimeSignatureEvent(tick=0, data=[4, 2, 24, 8]),
midi.SetTempoEvent(tick=0, data=[9, 39, 192]),
midi.EndOfTrackEvent(tick=0, data=[])])
midi.Track(
[midi.NoteOnEvent(tick=0, channel=0, data=[42, 38]),
midi.NoteOnEvent(tick=22, channel=0, data=[42, 0]),
midi.NoteOnEvent(tick=38, channel=0, data=[49, 32]),
midi.NoteOnEvent(tick=22, channel=0, data=[49, 0]),
midi.NoteOnEvent(tick=38, channel=0, data=[57, 34]),
midi.NoteOnEvent(tick=22, channel=0, data=[57, 0]),
midi.NoteOnEvent(tick=38, channel=0, data=[49, 33]),
midi.NoteOnEvent(tick=22, channel=0, data=[49, 0]),
midi.NoteOnEvent(tick=38, channel=0, data=[44, 42]),
midi.NoteOnEvent(tick=22, channel=0, data=[44, 0]),
midi.NoteOnEvent(tick=38, channel=0, data=[49, 34]),
midi.NoteOnEvent(tick=22, channel=0, data=[49, 0]),
midi.NoteOnEvent(tick=38, channel=0, data=[59, 38]),
midi.NoteOnEvent(tick=22, channel=0, data=[59, 0]),
midi.NoteOnEvent(tick=38, channel=0, data=[49, 36]),
midi.NoteOnEvent(tick=22, channel=0, data=[49, 0]),
midi.NoteOnEvent(tick=38, channel=0, data=[45, 44]),
midi.NoteOnEvent(tick=22, channel=0, data=[45, 0]),
midi.NoteOnEvent(tick=38, channel=0, data=[54, 40]),
midi.NoteOnEvent(tick=22, channel=0, data=[54, 0])])

The MIDI file is taken from the training set of Poliner & Ellis dataset : http://labrosa.ee.columbia.edu/projects/piano/

Multiple connections

Hi,

First of all thanks for this nice python library. I'm using it currently to write a tool to integrate my monome (http://monome.org/) as an MIDI interface in my live looping setup. It starts to work :) . During development I came across the following issues:

  • It does not seem possible to connect one output port of a sequencer to multiple input ports. Using jack, the alsa connections are showed, but MIDI events are only send to the last port that has been connected.
  • I noticed that I cannot setup connections via jack (or I guess aconnect). These need to be made via the python objects. Is there a workaround for that?
  • Can I make multiple input/output ports?

best regards :)
Alex

class AfterTouchEvent extended with pitch and value functions

Hi all,

Thanks for developing this wonderful Python module! I've extended the AfterTouchEvent class with new get_pitch, set_pitch, get_value and set_value functions in events.py:

events.py:

class AfterTouchEvent(Event):
slots = ['pitch', 'value']
statusmsg = 0xA0
length = 2
name = 'After Touch'

def get_pitch(self):
    return self.data[0]
def set_pitch(self, val):
    self.data[0] = val
pitch = property(get_pitch, set_pitch)

def get_value(self):
    return self.data[1]
def set_value(self, val):
    self.data[1] = val
value = property(get_value, set_value)

Now I can write in an application:

if (event.name == "After Touch"):
print "Channel %d" % (event.channel+1)
print "Note: %d" % event.get_pitch()
print "Value: %d" % event.get_value()

Can someone review and commit this code?
Thanks!

How to copy Event object?

I want to extract a part of a midi file and save it on a file

I am trying to copy events from pattern to another. But it seems deepcopy and copy aren't supported.

NoteOn events almost always get tick=0

This is a great library, but I think I found an issue:
When parsing midi files most - if not all - NoteOn events get a tick value of 0.
This is not correct. It should follow the timeline and increase the tick.
Can this be fixed ?

example:
midi.ControlChangeEvent(tick=3072, channel=1, data=[7, 75]),
midi.NoteOnEvent(tick=0, channel=1, data=[78, 127]),
midi.NoteOffEvent(tick=384, channel=1, data=[78, 64]),
midi.NoteOnEvent(tick=0, channel=1, data=[76, 127]),
midi.NoteOffEvent(tick=384, channel=1, data=[76, 64]),
midi.NoteOnEvent(tick=0, channel=1, data=[74, 127]),
midi.NoteOffEvent(tick=384, channel=1, data=[74, 64]),
midi.NoteOnEvent(tick=0, channel=1, data=[73, 127]),

a comparable perl-based parser (midicsv.exe) does it right:
2, 3072, Control_c, 1, 7, 75
2, 3072, Note_on_c, 1, 78, 127
2, 3456, Note_off_c, 1, 78, 64
2, 3456, Note_on_c, 1, 76, 127
2, 3840, Note_off_c, 1, 76, 64
2, 3840, Note_on_c, 1, 74, 127
2, 4224, Note_off_c, 1, 74, 64
2, 4224, Note_on_c, 1, 73, 127
2, 4608, Note_off_c, 1, 73, 64

Question : Converting timestamps to ticks

Hi, I am trying to convert mic input into a monophonic midi file. I have a script which does the pitch tracking for me. I ran it on a wav file and it give me a dictionary of timestamps and pitches(which I converted into midi numbers) . How should I go about creating a midi file from this?
I looked up the formula for converting time in seconds to ticks.
ticks = (timestamp * bpm) / 60
Looks like tick value should always be increasing with time.But I found that ticks some times decrease and sometimes increase for successive NoteOnEvents.
Could you help me out please?

How to get the start time and duration of NoteEvent?

Hi vishnubob, I'm trying to use this library to parse midi file in my project, but I found that when I run this example, I only got the NoteOnEvent, where is the NoteOffEvent? and can you show me how to get the start time and the duration of a NoteEvent?

thanks!

Questions on mididump.py

I downloaded this America the Beautiful.midi, and ran:

mididump.py america_the_beautiful_three_instruments.midi

which gave me this output

and I had a few questions:

  1. In this line: midi.NoteOnEvent(tick=720, channel=0, data=[62, 126]), what do the list elements in the data field represent?
  2. Why are there duplicate events? this event seems identical to this event
  3. Why aren't the events sorted by tick?
  4. In Side Note: What is a MIDI Tick?, you mention:

First, a saved MIDI file encodes an initial Resolution and Tempo

In the dumped file, I'm only seeing a resolution:

midi.Pattern(format=1, resolution=240, tracks=\

Where should I look for the Tempo?

Repo lacks versioning

The repository, as it currently stands, has no proper versioning system. This makes it hard for packagers (such as myself; I am packaging this for Arch Linux) to package "stable" versions of the code, because it's hard to know which commits are stable and which aren't. Therefore, I ask that you tag your releases with a specific version number to make releases easier.

Question : Is there a way I can extract pitches from midi?

I want to perform some mathematical operations on the pitches of a midi file. Is there a way I can extract pitches from a midi file, store them in a list perform some operations and write them back to their corresponding events ?

Support for MMC or CC

If I understand correctly, there isn't currently support for CC or MMC messages. Is that correct, and are there any plans to add that yet?

Question on how to change the pitch of certain notes (per-track)

I'm new to midi, so excuse the n00b question.

Here's a snippet of the actual algorithm I'm trying to implement:

Each measure of four beats represents a decade in time, starting from the year 1300A.D. and ending at 2015.

We used a simple algorithm in which a rise or fall in the values of each climate variables is matched by a rise or fall of an equal magnitude (measured as a %) of the pitch of the corresponding musical voices. The reference value used in this model is the pre-industrial (1300-1760) averages for temperature and CO2 concentrations. So, for example, if the temperature value rises by 10% as measured against the reference value over a decade, then the pitch of the trumpet will increase by 10% for the four beats of the corresponding measure in the score.

My current plan is to:

  • Read an input midi file
  • For each track
    • Create a new "modified" track that corresponds to this track
    • Read every note in the track
    • If the criteria above requires that the note must be changed:
      • Modify the note by changing the pitch a half-step up or down (are there any helper functions for doing this?)
      • Otherwise, copy the note over to the modified track as-is
  • Write the modified tracks to an output midi file

Do you have any examples of iterating through all notes in a track? If not, can you at least point me to the function/method to call? What I had in mind was basically something similar to the mxm python-midi example

As far as grouping the notes into groups of 4 beats, is there anything provided by the library to make this easier? I guess what I mean is there a "wrapper event" to invoke a callback after X number of beats and pass all the notes in those beats? Or would I have to handle that on my own?

By the way, this is for an an art project to bring awareness about Climate Change.

Calling Track.make_ticks_abs twice gives incorrect tick values

Calling Track.make_ticks_abs has a similar problematic behavior.

I would expect the second call to either of these functions to do nothing, as the ticks are already absolue/relative.

This is the current behavior for Track t:

print t
[midi.NoteOnEvent(tick=0, channel=0, data=[45, 95]),
midi.NoteOffEvent(tick=960, channel=0, data=[45, 80]),
midi.NoteOnEvent(tick=0, channel=0, data=[57, 95]),
midi.NoteOffEvent(tick=480, channel=0, data=[57, 80]))
t.make_ticks_abs()
print t #correct
[midi.NoteOnEvent(tick=0, channel=0, data=[45, 95]),
midi.NoteOffEvent(tick=960, channel=0, data=[45, 80]),
midi.NoteOnEvent(tick=960, channel=0, data=[57, 95]),
midi.NoteOffEvent(tick=1440, channel=0, data=[57, 80]))
t.make_ticks_abs()
print t #incorrect
[midi.NoteOnEvent(tick=0, channel=0, data=[45, 95]),
midi.NoteOffEvent(tick=960, channel=0, data=[45, 80]),
midi.NoteOnEvent(tick=1920, channel=0, data=[57, 95]),
midi.NoteOffEvent(tick=3360, channel=0, data=[57, 80]))

Missing files?

When i run sudo python setup.py install it fails with the error:
swig -python -o src/sequencer_alsa/sequencer_alsa_wrap.c src/sequencer_alsa/sequencer_alsa.i
Unable to execute swig: Nosuch file or directory.

I checked the src/sequencer_alsa/ directory and there is no sequencer_alsa_wrap.c but there is /sequencer_alsa.i

Installation problems with pip

Hi,

it seems that there are some problems when I try to install the midi package via pip.

He is the output

$ pip install midi
Collecting midi
  Could not find a version that satisfies the requirement midi (from versions: )
  Some externally hosted files were ignored as access to them may be unreliable (use --allow-external midi to allow).
No matching distribution found for midi

Interesting, that doing a pip search midi lists the package.

When I clone the repo and run pip install -e . from inside the directory, the installation finishes. However, when I install it like this I cannot import midi (also the script/ commands fail because of the same reason)

$ python
>>> import midi
ImportError: No module named midi

or the script

$ mididump.py data/confuta.mid 
Traceback (most recent call last):
  File "/usr/local/bin/mididump.py", line 6, in <module>
    exec(compile(open(__file__).read(), __file__, 'exec'))
  File "/home/ckoerner/software/python-midi/scripts/mididump.py", line 5, in <module>
    import midi
ImportError: No module named midi

When I install the package with the recommended way python setup.py install everything works fine. However, I would really like to put the package into the requirements.txt file and install it with pip.

Best,
Christoph

Unknown Meta MIDI Event

In my file i have some weird midi events. How do i get the parser to just skip the incorrect midi events?

My midi editor program can read the file just fine so i know that it can still be read.

setup.py not working, missing characters.

I was trying to install the module, as I plan to do some work with MIDI files and it seemed useful, but I couldn't get setup.py to install. For some reason, in the version I downloaded at least, the brackets around the print statement on line 65 were missing, and so it wouldn't work. When I added them back in, it worked fine.

Python 3

Any plans on Python 3 compatibility?

Is README out of date?

I'm trying to use the examples in README, but using the new_stream function gives

AttributeError: 'module' object has no attribute 'new_stream'

Looking through the tests and the source, it looks like new_stream has been replaced by the Pattern class?

Octave is incorrect for constants

If you run the following you would expect to get a result of 60 (that being the correct MIDI value for C4), however it actually returns the value for C3.

>>> import midi
>>> midi.C_4
48

This seems to be a fault with how the globals are generated in constants.py that causes every note name constant to be shifted down one octave.

Which license?

This looks really handy and I'd like to package it for the NixOS distro. I just wonder what license to put down for it. Thanks!

the readme examples don't work

import midi

t = midi.new_stream(resolution=120, tempo=200)

AttributeError Traceback (most recent call last)
in ()
----> 1 t = midi.new_stream(resolution=120, tempo=200)

AttributeError: 'module' object has no attribute 'new_stream'

Bug in Event.is_event() method?

Hi! I recently got a chance to try you code out, and it seems that the Event.is_event() method is unable to catch several valid midi events such an NoteOn and NoteOff. I believe this is due to a faulty mask for the bitwise AND operation with the statusmsg variable.

In line 97 of events.py it reads:
return (cls.statusmsg == (statusmsg & 0xF0))

Since you're comparing with a statusmsg variable of the parent class set to 0x0, it should be:
return (cls.statusmsg == (statusmsg & 0x0F)) (note the inverted mask: 0xF0 becomes 0x0F)

Without this modification, an event like, for example NoteOn which has a signature of 0x90, would never get caught, because 0x90 & 0xF0 equals to 0x90 instead of 0x0 and the method will return False. In fact, a NoteOn event actually raises an 'Unknown MIDI Event' exception.

Is that correct, or did I misunderstand the code?

Cheers,
Mauricio

python-rtmidi integration

It would be useful to be able to send midi events encoded as midi messages into python-rtmidi output ports, and to parse midi messages received from python-rtmidi input ports as midi events.
It would be especially useful on the Windows platform, where the sequencer module isn't supported.

What do you think?

SetTempoEvent data interpretation?

I am confused as to how one might make use of the SetTempoEvent data; I can't determine how the three values listed in each event are related to the tempo; for example, with a pattern resolution of 384, an event where the tempo should be set to 60 bpm, and therefore approximately 2604 ms/tick, has data=[15, 66, 64]. What correlation do those numbers have to the tempo?

Documentation??

Am I just missing documentation somewhere? It's almost impossible to do anything not described in your example without it. For example, I'm trying to figure out how to use SetTempoEvent, but I don't tell what any of the arguments are supposed to be. Help!

Question: how to set a note to "Minus C2"

Apparently virtual keyboards go down to "minus c2". How would I specify a note with this pitch in python-midi?

Conversation snippet with the composer I'm working with:

Minus C2 is the lowest note. the virtual midi keyboard- not real ones- go all the way down to -C2. A0 is the low note on a piano, although I think now that is -A1

C2 is too high- that's one octave below Middle C these days.

You are starting 4 octaves too high.

midilisten and midiplay examples nonfunctional, undocumented

My best guess about how to actually use the example scripts fail, indicate no solution. For example :

[russell@peregrin scripts (master) ]$ ./midiplay 0 0 ../mary.mid
Traceback (most recent call last):
File "./midiplay", line 27, in
seq.subscribe_port(client, port)
File "/home/russell/opt/lib/python2.7/site-packages/midi/sequencer/sequencer.py", line 430, in subscribe_port
self._subscribe_port(subscribe)
File "/home/russell/opt/lib/python2.7/site-packages/midi/sequencer/sequencer.py", line 115, in _subscribe_port
if err < 0: self._error(err)
File "/home/russell/opt/lib/python2.7/site-packages/midi/sequencer/sequencer.py", line 74, in _error
raise RuntimeError, msg
RuntimeError: ALSAError[-1]: Operation not permitted

Now what?

Generated MIDI file doesn't start playing at the beginning of file.

Perhaps I'm overlooking some details, but I wanted to see if anyone knew of this. When I generate MIDI files using this API, I'm noticing that the MIDI files don't begin playing until approx. half a second into the file. Is there a fix or workaround for this particular problem? Is this a bug, or am I misunderstanding the usage of the library?

I'm appending NoteOn events with a tick of 0 while the files are set to tick_relative = true. I'm going to try making the files using an absolute timing and see if that solves my issues.

I'm playing the files using Quicktime Player 7 and Max. I'm going to see if other programs reproduce the problem as well.

Note Values Mapping

Dear Vishnu,

First of all thanks for this fantastic library. Its really good, and is altogether different from rest of the open-source stuff out there.

I am trying to capture to a certain degree the mappings of the midi notes notations to their numerical values. I see that they are already there within the package..however, what I precisely would like to know is what alphabetical notation you are using to map those numerical values.. (i.e. for instance, what's the difference between Bb_1 and Bs_1 and B_1). What I am interpreting is that B_1, B_2, .. B_N corresponds to the 'B' in Nth octave. Now, what I am having trouble interpreting is about what is Bb_1 and Bs_1.

Hope you can help me out in this.

Thanks.

Saif

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.