Giter Club home page Giter Club logo

msaf's Introduction

Music Structure Analysis Framework

A Python framework to analyze music structure.

PyPI version GitHub license Build Status Coverage Status Documentation Status

Documentation

See https://msaf.readthedocs.io for a complete reference manual and introductory tutorials.

Installation

From the root folder, type:

pip install .

(Note: you may need to create and activate a Python virtual environment with python -m venv .venv and source .venv/bin/activate first, depending on your system configuration).

Demonstration Notebook

You can follow a thorough example on this titanic Jupyter Notebook.

Citing MSAF

Nieto, O., Bello, J. P., Systematic Exploration Of Computational Music Structure Research. Proc. of the 17th International Society for Music Information Retrieval Conference (ISMIR). New York City, NY, USA, 2016 (PDF).

Credits

Created by Oriol Nieto ([email protected]).

msaf's People

Contributors

andimarafioti avatar ax-le avatar carlthome avatar fortunto2 avatar jblsmith avatar jzq2000 avatar keunwoochoi avatar maeich avatar maldil avatar pswoodworth avatar stefan-balke avatar urinieto avatar wangsix avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

msaf's Issues

Can't run quickstart example and/or "Run MSAF" notebook because the evaluations are not saved.

When I try to run either the quickstart example or the "Run MSAF" notebook I get basically the same traceback:

---------------------------------------------------------------------------
FileNotFoundError                         Traceback (most recent call last)
<ipython-input-1-483106be7d2c> in <module>()
     15 
     16 # 4. Evaluate the results
---> 17 evals = msaf.eval.process(audio_file)
     18 print(evals)

/usr/local/lib/python3.4/dist-packages/msaf/eval.py in process(in_path, boundaries_id, labels_id, annot_beats, framesync, feature, hier, save, out_file, n_jobs, annotator_id, config)
    371         # Single File mode
    372         evals = [process_track(in_path, boundaries_id, labels_id, config,
--> 373                                annotator_id=annotator_id)]
    374     else:
    375         # Collection mode

/usr/local/lib/python3.4/dist-packages/msaf/eval.py in process_track(file_struct, boundaries_id, labels_id, config, annotator_id)
    274 
    275     one_res = compute_gt_results(est_file, ref_file, boundaries_id, labels_id,
--> 276                                  config, annotator_id=annotator_id)
    277 
    278     return one_res

/usr/local/lib/python3.4/dist-packages/msaf/eval.py in compute_gt_results(est_file, ref_file, boundaries_id, labels_id, config, bins, annotator_id)
    184     # Read estimations with correct configuration
    185     est_inter, est_labels = io.read_estimations(est_file, boundaries_id,
--> 186                                                 labels_id, **config)
    187 
    188     # Compute the results and return

/usr/local/lib/python3.4/dist-packages/msaf/input_output.py in read_estimations(est_file, boundaries_id, labels_id, **params)
     75     """
     76     # Open file and read jams
---> 77     jam = jams.load(est_file)
     78 
     79     # Find correct estimation

/usr/local/lib/python3.4/dist-packages/jams/core.py in load(path_or_file, validate, strict, fmt)
    210     """
    211 
--> 212     with _open(path_or_file, mode='r', fmt=fmt) as fdesc:
    213         jam = JAMS(**json.load(fdesc))
    214 

/usr/lib/python3.4/contextlib.py in __enter__(self)
     57     def __enter__(self):
     58         try:
---> 59             return next(self.gen)
     60         except StopIteration:
     61             raise RuntimeError("generator didn't yield") from None

/usr/local/lib/python3.4/dist-packages/jams/core.py in _open(name_or_fdesc, mode, fmt)
    140                 mode = '{:s}t'.format(mode)
    141 
--> 142             with open_map[ext](name_or_fdesc, mode=mode) as fdesc:
    143                 yield fdesc
    144 

FileNotFoundError: [Errno 2] No such file or directory: '../datasets/Sargon/estimations/01-Sargon-Mindless.jams'

This basically means that the estimations are never written to the jams file. I researched a bit and found that on the 3rd/august/2016 Oriol Nieto commented lines 358-360 from run.py:

        # Save estimations
        # msaf.utils.ensure_dir(os.path.dirname(file_struct.est_file))
        # io.save_estimations(file_struct, est_times, est_labels,
                                                      # boundaries_id, labels_id, **config)

These lines should save the estimations and if I un-comment them the examples work perfectly.

I am placing this as an issue because: 1) It seems weird to me that no one brought it up (quick remark, if you already have written files the system loads those, so this error won't appear). 2) the commit message is "Fixing writing folder for MIREX."

Maybe there's something more to this than what I found. Is msaf writing files in another way and my system is failing? Can I explicitly write them?

run_msaf.py tries to evaluate when it is not supposed to do so.

Hi, I was testing msaf in collection mode and had this error message.
There are two audio files and msaf got me two .json and .jams file in features/ and estimations/, but nothing in references/.

ewert-server:keunwoo keunwoo$ /Users/keunwoo/Downloads/msaf/examples/run_msaf.py temp -bid foote -lid fmc2d -save
Mar 23 09:12:44  python[11009] <Error>: Set a breakpoint at CGSLogError to catch errors as they are logged.
Mar 23 09:12:44  python[11009] <Error>: This user is not allowed access to the window system right now.
_RegisterApplication(), FAILED TO establish the default connection to the WindowServer, _CGSDefaultConnection() is NULL.
Mar 23 09:12:44  python[11009] <Warning>: CGSConnectionByID: 0 is not a valid connection ID.
Mar 23 09:12:44  python[11009] <Warning>: CGSConnectionByID: 0 is not a valid connection ID.
Mar 23 09:12:44  python[11009] <Warning>: CGSConnectionByID: 0 is not a valid connection ID.
Mar 23 09:12:44  python[11009] <Warning>: CGSConnectionByID: 0 is not a valid connection ID.
Mar 23 09:12:44  python[11009] <Warning>: CGSConnectionByID: 0 is not a valid connection ID.
Mar 23 09:12:44  python[11009] <Warning>: CGSConnectionByID: 0 is not a valid connection ID.
Mar 23 09:12:44  python[11009] <Warning>: Invalid Connection ID 0
/Library/Python/2.7/site-packages/matplotlib/__init__.py:872: UserWarning: axes.color_cycle is deprecated and replaced with axes.prop_cycle; please use the latter.
  warnings.warn(self.msg_depr % (key, alt_key))
2016-03-23 09:12:44,972: INFO: Segmenting temp/audio/1.wav
2016-03-23 09:12:44,972: INFO: Loading audio file 1.wav
2016-03-23 09:12:57,657: INFO: Computing Harmonic Percussive source separation...
2016-03-23 09:13:14,147: INFO: Computing Spectrogram...
2016-03-23 09:13:14,788: INFO: Computing Constant-Q...
2016-03-23 09:13:20,237: INFO: Computing MFCCs...
2016-03-23 09:13:20,252: INFO: Computing HPCPs...
2016-03-23 09:13:24,391: INFO: Computing Tonnetz...
2016-03-23 09:13:24,945: INFO: Estimating Beats...
2016-03-23 09:13:25,549: INFO: Saving the JSON file in temp/features/1.json
2016-03-23 09:13:25,580: WARNING: No annotated beats
2016-03-23 09:13:29,032: INFO: Writing results in: temp/estimations/1.jams
/Library/Python/2.7/site-packages/pandas/tseries/timedeltas.py:55: RuntimeWarning: tp_compare didn't return -1 or -2 for exception
  value = value.astype('timedelta64[ns]', copy=False)
2016-03-23 09:13:29,501: INFO: Segmenting temp/audio/2.wav
2016-03-23 09:13:29,501: INFO: Loading audio file 2.wav
2016-03-23 09:13:56,653: INFO: Computing Harmonic Percussive source separation...
2016-03-23 09:14:31,967: INFO: Computing Spectrogram...
2016-03-23 09:14:33,481: INFO: Computing Constant-Q...
2016-03-23 09:14:45,480: INFO: Computing MFCCs...
2016-03-23 09:14:45,520: INFO: Computing HPCPs...
2016-03-23 09:14:54,507: INFO: Computing Tonnetz...
2016-03-23 09:14:55,673: INFO: Estimating Beats...
2016-03-23 09:14:56,880: INFO: Saving the JSON file in temp/features/2.json
2016-03-23 09:14:56,978: WARNING: No annotated beats
2016-03-23 09:15:04,192: INFO: Writing results in: temp/estimations/2.jams
2016-03-23 09:15:05,095: INFO: Evaluating 2 tracks...
2016-03-23 09:15:05,095: WARNING: No references for file: temp/references/1.jams
2016-03-23 09:15:05,095: WARNING: No references for file: temp/references/2.jams
2016-03-23 09:15:05,099: INFO: 2 tracks analyzed
Traceback (most recent call last):
  File "/Users/keunwoo/Downloads/msaf/examples/run_msaf.py", line 135, in <module>
    main()
  File "/Users/keunwoo/Downloads/msaf/examples/run_msaf.py", line 128, in main
    msaf.eval.process(args.in_path, **params)
  File "/Users/keunwoo/Library/Python/2.7/lib/python/site-packages/msaf-0.0.4-py2.7.egg/msaf/eval.py", line 396, in process
    print_results(results)
  File "/Users/keunwoo/Library/Python/2.7/lib/python/site-packages/msaf-0.0.4-py2.7.egg/msaf/eval.py", line 31, in print_results
    logging.info(results["HitRate_3F"])
  File "/Library/Python/2.7/site-packages/pandas/core/frame.py", line 1787, in __getitem__
    return self._getitem_column(key)
  File "/Library/Python/2.7/site-packages/pandas/core/frame.py", line 1794, in _getitem_column
    return self._get_item_cache(key)
  File "/Library/Python/2.7/site-packages/pandas/core/generic.py", line 1079, in _get_item_cache
    values = self._data.get(item)
  File "/Library/Python/2.7/site-packages/pandas/core/internals.py", line 2843, in get
    loc = self.items.get_loc(item)
  File "/Library/Python/2.7/site-packages/pandas/core/index.py", line 1437, in get_loc
    return self._engine.get_loc(_values_from_object(key))
  File "pandas/index.pyx", line 134, in pandas.index.IndexEngine.get_loc (pandas/index.c:3824)
  File "pandas/index.pyx", line 154, in pandas.index.IndexEngine.get_loc (pandas/index.c:3704)
  File "pandas/hashtable.pyx", line 697, in pandas.hashtable.PyObjectHashTable.get_item (pandas/hashtable.c:12349)
  File "pandas/hashtable.pyx", line 705, in pandas.hashtable.PyObjectHashTable.get_item (pandas/hashtable.c:12300)
KeyError: 'HitRate_3F'

name 'unicode' is not defined

Hi, sorry but i have
this error

root@xxxx:~/msaf/examples# python3 ./run_msaf.py MarcoKuell_-_Epic_Trailer.mp3  -bid sf -lid fmc2d
/usr/local/lib/python3.4/dist-packages/librosa-0.4.1-py3.4.egg/librosa/core/audio.py:37: UserWarning: Could not import scikits.samplerate. Falling back to scipy.signal
/usr/local/lib/python3.4/dist-packages/matplotlib-1.5.1_1018.gc259a55-py3.4-linux-x86_64.egg/matplotlib/__init__.py:877: UserWarning: axes.color_cycle is deprecated and replaced with axes.prop_cycle; please use the latter.
  warnings.warn(self.msg_depr % (key, alt_key))
2016-01-24 21:46:55,856: INFO: Loading audio file MarcoKuell_-_Epic_Trailer.mp3
2016-01-24 21:49:37,940: INFO: Computing Harmonic Percussive source separation...
2016-01-24 21:49:43,221: INFO: Computing Spectrogram...
2016-01-24 21:49:43,936: INFO: Computing Constant-Q...
2016-01-24 21:50:26,809: INFO: Computing MFCCs...
2016-01-24 21:50:26,825: INFO: Computing HPCPs...
2016-01-24 21:51:13,819: INFO: Computing Tonnetz...
2016-01-24 21:51:14,129: INFO: Estimating Beats...
2016-01-24 21:51:14,560: INFO: Saving the JSON file in /root/music/features/MarcoKuell_-_Epic_Trailer.json
2016-01-24 21:51:14,603: WARNING: No annotated beats
segment_open
Traceback (most recent call last):
  File "./run_msaf.py", line 135, in <module>
    main()
  File "./run_msaf.py", line 116, in main
    res = func.process(args.in_path, **params)
  File "/usr/local/lib/python3.4/dist-packages/msaf-0.0.3-py3.4.egg/msaf/run.py", line 371, in process
    boundaries_id, labels_id, **config)
  File "/usr/local/lib/python3.4/dist-packages/msaf-0.0.3-py3.4.egg/msaf/input_output.py", line 419, in save_estimations
    ann.append(time=bound_inter[0], duration=dur, value=unicode(value))
NameError: name 'unicode' is not defined

Add annot_beats documentation.

I think the annot_beats option is quite useful, but the complete lack of documentation makes me think that it may have some problems.

I got it to work just by writing a jams file like this (following this example):

jam = jams.JAMS()
jam.file_metadata.duration = track_duration
beat_a = jams.Annotation(namespace='beat')
beat_a.annotation_metadata = jams.AnnotationMetadata(data_source='librosa beat tracker')

# Add beat timings to the annotation record.
# The beat namespace does not require value or confidence fields,
# so we can leave those blank.
for t in beat_times:
    beat_a.append(time=t, duration=0.0)

# Store the new annotation in the jam
jam.annotations.append(beat_a)
jam.save('test.jams')

The jam file then needs to be saved on the 'reference' folder of the dataset with the same name of the file.

It is important to note that the features folder doesn't have to contain a feature file for the file you wish to calculate with the external beats. I also think a 'force-recalculate-features' option would be cool.

Questions:
Is the annot_beats option tested and does it work well? Is it documented somewhere and I just missed it? Would a little tutorial be useful? I could make a proper example with the beat extraction.

input argument of msaf.process

Hi, I'd like to use msaf as a python module.
Is there an argument to disable to save json file after a segmentation? It would be better if there's a list of arguments that can be used for msaf.process().

Make "ground-truth" boundaries great again

In beat-sync mode (default) ground-truth boundaries need to align to closest beat-time such that labels algorithms can later use these boundaries. However, this might produce misalignments that can be confusing in the results. We should fix this.

Error with jams.save running single file mode

I'm attempting to run a song through single file mode, and it can't seem to make it through saving the estimations. The issue appears to come in when calling jams.save(file_struct.est_file). I can't seem to figure out if it's an issue with msaf or with jams. The traceback is below:

~/msaf/examples$./run_msaf.py Anna\ Sun.mp3 -bid cnmf -lid fmc2d -f hpcp -s
2015-11-04 15:54:07,219: INFO: Loading audio file Anna Sun.mp3
2015-11-04 15:54:27,048: INFO: Computing Harmonic Percussive source separation...
2015-11-04 15:54:38,629: INFO: Computing Spectrogram...
2015-11-04 15:54:39,087: INFO: Computing Constant-Q...
2015-11-04 15:54:44,226: INFO: Computing MFCCs...
2015-11-04 15:54:44,271: INFO: Computing HPCPs...
2015-11-04 15:54:48,193: INFO: Computing Tonnetz...
2015-11-04 15:54:48,925: INFO: Estimating Beats...
2015-11-04 15:54:49,705: INFO: Saving the JSON file in features/Anna Sun.json
2015-11-04 15:54:49,763: WARNING: No annotated beats
2015-11-04 15:55:00,664: INFO: Sonifying boundaries in out_bounds.wav...
Traceback (most recent call last):
  File "./run_msaf.py", line 135, in <module>
    main()
  File "./run_msaf.py", line 116, in main
    res = func.process(args.in_path, **params)
  File "/usr/local/lib/python2.7/site-packages/msaf-0.0.3-py2.7.egg/msaf/run.py", line 369, in process
    boundaries_id, labels_id, **config)
  File "/usr/local/lib/python2.7/site-packages/msaf-0.0.3-py2.7.egg/msaf/input_output.py", line 421, in save_estimations
    jam.save(file_struct.est_file)
  File "build/bdist.macosx-10.11-intel/egg/jams/core.py", line 1170, in save
  File "build/bdist.macosx-10.11-intel/egg/jams/core.py", line 1200, in validate
  File "build/bdist.macosx-10.11-intel/egg/jams/core.py", line 474, in validate
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 478, in validate
    cls(schema, *args, **kwargs).validate(instance)
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 122, in validate
    for error in self.iter_errors(*args, **kwargs):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 98, in iter_errors
    for error in errors:
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/_validators.py", line 291, in properties_draft4
    schema_path=property,
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 114, in descend
    for error in self.iter_errors(instance, schema):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 98, in iter_errors
    for error in errors:
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/_validators.py", line 42, in items
    for error in validator.descend(item, items, path=index):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 114, in descend
    for error in self.iter_errors(instance, schema):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 98, in iter_errors
    for error in errors:
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/_validators.py", line 203, in ref
    for error in validator.descend(instance, resolved):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 114, in descend
    for error in self.iter_errors(instance, schema):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 98, in iter_errors
    for error in errors:
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/_validators.py", line 291, in properties_draft4
    schema_path=property,
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 114, in descend
    for error in self.iter_errors(instance, schema):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 98, in iter_errors
    for error in errors:
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/_validators.py", line 328, in oneOf_draft4
    errs = list(validator.descend(instance, subschema, schema_path=index))
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 114, in descend
    for error in self.iter_errors(instance, schema):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 98, in iter_errors
    for error in errors:
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/_validators.py", line 203, in ref
    for error in validator.descend(instance, resolved):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 114, in descend
    for error in self.iter_errors(instance, schema):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 98, in iter_errors
    for error in errors:
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/_validators.py", line 42, in items
    for error in validator.descend(item, items, path=index):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 114, in descend
    for error in self.iter_errors(instance, schema):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 98, in iter_errors
    for error in errors:
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/_validators.py", line 203, in ref
    for error in validator.descend(instance, resolved):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 114, in descend
    for error in self.iter_errors(instance, schema):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 98, in iter_errors
    for error in errors:
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/_validators.py", line 291, in properties_draft4
    schema_path=property,
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 114, in descend
    for error in self.iter_errors(instance, schema):
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/validators.py", line 98, in iter_errors
    for error in errors:
  File "/Library/Python/2.7/site-packages/jsonschema-2.5.1-py2.7.egg/jsonschema/_validators.py", line 80, in minimum
    failed = instance < minimum
TypeError: Cannot cast ufunc less input from dtype('float64') to dtype('<m8[ns]') with casting rule 'same_kind'

Problems searching inside a JAMS file

There's a problem when searching inside a JAMS file that contains annotations with a specific key that can be either None or any other type. I reported this in the official JAMS repo (here).

This is particularly problematic in MSAF, where you can have a JAMS file containing several estimations for an algorithm, some of which may have labels_id=None and others labels_id=algorithm_id.

While this is getting fixed on JAMS, I will updated the code with Brian's suggestion.

VMO Algorithm Issues

The new Variable Markov Oracle algorithm (included in this PR) didn't pass the unit tests because of two reasons:

  • The results of the main vmo call did not follow the expected format: len(my_bounds) == len(my_labels) + 1. To fix this, I simply removed the last label (since it seemed that it was always repeated). @wangsix, could you confirm this is correct?

  • When calling VMO to obtain the labels only (i.e., when we have used another algorithm to compute the boundaries), the previously computed boundaries were ignored. This was an MSAF problem, which I fixed by making sure we use the old boundaries in the interface. @WANGSIZ, do you think VMO could be improved by including previously computed boundaries to obtain the labels only? If not, the current labels synchronization to the previously computed boundaries should suffice.

OLDA's "Audio file too short!" warning is misleading

In OLDA's segmenter.py there's a warning for short files (both in 277-290 and 340-345):

        try:
            # Load and apply transform
            W = load_transform(self.config["transform"])
            F = W.dot(F)

            # Get Segments
            kmin, kmax = get_num_segs(dur)
            est_idxs = get_segments(F, kmin=kmin, kmax=kmax)
        except:
            # The audio file is too short, only beginning and end
            logging.warning("Audio file too short! "
                            "Only start and end boundaries.")
            est_idxs = [0, F.shape[1] - 1]

and

        try:
            # Load and apply transform
            W = load_transform(self.config["transform"])
            F = W.dot(F)

            # Get Segments
            kmin, kmax = get_num_segs(dur)

            # Run algorithm layer by layer
            est_idxs = []
            est_labels = []
            for k in range(kmin, kmax):
                S, cost = get_k_segments(F, k)
                est_idxs.append(S)
                est_labels.append(np.ones(len(S) - 1) * -1)

                # Make sure that the first and last boundaries are included
                assert est_idxs[-1][0] == 0 and \
                    est_idxs[-1][-1] == F.shape[1] - 1, "Layer %d does not " \
                    "start or end in the right frame(s)." % k

                # Post process layer
                est_idxs[-1], est_labels[-1] = \
                        self._postprocess(est_idxs[-1], est_labels[-1])
        except:
            # The audio file is too short, only beginning and end
            logging.warning("Audio file too short! "
                            "Only start and end boundaries.")
            est_idxs = [np.array([0, F.shape[1] - 1])]
            est_labels = [np.ones(1) * -1]

I found at the moment there is an issue between librosa and sklearn that makes the olda algorithm fail (should be fixed soon, though) and this logging to warn about something completely unrelated to the actual problem. Maybe someone familiarized with the olda algorithm could estimate how large the file should be for it to work?

Error when loading jams file from datasets in msaf

The error I mentioned to you a few weeks ago. It did not happen with the jams files from the jams repository.

jam_path = .../msaf/datasets/Isophonics/references/Isophonics_1-16 Beat It.jams'
salami = pyjams.load(jam_path)


TypeError Traceback (most recent call last)
in ()
1 jam_path = '~/Documents/GitHub/msaf/datasets/Isophonics/references/Isophonics_1-16 Beat It.jams'
----> 2 jams_test = pyjams.load(jam_path)

~/Documents/Github/jams/pyjams/pyjams.pyc in load(filepath)
73 def load(filepath):
74 """Load a JAMS Annotation from a file."""
---> 75 return JAMS(**json.load(open(filepath, 'r')))
76
77

TypeError: init() got an unexpected keyword argument 'tags'

how to interpret the label

I see the label result like this:
8.0, 7.0, 0.0, 1.0, 1.0, 1.5, 1.0, 1.0, 1.0, 1.0, 8.0

and sometimes like this:
8.0, 1.0, 2.0, 1.0, 2.0, 7.0, 2.0, 8.0

What's the meaning of those number? Does different label methods return label with different meaning?

scipy.linalg.eig breaks in OLDA code

For some reason, if I use scipy instead of numpy in this line, Python crashes on OS X, no matter how may processes I am using:
https://github.com/urinieto/msaf/blob/master/algorithms/olda/segmenter.py#L65

I tried to replicate the problem in this little script:
https://github.com/urinieto/msaf/blob/master/algorithms/olda/sandbox/test_linalg.py

This script reads the exact data that makes Python crash on OS X. However, this script only crashes if the number of processes is > 1. Weird.

Any help? (Especially @bmcfee :))

Make features computation more modular

Now the code is a bit spaghetti. Should be easier to add new features. Also, we should only compute the feature to be used at the precise moment, and store it accordingly.

Add default parameters when segmenting audio

It might be good to select an algorithm by default when none is selected. Probably the Structural Features algorithm, since it give relatively good results and it is quite fast.

Stop log-normalizing everything

There is a serious bug that was log-5ormalizing all the features by default in algorihtms:

  • cnmf
  • fmc2d (this should be fine)
  • foote
  • sf

Only PCP features should be log-normalized, if desired, the rest should be simply normalized.

OLDA vs Scluster Hierarchies

The hierarchies should be consistent in terms of lowest / highest level. Right now, OLDA returns the highest level (upper scale in SALAMI) in the last position of the array, while Scluster returns it in the first position.

building algorithms/cc

Hi Guys

I have ventured into the land of Python (ignorantly) on my ubuntu 14.04 machine (compiled python using gcc 4.9) convinced by @urinieto

I do seem to have some problems compiling the algorithms/cc hope you can help

i had some issues with size_t and had to include <cstddef> in TempoTrackV2.h since apparently newer version of gcc have issues with size_t. (just FYI)

Now ive run into another issue which is a bit more tricky and hope you can help out.

i do the following
~/downloads/msaf/algorithms/cc# python setup.py build_ext --force --inplace

running build_ext
building 'cc_segmenter' extension
C compiler: x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fno-strict-aliasing -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security -fPIC

compile options: '-Idsp/segmentation -I. -Iinclude -I/usr/local/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c'
extra options: '-DUSE_PTHREADS'
x86_64-linux-gnu-gcc: dsp/phasevocoder/PhaseVocoder.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
x86_64-linux-gnu-gcc: thread/Thread.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
x86_64-linux-gnu-gcc: dsp/rateconversion/Decimator.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
x86_64-linux-gnu-gcc: dsp/chromagram/ConstantQ.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
x86_64-linux-gnu-gcc: dsp/tonal/TonalEstimator.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
In file included from dsp/tonal/TonalEstimator.cpp:16:0:
dsp/tonal/TonalEstimator.h: In member function ‘void ChromaVector::printDebug()’:
dsp/tonal/TonalEstimator.h:35:21: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
   for (int i = 0; i < size(); i++)
                     ^
dsp/tonal/TonalEstimator.h: In member function ‘void TCSVector::printDebug()’:
dsp/tonal/TonalEstimator.h:71:21: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
   for (int i = 0; i < size(); i++)
                     ^
x86_64-linux-gnu-gcc: dsp/tempotracking/TempoTrack.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
dsp/tempotracking/TempoTrack.cpp: In member function ‘void TempoTrack::initialise(TTParams)’:
dsp/tempotracking/TempoTrack.cpp:74:18: warning: unused variable ‘winPre’ [-Wunused-variable]
     unsigned int winPre = Params.WinT.pre;
                  ^
dsp/tempotracking/TempoTrack.cpp:75:18: warning: unused variable ‘winPost’ [-Wunused-variable]
     unsigned int winPost = Params.WinT.post;
                  ^
dsp/tempotracking/TempoTrack.cpp: In member function ‘double TempoTrack::tempoMM(double*, double*, int)’:
dsp/tempotracking/TempoTrack.cpp:181:12: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
  for (a=1;a<=numelem;a++)
            ^
dsp/tempotracking/TempoTrack.cpp: In member function ‘void TempoTrack::createPhaseExtractor(double*, unsigned int, double, unsigned int, unsigned int)’:
dsp/tempotracking/TempoTrack.cpp:590:21: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
  for(  int i = 0; i < scratchLength; i++ )
                     ^
dsp/tempotracking/TempoTrack.cpp:597:19: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
  for(int i = 0; i < scratchLength; i ++)
                   ^
dsp/tempotracking/TempoTrack.cpp: In member function ‘int TempoTrack::phaseMM(double*, double*, unsigned int, double)’:
dsp/tempotracking/TempoTrack.cpp:637:23: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
     for( int i = 0; i < winLength; i++ )
                       ^
dsp/tempotracking/TempoTrack.cpp:646:28: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
  for(int i = 1 + (o - 1); i< winLength; i += (p + 1))
                            ^
dsp/tempotracking/TempoTrack.cpp: In member function ‘std::vector<int> TempoTrack::process(std::vector<double>, std::vector<double>*)’:
dsp/tempotracking/TempoTrack.cpp:823:62: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
                     for (int i = 0; i < TTLoopIndex + 3 && i < TTFrames; ++i) {
                                                              ^
dsp/tempotracking/TempoTrack.cpp:827:62: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
                     for (int i = 0; i < TTLoopIndex + 3 && i < TTFrames; ++i) {
                                                              ^
dsp/tempotracking/TempoTrack.cpp:715:18: warning: unused variable ‘DFCLength’ [-Wunused-variable]
     unsigned int DFCLength = m_dataLength + m_winLength;
                  ^
x86_64-linux-gnu-gcc: dsp/transforms/FFT.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
x86_64-linux-gnu-gcc: dsp/segmentation/Segmenter.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
dsp/segmentation/Segmenter.cpp: In function ‘std::ostream& operator<<(std::ostream&, const Segmentation&)’:
dsp/segmentation/Segmenter.cpp:23:20: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
  for (int i = 0; i < s.segments.size(); i++)
                    ^
x86_64-linux-gnu-gcc: dsp/tempotracking/TempoTrackV2.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
dsp/tempotracking/TempoTrackV2.cpp: In member function ‘void TempoTrackV2::calculateBeats(const std::vector<double>&, const std::vector<double>&, std::vector<double>&)’:
dsp/tempotracking/TempoTrackV2.cpp:465:20: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
     if (startpoint >= backlink.size()) startpoint = backlink.size()-1;
                    ^
x86_64-linux-gnu-gcc: maths/CosineDistance.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
maths/CosineDistance.cpp: In member function ‘double CosineDistance::distance(const std::vector<double>&, const std::vector<double>&)’:
maths/CosineDistance.cpp:37:23: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
         for(int i=0; i<v1.size(); i++)
                       ^
x86_64-linux-gnu-gcc: dsp/signalconditioning/DFProcess.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
dsp/signalconditioning/DFProcess.cpp: In member function ‘void DFProcess::removeDCNormalize(double*, double*)’:
dsp/signalconditioning/DFProcess.cpp:183:31: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
     for( unsigned int i = 0; i< m_length; i++)
                               ^
x86_64-linux-gnu-gcc: dsp/mfcc/MFCC.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
x86_64-linux-gnu-gcc: maths/KLDivergence.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
x86_64-linux-gnu-gcc: dsp/segmentation/cluster_segmenter.c
x86_64-linux-gnu-gcc: dsp/segmentation/ClusterMeltSegmenter.cpp
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
dsp/segmentation/ClusterMeltSegmenter.cpp: In member function ‘void ClusterMeltSegmenter::extractFeaturesConstQ(const double*, int)’:
dsp/segmentation/ClusterMeltSegmenter.cpp:159:38: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
     if (!window || window->getSize() != fftsize) {
                                      ^
dsp/segmentation/ClusterMeltSegmenter.cpp: In member function ‘virtual void ClusterMeltSegmenter::segment()’:
dsp/segmentation/ClusterMeltSegmenter.cpp:338:25: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
     if (features.size() < histogramLength) return;
                         ^
dsp/segmentation/ClusterMeltSegmenter.cpp:345:23: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
     for (int i = 0; i < features.size(); i++)
                       ^
dsp/segmentation/ClusterMeltSegmenter.cpp:349:31: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
             for (int j = 0; j < features[0].size(); j++)
                               ^
dsp/segmentation/ClusterMeltSegmenter.cpp:378:23: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
     for (int i = 0; i < features.size(); i++)
                       ^
dsp/segmentation/ClusterMeltSegmenter.cpp: In function ‘int median(std::vector<int>&)’:
dsp/segmentation/ClusterMeltSegmenter.cpp:389:48: error: ‘nth_element’ was not declared in this scope
     nth_element(v.begin(), v.begin()+n, v.end());
                                                ^
dsp/segmentation/ClusterMeltSegmenter.cpp: In member function ‘void ClusterMeltSegmenter::makeSegmentation(int*, int)’:
dsp/segmentation/ClusterMeltSegmenter.cpp:422:27: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
         for (int i = 0; i < annotBounds.size() - 1; i++) {
                           ^
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
dsp/segmentation/ClusterMeltSegmenter.cpp: In member function ‘void ClusterMeltSegmenter::extractFeaturesConstQ(const double*, int)’:
dsp/segmentation/ClusterMeltSegmenter.cpp:159:38: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
     if (!window || window->getSize() != fftsize) {
                                      ^
dsp/segmentation/ClusterMeltSegmenter.cpp: In member function ‘virtual void ClusterMeltSegmenter::segment()’:
dsp/segmentation/ClusterMeltSegmenter.cpp:338:25: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
     if (features.size() < histogramLength) return;
                         ^
dsp/segmentation/ClusterMeltSegmenter.cpp:345:23: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
     for (int i = 0; i < features.size(); i++)
                       ^
dsp/segmentation/ClusterMeltSegmenter.cpp:349:31: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
             for (int j = 0; j < features[0].size(); j++)
                               ^
dsp/segmentation/ClusterMeltSegmenter.cpp:378:23: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
     for (int i = 0; i < features.size(); i++)
                       ^
dsp/segmentation/ClusterMeltSegmenter.cpp: In function ‘int median(std::vector<int>&)’:
dsp/segmentation/ClusterMeltSegmenter.cpp:389:48: error: ‘nth_element’ was not declared in this scope
     nth_element(v.begin(), v.begin()+n, v.end());
                                                ^
dsp/segmentation/ClusterMeltSegmenter.cpp: In member function ‘void ClusterMeltSegmenter::makeSegmentation(int*, int)’:
dsp/segmentation/ClusterMeltSegmenter.cpp:422:27: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
         for (int i = 0; i < annotBounds.size() - 1; i++) {
                           ^
error: Command "x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fno-strict-aliasing -D_FORTIFY_SOURCE=2 -g -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Idsp/segmentation -I. -Iinclude -I/usr/local/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c dsp/segmentation/ClusterMeltSegmenter.cpp -o build/temp.linux-x86_64-2.7/dsp/segmentation/ClusterMeltSegmenter.o -DUSE_PTHREADS" failed with exit status 1

i can see that i have problems with
dsp/segmentation/ClusterMeltSegmenter.cpp:389:48: error: ‘nth_element’ was not declared in this scope nth_element(v.begin(), v.begin()+n, v.end());

where nth_element seems to be the bad guy. It might again be an issue with the new version of the gcc compiler. Any suggestions?

Jams deprecation of to_interval_values() breaks eval.py

See deprecation change made in Jams 0.3.0 API for _to_interval_values() here: marl/jams#152

This change is raising the following exception when running the tutorial:

  File "MSAF.py", line 18, in <module>
    evals = msaf.eval.process(audio_file)
  File "/usr/local/lib/python2.7/dist-packages/msaf/eval.py", line 373, in process
    annotator_id=annotator_id)]
  File "/usr/local/lib/python2.7/dist-packages/msaf/eval.py", line 276, in process_track
    config, annotator_id=annotator_id)
  File "/usr/local/lib/python2.7/dist-packages/msaf/eval.py", line 182, in compute_gt_results
    ref_inter, ref_labels = ann.data.to_interval_values()
AttributeError: 'SortedListWithKey' object has no attribute 'to_interval_values'

IndexError: failed to coerce slice entry of type long to integer

Hi,

When executing msaf.process(filepath) I get the following traceback error. This was not happening before, but is now starting to happen very frequently, when processing a list of audio files.

How should I fix this? Thank you for your help @urinieto @keunwoochoi !

Traceback (most recent call last):
  File "app.py", line 157, in <module>
    process_files()
  File "app.py", line 102, in process_files
    boundaries, _ = msaf.process(file)
  File "/usr/local/lib/python2.7/dist-packages/msaf/run.py", line 338, in process
    annotator_id=annotator_id)
  File "/usr/local/lib/python2.7/dist-packages/msaf/run.py", line 189, in run_algorithms
    if config["features"].features.shape[0] <= msaf.config.minimum_frames:
  File "/usr/local/lib/python2.7/dist-packages/msaf/base.py", line 392, in features
    self._compute_all_features()
  File "/usr/local/lib/python2.7/dist-packages/msaf/base.py", line 339, in _compute_all_feature
s
    self._framesync_features = self.compute_features()
  File "/usr/local/lib/python2.7/dist-packages/msaf/features.py", line 225, in compute_features
    audio_harmonic, _ = self.compute_HPSS()
  File "/usr/local/lib/python2.7/dist-packages/msaf/base.py", line 108, in compute_HPSS
    return librosa.effects.hpss(self._audio)
  File "/usr/local/lib/python2.7/dist-packages/librosa/effects.py", line 86, in hpss
    stft = core.stft(y)
  File "/usr/local/lib/python2.7/dist-packages/librosa/core/spectrum.py", line 164, in stft
    y = np.pad(y, int(n_fft // 2), mode='reflect')
  File "/usr/local/lib/python2.7/dist-packages/numpy/lib/arraypad.py", line 1451, in pad
    pad_iter_a), method, axis)
  File "/usr/local/lib/python2.7/dist-packages/numpy/lib/arraypad.py", line 799, in _pad_ref
    ref_chunk1 = arr[ref_slice]
IndexError: failed to coerce slice entry of type long to integer

Change Algorithm Interface

Let's do it like the Features module, and use Mixins to differentiate between the type of algorithms.

checking file existence

As I once mentioned in the comments of #15, run() process doesn't check if the file really exists in the path. When there's no file, msaf.process() doesn't show any error but returns a blank array.
Wouldn't it be better if it check if the file/folder exists? I'd PR if you think so.

No module named _backend_gdk

I get this error when trying to import the module, and after some research it looks like it has to do with my matplotlib distribution not being compiled to work with gtk+. So. Why does this library even need gdk support? How does one get this on mac?

Reimplement Constrained Clustering and SI-PLCA

Right now the Constrained Clustering and the Shift-Invariant Probabilistic Latent Component Analysis are found in the msaf-gpl repo due to license issues.

It would be fun to re-implement them in Python under an MIT license, and would avoid having the msaf-gpl repo (which is probably a bit confusing).

Improvements to scluster

Some experiments on the SPAM dataset have turned up a few quirks in the Laplacian segmentation method that shouldn't be too hard to fix. Problematic tracks include:

  • SALAMI_478
  • SALAMI_302
  • SALAMI_108
  • SALAMI_838

One persistent issue that I'm seeing is a tendency on long tracks to concentrate all of the clustering on a small region in the middle of the track, while leaving the vast majority of the remainder labeled as one component. This is probably an error in bandwidth estimation, but it deserves some careful diagnostics.

Another issue, potentially unrelated, is due to some high-frequency noise in the laplacian eigenvectors. I've seen this in a few random tracks, and it's easily fixed by applying a temporal median filter to the eigenvectors before clustering. This is already implemented in the librosa gallery version but I haven't pushed it upstream anywhere yet.

I'll probably send a PR for this some time in the near future, so if you want to collect more suggestions in this thread, I'd appreciate it.

No module named _backend_gdk

I get this error when trying to import the module, and after some research it looks like it has to do with my matplotlib distribution not being compiled to work with gtk+. So. Why does this library even need a visualization toolbox?

*.jams does not exis.

Hi, new problem
if using music from not dataset

Reference file ../references/via.jams does not exis. You must have annotated references to run evaluations.

2016-07-30 12-21-26

need run Sonic Visualizer manual?

ValueError: Could not convert "None" to boolean

Hi,

I'm just trying msaf (version 0.1.1) out, but I am getting this error: ValueError: Could not convert "None" to boolean
This full stacktrace is here:

/usr/lib/pymodules/python2.7/matplotlib/rcsetup.py:378: UserWarning: tk.pythoninspect is obsolete, and has no effect
  warnings.warn("tk.pythoninspect is obsolete, and has no effect")
Traceback (most recent call last):
  File "app.py", line 7, in <module>
    import urllib, urllib2, msaf
  File "/usr/local/lib/python2.7/dist-packages/msaf/__init__.py", line 20, in <module>
    from . import features
  File "/usr/local/lib/python2.7/dist-packages/msaf/features.py", line 19, in <module>
    import librosa
  File "/usr/local/lib/python2.7/dist-packages/librosa/__init__.py", line 18, in <module>
    from . import display
  File "/usr/local/lib/python2.7/dist-packages/librosa/display.py", line 30, in <module>
    _matplotlibrc = copy.deepcopy(mpl.rcParams)
  File "/usr/lib/python2.7/copy.py", line 190, in deepcopy
    y = _reconstruct(x, rv, 1, memo)
  File "/usr/lib/python2.7/copy.py", line 358, in _reconstruct
    y[key] = value
  File "/usr/lib/pymodules/python2.7/matplotlib/__init__.py", line 808, in __setitem__
    cval = self.validate[key](val)
  File "/usr/lib/pymodules/python2.7/matplotlib/rcsetup.py", line 95, in validate_bool_maybe_none
    raise ValueError('Could not convert "%s" to boolean' % b)
ValueError: Could not convert "None" to boolean

All that I'm doing in my python file is importing msaf. Can you please help me with this?

Thanks!

Add Perceptual Precision-Recall Weighting

Add the metric of the perceptual weighting of Precision-Recall for boundary detection, based on the paper:

Nieto, O., Farbood, M., Jehan, T., Bello, J.P., Perceptual Analysis of the F-Measure to Evaluate Section Boundaries in Music. Proc. of the 15th International Society for Music Information Retrieval Conference (ISMIR). Taipei, Taiwan, 2014. PDF

Labeling each hierarchical level

We should be able to apply a given label algorithm to all the levels of the detected boundaries of a given hierarchical algorithm.

Why my some music can get result,some can not?

When I use the msaf to segment music,some music can work well,however some music can not .And I use the same format of the music ,eg. wav. And the length of the music is similar each other.But when I process the can't worked musics,they will get into a infinite loop.So ,what's problem with it?

Benchmark of algorithms

Hi,
do you have any benchmark result of the algorithms? Accuracy(or any other measure), complexity, or whatever.
If not, could you recommend me one? I'd like to get segmentations of 10k songs, and I can accept some degradation of performance for significantly low complexity.

I don't have right result.

I try both the single file mode and collection mode.
but it doesn't work well.
please let me know how to solve it.

root@53b38a:/data/segmenter/msaf/examples# ./run_msaf.py audio/09_-_Girl.wav -bid cnmf -lid fmc2d -f hpcp
/usr/local/lib/python2.7/dist-packages/matplotlib/init.py:872: UserWarning: axes.color_cycle is deprecated and replaced with axes.prop_cycle; please use the latter.
warnings.warn(self.msg_depr % (key, alt_key))
Traceback (most recent call last):
File "./run_msaf.py", line 135, in
main()
File "./run_msaf.py", line 116, in main
res = func.process(args.in_path, *_params)
File "/usr/local/lib/python2.7/dist-packages/msaf-0.0.3-py2.7.egg/msaf/run.py", line 369, in process
boundaries_id, labels_id, *_config)
File "/usr/local/lib/python2.7/dist-packages/msaf-0.0.3-py2.7.egg/msaf/input_output.py", line 420, in save_estimations
jam.save(file_struct.est_file)
File "/usr/local/lib/python2.7/dist-packages/jams/core.py", line 1186, in save
self.validate(strict=strict)
File "/usr/local/lib/python2.7/dist-packages/jams/core.py", line 1219, in validate
valid &= ann.validate(strict=strict)
File "/usr/local/lib/python2.7/dist-packages/jams/core.py", line 865, in validate
raise SchemaError(str(invalid))
jams.exceptions.SchemaError: 6.000000000000001e-09 is not of type u'string'

Failed validating u'type' in schema[u'properties'][u'value']:
{u'type': u'string'}

On instance[u'value']:
6.000000000000001e-09
root@53538a:/data/segmenter/msaf/examples# ./run_msaf.py . -f mfcc -bid foote
/usr/local/lib/python2.7/dist-packages/matplotlib/init.py:872: UserWarning: axes.color_cycle is deprecated and replaced with axes.prop_cycle; please use the latter.
warnings.warn(self.msg_depr % (key, alt_key))
2015-12-09 17:21:09,462: INFO: Segmenting ./audio/09_-Girl.wav
2015-12-09 17:21:09,662: INFO: Writing results in: ./estimations/09
-_Girl.jams
Traceback (most recent call last):
File "./run_msaf.py", line 135, in
main()
File "./run_msaf.py", line 116, in main
res = func.process(args.in_path, **params)
File "/usr/local/lib/python2.7/dist-packages/msaf-0.0.3-py2.7.egg/msaf/run.py", line 379, in process
annotator_id=annotator_id) for file_struct in file_structs[:])
File "/usr/local/lib/python2.7/dist-packages/joblib/parallel.py", line 804, in call
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python2.7/dist-packages/joblib/parallel.py", line 662, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python2.7/dist-packages/joblib/parallel.py", line 570, in _dispatch
job = ImmediateComputeBatch(batch)
File "/usr/local/lib/python2.7/dist-packages/joblib/parallel.py", line 183, in init
self.results = batch()
File "/usr/local/lib/python2.7/dist-packages/joblib/parallel.py", line 72, in call
return [func(_args, *_kwargs) for func, args, kwargs in self.items]
File "/usr/local/lib/python2.7/dist-packages/msaf-0.0.3-py2.7.egg/msaf/run.py", line 265, in process_track
boundaries_id, labels_id, **config)
File "/usr/local/lib/python2.7/dist-packages/msaf-0.0.3-py2.7.egg/msaf/input_output.py", line 420, in save_estimations
jam.save(file_struct.est_file)
File "/usr/local/lib/python2.7/dist-packages/jams/core.py", line 1186, in save
self.validate(strict=strict)
File "/usr/local/lib/python2.7/dist-packages/jams/core.py", line 1219, in validate
valid &= ann.validate(strict=strict)
File "/usr/local/lib/python2.7/dist-packages/jams/core.py", line 865, in validate
raise SchemaError(str(invalid))
jams.exceptions.SchemaError: 0.0 is not of type u'string'

Failed validating u'type' in schema[u'properties'][u'value']:
{u'type': u'string'}

On instance[u'value']:
0.0

msaf.process calling

When I can msaf.process function, if I have already load the music using librosa.load, can I use that directly?

Also how to use annot_beats and framesync parameter?

global name 'ds_name' is not defined

Hi i first install msaf
and catch error

NameError                                 Traceback (most recent call last)
<ipython-input-13-1b3d08e88056> in <module>()
      2 # Set plot = True to plot the results
      3 boundaries, labels = msaf.process(audio_file, boundaries_id="foote", 
----> 4                                   labels_id="fmc2d", plot=True)

/home/rustam/anaconda2/lib/python2.7/site-packages/msaf/run.pyc in process(in_path, annot_beats, feature, framesync, boundaries_id, labels_id, hier, sonify_bounds, plot, n_jobs, annotator_id, config, out_bounds, out_sr)
    345         if plot:
    346             plotting.plot_one_track(file_struct, est_times, est_labels,
--> 347                                     boundaries_id, labels_id, ds_name)
    348 
    349         # Save estimations

NameError: global name 'ds_name' is not defined

conda python (2.7)
ubuntu 16
msaf install with pip (msaf-0.1.0-py2-none-any.whl)
run this example
https://github.com/urinieto/msaf/blob/master/examples/compute_features.py

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.