Source code and supplementary information for IJCNN2021.
The supplementary information is available at: https://kinwaicheuk.github.io/IJCNN2021.github.io/
The source code is available under the codes
folder.
The MAPS dataset can be downloaded via https://amubox.univ-amu.fr/index.php/s/iNG0xc5Td1Nv4rR.
Download and unzip the dataset into the codes
folder.
After unzipping all the files inside, the ENSTDkAm1
and ENSTDkAm2
folder should be combined as one single folder ENSTDkAm
.
Run the following scripts to train different models:
python train_original.py with <args>
python train_fast_local_attent.py with <args>
python train_simple.py with <args>
The following arguments are available:
-
device
: choose what device to use. Can becpu
,cuda:0
or any device that is available in your PC. Defaultcuda:0
. -
LSTM
: Train the model with or without the LSTM layer. EitherTrue
orFalse
. DefaultTrue
. -
onset_stack
: Train the model with or without the onset stack. EitherTrue
orFalse
. DefaultTrue
. -
batch_size
: Setting the batch size. Default16
.
The following arguments are for train_fast_local_attent.py
only
w_size
: The attention window size. Default:30
.attention_mode
: Choosing which feature to attend to. Eitheronset
oractivation
orspec
. Defaultonset
.
The PyTorch dataset class MAPS()
inside each script will process and prepare the dataset if you are running it for the first time.
The weights can be downloaded here
The bash files contain all the commanands to obtain the results reported in the paper. Run the following bash scripts to get all the results
bash get_table1.sh
bash get_figure1.sh
bash get_table2.sh
The accuracy reports and the midi files will be saved in the results
folder.