Giter Club home page Giter Club logo

grt's People

Contributors

adamrankin avatar cyberluke avatar damellis avatar fruchtzwerg94 avatar narner avatar nebgnahz avatar ngillian-google avatar nickgillian avatar pixelflipping avatar rogerzanoni avatar romainguillemot avatar royshil avatar sakulstra avatar sgrignard avatar terziman avatar xiangzhai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

grt's Issues

Tiny issue in DTW.cpp

[Problem]:
if the test sample is the same as the training sample, DTW will return distance 0 and likelihood -NA which is wrong. Example, if I take some of the training data as testing data (now the testing sample == training sample), then this problem happens.

[Reason]:
In Line 410 of DTW.cpp, in the function of bool DTW::predict_(MatrixDouble &inputTimeSeries) {...
for(UINT k=0; k<numTemplates; k++){
//Perform DTW
classDistances[k] = computeDistance(templatesBuffer[k]....);
classLikelihoods[k] = 1.0 / classDistances[k];
....
////////! here the classDistances[k] is 0, and then used as divisor in the next line.

[My suggestion]:
change the code: classLikelihoods[k] = 1.0 / classDistances[k];
to: classLikelihoods[k] = (classDistances[k] == 0 ? INT_MAX : 1.0 / classDistances[k]);

Failed to resize matrix, rows and cols == zero!

Hello, sorry to bother. Im getting that error when I try to train an HMM continous model. Any Idea of what might be causing it? I could get to train it with a smaller data set. but with the original size one it throws that error. the data set its 36 dimensions and has 842 training examples and 8 clases, its 15Mbytes.... A DTW model have been train with no errors with this data set

Model Accuracy changing values

Hello again, sorry to throw questions here, I dont know where else to make them since the forum is down.

Im training a prediction model using random forest, and I just notice that with the same dataset, and the same parameters its giving me diferent accuracy values in a range that I cannot ignore (1-7%) is this a normal behavior? for example I ran the program with an especific configuration and it gave me 91.4896% then I ran it again with out changing any lines and showed 95.8347% then I did the same and have 93.7367%

saveModelToFile(std::fstream &file) is deprecated

Hi, I use master branch. Visual Studio 2015 created grt.lib and examples with no problems. Now I want to try your basic example and linker has problems. There are a lot of places, where saveModelToFile is used instead of save. I'm watching your discussion, so I know that you changed it. Do you want me to replace it and send pull request?

Severity Code Description Project File Line Suppression State
Error C4996 'GRT::MLBase::saveModelToFile': saveModelToFile(std::fstream &file) is deprecated, use save(std::fstream &file) instead NanoSuitSrv c:\bitbucket\grt\grt\coremodules\FeatureExtraction.h 107

Only the last stage of `featureDataReady` is effective

Only the last stage of featureDataReady is being considered when training pipeline for TimeSeriesClassificationData. In the loop:

for(UINT moduleIndex=0; moduleIndex<featureExtractionModules.size(); moduleIndex++)

the pipeline iterates through all feature extraction modules and featureDataReady is overwritten every time. This would be a problem in scenarios like the following:

    GRT::FFT fft(512, 128, 1, GRT::FFT::HAMMING_WINDOW, true, false);
    GRT::FFTFeatures fft_feature(256);

    pipeline.addFeatureExtractionModule(fft);
    pipeline.addFeatureExtractionModule(fft_feature);
    pipeline.setClassifier(SVM(SVM::LINEAR_KERNEL, SVM::C_SVC, true, true));

Although FFT result isn't ready in the first 512 samples, the FFTFeatures is computed anyway and the flag is then turned true.

Allow mixing of pre-processing and feature extraction modules?

There are times when it might be useful to mix these, e.g. taking the average over time of the result of an FFTFeatures module. Is there a reason that pre-processing and feature extraction need to be distinct? Aren't they both just functions from a vector to another vector?

getting started with accelerometer-based recognition

Hi Nick,

I've just gotten GRT set up and started working with it. I'm using wireless accelerometers, worn on the wrist for example, and would like to do some gesture recognition. At present, these are 3-axis accelerometers, but I'll soon be getting some 9-axis ones and I have the sensor fusion code ready so I'll be able to extract attitude as well.

I've successfully modified the DTW example and gotten some things to work, training and classifying simple movements with the accelerometer values! Very exciting. For my application, I would ideally like to recognise the same gesture performed at a wide range of speeds which makes using non-normalized accelerometer values problematic. I tried using HMM in the pipeline, but pretty much got all false positives, but I was only trying to train and recognise a single gesture, so more than likely I was doing something wrong.

I'd just like to get your advice on some possible next steps given the kind of measures I'm working with, accelerometry and (soon) attitude. Would it be worth trying the MovementTrajectoryFeature extraction with another classifier like SVM or even again with DTW? Do you think using attitude instead of accelerometry would yield better results? Just looking for some guidance if you have the time.

Much appreciated.

--Alex

Continuous HMM crashes when the downsample factor is larger than the input matrix

When using the predict_(MatrixDouble& obs) method of the continuous hmm classifier if the downsample factor is smaller than the number of rows of the given matrix.
This is not really a bug but I expected the matrix predict function to behave in the same way as the vector predict function (that uses a circular buffer to hold real time data). Given a bunch of realtime data one may want to process it as a whole and just get the last class predicted, this could be done by setting the circular buffer directly but being a protected attribute that is not possible.

Polymorphism with classifiers

Hello Nick,

Since the forum is down, I thought it best to message you here.

I was planning on using several different classifiers at the same time. For this purposed I defined an array of Classifiers and assigned different subtypes to each index. However, when trying to load the stored models I realised that the function called resides in MLBase and not in each of the appropriate ML Classifiers. Upon inspection I noticed that this function is not virtual and it just returns false. I guess it wasn't intended to be used in that way. Would you mind letting me know how I could achieve polymorphism? Could it be that I need to do an array of pipelines? I just want to use the library the way it was intended to be used. Don't want to go against its design principles.

Thanks a lot.
M

Way to get distance from prediction to training classes in units of the null rejection threshold?

I may just be confused, but it seems like the null rejection coefficients supplied to a classifier (w/ setNullRejectionCoeff()) are in different units / scales than the class distances (reported by getClassDistances()). Is there a way to get class distances in numbers that are comparable to the null rejection coefficient supplied? Specifically I've been looking at the ANBC classifier, but maybe it's similar for others?

(It looks like there's a way to directly specify the null rejection thresholds, rather than the null rejection coefficient, but that's not really suitable for my needs because picking the right numbers requires too much knowledge of how the thresholds are calculated from the coefficient and about the actual training data.)

Allow relabeling of training samples in GUI.

In the GUI, it's easy to accidentally forget to change the training class label before recording a new sample. It would be nice if there was a way to edit the label for a recorded sample. One thought would be to enable editing in the "Class Label" text box in the Timeseries Plot sub-tab.

arm architecture

Hi there

Thank you for this library: it is being so usefull. I was wondering: basing on the informations you released the library has been tested and build for linux, osx and windows, is there any plan to make it work for arm as well?

Thanks a lot

Alternative classifiers using DTW as the distance function?

It looks like the DTW classifier picks a representative or exemplar training sample from each class, then does prediction based on the distance to each class's exemplar sample. It might be nice to specify an alternative strategy for prediction, e.g. KNN using DTW as the distance function, or taking the average distance to all the samples in each class.

For instance, in doing gesture recognition, it might be nice to be able to supply training samples from multiple people (or multiple styles from one person), and have the additional data do more than just (potentially) change the choice of exemplar for each class.

Failed to find timeseries header!

I am having trouble trying to load a data set to LabelledTimeSeriesClassificationData, Failed to find timeseries header error is showing. Already check the format of the file and its fine, Any ideas of why am getting this?

Proposal: Buildsystem change

Would you accept a patch changing your buildsystem to CMake?

I'm using your library on linux and windows environments. It works fine on linux (and the same makefile work well on MacOS, I suppose), but I'm having some problems with nmake on windows (later I decided to create a solution and build grt with it instead of using the makefile). CMake would ease the maintenance on various platforms, since it is able to generate Makefiles, XCode projects and Visual studio solutions (I haven't tested it yet, but I will make sure it all works before sending the patch).

Pros:
Less maintenance
Less platform specific code in the buildsystem
Cleaner build system
???

Cons:
A new dependency (cmake)
???

Issue with the GUI not saving training data

Hi Nick,

Whenever I run the Processing Classification Example / GRT-GUI following the steps in the tutorial, I an error of "WARNING: Failed to save training data saved to file" appears in the Info Text box in the GUI.

Additionally, when I click on "Save" in the Data Manager tab, the dialog allows me to click on the "save" button, but when I click on "Load", no file is shown.

I'm running OS X 10.11.1. Thank you!

Current GUI version not working with QT 5.4.2

Hey Nick I just wanted to let you know that trying to build the GUI project with the newest QT 5.4.2 version. I'm running Ubuntu 15.10.

I managed to make the GUI compile under QT 5.4.2 by removing all instances of "QApplication::UnicodeUTF8" in the ui_* header files, changing <QtGui/...> to <QtWidgets/...> and adding QT += widgets to the .pro file.

There might be some cleaner backwards dependency fix for QT but I for one did not find it.

Tiny issue in ClassificationData::spiltDataIntoKFolds()

[problem]
At the start of the function:
//K can not be zero
if( K > totalNumSamples ){
errorLog << "spiltDataIntoKFolds(const UINT K,const bool useStratifiedSampling) - K can not be zero!" << std::endl;
return false;
}
is to compare K and zero, but here is totalNumSamples.

[Suggestion]
change the totalNumSamples to 0 to fix the problem.

HighPassFilter

Hi,

I asked my high pass filter its classType and got an empty string recently.

HighPassFilter.cpp Line 34 and 35 has an issue.

preProcessingType = classType;
preProcessingType = "HighPassFilter";

This is supposed to be

classType = "HighPassFilter";
preProcessingType = classType;

Chikashi.

how to use GRT in android project?

Hi,nick
I have seen the jni folder in your GRT project,does it means it can support in java?and how can I complie the source?can you provide a detailed introduction??

ubuntu 14.04

hay,
is there currently a way to build the gui for ubuntu 14.04?
Regards,
Lukas
edit: sry didn't see there is a forum

GUI Build under archlinux

I built and installed the grt library, the test program works (the one that prints the grt version).
Now i am trying to build the gui with qtcreator and i keep getting this error
Project ERROR: grt development package not found

Qt version is 5.6.0, i tried with version 5.5.1 and i had the same error

CSV File format

Hello I am trying to load a data set that is in a csv file, but "No header found" error keeps poping up. where can i get some example of the format a csv file must have to load succesfully in a classification data object

Cannot load file in Regression mode

This can be reproduced every time:

[ERROR LRD] loadDatasetFromCSVFile(...) - The number of columns in the CSV file (1) does not match the number of input dimensions plus the number of target dimensions (10)

TEST CASE (actual file):

GRT_LABELLED_REGRESSION_DATA_FILE_V1.0
DatasetName: NOT_SET
InfoText:
NumInputDimensions: 8
NumTargetDimensions: 2
TotalNumTrainingExamples: 412
UseExternalRanges: 0
RegressionData:
-0.07 0.84 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.84 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.84 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.84 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.84 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.84 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.04 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.53 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.06 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.06 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.06 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.06 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.06 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.06 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.06 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.06 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.06 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.06 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.06 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.06 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.07 0.83 0.05 -0.54 0.99 1.25 1.2 0.12 0 0
-0.04 -0.55 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.04 -0.55 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.04 -0.55 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.04 -0.54 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.04 -0.54 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.04 -0.54 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.04 -0.54 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.04 -0.54 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.03 -0.54 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.03 -0.54 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.03 -0.54 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.03 -0.54 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.03 -0.54 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.03 -0.54 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.03 -0.54 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.03 -0.54 0.01 -0.83 0.99 1.25 1.2 0.12 1 0
-0.03 -0.54 0.01 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0.01 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0.01 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0.01 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0.01 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0.01 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0.01 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0.01 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0.01 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0.01 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0.01 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0.01 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
-0.03 -0.53 0 -0.84 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.46 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.42 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.57 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.56 0.99 1.25 1.2 0.12 1 0
0.51 0.47 0.43 -0.56 0.99 1.25 1.2 0.12 1 0
0.66 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.22 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.22 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.66 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.21 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.67 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.68 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.21 -0.68 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.2 -0.68 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.2 -0.68 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.2 -0.68 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.2 -0.68 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.2 -0.68 0.99 1.25 1.2 0.12 1 1
0.67 -0.2 -0.2 -0.68 0.99 1.25 1.2 0.12 1 1

Linux build broken

Hello,
it seems like the grt-gui linux build is broken since 54ecf04 ```

grt/gui/GRT/mainwindow.h:280: error: 'SwipeDetector' in namespace 'GRT' does not name a type GRT::SwipeDetector swipeDetector; ^
Regards,
Lukas

Issues building in Visual Studio 2010

Hi!

I´m trying to build GRT on Visual Studio 2010, and have encountered a few issues (I just copied the entire sourcecode to my Openframeworks 0074 application).

1- Everywhere where it says " std::isinf" or " std::isnan" throws an error. Deliting the "std::" namespace from the instruction seems to resolve the issue.

2- In the declaration of "bool MLBase::MLBase::map(VectorDouble inputVector){ return map_( inputVector ); }" in the MLBase.cpp file, there´s that repeated "MLBase::".

3- I´m getting an ambiguous call to an overload function on "const UINT numBatches = (UINT)ceil( numTrainingSamples/batchSize );" at line 171 on BernoulliRBM.cpp .

None of the overloads returns an unsigned integer, but they do return:

long double ceil(long double)
float ceil(float)
double ceil(double)

Don´t know which one is intended to use here

Hello Nick

Hello Nick

Sorry to pollute your github with that dying project (https://github.com/Ractiv) of software for Ractiv's TouchPlus 3D Stereo Camera powered by Etron 3D Depth-Map Camera Controller - eSP870.

If our community of backers and buyers at http://gharbi.me/ractiv/ could provide you with a unit, I wonder if you could find a little time to help us trying to make your gesture recognition toolkit working on that device ?

Indeed your work is as impressive as this one https://github.com/OpenGP/htrack

I've asked Umar Nizamani for his help since he has hacked some of eSP870 codes last year, see : https://github.com/umarniz/TouchPlusLib

I've also asked Matthew H. Fogle for his knowledge on csharp, see : https://github.com/shadowmite/TouchPlusCMDR

I will try to contact also Anastasia Tkach at https://github.com/OpenGP/htrack to see if we could find some help with them too.

As Umar seems to be working for now on Google TensorFlow Software (artificial intelligence, no ?), see : https://medium.com/@Rapchik , I guest one day a mute person will be able to talk to computer using sign language ! (Maybe using Ractiv's device, who knows ?)

And, as we're watching space and realtime, why not trying to detect the famous gravitational waves ? I'm getting into ramblings as usual to me, but I'm a dreamer and a joker !

Thanks in advance for your response, even if negative, I know time is precious !

Best Regards,

touchhope

a french buyer at http://gharbi.me/ractiv/

Post Scriptum : To end with ramblings again, I love these supposed words from Albert Einstein : "Do not ask me to repeat what I have learned. Ask me rather what I think!"

Problem compiling project on Xcode

Hi

This is probably a rather basic question but I am just starting out with C++ on Xcode and would rather appreciate your help.

I have uploaded followed your instructions on http://www.nickgillian.com/wiki/pmwiki.php/GRT/Install#Xcode
to try out your DTW example. The screenshot of the completed setup is seen here:
screen shot 2016-07-02 at 2 49 38 am

However, when I tried to build the project, I received 16 errors, each stating there are undefined symbols for architecture x86_64.

screen shot 2016-07-02 at 2 53 29 am

Do you know what is the reason behind that? Any help will be very much appreciated. Thank you!

Gitter chat room

I think it'd be a nice idea to create a gitter chat room. to Share information, collaborate, ask questions...

Support binning of FFT results?

For instance, you might want to take an FFT of, say, each 1024 samples but group the resulting frequency spectrum into, say, 32 bins (instead of 512).

Automatic Gesture Spotting

Dear Nick,

I'm sorry for asking this question here but i tried to reach the http://www.nickgillian.com/forum/ forum but it's not working.

My question is about the following article
http://www.nickgillian.com/wiki/pmwiki.php/GRT/AutomaticGestureSpotting

Does the GRT toolkit supports Gesture Spotting that it can detect the beginning and the end of the gesture automatically or as stated in the article it can differentiate between gestures and non gestures patterns?

Thank you and sorry for irrelevance

OSC in GUI bug

When using just classification mode, the Processing example works fine (keys 'r', '[', ']'). But when switching to timeseries mode, the OSC messages are received, but GUI is not updated and new samples are not saved, therefore it's impossible to perform training on dataset in timeseries mode.

Windows build question

Hi, I want to build multiplatform GRT for Linux, Mac, Windows. I guess I will have to do it OS by OS and it cannot be done in Linux only for example. I was using just GRT GUI and Processing. Now I had to install Windows 10 to get Visual Studio 2015, because you cannot install it in Windows 7 and W7 has no C++11 compiler.

I did everything from readme:

  • select the project in the Visual Studio 'Solution' window, right click and select 'Properties'
  • select the 'C/C++', 'General', 'Additional Include Directories' and add the path to the folder containing the GRT source code and main GRT header (GRT.h)
  • select the 'Linker','General', 'Additional Library Directories' and add the path to the folder containing the GRT static library (grt.lib, this should be in a folder called Debug in your temporary build directory)
  • select the 'Linker','Input', 'Additional Dependencies' and add the GRT static library (grt.lib)
  • click 'Apply', you should now be able to build your custom project and link against the GRT

But after launch, there was a problem with missing grt.dll. The solution was to copy grt.dll from build Debug directory to Visual Studio project Debug folder. But I don't know if this is right as I'm the first one pointing to this behavior? Of course I have added grt.lib and its path in Linker. It won't compile without it. But it is quite strange that I have to use both static & dynamic library.

`#include <GRT.h>
using namespace GRT;

int main (int argc, const char * argv[])
{
//Print the GRT version
cout << "GRT Version: " << GRTBase::getGRTVersion() << endl;

return EXIT_SUCCESS;

}`

...also this code above: cout won't work...there is missing using namespace std; even after that Debug console will print only cerr, not cout. And this is what I found on stackoverflow.com:
"
The question is very clear. How use std::cout to debug a non-console application in Visual Studio.

The answer is very clear: you cannot. That is, Visual Studio does not support std::cout as debug tool for non-console applications.

This is a serious limitation of Visual Studio, probably a failure to meet the C++ standard even. I find it very sad to see disinformative "answers" here trying to hide this defect of their precious Visual Studio.
"

So I cannot see anything from GRT hahaha :D ...Do you think that Neatbeans with Cygwin and g++ would work in Windows? Cause Microsoft is big shit of shit.

FFTFeatures returning the bottom N frequency values instead of the top values.

Hi Nick!

I am trying to use your library to detect some temporal gestures. I decided that a good way to extract features from a temporal sign is using the information from the FFT and the FFTFeatures. After trying to use it I realized that there might be something wrong with the FFTFeatures.cpp class. As it's stated here the last feature that the FFFeatures should return should be The top N frequency values. Instead, the function is returning the bottom N frequency values. After digging into the code I realized the the error is in the file grt/GRT/FeatureExtractionModules/FFT/FFTFeatures.cpp, at the line 28, where there's the following method defined:

bool sortIndexDoubleDecendingValue(IndexedDouble i,IndexedDouble j) { return (i.value<j.value); }

And it should be like this:

bool sortIndexDoubleDecendingValue(IndexedDouble i,IndexedDouble j) { return (i.value>j.value); }

Thanks!

Michel

There's some problem with forum (sorry for offtop)

Hello Nick!
First, thank you for the grt, it is awesome tool!
I have no experience in ML almost at all, and now I have some questions and need some info about how to use your toolkit at the beginning. I think forum can answer for many of my questions, but is it not workink now. Is there any plans to restore it? Or maybe you have some dump of forum info? It would be very helpful for newbies like me.
Thank you again!

GRT.h missing inclusion?

Hello.

I tried to test SOMQuantizer today but the header file (SOMQuantizer.h) is not included in "GRT.h" though RBMQuantizer.h and other header files of Feature Extraction Modules are included properly in it.. so I am wondering if it is on purpose or not...

thanks.
Chikashi Miyama

continous HMM

Hi there

Am trying to use the continuous HMM but i couldn't understand the format of Time Series Classification Data. I used hmm before using another tool and the data format was similar to Classification Data.

Thanks a lot

bug in ClassLabelTimeoutFilter

In ClassLabelTimeoutFilter::filter function, if the predictedclasslabel equal iter->getClassLabel() and timerReached() is false, filteredClassaLabel will be set to 0, but matchFound does not change. After this, the result will be incorrect, and the post-processing does not work well.
So, adding "matchFound = true" after "filterClassLabel = 0;" may fix the bug.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.