Giter Club home page Giter Club logo

shark's Introduction

Shark is a fast, modular, general open-source C++ machine
learning library. 

Shark is licensed under the GNU Lesser General Public License, please
see the files COPYING and COPYING.LESSER, or visit
http://www.gnu.org/licenses .

Any application of the SHARK code toward military research and use is
expressly against the wishes of the SHARK development team.


INSTALLATION / DOCUMENTATION
----------------------------

The entry point to the Shark library documentation is located at
doc/index.html . For installation instructions, please click on
"Getting started" on that page. Short version of installation guide:
issue "ccmake ." in the main directory to select your build options,
and afterwards issue "make" in the main directory -- you should be
done (assuming Boost and CMake were installed). See the documentation
for detailed instructions.

BUILDING THE DOCUMENTATION: To build the documentation yourself (e.g.,
if you need to read it locally in order to install it, i.e., because
you don't have internet), see doc/README.txt


FILE STRUCTURE
--------------


README.txt          This file (residing in the root directory of
                    the Shark library).

CMakeLists.txt      Definitions for the CMake build system.

include/            This directory and its sub-directories hold
                    all include files of the library. Note that
                    some functionality is implemented in lower-
                    level Impl/ folders and inline .inl files.

lib/                The Shark library is placed in this directory.
                    In the source code distribution this directory
                    is initially empty, and the library is placed
                    into the directory as the results of
                    compilation. Binary distributions already
                    contain the library, pre-built in release mode.

doc/                All documentation files are found in this
                    sub-directory. In packaged versions of Shark
                    the html documentation is pre-built; the
                    repository provides the corresponding sources.
                    The documentation contains technical reference
                    documents for all classes and functions as well
                    as a collection of introductory and advanced
                    tutorials.

doc/index.html      Entry point to the Shark documentation.

examples/           The examples directory contains example
                    use-cases of the most important algorithms
                    implemented in Shark. Besides exemplifying
                    powerful learning algorithms, these programs
                    are intended as starting points for
                    experimentation with the library. The
                    executables corresponding to the C++ example
                    programs are found in examples/bin/.

Test/               Shark comes with a large collection of unit
                    tests, all of which reside inside the Test
                    directory.

bin/                The binaries of the Shark unit tests are placed
                    here. Once the CMake build system is set up
                    (with the "ccmake" command or equivalent) the
                    whole test suite can be executed with the
                    command "make test", issued in the Shark root
                    directory.

src/                Source files of the Shark library. Note that
                    from Shark version 3 onwards large parts of the
                    library are templated and therefore header-only.

gpl-3.0.txt         GNU general public license, version 3.


Note:
Depending of the type of Shark distribution (binary or source
package, or current repository snapshot) not all of theses files
and directories are present.



PACKAGE STRUCTURE

The organization of the include/ directory reflects the structure of
the Shark library. It consists of the following modules:


GENERAL INFRASTRUCTURE:

LinAlg              Data structures and algorithms for typical
                    linear algebra computations. For (dense and
                    sparse) vector and matrix classes Shark relies
                    on Boost uBLAS. Many higher level algorithms
                    (such as singular value decomposition) are
                    still implemented by the library itself.

Statistics          This component is new in Shark 3. It wraps the
                    capabilities of Boost accumulators, and it
                    provides tools that appear regularly in machine
                    learning, such as the Mann-Whitney U-test (also
                    known as the Wilcoxon rank-sum test).


LEARNING INFRASTRUCTURE:

Core                The core module is the central place for all
                    top-level interfaces. In addition it holds a
                    few infrastructure classes, such as exceptions.

Data                The data module hosts data containers that have
                    been specifically designed for the needs of
                    machine learning code. Also, data can be
                    imported and exported from and to different
                    standard machine learning data file formats.


MACHINE LEARNING:

Models              Models are adaptive systems, the architectures
                    on top of which (machine) learning happens.
                    Shark features a rich set of models, from simple
                    linear maps to (feed-forward and recurrent)
                    neural networks, support vector machines, and
                    different types of trees. Models can also be
                    concatenated with data format converters and
                    other models.

ObjectiveFunctions  This module collects different types of cost,
                    fitness, or objective functions for learning.
                    The bandwidth includes data-dependent error
                    functions based on simple loss functions,
                    cross-validation, area under the ROC curve, and
                    different objectives used for model selection.

Algorithms          All actual learning algorithms reside in this
                    module. There are two main groups of learning
                    algorithms, namely iterative optimizers and
                    more specialized model trainers. General
                    optimizers are organized into direct search
                    and gradient-based optimization. Specialized
                    algorithms for linear programming (a part of
                    GLPK, the GNU linear programming kit) and
                    quadratic programming for training of non-linear
                    support vector machines are included. Shark
                    also ships with algorithms for efficient
                    nearest neighbor search.

Unsupervised        This module contains the Shark implementation
                    of restricted Bolzmann machines (RBMs),
                    a recent experimental feature of Shark.



We use external libraries, images and code:

* Shark uses the `Doxygen documentation system http://www.doxygen.org
  Doxygen is available under the LGPL, as Shark.
* Shark also uses the `Sphinx documentation system: http://sphinx.pocoo.org/
  Sphinx is available under the 2-clause (simplified/Free) BSD license 
  http://www.opensource.org/licenses/bsd-license.php.
* The Shark documentation links between Sphinx and Doxygen using Doxylink
  http://pypi.python.org/pypi/sphinxcontrib-doxylink written
  by Matt Williams and released under a 2-clause (simplified/Free) BSD license
  http://www.opensource.org/licenses/bsd-license.php.
* The website header is derived from the Mollio, http://mollio.org/ set
  of html/css templates. Mollio is licensed under the
  GPLv2 <http://www.gnu.org/licenses/gpl-2.0.html
* The page icon in the local table of contents is one of Nicolas Gallagher's
  pure CSS GUI icons http://nicolasgallagher.com/pure-css-gui-icons/.
  Nicolas Gallagher's work is licensed under the
  GNU GPLv2 <http://www.gnu.org/licenses/gpl-2.0.html
* Download icon from the Steel System Icons set by Uriy1966,iconarchive.com 
  http://www.iconarchive.com/show/steel-system-icons-by-uriy1966/Download-icon.html
  free for non-commercial use.

shark's People

Contributors

asjafischer avatar bjornbugge avatar christian-igel avatar didacr avatar egomeh avatar elehcim avatar ghisvail avatar haozeke avatar htrocks avatar jakobht avatar m-p-t avatar tglas avatar ulfgard avatar wrigleyster avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

shark's Issues

error during make

Hi,

I cloned the shark repository on my Ubuntu 14, 64-bit machine and followed the installation instructions. I got an error at the beginning of the make process:

mani mani-thinkpad-t430s -desktop-shark-build_009

But I can see the clapack.h header file in /usr/include/atlas/clapack.h.

P.S. I have installed all the required packages before installing Shark.

Progress output for trainers

This ha come up a few times. When a trainer runs a long time, one might want to have some progress indication showing that the program is not stalling. How can we achieve this in a way that the user can decide how much output he wants and that we can also allow for trainer specific output? maybe some kind of callback that the user can set and takes a property-tree with some info?

Test failures with 3.1.0

Hi guys,

I have refreshed the Debian packaging with the latest 3.1.0 release. I am experiencing a few test failures, some of them due to tolerance errors. Reports for all architectures are available here. Could you have a look at it and help investigate these errors?

Thanks,
Ghis

Debian packaging

Dear Shark developers,

I am trying to revive the effort of submitting a package for Shark to the Debian archive. There has been an ITP filed here at the beginning of the year, but activity seems to have ceased.

I am aware that you guys already already have a debianization upstream but it looks insufficient to me as per the current Debian standards. Should you be interested in a submission to Debian, please answer my questions above, otherwise drop me a comment that you are not.

I have got a few questions:

  • What is the versioning logic of your software? Are you using semantic versioning ? From the content of your list file, the SOVERSION is set to be the same as VERSION which conveys the message that you intend to break the ABI even for incremental patch versions. If you don't care about ABI compatibility, it is best left to zero. For now, I am assuming this is the case.
  • Do you have a publication associated with the software ? If so, could you please paste a citation string and a bibtex reference for it, so I can include it to the package metadata.

I might have more questions as packaging progresses, so please don't close this issue until I am done.

Many thanks,

Testsuite failures with HDF5 enabled.

I have started rolling out some Debian package builds for 3.0.1, but the HDF5 tests fail with the following error:

        Start 124: Data_HDF5
124/167 Test #124: Data_HDF5 ....................................***Failed    0.07 sec
Running 4 test cases...
unknown location(0): fatal error in "BasicTests": std::exception: [loadIntoMatrix] open file name: ./Test/test_data/testfile_for_import.h5 (FAILED)
unknown location(0): fatal error in "CscTests": std::exception: [loadIntoMatrix] open file name: ./Test/test_data/testfile_for_import.h5 (FAILED)
unknown location(0): fatal error in "OneDimension": std::exception: [loadIntoMatrix] open file name: ./Test/test_data/testfile_for_import.h5 (FAILED)
expected: [importHDF5] Get data set(data/dummy) info from file(./Test/test_data/testfile_for_import.h5). got: [loadIntoMatrix] open file name: ./Test/test_data/testfile_for_import.h5 (FAILED)
/<<BUILDDIR>>/shark-3.0.1+ds1/Test/Data/HDF5Tests.cpp(264): error in "NegativeTests": incorrect exception shark::Exception is caught
expected: [loadIntoMatrix][./Test/test_data/testfile_for_import.h5][data/three_dimension] Support 1 or 2 dimensions, but this dataset has at least 3 dimensions. got: [loadIntoMatrix] open file name: ./Test/test_data/testfile_for_import.h5 (FAILED)
/<<BUILDDIR>>/shark-3.0.1+ds1/Test/Data/HDF5Tests.cpp(279): error in "NegativeTests": incorrect exception shark::Exception is caught
expected: [loadIntoMatrix] DataType doesn't match. HDF5 data type in dataset(./Test/test_data/testfile_for_import.h5::data/data1): 1, size: 8 got: [loadIntoMatrix] open file name: ./Test/test_data/testfile_for_import.h5 (FAILED)
/<<BUILDDIR>>/shark-3.0.1+ds1/Test/Data/HDF5Tests.cpp(294): error in "NegativeTests": incorrect exception shark::Exception is caught

*** 6 failures detected in test suite "CoreHDF5TestModule"

resulting in an FTBFS. I could provide a package build without HDF5 enabled, whilst this issue is investigated. The complete build log is available here.

Cheers,
Ghis

linking error in Mac OS el capitan

I have installed the shark library on Ubuntu and linked it with libshark_debug.so and liblapack_atlas.so. liblapack_atlas.so was in libatlas-base-dev package.

Now I want to do the same on Mac OS X, but I can not find liblapack_atlas.so. Linker stops with this error:

undefined symbols for architecture x86_64:
"_cblas_dgemm", reference from:
shark::blas::binding::gemm()

Which package should I install (using brew for instance) to be able to link my C++ program with liblapack_atlas.so.

Feature importance in random forest

I'm working on a classification problem with a multi-class dataset. I'm trying to model a random forest and tried to extract feature importance. However, when I ran and extracted the importance variable, it contained nothing.

Could someone provide some directions on how to use feature importance option correctly with an example? Thanks a lot.

Rethink second order methods

Second order is not very useful as the matrices are too large. We should instead think about approximation schemes which are inexpensive to compute in our framework but still include enough information about the local curvature.

This will require a change of the definition what a second order derivative in shark is.

A possible candidate is a generalization of levenberg marquardt, which can still be stored efficiently for large networks. however, this will force the approximation of the hessian to be positive definitive, which is not strictly required, e.g. for our Trust-Region-Newton which can handle indefinite matrices quite fine. Also levenberg-marquardt does only take curvature of the objective into account and no curvature of the model - is there maybe a better approximation?

In any case this will require an extension of our model and objective function interfaces.

A possible application is in writing a trainer for linear models which makes use of second order information by using a combination of trust-region-CG and the levenberg-marquardt approximation

Warnings with VS2013

I synced with the latest build from "master" and integrated OpenBlas. There are plenty of warnings

------ Rebuild All started: Project: ZERO_CHECK, Configuration: Release x64 ------
Checking Build System
CMake does not need to re-run because F:/Binaries/VS2013/Shark/CMakeFiles/generate.stamp is up-to-date.
CMake does not need to re-run because F:/Binaries/VS2013/Shark/include/CMakeFiles/generate.stamp is up-to-date.
CMake does not need to re-run because F:/Binaries/VS2013/Shark/src/CMakeFiles/generate.stamp is up-to-date.
CMake does not need to re-run because F:/Binaries/VS2013/Shark/examples/CMakeFiles/generate.stamp is up-to-date.
CMake does not need to re-run because F:/Binaries/VS2013/Shark/Test/CMakeFiles/generate.stamp is up-to-date.
------ Rebuild All started: Project: shark, Configuration: Release x64 ------
Building Custom Rule F:/GItSources/Shark/src/CMakeLists.txt
CMake does not need to re-run because F:\Binaries\VS2013\Shark\src\CMakeFiles\generate.stamp is up-to-date.
CARTTrainer.cpp
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(85): warning C4190: 'lapack_make_complex_float' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(714) : see declaration of 'std::complex'
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(101): warning C4190: 'lapack_make_complex_double' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(846) : see declaration of 'std::complex'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(47): warning C4244: 'argument' : conversion from 'double' to 'const float', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(69): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(70): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
F:\GItSources\Shark\src\Algorithms\CARTTrainer.cpp(338): warning C4244: '=' : conversion from 'size_t' to 'double', possible loss of data
F:\GItSources\Shark\src\Algorithms\CARTTrainer.cpp(341): warning C4244: 'argument' : conversion from 'size_t' to 'double', possible loss of data
F:\GItSources\Shark\src\Algorithms\CARTTrainer.cpp(487): warning C4244: 'argument' : conversion from 'size_t' to 'double', possible loss of data
F:\GItSources\Shark\src\Algorithms\CARTTrainer.cpp(532): warning C4244: 'initializing' : conversion from 'size_t' to 'double', possible loss of data
CMA.cpp
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(85): warning C4190: 'lapack_make_complex_float' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(714) : see declaration of 'std::complex'
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(101): warning C4190: 'lapack_make_complex_double' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(846) : see declaration of 'std::complex'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(47): warning C4244: 'argument' : conversion from 'double' to 'const float', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(69): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(70): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
F:\GItSources\Shark\src\Algorithms\DirectSearch\CMA.cpp(214): warning C4244: '=' : conversion from 'size_t' to 'double', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(89): warning C4244: 'argument' : conversion from '__int64' to 'int', possible loss of data
F:\GItSources\Shark\include\shark/LinAlg/BLAS/kernels/syev.hpp(58) : see reference to function template instantiation 'void shark::blas::bindings::syev<C,shark::blas::vector>(shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
F:\GItSources\Shark\include\shark/LinAlg/eigenvalues.h(90) : see reference to function template instantiation 'void shark::blas::kernels::syev<C,shark::blas::vector>(shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(130) : see reference to function template instantiation 'void shark::blas::eigensymm<C,C,shark::blas::vector>(const shark::blas::matrix_expression &,shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(89): warning C4267: 'argument' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(96): warning C4267: 'initializing' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(102): warning C4267: 'initializing' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/potrf.hpp(90): warning C4244: 'argument' : conversion from '_int64' to 'const int', possible loss of data
F:\GItSources\Shark\include\shark/LinAlg/BLAS/kernels/potrf.hpp(67) : see reference to function template instantiation 'int shark::blas::bindings::potrf<Triangular,shark::blas::matrix<double,shark::blas::column_major>>(shark::blas::matrix_containershark::blas::matrix<double,shark::blas::column_major> &,boost::mpl::true
)' being compiled
with
[
Triangular=shark::blas::lower
]
F:\GItSources\Shark\include\shark/LinAlg/Cholesky.h(78) : see reference to function template instantiation 'size_t shark::blas::kernels::potrfshark::blas::lower,shark::blas::matrix<double,shark::blas::column_major>(shark::blas::matrix_containershark::blas::matrix<double,shark::blas::column_major> &)' being compiled
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(180) : see reference to function template instantiation 'void shark::blas::choleskyDecomposition<C,shark::blas::matrix<double,shark::blas::column_major>>(const shark::blas::matrix_expression &,shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::column_major> &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/trmv.hpp(130): warning C4244: 'argument' : conversion from '_int64' to 'const int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/trmv.hpp(64) : see reference to function template instantiation 'void shark::blas::bindings::trmv<false,false,C,shark::blas::vector>(const shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &,boost::mpl::true
)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::column_major>
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(185) : see reference to function template instantiation 'void shark::blas::kernels::trmv<false,false,C,shark::blas::vector>(const shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::column_major>
]
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(209) : see reference to function template instantiation 'void shark::blas::triangular_prodshark::blas::lower,C,shark::blas::vector(const shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::column_major>
]
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(224) : see reference to function template instantiation 'void shark::MultiVariateNormalDistributionCholesky::generateshark::RealVector,shark::RealVector(Vector1 &,Vector2 &) const' being compiled
with
[
Vector1=shark::RealVector
, Vector2=shark::RealVector
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(136): warning C4244: 'argument' : conversion from '_int64' to 'const int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/gemv.hpp(71) : see reference to function template instantiation 'void shark::blas::bindings::gemv<C,shark::blas::vector,shark::blas::vector>(const shark::blas::matrix_expression &,const shark::blas::vector_expressionshark::blas::vector &,shark::blas::vector_expressionshark::blas::vector &,double,boost::mpl::true
)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::column_major>
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(29) : see reference to function template instantiation 'void shark::blas::kernels::gemv<C,shark::blas::matrix<double,shark::blas::column_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::column_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,double)' being compiled
with
[
C=shark::blas::vector
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(70) : see reference to function template instantiation 'void shark::blas::detail::axpy_prod_impl<C,shark::blas::matrix<double,shark::blas::column_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::column_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,bool,double,shark::blas::linear_structure)' being compiled
with
[
C=shark::blas::vector
]
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(212) : see reference to function template instantiation 'void shark::blas::axpy_prod<C,shark::blas::matrix<double,shark::blas::column_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::column_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,bool,double)' being compiled
with
[
C=shark::blas::vector
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4244: 'argument' : conversion from '_int64' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/dot.hpp(69) : see reference to function template instantiation 'void shark::blas::bindings::dot<shark::blas::matrix_row<const shark::blas::matrix_reference<const shark::blas::matrix<double,shark::blas::row_major>>>,shark::blas::vector_reference<const shark::blas::vector>>(const shark::blas::vector_expression<shark::blas::matrix_row<const shark::blas::matrix_reference<const shark::blas::matrix<double,shark::blas::row_major>>>> &,const shark::blas::vector_expression<shark::blas::vector_reference<const shark::blas::vector>> &,double &,boost::mpl::true
)' being compiled
F:\GItSources\Shark\include\shark/LinAlg/BLAS/vector_expression.hpp(474) : see reference to function template instantiation 'void shark::blas::kernels::dot<shark::blas::matrix_row<const shark::blas::matrix_reference<const shark::blas::matrix<double,shark::blas::row_major>>>,shark::blas::vector_reference<const shark::blas::vector>,value_type>(const shark::blas::vector_expression<shark::blas::matrix_row<const shark::blas::matrix_reference<const shark::blas::matrix<double,shark::blas::row_major>>>> &,const shark::blas::vector_expression<shark::blas::vector_reference<const shark::blas::vector>> &,result_type &)' being compiled
with
[
result_type=value_type
]
f:\gitsources\shark\include\shark\linalg\blas\matrix_expression.hpp(628) : see reference to function template instantiation 'double shark::blas::inner_prod<shark::blas::matrix_row<const shark::blas::matrix_reference<const shark::blas::matrix<double,shark::blas::row_major>>>,shark::blas::vector_reference<const shark::blas::vector>>(const shark::blas::vector_expression<shark::blas::matrix_row<const shark::blas::matrix_reference<const shark::blas::matrix<double,shark::blas::row_major>>>> &,const shark::blas::vector_expression<shark::blas::vector_reference<const shark::blas::vector>> &)' being compiled
f:\gitsources\shark\include\shark\linalg\blas\matrix_expression.hpp(587) : see reference to function template instantiation 'double shark::blas::matrix_vector_binary_traits<C,shark::blas::vector>::matrix_vector_prod::apply<shark::blas::matrix_reference<const shark::blas::matrix<double,shark::blas::row_major>>,shark::blas::vector_reference<const shark::blas::vector>>(const shark::blas::matrix_expression<shark::blas::matrix_reference<const shark::blas::matrix<double,shark::blas::row_major>>> &,const shark::blas::vector_expression<shark::blas::vector_reference<const shark::blas::vector>> &,size_t)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\matrix_expression.hpp(587) : see reference to function template instantiation 'double shark::blas::matrix_vector_binary_traits<C,shark::blas::vector>::matrix_vector_prod::apply<shark::blas::matrix_reference<const shark::blas::matrix<double,shark::blas::row_major>>,shark::blas::vector_reference<const shark::blas::vector>>(const shark::blas::matrix_expression<shark::blas::matrix_reference<const shark::blas::matrix<double,shark::blas::row_major>>> &,const shark::blas::vector_expression<shark::blas::vector_reference<const shark::blas::vector>> &,size_t)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\matrix_expression.hpp(586) : while compiling class template member function 'double shark::blas::matrix_vector_binary1<E1,E2,shark::blas::matrix_vector_binary_traits<E1,E2>::matrix_vector_prod>::operator ()(unsigned __int64) const'
with
[
E1=shark::blas::matrix<double,shark::blas::row_major>
, E2=shark::blas::vector
]
f:\gitsources\shark\include\shark\linalg\blas\kernels/vector_assign.hpp(34) : see reference to function template instantiation 'double shark::blas::matrix_vector_binary1<E1,E2,shark::blas::matrix_vector_binary_traits<E1,E2>::matrix_vector_prod>::operator ()(unsigned __int64) const' being compiled
with
[
E1=shark::blas::matrix<double,shark::blas::row_major>
, E2=shark::blas::vector
]
F:\GItSources\Shark\src\Algorithms\DirectSearch\CMA.cpp(269) : see reference to class template instantiation 'shark::blas::matrix_vector_binary1<E1,E2,shark::blas::matrix_vector_binary_traits<E1,E2>::matrix_vector_prod>' being compiled
with
[
E1=shark::blas::matrix<double,shark::blas::row_major>
, E2=shark::blas::vector
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4267: 'argument' : conversion from 'size_t' to 'int', possible loss of data
CMSA.cpp
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(85): warning C4190: 'lapack_make_complex_float' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(714) : see declaration of 'std::complex'
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(101): warning C4190: 'lapack_make_complex_double' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(846) : see declaration of 'std::complex'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(47): warning C4244: 'argument' : conversion from 'double' to 'const float', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(69): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(70): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(89): warning C4244: 'argument' : conversion from '__int64' to 'int', possible loss of data
F:\GItSources\Shark\include\shark/LinAlg/BLAS/kernels/syev.hpp(58) : see reference to function template instantiation 'void shark::blas::bindings::syev<C,shark::blas::vector>(shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
F:\GItSources\Shark\include\shark/LinAlg/eigenvalues.h(90) : see reference to function template instantiation 'void shark::blas::kernels::syev<C,shark::blas::vector>(shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(130) : see reference to function template instantiation 'void shark::blas::eigensymm<C,C,shark::blas::vector>(const shark::blas::matrix_expression &,shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(89): warning C4267: 'argument' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(96): warning C4267: 'initializing' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(102): warning C4267: 'initializing' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/potrf.hpp(90): warning C4244: 'argument' : conversion from '_int64' to 'const int', possible loss of data
F:\GItSources\Shark\include\shark/LinAlg/BLAS/kernels/potrf.hpp(67) : see reference to function template instantiation 'int shark::blas::bindings::potrf<Triangular,shark::blas::matrix<double,shark::blas::column_major>>(shark::blas::matrix_containershark::blas::matrix<double,shark::blas::column_major> &,boost::mpl::true
)' being compiled
with
[
Triangular=shark::blas::lower
]
F:\GItSources\Shark\include\shark/LinAlg/Cholesky.h(78) : see reference to function template instantiation 'size_t shark::blas::kernels::potrfshark::blas::lower,shark::blas::matrix<double,shark::blas::column_major>(shark::blas::matrix_containershark::blas::matrix<double,shark::blas::column_major> &)' being compiled
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(180) : see reference to function template instantiation 'void shark::blas::choleskyDecomposition<C,shark::blas::matrix<double,shark::blas::column_major>>(const shark::blas::matrix_expression &,shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::column_major> &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/trmv.hpp(130): warning C4244: 'argument' : conversion from '_int64' to 'const int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/trmv.hpp(64) : see reference to function template instantiation 'void shark::blas::bindings::trmv<false,false,C,shark::blas::vector>(const shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &,boost::mpl::true
)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::column_major>
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(185) : see reference to function template instantiation 'void shark::blas::kernels::trmv<false,false,C,shark::blas::vector>(const shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::column_major>
]
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(209) : see reference to function template instantiation 'void shark::blas::triangular_prodshark::blas::lower,C,shark::blas::vector(const shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::column_major>
]
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(224) : see reference to function template instantiation 'void shark::MultiVariateNormalDistributionCholesky::generateshark::RealVector,shark::RealVector(Vector1 &,Vector2 &) const' being compiled
with
[
Vector1=shark::RealVector
, Vector2=shark::RealVector
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(136): warning C4244: 'argument' : conversion from '_int64' to 'const int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/gemv.hpp(71) : see reference to function template instantiation 'void shark::blas::bindings::gemv<C,shark::blas::vector,shark::blas::vector>(const shark::blas::matrix_expression &,const shark::blas::vector_expressionshark::blas::vector &,shark::blas::vector_expressionshark::blas::vector &,double,boost::mpl::true
)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::column_major>
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(29) : see reference to function template instantiation 'void shark::blas::kernels::gemv<C,shark::blas::matrix<double,shark::blas::column_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::column_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,double)' being compiled
with
[
C=shark::blas::vector
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(70) : see reference to function template instantiation 'void shark::blas::detail::axpy_prod_impl<C,shark::blas::matrix<double,shark::blas::column_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::column_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,bool,double,shark::blas::linear_structure)' being compiled
with
[
C=shark::blas::vector
]
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(212) : see reference to function template instantiation 'void shark::blas::axpy_prod<C,shark::blas::matrix<double,shark::blas::column_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::column_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,bool,double)' being compiled
with
[
C=shark::blas::vector
]
ElitistCMA.cpp
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(85): warning C4190: 'lapack_make_complex_float' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(714) : see declaration of 'std::complex'
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(101): warning C4190: 'lapack_make_complex_double' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(846) : see declaration of 'std::complex'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(47): warning C4244: 'argument' : conversion from 'double' to 'const float', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(69): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(70): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(89): warning C4244: 'argument' : conversion from '__int64' to 'int', possible loss of data
F:\GItSources\Shark\include\shark/LinAlg/BLAS/kernels/syev.hpp(58) : see reference to function template instantiation 'void shark::blas::bindings::syev<C,shark::blas::vector>(shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
F:\GItSources\Shark\include\shark/LinAlg/eigenvalues.h(90) : see reference to function template instantiation 'void shark::blas::kernels::syev<C,shark::blas::vector>(shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(130) : see reference to function template instantiation 'void shark::blas::eigensymm<C,C,shark::blas::vector>(const shark::blas::matrix_expression &,shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(89): warning C4267: 'argument' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(96): warning C4267: 'initializing' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(102): warning C4267: 'initializing' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/potrf.hpp(90): warning C4244: 'argument' : conversion from '_int64' to 'const int', possible loss of data
F:\GItSources\Shark\include\shark/LinAlg/BLAS/kernels/potrf.hpp(67) : see reference to function template instantiation 'int shark::blas::bindings::potrf<Triangular,shark::blas::matrix<double,shark::blas::column_major>>(shark::blas::matrix_containershark::blas::matrix<double,shark::blas::column_major> &,boost::mpl::true
)' being compiled
with
[
Triangular=shark::blas::lower
]
F:\GItSources\Shark\include\shark/LinAlg/Cholesky.h(78) : see reference to function template instantiation 'size_t shark::blas::kernels::potrfshark::blas::lower,shark::blas::matrix<double,shark::blas::column_major>(shark::blas::matrix_containershark::blas::matrix<double,shark::blas::column_major> &)' being compiled
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(180) : see reference to function template instantiation 'void shark::blas::choleskyDecomposition<C,shark::blas::matrix<double,shark::blas::column_major>>(const shark::blas::matrix_expression &,shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::column_major> &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/trmv.hpp(130): warning C4244: 'argument' : conversion from '_int64' to 'const int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/trmv.hpp(64) : see reference to function template instantiation 'void shark::blas::bindings::trmv<false,false,C,shark::blas::vector>(const shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &,boost::mpl::true
)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::column_major>
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(185) : see reference to function template instantiation 'void shark::blas::kernels::trmv<false,false,C,shark::blas::vector>(const shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::column_major>
]
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(209) : see reference to function template instantiation 'void shark::blas::triangular_prodshark::blas::lower,C,shark::blas::vector(const shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::column_major>
]
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(224) : see reference to function template instantiation 'void shark::MultiVariateNormalDistributionCholesky::generateshark::RealVector,shark::RealVector(Vector1 &,Vector2 &) const' being compiled
with
[
Vector1=shark::RealVector
, Vector2=shark::RealVector
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(136): warning C4244: 'argument' : conversion from '_int64' to 'const int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/gemv.hpp(71) : see reference to function template instantiation 'void shark::blas::bindings::gemv<C,shark::blas::vector,shark::blas::vector>(const shark::blas::matrix_expression &,const shark::blas::vector_expressionshark::blas::vector &,shark::blas::vector_expressionshark::blas::vector &,double,boost::mpl::true
)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::column_major>
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(29) : see reference to function template instantiation 'void shark::blas::kernels::gemv<C,shark::blas::matrix<double,shark::blas::column_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::column_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,double)' being compiled
with
[
C=shark::blas::vector
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(70) : see reference to function template instantiation 'void shark::blas::detail::axpy_prod_impl<C,shark::blas::matrix<double,shark::blas::column_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::column_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,bool,double,shark::blas::linear_structure)' being compiled
with
[
C=shark::blas::vector
]
F:\GItSources\Shark\include\shark/Statistics/Distributions/MultiVariateNormalDistribution.h(212) : see reference to function template instantiation 'void shark::blas::axpy_prod<C,shark::blas::matrix<double,shark::blas::column_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::column_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,bool,double)' being compiled
with
[
C=shark::blas::vector
]
FisherLDA.cpp
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(85): warning C4190: 'lapack_make_complex_float' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(714) : see declaration of 'std::complex'
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(101): warning C4190: 'lapack_make_complex_double' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(846) : see declaration of 'std::complex'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(47): warning C4244: 'argument' : conversion from 'double' to 'const float', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(69): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(70): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
F:\GItSources\Shark\src\Algorithms\FisherLDA.cpp(116): warning C4244: 'argument' : conversion from 'unsigned __int64' to 'double', possible loss of data
F:\GItSources\Shark\src\Algorithms\FisherLDA.cpp(131): warning C4244: 'argument' : conversion from 'size_t' to 'double', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(89): warning C4244: 'argument' : conversion from '__int64' to 'int', possible loss of data
F:\GItSources\Shark\include\shark/LinAlg/BLAS/kernels/syev.hpp(58) : see reference to function template instantiation 'void shark::blas::bindings::syev<C,shark::blas::vector>(shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
F:\GItSources\Shark\include\shark/LinAlg/eigenvalues.h(90) : see reference to function template instantiation 'void shark::blas::kernels::syev<C,shark::blas::vector>(shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
F:\GItSources\Shark\src\Algorithms\FisherLDA.cpp(59) : see reference to function template instantiation 'void shark::blas::eigensymm<C,C,shark::blas::vector>(const shark::blas::matrix_expression &,shark::blas::matrix_expression &,shark::blas::vector_expressionshark::blas::vector &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(89): warning C4267: 'argument' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(96): warning C4267: 'initializing' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/syev.hpp(102): warning C4267: 'initializing' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemm.hpp(161): warning C4244: 'argument' : conversion from 'int64' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/gemm.hpp(72) : see reference to function template instantiation 'void shark::blas::bindings::gemm<C,shark::blas::matrix_transpose,C>(const shark::blas::matrix_expression &,const shark::blas::matrix_expression<shark::blas::matrix_transpose> &,shark::blas::matrix_expression &,double,boost::mpl::true
)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(135) : see reference to function template instantiation 'void shark::blas::kernels::gemm<C,C,shark::blas::matrix_transpose>(const shark::blas::matrix_expression &,const shark::blas::matrix_expression<shark::blas::matrix_transpose> &,shark::blas::matrix_expression &,double)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
F:\GItSources\Shark\include\shark/Models/LinearModel.h(180) : see reference to function template instantiation 'void shark::blas::axpy_prod<C,C,shark::blas::matrix_transpose>(const shark::blas::matrix_expression &,const shark::blas::matrix_expression<shark::blas::matrix_transpose> &,shark::blas::matrix_expression &,bool,double)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
F:\GItSources\Shark\include\shark/Models/LinearModel.h(177) : while compiling class template member function 'void shark::LinearModelshark::RealVector::eval(const shark::blas::matrix<double,shark::blas::row_major> &,shark::blas::matrix<double,shark::blas::row_major> &) const'
F:\GItSources\Shark\include\shark/Algorithms/Trainers/AbstractTrainer.h(68) : see reference to class template instantiation 'shark::LinearModelshark::RealVector' being compiled
F:\GItSources\Shark\include\shark/Algorithms/Trainers/FisherLDA.h(80) : see reference to class template instantiation 'shark::AbstractTrainershark::LinearModel<shark::RealVector,unsigned int>' being compiled
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemm.hpp(161): warning C4244: 'argument' : conversion from 'shark::blas::dense_matrix_adaptor<double,shark::blas::row_major>::difference_type' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/gemm.hpp(72) : see reference to function template instantiation 'void shark::blas::bindings::gemm<shark::blas::matrix_transpose,C,shark::blas::dense_matrix_adaptor<double,shark::blas::row_major>>(const shark::blas::matrix_expression<shark::blas::matrix_transpose> &,const shark::blas::matrix_expression &,shark::blas::matrix_expressionshark::blas::dense_matrix_adaptor<double,shark::blas::row_major> &,double,boost::mpl::true
)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(135) : see reference to function template instantiation 'void shark::blas::kernels::gemmshark::blas::dense_matrix_adaptor<double,shark::blas::row_major,shark::blas::matrix_transpose,C>(const shark::blas::matrix_expression<shark::blas::matrix_transpose> &,const shark::blas::matrix_expression &,shark::blas::matrix_expressionshark::blas::dense_matrix_adaptor<double,shark::blas::row_major> &,double)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
F:\GItSources\Shark\include\shark/Models/LinearModel.h(204) : see reference to function template instantiation 'void shark::blas::axpy_prodshark::blas::dense_matrix_adaptor<double,shark::blas::row_major,shark::blas::matrix_transpose,C>(const shark::blas::matrix_expression<shark::blas::matrix_transpose> &,const shark::blas::matrix_expression &,shark::blas::matrix_expressionshark::blas::dense_matrix_adaptor<double,shark::blas::row_major> &,bool,double)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
F:\GItSources\Shark\include\shark/Models/LinearModel.h(193) : while compiling class template member function 'void shark::LinearModelshark::RealVector::weightedParameterDerivative(const shark::blas::matrix<double,shark::blas::row_major> &,const shark::RealMatrix &,const shark::State &,shark::RealVector &) const'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(136): warning C4244: 'argument' : conversion from '_int64' to 'const int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/gemv.hpp(71) : see reference to function template instantiation 'void shark::blas::bindings::gemv<C,shark::blas::vector,shark::blas::vector>(const shark::blas::matrix_expression &,const shark::blas::vector_expressionshark::blas::vector &,shark::blas::vector_expressionshark::blas::vector &,double,boost::mpl::true
)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(29) : see reference to function template instantiation 'void shark::blas::kernels::gemv<C,shark::blas::matrix<double,shark::blas::row_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::row_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,double)' being compiled
with
[
C=shark::blas::vector
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(70) : see reference to function template instantiation 'void shark::blas::detail::axpy_prod_impl<C,shark::blas::matrix<double,shark::blas::row_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::row_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,bool,double,shark::blas::linear_structure)' being compiled
with
[
C=shark::blas::vector
]
F:\GItSources\Shark\src\Algorithms\FisherLDA.cpp(70) : see reference to function template instantiation 'void shark::blas::axpy_prod<C,shark::blas::matrix<double,shark::blas::row_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::row_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,bool,double)' being compiled
with
[
C=shark::blas::vector
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/potrf.hpp(90): warning C4244: 'argument' : conversion from 'int64' to 'const int', possible loss of data
F:\GItSources\Shark\include\shark/LinAlg/BLAS/kernels/potrf.hpp(67) : see reference to function template instantiation 'int shark::blas::bindings::potrf<Triangular,shark::blas::matrix<double,shark::blas::row_major>>(shark::blas::matrix_containershark::blas::matrix<double,shark::blas::row_major> &,boost::mpl::true
)' being compiled
with
[
Triangular=shark::blas::lower
]
F:\GItSources\Shark\include\shark/LinAlg/Cholesky.h(78) : see reference to function template instantiation 'size_t shark::blas::kernels::potrfshark::blas::lower,shark::blas::matrix<double,shark::blas::row_major>(shark::blas::matrix_containershark::blas::matrix<double,shark::blas::row_major> &)' being compiled
f:\gitsources\shark\include\shark\linalg\Impl/solveSystem.inl(125) : see reference to function template instantiation 'void shark::blas::choleskyDecomposition<C,C>(const shark::blas::matrix_expression &,shark::blas::matrix_expression &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\Impl/solveSystem.inl(169) : see reference to function template instantiation 'void shark::blas::solveSymmPosDefSystemInPlace<System,C,C>(const shark::blas::matrix_expression &,shark::blas::matrix_expression &)' being compiled
with
[
System=shark::blas::SolveAXB
, C=shark::blas::matrix<double,shark::blas::row_major>
]
F:\GItSources\Shark\src\Algorithms\FisherLDA.cpp(141) : see reference to function template instantiation 'void shark::blas::solveSymmPosDefSystemshark::blas::SolveAXB,C,C,C(const shark::blas::matrix_expression &,shark::blas::matrix_expression &,const shark::blas::matrix_expression &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/trsm.hpp(113): warning C4267: 'initializing' : conversion from 'size_t' to 'int', possible loss of data
F:\GItSources\Shark\include\shark/LinAlg/BLAS/kernels/trsm.hpp(64) : see reference to function template instantiation 'void shark::blas::bindings::trsm<false,false,C,C>(const shark::blas::matrix_expression &,shark::blas::matrix_expression &,boost::mpl::true
)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\Impl/solveTriangular.inl(107) : see reference to function template instantiation 'void shark::blas::kernels::trsm<false,false,C,C>(const shark::blas::matrix_expression &,shark::blas::matrix_expression &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\Impl/solveTriangular.inl(71) : see reference to function template instantiation 'void shark::blas::solveTriangularSystemInPlaceshark::blas::SolveAXB,shark::blas::lower,C,C(const shark::blas::matrix_expression &,shark::blas::matrix_expression &)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\Impl/solveTriangular.inl(129) : see reference to function template instantiation 'void shark::blas::detail::solveTriangularCholeskyInPlace<C,shark::blas::matrix<double,shark::blas::row_major>>(const shark::blas::matrix_expression &,Arg &,shark::blas::SolveAXB)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
, Arg=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\Impl/solveSystem.inl(127) : see reference to function template instantiation 'void shark::blas::solveTriangularCholeskyInPlace<System,C,C>(const shark::blas::matrix_expression &,shark::blas::matrix_expression &)' being compiled
with
[
System=shark::blas::SolveAXB
, C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/trsm.hpp(114): warning C4267: 'initializing' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/trsm.hpp(121): warning C4244: 'argument' : conversion from '__int64' to 'int', possible loss of data
AbstractLineSearchOptimizer.cpp
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(85): warning C4190: 'lapack_make_complex_float' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(714) : see declaration of 'std::complex'
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(101): warning C4190: 'lapack_make_complex_double' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(846) : see declaration of 'std::complex'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(47): warning C4244: 'argument' : conversion from 'double' to 'const float', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(69): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(70): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4244: 'argument' : conversion from '_int64' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/dot.hpp(69) : see reference to function template instantiation 'void shark::blas::bindings::dot<C,C>(const shark::blas::vector_expression &,const shark::blas::vector_expression &,double &,boost::mpl::true
)' being compiled
with
[
C=shark::blas::vector
]
F:\GItSources\Shark\include\shark/LinAlg/BLAS/vector_expression.hpp(474) : see reference to function template instantiation 'void shark::blas::kernels::dot<C,C,value_type>(const shark::blas::vector_expression &,const shark::blas::vector_expression &,result_type &)' being compiled
with
[
C=shark::blas::vector
, result_type=value_type
]
f:\gitsources\shark\include\shark\algorithms\gradientdescent\Impl/dlinmin.inl(194) : see reference to function template instantiation 'double shark::blas::inner_prod<C,C>(const shark::blas::vector_expression &,const shark::blas::vector_expression &)' being compiled
with
[
C=shark::blas::vector
]
F:\GItSources\Shark\include\shark/Algorithms/GradientDescent/LineSearch.h(103) : see reference to function template instantiation 'void shark::detail::dlinmin<shark::RealVector,shark::RealVector,const shark::LineSearch::ObjectiveFunction>(VectorT &,const VectorU &,double &,DifferentiableFunction &,double,double)' being compiled
with
[
VectorT=shark::RealVector
, VectorU=shark::RealVector
, DifferentiableFunction=const shark::LineSearch::ObjectiveFunction
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4267: 'argument' : conversion from 'size_t' to 'int', possible loss of data
BFGS.cpp
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(85): warning C4190: 'lapack_make_complex_float' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(714) : see declaration of 'std::complex'
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(101): warning C4190: 'lapack_make_complex_double' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(846) : see declaration of 'std::complex'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(47): warning C4244: 'argument' : conversion from 'double' to 'const float', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(69): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(70): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4244: 'argument' : conversion from '_int64' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/dot.hpp(69) : see reference to function template instantiation 'void shark::blas::bindings::dot<C,C>(const shark::blas::vector_expression &,const shark::blas::vector_expression &,double &,boost::mpl::true
)' being compiled
with
[
C=shark::blas::vector
]
F:\GItSources\Shark\include\shark/LinAlg/BLAS/vector_expression.hpp(474) : see reference to function template instantiation 'void shark::blas::kernels::dot<C,C,value_type>(const shark::blas::vector_expression &,const shark::blas::vector_expression &,result_type &)' being compiled
with
[
C=shark::blas::vector
, result_type=value_type
]
F:\GItSources\Shark\src\Algorithms\GradientDescent\BFGS.cpp(51) : see reference to function template instantiation 'double shark::blas::inner_prod<C,C>(const shark::blas::vector_expression &,const shark::blas::vector_expression &)' being compiled
with
[
C=shark::blas::vector
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4267: 'argument' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(136): warning C4244: 'argument' : conversion from '_int64' to 'const int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/gemv.hpp(71) : see reference to function template instantiation 'void shark::blas::bindings::gemv<C,shark::blas::vector,shark::blas::vector>(const shark::blas::matrix_expression &,const shark::blas::vector_expressionshark::blas::vector &,shark::blas::vector_expressionshark::blas::vector &,double,boost::mpl::true
)' being compiled
with
[
C=shark::blas::matrix<double,shark::blas::row_major>
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(29) : see reference to function template instantiation 'void shark::blas::kernels::gemv<C,shark::blas::matrix<double,shark::blas::row_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::row_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,double)' being compiled
with
[
C=shark::blas::vector
]
f:\gitsources\shark\include\shark\linalg\blas\operation.hpp(70) : see reference to function template instantiation 'void shark::blas::detail::axpy_prod_impl<C,shark::blas::matrix<double,shark::blas::row_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::row_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,bool,double,shark::blas::linear_structure)' being compiled
with
[
C=shark::blas::vector
]
F:\GItSources\Shark\src\Algorithms\GradientDescent\BFGS.cpp(54) : see reference to function template instantiation 'void shark::blas::axpy_prod<C,shark::blas::matrix<double,shark::blas::row_major>,C>(const shark::blas::matrix_expressionshark::blas::matrix<double,shark::blas::row_major> &,const shark::blas::vector_expression &,shark::blas::vector_expression &,bool,double)' being compiled
with
[
C=shark::blas::vector
]
CG.cpp
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(85): warning C4190: 'lapack_make_complex_float' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(714) : see declaration of 'std::complex'
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(101): warning C4190: 'lapack_make_complex_double' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(846) : see declaration of 'std::complex'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(47): warning C4244: 'argument' : conversion from 'double' to 'const float', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(69): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(70): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4244: 'argument' : conversion from '_int64' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/dot.hpp(69) : see reference to function template instantiation 'void shark::blas::bindings::dot<C,C>(const shark::blas::vector_expression &,const shark::blas::vector_expression &,double &,boost::mpl::true
)' being compiled
with
[
C=shark::blas::vector
]
F:\GItSources\Shark\include\shark/LinAlg/BLAS/vector_expression.hpp(474) : see reference to function template instantiation 'void shark::blas::kernels::dot<C,C,value_type>(const shark::blas::vector_expression &,const shark::blas::vector_expression &,result_type &)' being compiled
with
[
C=shark::blas::vector
, result_type=value_type
]
f:\gitsources\shark\include\shark\algorithms\gradientdescent\Impl/dlinmin.inl(194) : see reference to function template instantiation 'double shark::blas::inner_prod<C,C>(const shark::blas::vector_expression &,const shark::blas::vector_expression &)' being compiled
with
[
C=shark::blas::vector
]
F:\GItSources\Shark\include\shark/Algorithms/GradientDescent/LineSearch.h(103) : see reference to function template instantiation 'void shark::detail::dlinmin<shark::RealVector,shark::RealVector,const shark::LineSearch::ObjectiveFunction>(VectorT &,const VectorU &,double &,DifferentiableFunction &,double,double)' being compiled
with
[
VectorT=shark::RealVector
, VectorU=shark::RealVector
, DifferentiableFunction=const shark::LineSearch::ObjectiveFunction
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4267: 'argument' : conversion from 'size_t' to 'int', possible loss of data
LBFGS.cpp
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(85): warning C4190: 'lapack_make_complex_float' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(714) : see declaration of 'std::complex'
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(101): warning C4190: 'lapack_make_complex_double' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(846) : see declaration of 'std::complex'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(47): warning C4244: 'argument' : conversion from 'double' to 'const float', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(69): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(70): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4244: 'argument' : conversion from '_int64' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/dot.hpp(69) : see reference to function template instantiation 'void shark::blas::bindings::dot<C,C>(const shark::blas::vector_expression &,const shark::blas::vector_expression &,double &,boost::mpl::true
)' being compiled
with
[
C=shark::blas::vector
]
F:\GItSources\Shark\include\shark/LinAlg/BLAS/vector_expression.hpp(474) : see reference to function template instantiation 'void shark::blas::kernels::dot<C,C,value_type>(const shark::blas::vector_expression &,const shark::blas::vector_expression &,result_type &)' being compiled
with
[
C=shark::blas::vector
, result_type=value_type
]
F:\GItSources\Shark\src\Algorithms\GradientDescent\LBFGS.cpp(78) : see reference to function template instantiation 'double shark::blas::inner_prod<C,C>(const shark::blas::vector_expression &,const shark::blas::vector_expression &)' being compiled
with
[
C=shark::blas::vector
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4267: 'argument' : conversion from 'size_t' to 'int', possible loss of data
Rprop.cpp
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(85): warning C4190: 'lapack_make_complex_float' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(714) : see declaration of 'std::complex'
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(101): warning C4190: 'lapack_make_complex_double' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(846) : see declaration of 'std::complex'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(47): warning C4244: 'argument' : conversion from 'double' to 'const float', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(69): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(70): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
TrustRegionNewton.cpp
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(85): warning C4190: 'lapack_make_complex_float' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(714) : see declaration of 'std::complex'
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(101): warning C4190: 'lapack_make_complex_double' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(846) : see declaration of 'std::complex'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(47): warning C4244: 'argument' : conversion from 'double' to 'const float', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(69): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(70): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4244: 'argument' : conversion from '_int64' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/dot.hpp(69) : see reference to function template instantiation 'void shark::blas::bindings::dot<C,C>(const shark::blas::vector_expression &,const shark::blas::vector_expression &,double &,boost::mpl::true
)' being compiled
with
[
C=shark::blas::vector
]
F:\GItSources\Shark\include\shark/LinAlg/BLAS/vector_expression.hpp(474) : see reference to function template instantiation 'void shark::blas::kernels::dot<C,C,value_type>(const shark::blas::vector_expression &,const shark::blas::vector_expression &,result_type &)' being compiled
with
[
C=shark::blas::vector
, result_type=value_type
]
F:\GItSources\Shark\src\Algorithms\GradientDescent\TrustRegionNewton.cpp(53) : see reference to function template instantiation 'double shark::blas::inner_prod<C,C>(const shark::blas::vector_expression &,const shark::blas::vector_expression &)' being compiled
with
[
C=shark::blas::vector
]
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4267: 'argument' : conversion from 'size_t' to 'int', possible loss of data
KMeans.cpp
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(85): warning C4190: 'lapack_make_complex_float' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(714) : see declaration of 'std::complex'
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(101): warning C4190: 'lapack_make_complex_double' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(846) : see declaration of 'std::complex'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(47): warning C4244: 'argument' : conversion from 'double' to 'const float', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(69): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(70): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
LDA.cpp
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(85): warning C4190: 'lapack_make_complex_float' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(714) : see declaration of 'std::complex'
E:\ThirdPartyLibraries_VS2013\64Bit\openblas\cmake..\include\lapacke.h(101): warning C4190: 'lapack_make_complex_double' has C-linkage specified, but returns UDT 'std::complex' which is incompatible with C
E:\Program_Files\Microsoft Visual Studio 12.0\VC\include\complex(846) : see declaration of 'std::complex'
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(47): warning C4244: 'argument' : conversion from 'double' to 'const float', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(69): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemv.hpp(70): warning C4244: 'argument' : conversion from 'double' to 'const std::complex::_Ty', possible loss of data
F:\GItSources\Shark\src\Algorithms\LDA.cpp(70): warning C4244: 'argument' : conversion from 'size_t' to 'double', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4244: 'argument' : conversion from '_int64' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/dot.hpp(69) : see reference to function template instantiation 'void shark::blas::bindings::dotshark::blas::matrix_rowshark::RealMatrix,shark::blas::matrix_rowshark::RealMatrix(const shark::blas::vector_expressionshark::blas::matrix_rowshark::RealMatrix &,const shark::blas::vector_expressionshark::blas::matrix_rowshark::RealMatrix &,double &,boost::mpl::true
)' being compiled
F:\GItSources\Shark\include\shark/LinAlg/BLAS/vector_expression.hpp(474) : see reference to function template instantiation 'void shark::blas::kernels::dotshark::blas::matrix_rowshark::RealMatrix,shark::blas::matrix_rowshark::RealMatrix,value_type(const shark::blas::vector_expressionshark::blas::matrix_rowshark::RealMatrix &,const shark::blas::vector_expressionshark::blas::matrix_rowshark::RealMatrix &,result_type &)' being compiled
with
[
result_type=value_type
]
F:\GItSources\Shark\src\Algorithms\LDA.cpp(104) : see reference to function template instantiation 'double shark::blas::inner_prodshark::blas::matrix_rowshark::RealMatrix,shark::blas::matrix_rowshark::RealMatrix(const shark::blas::vector_expressionshark::blas::matrix_rowshark::RealMatrix &,const shark::blas::vector_expressionshark::blas::matrix_rowshark::RealMatrix &)' being compiled
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/dot.hpp(107): warning C4267: 'argument' : conversion from 'size_t' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels\openblas/gemm.hpp(161): warning C4244: 'argument' : conversion from 'int64' to 'int', possible loss of data
f:\gitsources\shark\include\shark\linalg\blas\kernels/gemm.hpp(72) : see reference to function template instantiation 'void shark::blas::bindings::gemm<C,shark::blas::matrix_transpose,C>(const shark::blas::matrix_expression &,const shark::blas::matrix_expression<shark::blas::matrix_transpose

SteadyStateMOCMA does not implement read() and write()

The SteadyStateMOCMA implements a serialize() function but not the read() and write() functions. Since the AbstractOptimizer states that it implements the ISerializable interface the functionality in serialize() should be in read()/write() instead.

Incremental hypervolume, dominance and least contributor computation

In may cases we only need to add a small(or single) point to a current set and query its hypervolume contribution or update the hypervolume of the set. Similarly we also often need to query whether a new point is dominated.

All these cases can be computed a lot faster using an incremental algorithm.

Initialization of SMS-EMOA

In the call to SMSEMOA::doInit, if startingPoints.size() is greater than mu, line 197 is reached with numPoints==0:
https://github.com/Shark-ML/Shark/blob/InterfaceEvo/include/shark/Algorithms/DirectSearch/SMS-EMOA.h#L197
The call to Rng::discrete is then fed the interval (0,-1) and since -1 is a std::size_t it is interpreted as a very large number like (18446744073709551615). The index picked is therefore with high probability out of bounds when used in the next line. @Ulfgard

Fundamental Matrix structure

In order to start work on adding Eigen / ViennaCl support to Shark, I would need to change the basic Matrix structure. Can someone point out what would be the best place to begin this?

Potential overlinkage problem

This is the content of the SharkConfig.cmake file generated from v3.0.1

# - Config file for the Shark package
# It defines the following variables
#  SHARK_INCLUDE_DIRS - include directories for SHARK
#  SHARK_LIBRARIES    - libraries to link against
#  SHARK_LIBRARY_DIRS - path to the libraries
set(SHARK_INSTALL_PREFIX /usr)

set(SHARK_INCLUDE_DIRS "${SHARK_INSTALL_PREFIX}/include;/usr/include;/usr/include")
set(SHARK_LIBRARY_DIRS "${SHARK_INSTALL_PREFIX}/lib")

# Our library dependencies (contains definitions for IMPORTED targets)
include("${SHARK_INSTALL_PREFIX}/lib/x86_64-linux-gnu/cmake/Shark/SharkTargets.cmake")

# The Shark version number
set(SHARK_VERSION_MAJOR "3")
set(SHARK_VERSION_MINOR "0")
set(SHARK_VERSION_PATCH "0")

# The C and C++ flags added by Shark to the cmake-configured flags.
SET(SHARK_REQUIRED_C_FLAGS "-fopenmp")
SET(SHARK_REQUIRED_CXX_FLAGS "-fopenmp")
SET(SHARK_REQUIRED_EXE_LINKER_FLAGS "")
SET(SHARK_REQUIRED_SHARED_LINKER_FLAGS "")
SET(SHARK_REQUIRED_MODULE_LINKER_FLAGS "")

# The location of the UseShark.cmake file.
SET(SHARK_USE_FILE "${SHARK_INSTALL_PREFIX}/lib/x86_64-linux-gnu/cmake/Shark/UseShark.cmake")

set(SHARK_LIBRARIES "/usr/lib/x86_64-linux-gnu/libboost_system.so;/usr/lib/x86_64-linux-gnu/libboost_date_time.so;/usr/lib/x86_64-linux-gnu/libboost_filesystem.so;/usr/lib/x86_64-linux-gnu/libboost_program_options.so;/usr/lib/x86_64-linux-gnu/libboost_serialization.so;/usr/lib/x86_64-linux-gnu/libboost_thread.so;/usr/lib/x86_64-linux-gnu/libboost_unit_test_framework.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/x86_64-linux-gnu/hdf5/serial/lib/libhdf5.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/x86_64-linux-gnu/libsz.so;/usr/lib/x86_64-linux-gnu/libz.so;/usr/lib/x86_64-linux-gnu/libdl.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/hdf5/serial/lib/libhdf5_cpp.so;/usr/lib/x86_64-linux-gnu/hdf5/serial/lib/libhdf5.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/x86_64-linux-gnu/libsz.so;/usr/lib/x86_64-linux-gnu/libz.so;/usr/lib/x86_64-linux-gnu/libdl.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/hdf5/serial/lib/libhdf5_hl.so;/usr/lib/x86_64-linux-gnu/hdf5/serial/lib/libhdf5.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/x86_64-linux-gnu/libsz.so;/usr/lib/x86_64-linux-gnu/libz.so;/usr/lib/x86_64-linux-gnu/libdl.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/libblas/libblas.so;shark")

I am puzzled by the value of SHARK_LIBRARIES. The Shark DSO and headers do not require all these dependencies, do they ? For instance the Boost unit_test dependency is likely to be only required for building the testsuite, not to use the library.

Could you guys please review this and come up with a sane selection of dependencies just required for using Shark.

Many thanks,

SegFault in Ubuntu 15.10 Desktop

I have problem using Shark in Ubuntu 15.10. In Ubuntu 14 everything seems to be working ok but the same C++ program in Ubuntu 15.10 gives me segmentation fault and my program crashes.

Program received signal SIGSEGV, segmentation fault in
_GLOBAL__sub_I_KernelBasisDistance.cpp () from /usr/local/lib/libshark.so.0

screenshot from 2016-01-24 12 01 15

HDF5 test failure on Debian.

Whilst working on the packaging for Debian, I encountered the following test failure concerning the HDF5 component:

123/167 Test #124: Data_HDF5 ....................................***Failed    0.20 sec
Running 4 test cases...
/home/ghislain/debian/packages/build-area/shark-3.0.0+ds1/Test/Data/HDF5Tests.cpp(247): error in "NegativeTests": incorrect exception shark::Exception is caught

*** 1 failure detected in test suite "CoreHDF5TestModule"

Any idea where it could come from?

(The build machine is using an up-to-date version of Debian Testing)

Build project doc error

Visual Studio 2015.

cannot open tag file D:/Shark-3.1.0/doc/doxygen_pages/tag_files/all.tag for writing

write computational kernel for LU decomposition

Not super important but it might be useful to have a proper kernel that uses the atlas/lapack backend if possible. This might require changes in the permutation class to make it compatible to lapack or atlas

attaching student label because this is relatively easy

Complex Capabilities of LinAlg

The following has to be done for complex support:

  1. Add a proper conj() and herm sets of methods. How can we do this without loosing the storage scheme? we need to carry a flag that helps us dispatch in the called kernels
  2. Test everything.

v3.0.1 compilation errors with Visual Studio 2015

I'm trying to build v3.0.1 on Windows 7 64-bit with Visual Studio 2015 Update 1, x64 release configuration and I'm getting this error:

C:\Shark\src\Models\RNNet.cpp(42): error C2668: 'shark::size': ambiguous call to overloaded function

Test 126 - Data_HDF5 fails

Hi,
When you do

make test

you get

126 - Data_HDF5 (Failed)

because the test looks for an input file:

m_exampleFileName("./Test/test_data/testfile_for_import.h5“)

What would be the best way to handle this?
The test works fine if called from the Shark root directory.
Cheers,
Christian

BLAS Block-Expressions

The shark::blas component needs a better way to represent block expressions, that is expressions that can not effectively computed in an element-wise fashion, but instead blocks. The current work around is that block expressions like prod(A,B) compute and return their result. However, this requires an assumption about the optimal return type for which the algorithm might not have enough information. It further requires the use of more intermediate results than necessary as for example

noalias(C)+=prod(A,B)

can be computed without any intermediate results by

axpy_prod(A,B,C,false);

it become worse with chained results as

noalias(C)+=prod(A,B)+prod(D,E)

the solution is to return a new type of expression that represents a blocked access. In the cases above, this could be resolved by C::plus_assign calling a new method like prod(A,B)::plus_assign_to(C) and now this can dispatch to axpy_prod(A,B,C,false) or use one of the other kernels.

to do this, expressions might require a new tag like "evaluation_type" which helps distinguishing between blocked and elementwise of expressions.

the optimal interface is a problem though as by this

subrange(prod(A,B),start,end,start,end)

would need to dispatch the subset operation to prod(A,B), which requires a complicated interface. A workaround would be not to allow proxies from blocked expressions

OpenBlas support

I would like to submit a patch that would enable usage of OpenBlas. How do I go about doing this?

KNN with other kernels

Can some examples be made available for implementing KNN with other kernels besides the LinearKernel ?

Remove override of CMAKE_BUILD_TYPE

Taken from the root CMakeLists.txt, the following snippet:

string( TOLOWER ${CMAKE_BUILD_TYPE} CMAKE_BUILD_TYPE )
if(${CMAKE_BUILD_TYPE}  MATCHES "debug")
    #we want to be warned in debug mode
    if(UNIX)
#       if(CMAKE_COMPILER_IS_GNUCC)
        set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall")
#       endif()
    endif()
endif()

is bad, as it overrides the value of CMAKE_BUILD_TYPE and prevents CMake doing its magic when using the RelWithDebInfo configuration (useful for Debian packaging). Please consider an alternative solution, such as using a different variable to store the converted string.

BLAS simplification rules

The basic idea is the following:

Assume we have an expression

noalias(x)=prod(prod(A,B)+C,v)

with A,B,C being matrices and x and v being a vector. Lets further assume prod actually returned an expression instead of a result as by #7. computing the result in the straight forward way as

D=prod(A,B)+C
noalias(x)=prod(D,v)

is extremely expensive as we compute the whole expensive matrix-matrix computation. It also requires storage of D in memory. Reordering the expression to

prod(A,prod(B,v))+prod(C,v)

would resolve this problem.In most cases the user can do this by himself, but there are some cases where this is not a good option. For example assume we want to use

approxsolveSymmPosDefSystemInPlace(prod(A,B)+C,x);

here the user can not reorder and instead the algorithm itself would need to realize that it has to compute the intermediate result of the expression as otherwise it will constantly recompute prod(A,B)+C. In many cases computing the result even once would be more expensive than the whole algorithm on the simplified matrix.

The solution to this are expression simplifiers which take an expression and simplify it. This would also allow to use proxies in conjunction with #7 as we could reorder the expression so that the proxies are applied to the arguments (in a meaningful way) instead.

Lapack issue: undefined symbol: clapack_dpotrf

Hey guys

I tried to compile Shark on my Ubuntu (14.04) machine and ran into an error while compiling one of the examples. Lapack on my machine does not include dpotrf (undefined symbol: clapack_dpotrf).

I assume that it is necessary to build Atlas myself instead of using the available package on Ubuntu if I need dpotrf (and maybe other functions)? Can someone please confirm this?

Also has anyone used Intel's MKL with Shark?

Thanks!

Licensing mismatch in public header files

Despite the project being advertised as LGPL-3, some public headers are licensed under GPL-3:

licensecheck -r --copyright include/ | grep " GPL"
include/shark/LinAlg/Impl/eigensymm.inl: *No copyright* GPL (v3 or later)
include/shark/LinAlg/Impl/pivotingRQ.inl: GPL (v3 or later)
include/shark/LinAlg/Impl/solveSystem.inl: GPL (v3 or later)
include/shark/LinAlg/Impl/solveTriangular.inl: GPL (v3 or later)
include/shark/LinAlg/Impl/svd.inl: GPL (v3 or later)
include/shark/LinAlg/Impl/Cholesky.inl: GPL (v3 or later)
include/shark/Models/RecurrentStructure.h: GPL (v3 or later)
include/shark/Models/Impl/LinearModel.inl: GPL (v3 or later)
include/shark/Unsupervised/RBM/Neuronlayers/BipolarLayer.h: *No copyright* GPL (v3 or later)
include/shark/Data/Impl/Dataset.inl: *No copyright* GPL (v3 or later)
include/shark/ObjectiveFunctions/Impl/ErrorFunction.inl: *No copyright* GPL (v3 or later)
include/shark/ObjectiveFunctions/Impl/SparseAutoencoderError.inl: *No copyright* GPL (v3 or later)
include/shark/ObjectiveFunctions/Impl/NoisyErrorFunction.inl: *No copyright* GPL (v3 or later)
include/shark/Core/Shark.h.in: GPL (v3 or later)
include/shark/Algorithms/GradientDescent/Impl/dlinmin.inl: GPL (v3 or later)
include/shark/Algorithms/GradientDescent/Impl/wolfecubic.inl: GPL (v3 or later)

others have missing licensing information:

licensecheck -r --copyright include/ | grep "UNKNOWN
"include/shark/LinAlg/BLAS/fwd.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/matrix_set.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/expression_types.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/matrix.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/vector_expression.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/operation.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/io.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/permutation.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/vector_sparse.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/detail/iterator.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/detail/functional.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/matrix_expression.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/vector.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/matrix_sparse.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/triangular_matrix.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/lu.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/kernels/matrix_assign.hpp: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/kernels/gotoblas/cblas_inc.h: *No copyright* UNKNOWN
include/shark/LinAlg/BLAS/kernels/vector_assign.hpp: *No copyright* UNKNOWN
include/shark/Statistics/Tests/WilcoxonRankSumTest.h: *No copyright* UNKNOWN
include/shark/Models/LinearClassifier.h: *No copyright* UNKNOWN
include/shark/Models/SigmoidModel.h: *No copyright* UNKNOWN
include/shark/Models/LinearNorm.h: *No copyright* UNKNOWN
include/shark/Unsupervised/RBM/Problems/Shifter.h: *No copyright* UNKNOWN
include/shark/Unsupervised/RBM/Problems/BarsAndStripes.h: *No copyright* UNKNOWN
include/shark/Unsupervised/RBM/Problems/MNIST.h: *No copyright* UNKNOWN
include/shark/Unsupervised/RBM/Problems/DistantModes.h: *No copyright* UNKNOWN
include/shark/Unsupervised/RBM/Impl/ConvolutionalEnergyGradient.h: *No copyright* UNKNOWN
include/shark/Unsupervised/RBM/Impl/AverageEnergyGradient.h: *No copyright* UNKNOWN
include/shark/Data/Impl/Statistics.inl: *No copyright* UNKNOWN
include/shark/Data/Impl/CVDatasetTools.inl: *No copyright* UNKNOWN
include/shark/ObjectiveFunctions/Benchmarks/GSP.h: *No copyright* UNKNOWN
include/shark/Rng/GlobalRng.h: *No copyright* UNKNOWN
include/shark/Algorithms/DirectSearch/FastNonDominatedSort.h: *No copyright* UNKNOWN
include/shark/Algorithms/DirectSearch/Grid.h: *No copyright* UNKNOWN

Please consider updating the licensing of all these files accordingly. In particular, the mixing of GPL-3 and LGPL-3 source makes the licensing of the resulting binaries unnecessarily ambiguous.

Inconsistent encoding in CMSA{.h,.cpp} files

The CMSA files have a different encoding (unknown) than the rest of the project (ascii):

file -bi include/shark/Algorithms/DirectSearch/CMSA.h
text/x-c; charset=unknown-8bit

Please consider converting the files to a known format such as unicode if ascii is not an option. The current format prevents standard unix text parsers to operate on these files.

Data standardiztion

Hi,

I hope someone with knowledge could help me on data normalization with standard distribution. Based on my first reading of the documentation, I have put together following code attempt. Obviously, it did not work. I would like to train nomalized data with random forest trainer and generate prediction with a classifier. I'm lost with proper declaration of data type.

    ClassificationDataset training_data, test_data;

    importCSV(training_data, training, LAST_COLUMN, ' ');
    importCSV(test_data, test, LAST_COLUMN, ' ');

    Normalizer<RealVector> normalizer_train, normalizer_test;
    NormalizeComponentsUnitInterval<RealVector> normalizing_train(), normalizing_test();
    normalizing_train.train(normalizer_train, training_data);
    normalizing_test.train(normalizer_test, test_data);

    ClassificationDataset normalizedData_train = transformInputs(training_data, normalizer_train),
                      normalizedData_test = transformInputs(test_data, normalizer_test);

Installation in Windows 10

I followed the installation procedure from this:

http://image.diku.dk/shark/sphinx_pages/build/html/rest_sources/getting_started/installation.html

Did not complete at Configure step in CMake GUI. This is an error output:
The C compiler identification is unknown
The CXX compiler identification is unknown
Check for working C compiler using: Visual Studio 14 2015 Win64
Check for working C compiler using: Visual Studio 14 2015 Win64 -- broken
CMake Error at C:/Program Files (x86)/CMake/share/cmake-3.5/Modules/CMakeTestCCompiler.cmake:61 (message):
The C compiler "C:/mingw64/bin/gcc.exe" is not able to compile a simple
test program.

It fails with the following output:

Change Dir: C:/Shark/Build/CMakeFiles/CMakeTmp

Run Build Command:"C:/Program Files (x86)/MSBuild/14.0/bin/MSBuild.exe"
"cmTC_f7cd4.vcxproj" "/p:Configuration=Debug" "/p:VisualStudioVersion=14.0"

Microsoft (R) Build Engine version 14.0.25123.0

Copyright (C) Microsoft Corporation. All rights reserved.

Build started 4/4/2016 9:08:33 AM.

Project "C:\Shark\Build\CMakeFiles\CMakeTmp\cmTC_f7cd4.vcxproj" on node 1
(default targets).

C:\Program Files
(x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Platforms\x64\PlatformToolsets\v140\Toolset.targets(36,5):
error MSB8036: The Windows SDK version 8.1 was not found. Install the
required version of Windows SDK or change the SDK version in the project
property pages or by right-clicking the solution and selecting "Retarget
solution". [C:\Shark\Build\CMakeFiles\CMakeTmp\cmTC_f7cd4.vcxproj]

Done Building Project
"C:\Shark\Build\CMakeFiles\CMakeTmp\cmTC_f7cd4.vcxproj" (default targets)
-- FAILED.

Build FAILED.

"C:\Shark\Build\CMakeFiles\CMakeTmp\cmTC_f7cd4.vcxproj" (default target)
(1) ->

(Desktop_PlatformPrepareForBuild target) ->

C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Platforms\x64\PlatformToolsets\v140\Toolset.targets(36,5): error MSB8036: The Windows SDK version 8.1 was not found. Install the required version of Windows SDK or change the SDK version in the project property pages or by right-clicking the solution and selecting "Retarget solution". [C:\Shark\Build\CMakeFiles\CMakeTmp\cmTC_f7cd4.vcxproj]





  0 Warning(s)

  1 Error(s)

Time Elapsed 00:00:00.57

CMake will not be able to correctly generate this project.
Call Stack (most recent call first):
CMakeLists.txt:9 (project)

Configuring incomplete, errors occurred!
See also "C:/Shark/Build/CMakeFiles/CMakeOutput.log".
See also "C:/Shark/Build/CMakeFiles/CMakeError.log".

Setup Continuous integration for C++11

In order to support C++11 we have to build with that switch activated. Travis-Ci has kind of outdated packages, this makes everything harder, e.g. having a proper cmake script

rework hypervolume in DirectSearch

currently, there are several algorithmic variants in a small number of files/routines. We should split them up to make them easier testable (e.g. testing brute-force vs smart algorithm in 3D)

Rewrite pgm.h to use iostreams and no printf

Currently, we have to workaround and use a lot of cast which is not needed. So rewrite and make it easier.

the implementation should also be moved into the library as it is template independent.

Potential memory bug

When I run Memcheck on a simple sample program that runs a few iterations of the CMA algorithm I get the following in the output:
==5452== Mismatched free() / delete / delete []
==5452== at 0x4C2C2BC: operator delete(void*) (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so)
==5452== by 0x63CFD86: syev (syev.hpp:64)
==5452== by 0x63CFD86: void shark::blas::bindings::syev<shark::blas::matrix<double, shark::blas::row_major>, shark::blas::vector >(shark::blas::matrix_expression<shark::blas::matrix<double, shark::blas::row_major> >&, shark::blas::vector_expressionshark::blas::vector&) (syev.hpp:83)
==5452== by 0x63D025C: syev<shark::blas::matrix<double, shark::blas::row_major>, shark::blas::vector > (syev.hpp:56)
==5452== by 0x63D025C: void shark::blas::eigensymm<shark::blas::matrix<double, shark::blas::row_major>, shark::blas::matrix<double, shark::blas::row_major>, shark::blas::vector >(shark::blas::matrix_expression<shark::blas::matrix<double, shark::blas::row_major> > const&, shark::blas::matrix_expression<shark::blas::matrix<double, shark::blas::row_major> >&, shark::blas::vector_expressionshark::blas::vector&) (eigenvalues.h:90)
==5452== by 0x63C95C8: update (MultiVariateNormalDistribution.h:131)
==5452== by 0x63C95C8: shark::CMA::updatePopulation(std::vectorshark::Individual<shark::blas::vector<double, double, shark::blas::vector >, std::allocatorshark::Individual<shark::blas::vector<double, double, shark::blas::vector > > > const&) (CMA.cpp:312)
[ .... rest of stacktrace not Shark ... ]

unify MOO evaluation routine

Currently all MOO algorithms come with the same testing routine. put it into some file and reference it from the tests like test_function in the single objective case.

c++11 incompatibility between shark::size and std::size

from the mailing list

"Compiling shark with VS14 and boost 1.59 I get an error "'shark::size' ambiguous call to overloaded function". The ambiguity is between std::size and shark::size, and in the affected files both "using namespace std" and "using namespace shark" are defined. Adding namespace shark to size() solves the problem, but it would be nice to not have to manually do this. "

shark::size got introduced because at that date, std::size was not available. As shark does not officially support C++11 yet, we can defer the fix until its time

Shark build for VS2015 (VS/VC++14)

Hi, I am having trouble building Shark3.0.0 for VS2015.

I have generated the VS project files from CMake and proceeded to build the shark project, however there have been various linker and other errors. Can someone provide a working list of instructions on properly building Shark3.0.0 for VS2015? (I have searched far and wide for complete instructions on install and found none)

That would be most appreciated.

Model Serialization to a HDF5 file

Can we have an option to do the following:
a.) Serialze the model into a string
b.) Convert the string into a char array
c.) Save the char array into a HDF5 file given that we can optionally build hdf5 into shark?

Improve Shark using C++11

Its time to move on.

[x] Update Travis-Ci to support C++11 #59
[x] replaced boost/type_traits by <type_traits
[x] replaced boost::shared_ptr by std::shared_ptr
[x] replaced BOOST_FOREACH by range-based for
[ ] replace boost::random by std::random, also when we are at it, make that part of shark better...
[ ] More issues to come

Unresolved symbol - CVFolds

Having built shark.dll, I tried to build an example CVFolds. With a previous version of the code, It built and ran without any problems. However, now there are some link errors.
Error 1 error LNK2019: unresolved external symbol "__declspec(dllimport) public: __cdecl shark::IRpropPlus::IRpropPlus(void)" (_imp??0IRpropPlus@shark@@qeaa@XZ) referenced in function "double __cdecl trainProblem(class shark::LabeledData<class shark::blas::vector,class shark::blas::vector > const &,class shark::LabeledData<class shark::blas::vector,class shark::blas::vector > const &,double)" (?trainProblem@@YANAEBV?$LabeledData@V?$vector@N@blas@shark@@v123@@Shark@@0n@Z) F:\Binaries\VS2013\Shark\examples\CVFolds.obj CVFolds
Error 2 error LNK2019: unresolved external symbol "__declspec(dllimport) public: virtual void __cdecl shark::IRpropPlus::init(class shark::AbstractObjectiveFunction<class shark::blas::vector,double> &,class shark::blas::vector const &)" (_imp?init@IRpropPlus@shark@@UEAAXAEAV?$AbstractObjectiveFunction@V?$vector@N@blas@shark@@n@2@AEBV?$vector@N@blas@2@@z) referenced in function "double __cdecl trainProblem(class shark::LabeledData<class shark::blas::vector,class shark::blas::vector > const &,class shark::LabeledData<class shark::blas::vector,class shark::blas::vector > const &,double)" (?trainProblem@@YANAEBV?$LabeledData@V?$vector@N@blas@shark@@v123@@Shark@@0n@Z) F:\Binaries\VS2013\Shark\examples\CVFolds.obj CVFolds
Error 3 error LNK2019: unresolved external symbol "__declspec(dllimport) public: virtual void __cdecl shark::IRpropPlus::step(class shark::AbstractObjectiveFunction<class shark::blas::vector,double> const &)" (_imp?step@IRpropPlus@shark@@UEAAXAEBV?$AbstractObjectiveFunction@V?$vector@N@blas@shark@@n@2@@z) referenced in function "double __cdecl trainProblem(class shark::LabeledData<class shark::blas::vector,class shark::blas::vector > const &,class shark::LabeledData<class shark::blas::vector,class shark::blas::vector > const &,double)" (?trainProblem@@YANAEBV?$LabeledData@V?$vector@N@blas@shark@@v123@@Shark@@0n@Z) F:\Binaries\VS2013\Shark\examples\CVFolds.obj CVFolds
Error 4 error LNK1120: 3 unresolved externals F:\Binaries\VS2013\Shark\bin\Release\CVFolds.exe 1 1 CVFolds

DeepNetworkTraining.cpp fails in debug mode

Error is

terminate called after throwing an instance of 'shark::Exception'
what(): size mismatch: inputs.size2() == inputSize()

it is likely that this is due to an error in the parameter list of the ImpulseNoiseModel

Compile Error

At the top of the shark/Data/Dataset.h file, there is an example of accessing elements of type Data:

///typedef Data Set;
/// Set data;
/// for(Set::element_iterator pos=data.elemBegin();pos!= data.elemEnd();++pos){
/// std::cout<<_pos<<" ";
/// Set::element_reference ref=pos;
/// ref
=2;
/// std::cout<<_pos<<std::endl;
///}

I can not make this example work! C++ shows this error message:

‘element_iterator’ is not a member of ‘Set {aka shark::Datashark::blas::vector}’

Documentation target failing with error: cannot open tag file

Just checked out version 3.0.1, I cannot get the documentation target to build without an error during the Doxygen docs generation.

mkdir build && cd build
cmake -DBUILD_DOCUMENTATION=ON ..
make doc

output:

[...]
error: cannot open tag file /home/ghislain/debian/packages/shark/doc/doxygen_pages/tag_files/all.tag for writing
Exiting...
doc/CMakeFiles/doc.dir/build.make:49: recipe for target 'doc/CMakeFiles/doc' failed
make[3]: *** [doc/CMakeFiles/doc] Error 1
CMakeFiles/Makefile2:9627: recipe for target 'doc/CMakeFiles/doc.dir/all' failed
make[2]: *** [doc/CMakeFiles/doc.dir/all] Error 2
CMakeFiles/Makefile2:9639: recipe for target 'doc/CMakeFiles/doc.dir/rule' failed
make[1]: *** [doc/CMakeFiles/doc.dir/rule] Error 2
Makefile:3561: recipe for target 'doc' failed
make: *** [doc] Error 2

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.