Giter Club home page Giter Club logo

montylab_test's Introduction

###This is a test, thanks to the CS department of Cornell, and all the files are downloaded from https://www.cs.cornell.edu/~kilian/code/code.html

Large Margin Nearest Neighbors

LMNN Cake Image

(Thanks to John Blitzer, who gave me this cake for my 30th birthday.)

This is a MATLAB implementation of Large Margin Nearest Neighbor (LMNN), a metric learning algorithm first introduced by Kilian Q. Weinberger, John C. Blitzer and Lawrence K. Saul in 2005. LMNN is a metric learning algorithm to improve k-nearest neighbor classification by learning a generalized Euclidean metric Equation especially for nearest neighbor classification. For more details on the solver see the 2009 JMLR paper.

(The current version is 3.0.1.)

Usage:

To see a working demo, please run (inside the MATLAB console):

Version 3: setpaths3 cd demos isoletdemo

Version 2: install demo

Credit:

If you use this code in scientific work, please cite:

#!bibtex
@article{weinberger2009distance,
  title={Distance metric learning for large margin nearest neighbor classification},
  author={Weinberger, K.Q. and Saul, L.K.},
  journal={The Journal of Machine Learning Research},
  volume={10},
  pages={207--244},
  year={2009},
  publisher={JMLR.org}
}

Changelog:

  • update 04/08/2015
    • added simple implementation of Neighbourhood Component Analysis (NCA)
  • update 03/31/2015
    • released version 3.0.0
    • Version 3 uses the squared LMNN loss (which is differentiable) and the optimization is usesMark Schmidt's LBFGS implementation
  • update 03/27/2015
    • released version 2.6.0
    • this version has a much simpler, clearer and faster gradient computation
    • other small bug fixes
  • update 02/26/2015
    • fixed a bug in SOD.m (for very large data sets)
    • added a simple bandit algorithm to choose between SOD.m and SODmex.c during runtime
    • Added sd2b.c, which speeds up gradient computation for smallish data sets.
    • Added findLMNNparams.m for automatic tune LMNN parameters
  • update 01/23/2015
    • Release version 2.5.1
    • moved to Bitbucket as new host
    • Fixed bug
  • update 23/04/2014
    • Release version 2.5:
    • introduce new parameter "subsample" (subsample 10% of constraints by default)
    • improve convergence criteria
  • update 10/04/2013
    • fixed a small but critical bug in applypca.m (this function is optional as pre-processing)
  • update 09/17/2013 -Version 2.4.1:
    • Set default validation parameter to 0.2
    • Now perform cross validation over maxstepsize automatically
  • update 07/26/2013
    • Version 2.4:
    • Added GB-LMNN
    • New demo.m (now including GB-LMNN)
    • Made small changes to LMNN (mostly usability)
    • Parallelized some C-functions with open-MP
    • Thanks to Gao Huang (Tsinghua university) for helping with the GB-LMNN implementation
  • update 13/11/2012
    • Fixed a bug that prevented execution with even values for k.
  • update 01/11/2012
    • Added optional 'diagonal' version to learn diagonal matrices
  • update 09/19/2012
    • Added 32-bit Windows binaries (Thanks to Ya Shi)
  • update 09/18/2012
    • Added parameter 'outdim' to easily specify the output dimensionality
    • Small fixes in mtree code, which broke compilation on some windows machines.
    • Speedup in findimps3Dm by substituting some repmats with bsxfun (somehow they have been overlooked)
  • update 09/13/2012
    • Small fix to setpaths.m script
    • Rearranged files to ensure that the mexed files are in the path.
    • updated demo
  • update 09/06/2012
    • Small fix to install.m script
  • update 08/23/2012
    • Removed mex files which are no longer faster than the Matlab equivalent (Matlab became a lot faster over the years)
    • Updated mtrees to compile on windows computers and no longer use depreciated libraries
    • Removed all BLAS / LAPACK dependencies
    • Renamed knnclassify.m to knncl.m (as former clashed with the implementation from the statistics toolbox)
    • (Many thanks to Jake Gardner who helped a lot with tiding up of the code.)

(LMNN is different from the work by Domeniconi et al. 2005 with a very similar title.)

montylab_test's People

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.