Giter Club home page Giter Club logo

Comments (4)

ablaom avatar ablaom commented on June 12, 2024

This sounds like a reasonable suggestion. If we add this here, then it would be good to have a PR to update this list at LearnAPI.jl.

My vote is for "neural networks". It's true that Flux models don't have to be neural networks - any differentiable function will do, but "gradient descent methods" is probably too unfamiliar to beginners, and catches things like gradient tree boosters (e.g. XGBoost) that likely we don't want to put in that bucket, despite the valid interpretation.

While I acknowledge its naturalisation into the machine learning vernacular, I've never been fond of the term "deep learning". Neural networks have been around since the 1960's (at least), so the term seems redundant. It' like we started calling polynomials of degree 17 "deep functions". How deep is deep anyway? And lots of things are functions that are not polynomials. Of course we all internalise these terms in different (often unconscious) ways, and this is just my subjective view.

At least 95% of the time, MLJFlux models will be garden variety neural networks. Into this box also go, unambiguously, the models listed below:

 (name = AutoEncoder, package_name = BetaML, ... )
 (name = MultitargetNeuralNetworkRegressor, package_name = BetaML, ... )
 (name = NeuralNetworkClassifier, package_name = BetaML, ... )
 (name = NeuralNetworkRegressor, package_name = BetaML, ... )
 (name = KernelPerceptronClassifier, package_name = BetaML, ... )
 (name = PerceptronClassifier, package_name = BetaML, ... )
 (name = PerceptronClassifier, package_name = MLJScikitLearnInterface, ... )

I may have missed some others.

from mlj.jl.

EssamWisam avatar EssamWisam commented on June 12, 2024

I agree with what you said on "gradient descent methods" and that "neural networks" is more well-defined and generic than "deep learning". That said, I may share my internal definition thereof: it's deep learning if the neural network is sufficiently deep so that the universal approximation theorem holds; i.e., there is as least one hidden layer.

Likewise, I believe most of what could be built with Flux is some sort of neural network (even if not exactly a feedforward neural network). For your list, I think it's sufficiently comprehensive by including the four models exposed in MLJFlux and AutoEncoder. Like the Perceptron, linear and logistic regression could be viewed as special cases of the neural network (e.g., one neuron); however, the Perceptron is more involved in the history of creation of neural networks.

from mlj.jl.

EssamWisam avatar EssamWisam commented on June 12, 2024

Gentle reminder.

from mlj.jl.

ablaom avatar ablaom commented on June 12, 2024

Done in upcoming PR. Thanks for the reminder

from mlj.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.