Giter Club home page Giter Club logo

Comments (6)

mylo19 avatar mylo19 commented on June 14, 2024 2

I came across the same issue.

I suggest you change the (other than that, really helpful) tutorial, so as not to include examples that use default parameters in GA and DE. These examples were the first that I tried to familiarize myself with the package, and it was a bit frustrating to get weird results even if copying the code. There was some time until I realised that there might be a problem with the parameters and not with the syntax I used.

from evolutionary.jl.

ForceBru avatar ForceBru commented on June 14, 2024

Looks like I got it. Apparently, when I specify one initial point, the entire population will be just copies of this point:

"""
initial_population(method, individual::AbstractVector)
Initialize population by replicating the `individual` vector.
"""
initial_population(method::M, individual::I; kwargs...) where {M<:AbstractOptimizer, I<:AbstractVector} =
[copy(individual) for i in 1:population_size(method)]

And then there's probably not enough variability for mutation or crossover to change anything, so the population doesn't change.

If I use BoxConstraints, then even DE starts working.

from evolutionary.jl.

wildart avatar wildart commented on June 14, 2024

Thanks for testing package in such mode. I wouldn't never try to use this algorithms without specific parameters. But, I understand that for someone new trying evolutionary optimization the experience can be frustrating.

You are correct, default parameters are useless. But there is not way to set default parameters for any model because operators are population dependent. If population is represented by binary strings, you need specific to binary string operations, same for numerical functions, and so on. I think the best way is to terminate optimization (with some useful message) when no-ops operators are used.
Same problem with the initial population. Most of the evolutionary algorithms require sufficient randomness in population for proper optimization. Having all population with the same value totally defeats the optimization technique. Adding some noise for the point initialized population may work much better.

If you use BoxConstraints in optimization call, then the population is sampled within the box which beneficially reflects on optimization results.

from evolutionary.jl.

ForceBru avatar ForceBru commented on June 14, 2024

default parameters are useless ... there is not way to set default parameters for any model because operators are population dependent

Maybe there shouldn't be any default parameters then? Having "useless" default parameters is:

  1. confusing and
  2. leads to incorrect examples in the documentation: Evolutionary.optimize(x->-sum(x), BitVector(zeros(3)), GA()) says that [0,0,0] is the minimum, but it's not.

The no-argument constructor could throw an error, for example:

DE() = error("There is not way to set default parameters for any model because operators are population dependent, so please set parameters manually.")

It's especially strange with GA, where the default mutation and crossover operations are no-ops, but mutation and crossover seem to be the entire point of genetic algorithms.

On the other hand, the default DE() seems to work fine when I use BoxConstraints, so default params don't seem all that useless...


Having all population with the same value totally defeats the optimization technique.

Does it mean that optimize methods that accept the indiv parameter aren't particularly useful? With ordinary gradient-based methods, the initial guess is extremely important, so my first instinct was to specify the initial individual.

Also the documentation recommends this: Evolutionary.optimize(f, x0, CMAES()), and it actually works with CMAES, even though the population should consist of copies of x0.

from evolutionary.jl.

wildart avatar wildart commented on June 14, 2024

It's especially strange with GA, where the default mutation and crossover operations are no-ops, but mutation and crossover seem to be the entire point of genetic algorithms.

Exactly my point, performing evolutionary optimization correctly is not only about initial guess, but also about which mutation operations and how they applied to population, i.e. rates.

Does it mean that optimize methods that accept the indiv parameter aren't particularly useful? With ordinary gradient-based methods, the initial guess is extremely important, so my first instinct was to specify the initial individual.

In current implementation, it is problematic. I think more randomness needs to be added in such case.

Also the documentation recommends this: Evolutionary.optimize(f, x0, CMAES()), and it actually works with CMAES, even though the population should consist of copies of x0.

By default, CMAES initializes parameters with a very convoluted procedure, that was refined for many years of research. CMAES become very fragile with incorrect parameters, and it requires good understanding of algorithm and problem to tune in algorithm for the best performance.

from evolutionary.jl.

jnsbck avatar jnsbck commented on June 14, 2024

Same here. In my case I was using Evolutionary.jl with Optimization.jl, where you have to set a x0 when defining the OptimizationProblem.

Got it to work by initializing with an initial_population rather than an individual x0. I sampled from a uniform distribution, but I could also imagine just adding randn to x0.

NP = 100
de = Evolutionary.DE(populationSize=NP)
lb, ub = [0 1]
U = Uniform(lb, ub)
x0 = Evolutionary.initial_population(de, [rand(U, 1) for i in 1:NP])

f = OptimizationFunction(foo)
prob = Optimization.OptimizationProblem(f, x0, p, lb=lb, ub=ub)
sol = solve(prob, de)

from evolutionary.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.