Comments (6)
I came across the same issue.
I suggest you change the (other than that, really helpful) tutorial, so as not to include examples that use default parameters in GA and DE. These examples were the first that I tried to familiarize myself with the package, and it was a bit frustrating to get weird results even if copying the code. There was some time until I realised that there might be a problem with the parameters and not with the syntax I used.
from evolutionary.jl.
Looks like I got it. Apparently, when I specify one initial point, the entire population will be just copies of this point:
Evolutionary.jl/src/api/utilities.jl
Lines 58 to 64 in 81d8a72
And then there's probably not enough variability for mutation or crossover to change anything, so the population doesn't change.
If I use BoxConstraints
, then even DE
starts working.
from evolutionary.jl.
Thanks for testing package in such mode. I wouldn't never try to use this algorithms without specific parameters. But, I understand that for someone new trying evolutionary optimization the experience can be frustrating.
You are correct, default parameters are useless. But there is not way to set default parameters for any model because operators are population dependent. If population is represented by binary strings, you need specific to binary string operations, same for numerical functions, and so on. I think the best way is to terminate optimization (with some useful message) when no-ops operators are used.
Same problem with the initial population. Most of the evolutionary algorithms require sufficient randomness in population for proper optimization. Having all population with the same value totally defeats the optimization technique. Adding some noise for the point initialized population may work much better.
If you use BoxConstraints
in optimization call, then the population is sampled within the box which beneficially reflects on optimization results.
from evolutionary.jl.
default parameters are useless ... there is not way to set default parameters for any model because operators are population dependent
Maybe there shouldn't be any default parameters then? Having "useless" default parameters is:
- confusing and
- leads to incorrect examples in the documentation:
Evolutionary.optimize(x->-sum(x), BitVector(zeros(3)), GA())
says that[0,0,0]
is the minimum, but it's not.
The no-argument constructor could throw an error, for example:
DE() = error("There is not way to set default parameters for any model because operators are population dependent, so please set parameters manually.")
It's especially strange with GA
, where the default mutation and crossover operations are no-ops, but mutation and crossover seem to be the entire point of genetic algorithms.
On the other hand, the default DE()
seems to work fine when I use BoxConstraints
, so default params don't seem all that useless...
Having all population with the same value totally defeats the optimization technique.
Does it mean that optimize
methods that accept the indiv
parameter aren't particularly useful? With ordinary gradient-based methods, the initial guess is extremely important, so my first instinct was to specify the initial individual.
Also the documentation recommends this: Evolutionary.optimize(f, x0, CMAES())
, and it actually works with CMAES
, even though the population should consist of copies of x0
.
from evolutionary.jl.
It's especially strange with GA, where the default mutation and crossover operations are no-ops, but mutation and crossover seem to be the entire point of genetic algorithms.
Exactly my point, performing evolutionary optimization correctly is not only about initial guess, but also about which mutation operations and how they applied to population, i.e. rates.
Does it mean that optimize methods that accept the indiv parameter aren't particularly useful? With ordinary gradient-based methods, the initial guess is extremely important, so my first instinct was to specify the initial individual.
In current implementation, it is problematic. I think more randomness needs to be added in such case.
Also the documentation recommends this: Evolutionary.optimize(f, x0, CMAES()), and it actually works with CMAES, even though the population should consist of copies of x0.
By default, CMAES initializes parameters with a very convoluted procedure, that was refined for many years of research. CMAES become very fragile with incorrect parameters, and it requires good understanding of algorithm and problem to tune in algorithm for the best performance.
from evolutionary.jl.
Same here. In my case I was using Evolutionary.jl
with Optimization.jl
, where you have to set a x0
when defining the OptimizationProblem
.
Got it to work by initializing with an initial_population
rather than an individual x0
. I sampled from a uniform distribution, but I could also imagine just adding randn
to x0
.
NP = 100
de = Evolutionary.DE(populationSize=NP)
lb, ub = [0 1]
U = Uniform(lb, ub)
x0 = Evolutionary.initial_population(de, [rand(U, 1) for i in 1:NP])
f = OptimizationFunction(foo)
prob = Optimization.OptimizationProblem(f, x0, p, lb=lb, ub=ub)
sol = solve(prob, de)
from evolutionary.jl.
Related Issues (20)
- Defining intitial value for parameters as a vector for each parameter HOT 1
- "maximization context"? HOT 2
- Zero-index bug in recombination operators that use `inmap` HOT 1
- Cannot modify trace HOT 3
- Default for `recombination` in `ES` causes error due to `rng` kwarg
- GA returns most recent (rather than best) value? HOT 6
- DE returns most recent population info, rather than the minimizer HOT 2
- Function with binary and real inputs HOT 1
- ES selection strategies HOT 3
- differential evolution does not honour the parallelization option HOT 1
- Access the population during GA evolution HOT 1
- Crossover functions not working as intended for integer sequences?
- Not being able to parallelize.
- nsga2 use of StackViews only allows concrete types
- MILX Operator not working with integer solutions
- Symbolic regression examples not working in version 0.11.1 HOT 1
- Cannot get examples MLP to work HOT 3
- DE rastrigin tests sometimes randomly fail
- Integer Problem with MILX and MIPM HOT 11
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from evolutionary.jl.