wildart / evolutionary.jl Goto Github PK
View Code? Open in Web Editor NEWEvolutionary & genetic algorithms for Julia
License: Other
Evolutionary & genetic algorithms for Julia
License: Other
Hello-
Thank you for putting this package together. Unfortunately, I am having trouble getting the tutorial example to work. The algorithm seems to be stuck at the initial values.
res = Evolutionary.optimize(x -> -sum(x),
BitVector(zeros(3)),
GA(),
Evolutionary.Options(iterations=10, show_trace=true))
Here are the results:
Iter Function value
0 0
* time: 8.416175842285156e-5
1 0
* time: 0.0001971721649169922
2 0
* time: 0.00025010108947753906
3 0
* time: 0.00028705596923828125
4 0
* time: 0.0003161430358886719
5 0
* time: 0.0003452301025390625
6 0
* time: 0.00038123130798339844
7 0
* time: 0.000408172607421875
8 0
* time: 0.0004360675811767578
9 0
* time: 0.0004742145538330078
10 0
* time: 0.0005102157592773438
11 0
* time: 0.0005371570587158203
* Status: success
* Candidate solution
Minimizer: [false, false, false]
Minimum: 0
Iterations: 11
* Found with
Algorithm: GA[P=50,x=0.8,μ=0.1,ɛ=0]
If I understand correctly, the minimizer should be [true,true,true] with a minimum of -3. I tried changing parameters of GA and in Options, but to no avail. Am I doing something incorrectly?
A PkgEval run for a Julia pull request which changes the generated numbers for rand(a:b)
indicates that the tests of this package might fail in Julia 1.5 (and on Julia current master branch).
Also, you might be interested in using the new StableRNGs.jl registered package, which provides guaranteed stable streams of random numbers across Julia releases.
Apologies if this is a false positive. Cf. https://github.com/JuliaCI/NanosoldierReports/blob/ab6676206b210325500b4f4619fa711f2d7429d2/pkgeval/by_hash/52c2272_vs_47c55db/logs/Evolutionary/1.5.0-DEV-87d2a04de3.log
Hi,
I have been exploring Julia's GA capabilities, looking at GeneticAlgorithms and Evolutionary packages. I really enjoy the simple interface presented by Evolutionary and it also seems to be more advanced in terms of development. Great work and thank you for your efforts.
However, I am having trouble figuring out how I can implement a binary GA. I am using a knapsack problem as a simple illustrative example. I've written a fitness function which looks like this:
mass = [1, 5, 3, 7, 2, 10, 5]
utility = [1, 3, 5, 2, 5, 8, 3]
function fitness(n::Vector{Int})
total_mass = sum(mass .* n)
return (total_mass <= 20) ? sum(utility .* n) : 0
end
but when I try to run the GA as follows
ga(fitness, 7)
I get
ERROR: wrong number of arguments
in anonymous at /home/colliera/.julia/v0.3/Evolutionary/src/ga.jl:23
in ga at /home/colliera/.julia/v0.3/Evolutionary/src/ga.jl:66
Looking at the debug output from ga()
I see that the vectors being passed into my fitness function are floating point rather than binary (or at least a vector of integer 0 or 1). I'm not sure whether this is the source of the ERROR above, but it's certainly something that I need to resolve in order to get my problem up and running.
Is there support for binary chromosomes? Have I missed something?
Best regards,
Andrew.
For GA, based on the default parameters (tol = 0.0
, tolIter = 10
), I would expect the algorithm to terminate after 10 iterations with no improvement on the objective, and I feel that this is logical default behaviour. However, in practice, early stopping does not occur without setting tol to a positive value.
This is a very neat repo for evolutionary algorithms! I really like it, but please allow me to give two advices to make the interface better
for (count, runtime_info) in enumerate(optimize(objfunc, individual, args...))
# inspect your program
@show runtime_info.best
# exit condition
(count > 100 || runtime_info.precision < 1e-10) && break
end
these changes have practical importances
populate!
be a function like in PR #26 can help people implement parallelism (e.g. MPI) much easier. Let functions short is helpful to open issues #15 and #2 , so that people can overload shorter functions like select(::ProblemType, current_population, current_fitness)
to custom the strategies of selecting parents, or generate(::ProblemType, parents)
to generate new generation in bounds.call_back
function, many people want to inspect their program during optimization, and iterator provides full access to the runtime information.iterations
, tol
and verbose
, because people can decide when to break the program intuitively inside a loop.See this PR for detail (this is a demo, not a real PR, a PR requires more discussion)
#26
Trying to use tournament selection gives the error: MethodError: no method matching Array{Int64,N} where N(::Int64)
Hi there!
I am a bit of a beginner in GA, so sorry for the question that might be very basic. I use the code below (sorry, it is hard for me to make a MWE as it is part of a package I am writing). The function f
has been tested independently.
function _optimize(icn, X; ga = _icn_ga, metric = hamming, pop_size = 100)
f(weigths) = _loss(X, icn, weigths, metric)
_icn_ga = GA(populationSize = pop_size, crossoverRate = 0.8, ɛ = .05, selection = tournament, crossover = singlepoint, mutation = flip)
# problem also occurs with GA() and pop_size = 50
pop = generate_population(icn, pop_size)
optimize(f, pop, _icn_ga)
end
I have a very strange behavior which is when I call f
the weights
argument is a zeros vector, even though my initial population does not contain such individuals.
So, my question is the following: am I missing some settings that make the GA provides only zeros individual?
Side question, is it possible to provide a predicate for viable individuals at each generation so that mutation and crossover are redone if the individual is not viable?
I have been trying to store
When I don't override the trace function can call I get:
fieldnames(typeof(oc)) = (:iteration, :value, :metadata)
for the value of os
If I override the trace! function and set store_trace to true, no matter what I add to the dictionary I get
fieldnames(typeof(tr)) = ()
for the value of tr that I can see gets passed in in this case
I feel like this should be possible.
Maybe one solution would be to give the callback function an intermediate state of the results such that this would work:
function callback(intermediate_res)
items = Evolutionary.iterations(intermediate_res)
best_indv = Evolutionary.minimizer(intermediate_res)
best_fit = minimum(intermediate_res)
.
.
.
return false
end
First, thanks for building and sharing such an exciting and promising package!
I tried your SymbolicRegression example: https://github.com/wildart/Evolutionary.jl/blob/master/examples/SymbolicRegression.ipynb
In section 1.3.1 Linear
, the solution found is much more complex than your example. It goes on for pages. ;-) (screenshot below)
Maybe maxdepth
is no longer respected?
Thank you!
Christian
Hello,
I am trying to optimise an array of booleans:
init = ()->Bool[rand(Bool) for el in tc]
opts = Evolutionary.Options(iterations=500)
mthd = GA(populationSize=150, crossoverRate=0.5, mutationRate=0.2, selection=susinv, crossover=uniform(), mutation=scramble())
Random.seed!(10);
result = Evolutionary.optimize(f1, init, mthd, opts)
but I keep getting this error:
ERROR: LoadError: MethodError: no method matching scramble()
Closest candidates are:
scramble(!Matched::T) where T<:(AbstractArray{T,1} where T) at /home/nsalvi/.julia/packages/Evolutionary/BIk3j/src/mutations.jl:225
I am a bit lost, can anyone help me with that?
Hi,
is there a way to implement hard constraints using Evolutionary.jl?
Thanks.
Example:
best, = ga(x->-sum(x),
5,
initPopulation=n->rand(Bool, n),
iterations=10000,
mutationRate = 0.2,
selection = roulette
)
I think this is because you maximize the inverse of the provided objective function. Is there a reason to do this rather than just directly minimizing the function? (Or I guess, maximizing the negation of the function)
When I add the package, Julia gives me an error. I thought it is about the version of the package.
(v1.3) pkg> add Evolutionary
Resolving package versions...
ERROR: Unsatisfiable requirements detected for package UnPack [3a884ed6]:
UnPack [3a884ed6] log:
├─possible versions are: [0.1.0, 1.0.0-1.0.1] or uninstalled
├─restricted to versions 1.0.1 by an explicit requirement, leaving only versions 1.0.1
└─restricted by compatibility requirements with Evolutionary [86b6b26d] to versions: 0.1.0 — no versions left
└─Evolutionary [86b6b26d] log:
├─possible versions are: [0.3.0, 0.4.0, 0.5.0, 0.6.0-0.6.1] or uninstalled
└─restricted to versions * by an explicit requirement, leaving only versions [0.3.0, 0.4.0, 0.5.0, 0.6.0-0.6.1]
Hi, I am wondering how can I get the status from the result
variable, is there a function call on result
and return the status: success or failure? Thanks.
It seems that the parameters lowerBounds
and upperBounds
in the genetic algorithm have no function. Without those bounds, the search space for integer variables is too big and the algorithm fails to find a solution. Any chance those parameters can be implemented?
Hi :) very nice library, thanks. I'm wondering if you're planning on updating this to work with 1.0 any time soon? Currently it doesn't compile (Void is no longer a type is the first error I'm getting, but there might be more)
Thanks for your work. And there is an error in my code, could you please point out the mistakes?
`fitFunc(w)=w[1]-w[2]+w[3]-w[4]+w[5]-w[6]
lx = [0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
ux = [1.0, 1.0, 1.0, 1.0, 1.0, 1.0]
con_w = w[1]+w[2]+w[3]+w[4]+w[5]+w[6]
lc = [1.0]
uc = [1.0]
mthd = GA(populationSize = 3, ɛ = 0.03, selection = susinv, crossover = intermediate(0.7), mutation = gaussian(2))
con = PenaltyConstraints(1e3, lx, ux, lc, uc, con_w)
Evolutionary.ConstraintBounds(lx,ux,lc,uc), con_w)
opts = Evolutionary.Options(iterations=15, successive_f_tol=25, show_trace=true)
res_Final = Evolutionary.optimize(fitFunc, con, con.bounds, mthd, opts)
m = Evolutionary.minimizer(res_Final)
minimum(res_Final)`
There is a error at the line of function “optimize”, that "Error: objects of type Float64 are not callable".
In detail:
LoadError: MethodError: objects of type Float64 are not callable
in expression starting at D:\gaspipeline_sto.jl:212
value(::PenaltyConstraints{Float64,Float64}, ::NonDifferentiable{Float64,Array{Float64,1}}, ::Array{Float64,1}) at constraints.jl:138
update_state!(::NonDifferentiable{Float64,Array{Float64,1}}, ::PenaltyConstraints{Float64,Float64}, ::Evolutionary.GAState{Float64,Array{Float64,1}}, ::Array{Array{Float64,1},1}, ::GA, ::Int64) at ga.jl:98
optimize(::NonDifferentiable{Float64,Array{Float64,1}}, ::PenaltyConstraints{Float64,Float64}, ::Array{Array{Float64,1},1}, ::GA, ::Evolutionary.Options{Nothing}, ::Evolutionary.GAState{Float64,Array{Float64,1}}) at optimize.jl:56
optimize(::NonDifferentiable{Float64,Array{Float64,1}}, ::PenaltyConstraints{Float64,Float64}, ::Array{Array{Float64,1},1}, ::GA, ::Evolutionary.Options{Nothing}) at optimize.jl:40
optimize(::Function, ::PenaltyConstraints{Float64,Float64}, ::NLSolversBase.ConstraintBounds{Float64}, ::GA, ::Evolutionary.Options{Nothing}) at optimize.jl:31
top-level scope at gaspipeline_sto.jl:212
include_string(::Function, ::Module, ::String, ::String) at loading.jl:1088
I'm trying to find a Genetic Algorithms implementation for a project I'm working on, so have been looking through the code. Is mutation applied in-place, or do I misunderstand? On one hand, the code is clearly discarding the return value from mutation:
Lines 87 to 94 in c7fb224
But on the other hand, the sample mutations in mutations.jl (and the default mutation (x -> x)) all return a value and aren't suffixed with a !
.
Hello,
We are trying to provide the Evolutionary.jl as the backend optimizer for the cost function in DiffEqParameEstim.jl See SciML/DiffEqParamEstim.jl#25. The present release is not compatible with Julia 0.5 and 0.6. Do you plan to release a new version which provides support for recent julia versions?
I am happy to help :)
Hi, I am wondering is there time_limit keywords in the optimize setting to control the maximum running time of the algorithm?
Hi,
thanks for the cool package.
I recently started to do some research within the field of Evolutionary Strategies.
indiv = [0.0,0.0]
lowerbound = [0.0,0.0]
upperbound = [20.0,20.0]
function fit(x)
if(sum(x) == 4.0)
return -1
else
return +1
end
end
Evolutionary.optimize(fit,lowerbound,upperbound,indiv,CMAES(),Evolutionary.Options(iterations = 1000))
>Status = success
>Candidate solution
> Minimzer = [6.216071822584697,2.911369416556109]
> Minimum = 1.0
> Iterations = 12
>Found with
> Algorihm: (15,30)-CMA-ES
I do not understand why the Algorithm yields infeasible solutions, returns the status "success" and always stops after 12 Iterations.
Is there a way to mutate the genotype, so that the mutation is a multiple of a certain step/number?
The Mutations should do something like this:
step = 0.25 :[1.495763498,1.23478634] --> [1.75,1.23478634]. Is there such an Algorithm in this package?
I really hope I was able to explain my problems understandably.
Thanks
This issue follow #74 (now closed as the real problem has been identified).
Is it possible to provide a predicate for viable individuals at each generation so that mutation and crossover are remade if the individual is not viable?
(btw, is there a place to ask questions and chat about Evolutionary.jl?)
There are some errors using this with Julia 0.5.0. If you change the following lines in Evolutionary.jl
and ga.jl, it seems to work fine.
Evolutionary.jl:22: typealias Individual Union{Vector, Matrix, Function, Void}
ga.jl:17: lowerBounds::Union{Void, Vector} = nothing,
ga.jl:18: upperBounds::Union{Void, Vector} = nothing,
The latest registered version in the Julia JuliaRegistries is v0.1.2
which is not compatible with Julia 1.0 or higher. Please register the latest version v0.2.0
.
Edit: Changed METADATA.jl to JuliaRegistries.
Hi Art,
I've spent another few hours playing with Evolutionary and I'm completely sold. This is really great! I love the fact that so much can be achieved through such a deceptively simple interface!
I have previously worked quite a lot with R's GA package. One of the features that I really enjoyed there was the fact that you can plot the convergence of the optimal and average fitness of the population. Something like this: http://imgur.com/v8Krt1Z. Would it be possible to implement something similar with Evolutionary? I realise that this would require you to store interim results for each generation, but I think that in terms of understanding the performance of the GA and being able to compare different crossover/mutation algorithms this would be very valuable.
Best regards,
Andrew.
After entering pkg> add https://github.com/wildart/Evolutionary.jl.git#v0.2.0, I get the error message:
ERROR: syntax: extra token "https" after end of expression
One can specify the ConstraintBounds
as the optimizer input, however, it is more likely designed for lagrange multiplier.
It does take effect during population generation of CMAES
:
Line 147 in af63c95
Currently, I need a bound that populations can be cliped on this bounds like
clip!(generated_population, bounds)
Do you think it is proper to add a new bounds
as field of CMAES
? Naively, the bounds type should contain a vector of upper bounds and a vector of lower bounds. Any comments?
I see the following error, might be due to some dependencies
On running -> Pkg.test("Evolutionary")
INFO: Testing Evolutionary
Running tests:
FAILED: sphere.jl
LoadError: LoadError: MethodError: no method matching Union(::Type{Array{T,1}}, ::Type{Array{T,2}}, ::Type{Function}, ::Type{Void})
Closest candidates are:
Union{T}(::Any) at sysimg.jl:53
in macro expansion; at /home/rishab/.julia/v0.5/Evolutionary/test/runtests.jl:25 [inlined]
in anonymous at ./:?
in include_from_node1(::String) at ./loading.jl:488
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
while loading /home/rishab/.julia/v0.5/Evolutionary/src/Evolutionary.jl, in expression starting on line 22
while loading /home/rishab/.julia/v0.5/Evolutionary/test/sphere.jl, in expression starting on line 2
FAILED: rosenbrock.jl
LoadError: UndefVarError: es not defined
in macro expansion; at /home/rishab/.julia/v0.5/Evolutionary/test/runtests.jl:25 [inlined]
in anonymous at ./:?
in include_from_node1(::String) at ./loading.jl:488
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
while loading /home/rishab/.julia/v0.5/Evolutionary/test/rosenbrock.jl, in expression starting on line 20
FAILED: schwefel.jl
LoadError: UndefVarError: cmaes not defined
in macro expansion; at /home/rishab/.julia/v0.5/Evolutionary/test/runtests.jl:25 [inlined]
in anonymous at ./:?
in include_from_node1(::String) at ./loading.jl:488
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
while loading /home/rishab/.julia/v0.5/Evolutionary/test/schwefel.jl, in expression starting on line 19
FAILED: rastrigin.jl
LoadError: UndefVarError: es not defined
in macro expansion; at /home/rishab/.julia/v0.5/Evolutionary/test/runtests.jl:25 [inlined]
in anonymous at ./:?
in include_from_node1(::String) at ./loading.jl:488
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
while loading /home/rishab/.julia/v0.5/Evolutionary/test/rastrigin.jl, in expression starting on line 27
FAILED: n-queens.jl
LoadError: UndefVarError: inversion not defined
in macro expansion; at /home/rishab/.julia/v0.5/Evolutionary/test/runtests.jl:25 [inlined]
in anonymous at ./:?
in include_from_node1(::String) at ./loading.jl:488
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
while loading /home/rishab/.julia/v0.5/Evolutionary/test/n-queens.jl, in expression starting on line 28
INFO: Evolutionary tests passed
"The test seems to be passed yet the source of errors is unknown to me"
Hey there,
I'm using Evolutionary, great package, so thanks for developing it. I was wondering if it's possible by a callback or something like that, to save all individuals and their associated fitness function values? Since I'm only solving a 2 dimensional Problem and this in fact very often, it would be nice to get an Idea of how multi-modal my function is and thus, I would like to plot the fitness function over the 2D parameter space.
Hey, we wanted to add Evolutionary.jl as one of the optimization backends in DiffEqFlux.jl https://github.com/SciML/DiffEqFlux.jl/issues/186, but not having a single function to call for different algorithms makes it a little awkward to handle. If you think this would be a good idea I could start working on a PR for this?
Currently, CMAES only works with one-dimensional individuals. It's possible to accept multi-dimensional individuals and vectorize them, keep dimensional information in the state and reshaping an individual into its original form when the minimizer is requested.
Currently, a matrix of individuals transformed into a vector of vectors, which doesn't make much sense.
I can't get the mixed interger example code to run as is. I get UndefVarError for MixedTypePenaltyConstraints,WorstFitnessConstraints and MILX.
Consider parallelization of the algorithms in multiple modes:
I'm currently looking at a problem I'm trying to apply GA to which has 2D individuals. This implementation only supports 1D individuals but I don't see any reason not to support individuals of arbitrary dimension.
Major changes required are to getIndividual and population generation. The easy way to do this would probably be to change the dimensionality parameter N to instead specify the type of an individual.
What are your thoughts?
Provide non-Greek (in English letters) constructor parameters as a fallback. See #54.
Hey @wildart, I just wanted to let you know that this package generates a crazy amount of warnings on 0.4 due to indexing with non-integers. So much so that it generated 16 MB of deprecation warnings before a timeout killed it on PackageEvaluator. Something to look into if you get a chance!
I have been comparing performance of CMAES() with BlackBoxOptim on a 50-dimensional problem. BlackBoxOptim has been giving qualitatively better results, and even when I initialise CMAES() with the solution from BBO, the candidate solution it (CMAES) produces (after 1500 iterations) is an order of magnitude worse (in terms of objective) than the initial value.
Am I missing something?
I am just using the basic res=Evolutionary.optimize(f, b, CMAES())
Thanks for any help!
PS this is just a test example of lasso linear regression
Now the initial \sigma of CMAES is fixed to 1.0. I am about to submit a PR to allow users to specify it manually.
Wondering if it is a good move?
Hi, thanks for the package!
I'm running into a couple of errors with PM
and MIPM
using Evolutionary
f(x) = x[1]^2 + 1
x0 = [0]
res = Evolutionary.optimize(f, x0, GA(selection=ranklinear(1), mutation=PM([-1.0], [3.0])))
ERROR: AssertionError: Need to set p_int for integer variables
I think this is because this line needs to pass a dummy value for p_float
before p
:
Evolutionary.jl/src/mutations.jl
Line 189 in 8555f34
When I try with MIPM
I get this
julia> res = Evolutionary.optimize(f, x0, GA(selection=ranklinear(1), mutation=MIPM([-1.0], [3.0])))
ERROR: InexactError: Int64(-0.9692105629432188)
Which I think is because this line will result in a float always:
Evolutionary.jl/src/mutations.jl
Line 209 in 8555f34
The second problem goes away when the initial condition is a vector of floats - should it always be this or is the intention for it to work with a vector of ints?
Cheers
Hello,
in Julia 0.5.2 the using instruction gives this error:
julia> using Evolutionary ERROR: LoadError: MethodError: no method matching Union(::Type{Array{T,1}}, ::Type{Array{T,2}}, ::Type{Function}, ::Type{Void}) Closest candidates are: Union{T}(::Any) at sysimg.jl:53 in include_from_node1(::String) at .\loading.jl:488 in eval(::Module, ::Any) at .\boot.jl:234 in require(::Symbol) at .\loading.jl:415 while loading C:\Users\a.peano\.julia\v0.5\Evolutionary\src\Evolutionary.jl, in expression starting on line 22
Do you have any idea on how to bypass it?
Thanks
This issue is used to trigger TagBot; feel free to unsubscribe.
If you haven't already, you should update your TagBot.yml
to include issue comment triggers.
Please see this post on Discourse for instructions and more details.
If you'd like for me to do this for you, comment TagBot fix
on this issue.
I'll open a PR within a few hours, please be patient!
I've been trying to run GA on my custom structure, Organism
, but it's giving the following error:
result = Evolutionary.optimize(
fitness,
() -> init_organism("msa_data/temp.tfa"),
GA(
populationSize = 50,
))
Error
MethodError: no method matching NLSolversBase.NonDifferentiable(::typeof(Main.workspace3.fitness), ::Main.workspace3.Organism)
Closest candidates are:
NLSolversBase.NonDifferentiable(::Any, ::Any, !Matched::AbstractArray) at C:\Users\aadim\.julia\packages\NLSolversBase\QPnui\src\objective_types\nondifferentiable.jl:21
NLSolversBase.NonDifferentiable(::Any, ::TF, !Matched::TX, !Matched::Array{Int64,1}) where {TF, TX} at C:\Users\aadim\.julia\packages\NLSolversBase\QPnui\src\objective_types\nondifferentiable.jl:3
NLSolversBase.NonDifferentiable(::Any, ::Any, !Matched::AbstractArray, !Matched::Union{Real, AbstractArray}) at C:\Users\aadim\.julia\packages\NLSolversBase\QPnui\src\objective_types\nondifferentiable.jl:21
...
optimize(::Function, ::Evolutionary.NoConstraints, ::Function, ::Evolutionary.GA, ::Evolutionary.Options{Nothing})@optimize.jl:30
[email protected]:13[inlined]
top-level scope@Local: 1
The init_organism
function initializes an object of type Organism
and returns it, while the fitness
function takes an Organism
as input and returns a Number
as fitness value. I haven't defined the crossover and mutation functions for now, as I only want to see if the GA is being initialized well.
function fitness(organism::Organism)
sum = 0
if !ismissing(organism.alignment)
alignments = collect(values(organism.alignment))
for i=(1:length(alignments)-1), j=(i+1:length(alignments))
a = alignments[i]
b = alignments[j]
sum += pairwise_score(
a, b,
matrix=organism.scoring_matrix.matrix,
gop=organism.gop, gep=organism.gep)
end
end
sum
end
After a thorough search, I'm unable to figure out the way to solve this problem/error. If anyone could give me some helpful pointers to solve this error message, I'd be grateful. Thanks
Is there any interest in a wide variety of selection methods?
I have some code around that implements Stochastic Acceptance and Truncated Rank-Based selection as well as Unbiased Tournament selection.
Thanks for this package, it looks great.
We have the implementation of CMA-ES and ES etc on the todo list for the BlackBoxOptim.jl (BBO) package but since this seems to be a nice implementation I wonder if you would be ok with us basing our implementation of them on your code?
My interpretation is that your license does allow it but I want to make sure anyway. I see value in both having this smaller and more direct implementation and including it in a framework/lib like BBO where one can easily switch between different optimization algorithms.
My code has the following first line:
Using Evolutionary
Then it appears that error:
WARNING: deprecated syntax "typealias Strategy Dict{Symbol,Any}" at C:\Users\Vilmar.julia\v0.6\Evolutionary\src\Evolutionary.jl:22.
Use "const Strategy = Dict{Symbol,Any}" instead.
WARNING: deprecated syntax "typealias Individual Union(Vector,Matrix,Function,Nothing)" at C:\Users\Vilmar.julia\v0.6\Evolutionary\src\Evolutionary.jl:23.
Use "const Individual = Union(Vector,Matrix,Function,Nothing)" instead.
ERROR: LoadError: LoadError: MethodError: no method matching Union(::Type{Array{T,1} where T}, ::Type{Array{T,2} where T}, ::Type{Function}, ::Type{Void})
Stacktrace:
[1] include_from_node1(::String) at .\loading.jl:576
while loading C:\Users\Vilmar.julia\v0.6\Evolutionary\src\Evolutionary.jl, in expression starting on line 22
while loading C:\Users\Vilmar\Documents\Otimizacao Comb\Programas\knapsack_ga.jl, in expression starting on line 1
What should I do?
As described in https://discourse.julialang.org/t/ann-plans-for-removing-packages-that-do-not-yet-support-1-0-from-the-general-registry/ we are planning on removing packages that do not support 1.0 from the General registry. This package has been detected to not support 1.0 and is thus slated to be removed. The removal of packages from the registry will happen approximately a month after this issue is open.
To transition to the new Pkg system using Project.toml
, see https://github.com/JuliaRegistries/Registrator.jl#transitioning-from-require-to-projecttoml.
To then tag a new version of the package, see https://github.com/JuliaRegistries/Registrator.jl#via-the-github-app.
If you believe this package has erroneously been detected as not supporting 1.0 or have any other questions, don't hesitate to discuss it here or in the thread linked at the top of this post.
I would think that when setting up the GA with a setting utilizing elite
the best fit in each population should be guaranteed to survive. It seems it is possible to overwrite for example the best fit with one that isn't.
I would expect that if we are selection a certain percentage or number of elite to move to the next population that the best
n of them would be available.
I have seen in multiple runs where I have an initial population that contains the best known to date, I will see the best
in a population be actually worse than the best
I set in the initial population.
I can trace the problem in ga.jl to the code below. The subs = rand(1:populationSize
and offspring[subs] = population[fitidx[i]]
can easily overwrite the most elite with one that isn't the best
.
I had a run where the best of a population was in location 10, then was assigned to position 19 in offspring
, then another elite that wasn't as good as the best, was assigned over that location of 19. Now the best in the population doesn't move forward.
if elite > 0
for i in 1:elite
subs = rand(1:populationSize)
debug && println("ELITE $(fitidx[i])=>$(subs): $(population[fitidx[i]]) => $(offspring[subs])")
offspring[subs] = population[fitidx[i]]
end
end
I am trying to use the Package, but I get the following error, when I try to call using Evolutionary
:
LoadError: LoadError: UndefVarError: Nothing not defined
while loading C:\Users\thiem\.julia\v0.5\Evolutionary\src\Evolutionary.jl, in expression starting on line 22
while loading In[1], in expression starting on line 1
in include_from_node1(::String) at .\loading.jl:488
in eval(::Module, ::Any) at .\boot.jl:234
in require(::Symbol) at .\loading.jl:415
julia> using Evolutionary
[ Info: Precompiling Evolutionary [86b6b26d-c046-49b6-aa0b-5f0f74682bd6]
julia> opt = CMAES(μ=5, λ=20)
CMAES(5, 20, NaN, NaN, NaN)
julia> Evolutionary.optimize(x->sum(x.^2), randn(2), opt)
* Status: success
* Candidate solution
Minimizer: [2.4612980555252555e-7, 4.3533606672211497e-7]
Minimum: 2.500973721704058e-13
Iterations: 49
* Found with
Algorithm: CMAES
julia> using Evolutionary
[ Info: Precompiling Evolutionary [86b6b26d-c046-49b6-aa0b-5f0f74682bd6]
julia> opt = CMAES(μ=5, λ=20)
CMAES{Float64}(5, 10, NaN, NaN, NaN, NaN, 1.0, 1.0, [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0])
julia> Evolutionary.optimize(x->sum(x.^2), randn(2), opt)
* Status: success
* Candidate solution
Minimizer: [-0.004843917777738457, 0.19663907163798566]
Minimum: 0.038690388034086344
Iterations: 491
* Found with
Algorithm: (5,10)-CMA-ES
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.