Giter Club home page Giter Club logo

evolutionary.jl's People

Contributors

0x47 avatar aahaselgrove avatar abhigupta768 avatar agctute avatar daviehh avatar giggleliu avatar haydensolo avatar honeypot95 avatar juliatagbot avatar matago avatar miguelbiron avatar roger-luo avatar timholy avatar tyde avatar vchuravy avatar wildart avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

evolutionary.jl's Issues

Cannot get tutorial example to work

Hello-

Thank you for putting this package together. Unfortunately, I am having trouble getting the tutorial example to work. The algorithm seems to be stuck at the initial values.

res = Evolutionary.optimize(x -> -sum(x),
                            BitVector(zeros(3)),
                            GA(),
                            Evolutionary.Options(iterations=10, show_trace=true))

Here are the results:

Iter     Function value
     0                0
 * time: 8.416175842285156e-5
     1                0
 * time: 0.0001971721649169922
     2                0
 * time: 0.00025010108947753906
     3                0
 * time: 0.00028705596923828125
     4                0
 * time: 0.0003161430358886719
     5                0
 * time: 0.0003452301025390625
     6                0
 * time: 0.00038123130798339844
     7                0
 * time: 0.000408172607421875
     8                0
 * time: 0.0004360675811767578
     9                0
 * time: 0.0004742145538330078
    10                0
 * time: 0.0005102157592773438
    11                0
 * time: 0.0005371570587158203

 * Status: success

 * Candidate solution
    Minimizer:  [false, false, false]
    Minimum:    0
    Iterations: 11

 * Found with
    Algorithm: GA[P=50,x=0.8,μ=0.1,ɛ=0]

If I understand correctly, the minimizer should be [true,true,true] with a minimum of -3. I tried changing parameters of GA and in Options, but to no avail. Am I doing something incorrectly?

possible test failure in upcoming Julia version 1.5

A PkgEval run for a Julia pull request which changes the generated numbers for rand(a:b) indicates that the tests of this package might fail in Julia 1.5 (and on Julia current master branch).

Also, you might be interested in using the new StableRNGs.jl registered package, which provides guaranteed stable streams of random numbers across Julia releases.

Apologies if this is a false positive. Cf. https://github.com/JuliaCI/NanosoldierReports/blob/ab6676206b210325500b4f4619fa711f2d7429d2/pkgeval/by_hash/52c2272_vs_47c55db/logs/Evolutionary/1.5.0-DEV-87d2a04de3.log

Support for binary chromosomes?

Hi,

I have been exploring Julia's GA capabilities, looking at GeneticAlgorithms and Evolutionary packages. I really enjoy the simple interface presented by Evolutionary and it also seems to be more advanced in terms of development. Great work and thank you for your efforts.

However, I am having trouble figuring out how I can implement a binary GA. I am using a knapsack problem as a simple illustrative example. I've written a fitness function which looks like this:

mass    = [1, 5, 3, 7, 2, 10, 5]
utility = [1, 3, 5, 2, 5,  8, 3]

function fitness(n::Vector{Int})
    total_mass = sum(mass .* n)
    return (total_mass <= 20) ? sum(utility .* n) : 0
end

but when I try to run the GA as follows

ga(fitness, 7)

I get

ERROR: wrong number of arguments
 in anonymous at /home/colliera/.julia/v0.3/Evolutionary/src/ga.jl:23
 in ga at /home/colliera/.julia/v0.3/Evolutionary/src/ga.jl:66

Looking at the debug output from ga() I see that the vectors being passed into my fitness function are floating point rather than binary (or at least a vector of integer 0 or 1). I'm not sure whether this is the source of the ERROR above, but it's certainly something that I need to resolve in order to get my problem up and running.

Is there support for binary chromosomes? Have I missed something?

Best regards,
Andrew.

Early stopping doesn't behave as expected

For GA, based on the default parameters (tol = 0.0, tolIter = 10), I would expect the algorithm to terminate after 10 iterations with no improvement on the objective, and I feel that this is logical default behaviour. However, in practice, early stopping does not occur without setting tol to a positive value.

Advices to make interfaces extensible

This is a very neat repo for evolutionary algorithms! I really like it, but please allow me to give two advices to make the interface better

  1. split body of functions into multiple shorter functions, this is always plausible in Julia.
  2. provide iterator interface to algorithms, like
for (count, runtime_info) in enumerate(optimize(objfunc, individual, args...))
    # inspect your program
    @show runtime_info.best
    # exit condition
    (count > 100 || runtime_info.precision < 1e-10) && break
end

these changes have practical importances

  1. making functions shorter allows us dispatch and overload specific functionality. e.g. let populate! be a function like in PR #26 can help people implement parallelism (e.g. MPI) much easier. Let functions short is helpful to open issues #15 and #2 , so that people can overload shorter functions like select(::ProblemType, current_population, current_fitness) to custom the strategies of selecting parents, or generate(::ProblemType, parents) to generate new generation in bounds.
  2. Iterator interfaces can be regarded as a clean version of call_back function, many people want to inspect their program during optimization, and iterator provides full access to the runtime information.
  3. Also, iterator interface requires less input parameters, like iterations, tol and verbose, because people can decide when to break the program intuitively inside a loop.

See this PR for detail (this is a demo, not a real PR, a PR requires more discussion)
#26

[help] A bit at a loss on what happens with my initial population

Hi there!

I am a bit of a beginner in GA, so sorry for the question that might be very basic. I use the code below (sorry, it is hard for me to make a MWE as it is part of a package I am writing). The function f has been tested independently.

function _optimize(icn, X; ga = _icn_ga, metric = hamming, pop_size = 100)
    f(weigths) = _loss(X, icn, weigths, metric)

    _icn_ga = GA(populationSize = pop_size, crossoverRate = 0.8, ɛ = .05, selection = tournament, crossover = singlepoint, mutation = flip)
    # problem also occurs with GA() and pop_size = 50

    pop = generate_population(icn, pop_size)

    optimize(f, pop, _icn_ga)
end

I have a very strange behavior which is when I call f the weights argument is a zeros vector, even though my initial population does not contain such individuals.

So, my question is the following: am I missing some settings that make the GA provides only zeros individual?

Side question, is it possible to provide a predicate for viable individuals at each generation so that mutation and crossover are redone if the individual is not viable?

can't store iters, fitness, and best minimizer via callback

I have been trying to store

  1. the current number of iterations
  2. the best fitness found so far
  3. the genome of the individual with that value
    Through a callback function and have been trying to work with trace to get there right information into the callback.

When I don't override the trace function can call I get:
fieldnames(typeof(oc)) = (:iteration, :value, :metadata)
for the value of os

If I override the trace! function and set store_trace to true, no matter what I add to the dictionary I get
fieldnames(typeof(tr)) = ()
for the value of tr that I can see gets passed in in this case

I feel like this should be possible.

Maybe one solution would be to give the callback function an intermediate state of the results such that this would work:

function callback(intermediate_res)
items = Evolutionary.iterations(intermediate_res)
best_indv = Evolutionary.minimizer(intermediate_res)
best_fit = minimum(intermediate_res)
.
.

Code to save values

.
return false
end

error with scramble mutation

Hello,

I am trying to optimise an array of booleans:

init = ()->Bool[rand(Bool) for el in tc]
opts = Evolutionary.Options(iterations=500)
mthd = GA(populationSize=150, crossoverRate=0.5, mutationRate=0.2, selection=susinv, crossover=uniform(), mutation=scramble())
Random.seed!(10);
result = Evolutionary.optimize(f1, init, mthd, opts)

but I keep getting this error:

ERROR: LoadError: MethodError: no method matching scramble()
Closest candidates are:
  scramble(!Matched::T) where T<:(AbstractArray{T,1} where T) at /home/nsalvi/.julia/packages/Evolutionary/BIk3j/src/mutations.jl:225

I am a bit lost, can anyone help me with that?

ga fails to minimize functions with negative range

Example:

best, = ga(x->-sum(x), 
          5,
          initPopulation=n->rand(Bool, n),
          iterations=10000,
          mutationRate = 0.2,
          selection = roulette
          )

I think this is because you maximize the inverse of the provided objective function. Is there a reason to do this rather than just directly minimizing the function? (Or I guess, maximizing the negation of the function)

Can not add the package

When I add the package, Julia gives me an error. I thought it is about the version of the package.

(v1.3) pkg> add Evolutionary
 Resolving package versions...
ERROR: Unsatisfiable requirements detected for package UnPack [3a884ed6]:
 UnPack [3a884ed6] log:
 ├─possible versions are: [0.1.0, 1.0.0-1.0.1] or uninstalled
 ├─restricted to versions 1.0.1 by an explicit requirement, leaving only versions 1.0.1
 └─restricted by compatibility requirements with Evolutionary [86b6b26d] to versions: 0.1.0 — no versions left
   └─Evolutionary [86b6b26d] log:
     ├─possible versions are: [0.3.0, 0.4.0, 0.5.0, 0.6.0-0.6.1] or uninstalled
     └─restricted to versions * by an explicit requirement, leaving only versions [0.3.0, 0.4.0, 0.5.0, 0.6.0-0.6.1]

Return of status

Hi, I am wondering how can I get the status from the result variable, is there a function call on result and return the status: success or failure? Thanks.

ga() parameters `lowerBounds` and `upperBounds` do nothing

It seems that the parameters lowerBounds and upperBounds in the genetic algorithm have no function. Without those bounds, the search space for integer variables is too big and the algorithm fails to find a solution. Any chance those parameters can be implemented?

1.0 compatibility

Hi :) very nice library, thanks. I'm wondering if you're planning on updating this to work with 1.0 any time soon? Currently it doesn't compile (Void is no longer a type is the first error I'm getting, but there might be more)

Error: objects of type Float64 are not callable

Thanks for your work. And there is an error in my code, could you please point out the mistakes?

`fitFunc(w)=w[1]-w[2]+w[3]-w[4]+w[5]-w[6]

lx = [0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
ux = [1.0, 1.0, 1.0, 1.0, 1.0, 1.0]
con_w = w[1]+w[2]+w[3]+w[4]+w[5]+w[6]
lc = [1.0]
uc = [1.0]

mthd = GA(populationSize = 3, ɛ = 0.03, selection = susinv, crossover = intermediate(0.7), mutation = gaussian(2))
con = PenaltyConstraints(1e3, lx, ux, lc, uc, con_w)
Evolutionary.ConstraintBounds(lx,ux,lc,uc), con_w)
opts = Evolutionary.Options(iterations=15, successive_f_tol=25, show_trace=true)

res_Final = Evolutionary.optimize(fitFunc, con, con.bounds, mthd, opts)
m = Evolutionary.minimizer(res_Final)
minimum(res_Final)`

There is a error at the line of function “optimize”, that "Error: objects of type Float64 are not callable".
In detail:
LoadError: MethodError: objects of type Float64 are not callable
in expression starting at D:\gaspipeline_sto.jl:212
value(::PenaltyConstraints{Float64,Float64}, ::NonDifferentiable{Float64,Array{Float64,1}}, ::Array{Float64,1}) at constraints.jl:138
update_state!(::NonDifferentiable{Float64,Array{Float64,1}}, ::PenaltyConstraints{Float64,Float64}, ::Evolutionary.GAState{Float64,Array{Float64,1}}, ::Array{Array{Float64,1},1}, ::GA, ::Int64) at ga.jl:98
optimize(::NonDifferentiable{Float64,Array{Float64,1}}, ::PenaltyConstraints{Float64,Float64}, ::Array{Array{Float64,1},1}, ::GA, ::Evolutionary.Options{Nothing}, ::Evolutionary.GAState{Float64,Array{Float64,1}}) at optimize.jl:56
optimize(::NonDifferentiable{Float64,Array{Float64,1}}, ::PenaltyConstraints{Float64,Float64}, ::Array{Array{Float64,1},1}, ::GA, ::Evolutionary.Options{Nothing}) at optimize.jl:40
optimize(::Function, ::PenaltyConstraints{Float64,Float64}, ::NLSolversBase.ConstraintBounds{Float64}, ::GA, ::Evolutionary.Options{Nothing}) at optimize.jl:31
top-level scope at gaspipeline_sto.jl:212
include_string(::Function, ::Module, ::String, ::String) at loading.jl:1088

In-place mutation

I'm trying to find a Genetic Algorithms implementation for a project I'm working on, so have been looking through the code. Is mutation applied in-place, or do I misunderstand? On one hand, the code is clearly discarding the return value from mutation:

# Perform mutation
for i in 1:populationSize
if rand() < mutationRate
debug && println("MUTATED $(i)>: $(offspring[i])")
mutation(offspring[i])
debug && println("MUTATED >$(i): $(offspring[i])")
end
end

But on the other hand, the sample mutations in mutations.jl (and the default mutation (x -> x)) all return a value and aren't suffixed with a !.

New version release?

Hello,
We are trying to provide the Evolutionary.jl as the backend optimizer for the cost function in DiffEqParameEstim.jl See SciML/DiffEqParamEstim.jl#25. The present release is not compatible with Julia 0.5 and 0.6. Do you plan to release a new version which provides support for recent julia versions?
I am happy to help :)

Help wanted: solution to very simple problem?

Hi,

thanks for the cool package.

I recently started to do some research within the field of Evolutionary Strategies.

I have a question regarding a very simple problem:

indiv = [0.0,0.0]
lowerbound = [0.0,0.0]
upperbound = [20.0,20.0]

function fit(x)
    if(sum(x) == 4.0)
        return -1
    else
        return +1
    end
end

Evolutionary.optimize(fit,lowerbound,upperbound,indiv,CMAES(),Evolutionary.Options(iterations = 1000))

>Status = success
>Candidate solution
> Minimzer = [6.216071822584697,2.911369416556109]
> Minimum = 1.0
> Iterations = 12

>Found with
> Algorihm: (15,30)-CMA-ES

I do not understand why the Algorithm yields infeasible solutions, returns the status "success" and always stops after 12 Iterations.

Another question:

Is there a way to mutate the genotype, so that the mutation is a multiple of a certain step/number?
The Mutations should do something like this:
step = 0.25 :[1.495763498,1.23478634] --> [1.75,1.23478634]. Is there such an Algorithm in this package?

I really hope I was able to explain my problems understandably.

Thanks

Errors with Julia 0.5.0

There are some errors using this with Julia 0.5.0. If you change the following lines in Evolutionary.jl
and ga.jl, it seems to work fine.

Evolutionary.jl:22: typealias Individual Union{Vector, Matrix, Function, Void}
ga.jl:17: lowerBounds::Union{Void, Vector} = nothing,
ga.jl:18: upperBounds::Union{Void, Vector} = nothing,

Plotting interim results

Hi Art,

I've spent another few hours playing with Evolutionary and I'm completely sold. This is really great! I love the fact that so much can be achieved through such a deceptively simple interface!

I have previously worked quite a lot with R's GA package. One of the features that I really enjoyed there was the fact that you can plot the convergence of the optimal and average fitness of the population. Something like this: http://imgur.com/v8Krt1Z. Would it be possible to implement something similar with Evolutionary? I realise that this would require you to store interim results for each generation, but I think that in terms of understanding the performance of the GA and being able to compare different crossover/mutation algorithms this would be very valuable.

Best regards,
Andrew.

How to set `bounds` properly?

One can specify the ConstraintBounds as the optimizer input, however, it is more likely designed for lagrange multiplier.

It does take effect during population generation of CMAES:

offspring[i] = parent + σ * B * D * z[:,i]

Currently, I need a bound that populations can be cliped on this bounds like

clip!(generated_population, bounds)

Do you think it is proper to add a new bounds as field of CMAES? Naively, the bounds type should contain a vector of upper bounds and a vector of lower bounds. Any comments?

Test showing error on master package

I see the following error, might be due to some dependencies

On running -> Pkg.test("Evolutionary")

INFO: Testing Evolutionary
Running tests:
FAILED: sphere.jl
LoadError: LoadError: MethodError: no method matching Union(::Type{Array{T,1}}, ::Type{Array{T,2}}, ::Type{Function}, ::Type{Void})
Closest candidates are:
Union{T}(::Any) at sysimg.jl:53
in macro expansion; at /home/rishab/.julia/v0.5/Evolutionary/test/runtests.jl:25 [inlined]
in anonymous at ./:?
in include_from_node1(::String) at ./loading.jl:488
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
while loading /home/rishab/.julia/v0.5/Evolutionary/src/Evolutionary.jl, in expression starting on line 22
while loading /home/rishab/.julia/v0.5/Evolutionary/test/sphere.jl, in expression starting on line 2
FAILED: rosenbrock.jl
LoadError: UndefVarError: es not defined
in macro expansion; at /home/rishab/.julia/v0.5/Evolutionary/test/runtests.jl:25 [inlined]
in anonymous at ./:?
in include_from_node1(::String) at ./loading.jl:488
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
while loading /home/rishab/.julia/v0.5/Evolutionary/test/rosenbrock.jl, in expression starting on line 20
FAILED: schwefel.jl
LoadError: UndefVarError: cmaes not defined
in macro expansion; at /home/rishab/.julia/v0.5/Evolutionary/test/runtests.jl:25 [inlined]
in anonymous at ./:?
in include_from_node1(::String) at ./loading.jl:488
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
while loading /home/rishab/.julia/v0.5/Evolutionary/test/schwefel.jl, in expression starting on line 19
FAILED: rastrigin.jl
LoadError: UndefVarError: es not defined
in macro expansion; at /home/rishab/.julia/v0.5/Evolutionary/test/runtests.jl:25 [inlined]
in anonymous at ./:?
in include_from_node1(::String) at ./loading.jl:488
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
while loading /home/rishab/.julia/v0.5/Evolutionary/test/rastrigin.jl, in expression starting on line 27
FAILED: n-queens.jl
LoadError: UndefVarError: inversion not defined
in macro expansion; at /home/rishab/.julia/v0.5/Evolutionary/test/runtests.jl:25 [inlined]
in anonymous at ./:?
in include_from_node1(::String) at ./loading.jl:488
in process_options(::Base.JLOptions) at ./client.jl:265
in _start() at ./client.jl:321
while loading /home/rishab/.julia/v0.5/Evolutionary/test/n-queens.jl, in expression starting on line 28
INFO: Evolutionary tests passed

"The test seems to be passed yet the source of errors is unknown to me"

Plot Fitness Function for all evaluated points

Hey there,
I'm using Evolutionary, great package, so thanks for developing it. I was wondering if it's possible by a callback or something like that, to save all individuals and their associated fitness function values? Since I'm only solving a 2 dimensional Problem and this in fact very often, it would be nice to get an Idea of how multi-modal my function is and thus, I would like to plot the fitness function over the 2D parameter space.

CMAES: Accept multi-dimensional individuals

Currently, CMAES only works with one-dimensional individuals. It's possible to accept multi-dimensional individuals and vectorize them, keep dimensional information in the state and reshaping an individual into its original form when the minimizer is requested.

Currently, a matrix of individuals transformed into a vector of vectors, which doesn't make much sense.

Parallelization

Consider parallelization of the algorithms in multiple modes:

  • Single process (core)
  • Multi-threading
  • Multi-process (multi-core)

Support for multidimensional individuals

I'm currently looking at a problem I'm trying to apply GA to which has 2D individuals. This implementation only supports 1D individuals but I don't see any reason not to support individuals of arbitrary dimension.

Major changes required are to getIndividual and population generation. The easy way to do this would probably be to change the dimensionality parameter N to instead specify the type of an individual.

What are your thoughts?

Deprecation warnings on 0.4

Hey @wildart, I just wanted to let you know that this package generates a crazy amount of warnings on 0.4 due to indexing with non-integers. So much so that it generated 16 MB of deprecation warnings before a timeout killed it on PackageEvaluator. Something to look into if you get a chance!

Solution worse than initial guess

I have been comparing performance of CMAES() with BlackBoxOptim on a 50-dimensional problem. BlackBoxOptim has been giving qualitatively better results, and even when I initialise CMAES() with the solution from BBO, the candidate solution it (CMAES) produces (after 1500 iterations) is an order of magnitude worse (in terms of objective) than the initial value.

Am I missing something?

I am just using the basic res=Evolutionary.optimize(f, b, CMAES())

Thanks for any help!

PS this is just a test example of lasso linear regression

CMAES sigma parameter

Now the initial \sigma of CMAES is fixed to 1.0. I am about to submit a PR to allow users to specify it manually.

Wondering if it is a good move?

Errors using PM and MIPM mutations

Hi, thanks for the package!

I'm running into a couple of errors with PM and MIPM

using Evolutionary
f(x) = x[1]^2 + 1
x0 = [0]
res = Evolutionary.optimize(f, x0, GA(selection=ranklinear(1), mutation=PM([-1.0], [3.0])))
ERROR: AssertionError: Need to set p_int for integer variables

I think this is because this line needs to pass a dummy value for p_float before p:

return mipmmutation(lower, upper, p)

When I try with MIPM I get this

julia> res = Evolutionary.optimize(f, x0, GA(selection=ranklinear(1), mutation=MIPM([-1.0], [3.0])))
ERROR: InexactError: Int64(-0.9692105629432188)

Which I think is because this line will result in a float always:

S = u .^ (1 ./ P) # random var following power distribution

The second problem goes away when the initial condition is a vector of floats - should it always be this or is the intention for it to work with a vector of ints?

Cheers

Issue while loading in Julia 0.5.2

Hello,
in Julia 0.5.2 the using instruction gives this error:
julia> using Evolutionary ERROR: LoadError: MethodError: no method matching Union(::Type{Array{T,1}}, ::Type{Array{T,2}}, ::Type{Function}, ::Type{Void}) Closest candidates are: Union{T}(::Any) at sysimg.jl:53 in include_from_node1(::String) at .\loading.jl:488 in eval(::Module, ::Any) at .\boot.jl:234 in require(::Symbol) at .\loading.jl:415 while loading C:\Users\a.peano\.julia\v0.5\Evolutionary\src\Evolutionary.jl, in expression starting on line 22

Do you have any idea on how to bypass it?

Thanks

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

no method matching NLSolversBase.NonDifferentiable

I've been trying to run GA on my custom structure, Organism, but it's giving the following error:

result = Evolutionary.optimize(
        fitness,
        () -> init_organism("msa_data/temp.tfa"),
        GA(
            populationSize = 50,
        ))

Error

MethodError: no method matching NLSolversBase.NonDifferentiable(::typeof(Main.workspace3.fitness), ::Main.workspace3.Organism)

Closest candidates are:

NLSolversBase.NonDifferentiable(::Any, ::Any, !Matched::AbstractArray) at C:\Users\aadim\.julia\packages\NLSolversBase\QPnui\src\objective_types\nondifferentiable.jl:21

NLSolversBase.NonDifferentiable(::Any, ::TF, !Matched::TX, !Matched::Array{Int64,1}) where {TF, TX} at C:\Users\aadim\.julia\packages\NLSolversBase\QPnui\src\objective_types\nondifferentiable.jl:3

NLSolversBase.NonDifferentiable(::Any, ::Any, !Matched::AbstractArray, !Matched::Union{Real, AbstractArray}) at C:\Users\aadim\.julia\packages\NLSolversBase\QPnui\src\objective_types\nondifferentiable.jl:21

...

optimize(::Function, ::Evolutionary.NoConstraints, ::Function, ::Evolutionary.GA, ::Evolutionary.Options{Nothing})@optimize.jl:30
[email protected]:13[inlined]
top-level scope@Local: 1

The init_organism function initializes an object of type Organism and returns it, while the fitness function takes an Organism as input and returns a Number as fitness value. I haven't defined the crossover and mutation functions for now, as I only want to see if the GA is being initialized well.

function fitness(organism::Organism)
	sum = 0
	if !ismissing(organism.alignment)
		alignments = collect(values(organism.alignment))
		for i=(1:length(alignments)-1), j=(i+1:length(alignments))
			a = alignments[i]
			b = alignments[j]
			sum += pairwise_score(
				a, b, 
				matrix=organism.scoring_matrix.matrix, 
				gop=organism.gop, gep=organism.gep)
		end
	end
	sum
end

After a thorough search, I'm unable to figure out the way to solve this problem/error. If anyone could give me some helpful pointers to solve this error message, I'd be grateful. Thanks

Selection methods

Is there any interest in a wide variety of selection methods?

I have some code around that implements Stochastic Acceptance and Truncated Rank-Based selection as well as Unbiased Tournament selection.

Re-use code for BlackBoxOptim.jl?

Thanks for this package, it looks great.

We have the implementation of CMA-ES and ES etc on the todo list for the BlackBoxOptim.jl (BBO) package but since this seems to be a nice implementation I wonder if you would be ok with us basing our implementation of them on your code?

My interpretation is that your license does allow it but I want to make sure anyway. I see value in both having this smaller and more direct implementation and including it in a framework/lib like BBO where one can easily switch between different optimization algorithms.

Error while utilizing Julia 0.6.2 on Windows 10

My code has the following first line:
Using Evolutionary

Then it appears that error:

WARNING: deprecated syntax "typealias Strategy Dict{Symbol,Any}" at C:\Users\Vilmar.julia\v0.6\Evolutionary\src\Evolutionary.jl:22.
Use "const Strategy = Dict{Symbol,Any}" instead.

WARNING: deprecated syntax "typealias Individual Union(Vector,Matrix,Function,Nothing)" at C:\Users\Vilmar.julia\v0.6\Evolutionary\src\Evolutionary.jl:23.
Use "const Individual = Union(Vector,Matrix,Function,Nothing)" instead.
ERROR: LoadError: LoadError: MethodError: no method matching Union(::Type{Array{T,1} where T}, ::Type{Array{T,2} where T}, ::Type{Function}, ::Type{Void})
Stacktrace:
[1] include_from_node1(::String) at .\loading.jl:576
while loading C:\Users\Vilmar.julia\v0.6\Evolutionary\src\Evolutionary.jl, in expression starting on line 22
while loading C:\Users\Vilmar\Documents\Otimizacao Comb\Programas\knapsack_ga.jl, in expression starting on line 1

What should I do?

Info about upcoming removal of packages in the General registry

As described in https://discourse.julialang.org/t/ann-plans-for-removing-packages-that-do-not-yet-support-1-0-from-the-general-registry/ we are planning on removing packages that do not support 1.0 from the General registry. This package has been detected to not support 1.0 and is thus slated to be removed. The removal of packages from the registry will happen approximately a month after this issue is open.

To transition to the new Pkg system using Project.toml, see https://github.com/JuliaRegistries/Registrator.jl#transitioning-from-require-to-projecttoml.
To then tag a new version of the package, see https://github.com/JuliaRegistries/Registrator.jl#via-the-github-app.

If you believe this package has erroneously been detected as not supporting 1.0 or have any other questions, don't hesitate to discuss it here or in the thread linked at the top of this post.

The best elite could be overwritten.

I would think that when setting up the GA with a setting utilizing elite the best fit in each population should be guaranteed to survive. It seems it is possible to overwrite for example the best fit with one that isn't.

I would expect that if we are selection a certain percentage or number of elite to move to the next population that the best n of them would be available.

I have seen in multiple runs where I have an initial population that contains the best known to date, I will see the best in a population be actually worse than the best I set in the initial population.

I can trace the problem in ga.jl to the code below. The subs = rand(1:populationSize and offspring[subs] = population[fitidx[i]]can easily overwrite the most elite with one that isn't the best.

I had a run where the best of a population was in location 10, then was assigned to position 19 in offspring, then another elite that wasn't as good as the best, was assigned over that location of 19. Now the best in the population doesn't move forward.

if elite > 0
            for i in 1:elite
                subs = rand(1:populationSize)
                debug && println("ELITE $(fitidx[i])=>$(subs): $(population[fitidx[i]]) => $(offspring[subs])")
                offspring[subs] = population[fitidx[i]]
            end
        end

Unable to use Package on julia v0.5

I am trying to use the Package, but I get the following error, when I try to call using Evolutionary:

LoadError: LoadError: UndefVarError: Nothing not defined
while loading C:\Users\thiem\.julia\v0.5\Evolutionary\src\Evolutionary.jl, in expression starting on line 22
while loading In[1], in expression starting on line 1

 in include_from_node1(::String) at .\loading.jl:488
 in eval(::Module, ::Any) at .\boot.jl:234
 in require(::Symbol) at .\loading.jl:415

CMAES optimizer is not working properly in recent versions

0.4.0

julia> using Evolutionary
[ Info: Precompiling Evolutionary [86b6b26d-c046-49b6-aa0b-5f0f74682bd6]

julia> opt = CMAES=5, λ=20)
CMAES(5, 20, NaN, NaN, NaN)

julia> Evolutionary.optimize(x->sum(x.^2), randn(2), opt)

 * Status: success

 * Candidate solution
    Minimizer:  [2.4612980555252555e-7, 4.3533606672211497e-7]
    Minimum:    2.500973721704058e-13
    Iterations: 49

 * Found with
    Algorithm: CMAES

master branch

julia> using Evolutionary
[ Info: Precompiling Evolutionary [86b6b26d-c046-49b6-aa0b-5f0f74682bd6]

julia> opt = CMAES=5, λ=20)
CMAES{Float64}(5, 10, NaN, NaN, NaN, NaN, 1.0, 1.0, [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0])

julia> Evolutionary.optimize(x->sum(x.^2), randn(2), opt)

 * Status: success

 * Candidate solution
    Minimizer:  [-0.004843917777738457, 0.19663907163798566]
    Minimum:    0.038690388034086344
    Iterations: 491

 * Found with
    Algorithm: (5,10)-CMA-ES

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.