Giter Club home page Giter Club logo

gaussianfilters.jl's Introduction

Testing Coverage Documentation
Build Status Coverage Status

GaussianFilters.jl

GaussianFilters implements methods to define and run Kalman, Extended Kalman, Unscented Kalman, and Gaussian-Mixture Probability Hypothesis Density Filters on simulated data. It also implements simulation functions for the Kalman-class filters.

Documentation

The documentation for the package can be found here: https://sisl.github.io/GaussianFilters.jl/latest

Installation

GaussianFilters can be installed by running:

using Pkg
Pkg.add("GaussianFilters")

Basic Usage

Basic usage follows along defining appropriate models, constructing an appropriate filter, and running the filter with known actions on some measurement data.

using GaussianFilters, LinearAlgebra

# dynamics model
A = [1 0.1; 0 1]
B = [0; 1]
W = [0.5 0; 0 0.5]
dmodel = LinearDynamicsModel(A, B, W)

# measurement model
measure(x, u) = LinearAlgebra.norm(x, 2)
V = [0.01]
omodel = NonlinearObservationModel(measure, V)

# filtering given some action and measurement
ukf = UnscentedKalmanFilter(dmodel, omodel)

b0 = GaussianBelief([0, 0], [1 0; 0 1])
b1 = update(ukf, b0, action, measurement)

See documentation and examples for more details.

Examples

Examples notebooks can be found in the notebooks folder:

Kalman Filter Example

Extended Kalman Filter Example

Unscented Kalman Filter Example

GM-PHD Object Surveillance Example

GM-PHD Aircraft Carrier Example

gaussianfilters.jl's People

Contributors

benkmoore avatar benqlange avatar divbyzerofordummies avatar duncaneddy avatar jamgochiana avatar johannes-fischer avatar lassepe avatar maximebouton avatar radioflash avatar zprihoda avatar zsunberg avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

gaussianfilters.jl's Issues

Consistency of `update` with POMDPs.jl

[Suggestion (4) of series starting in #14]

It would be really useful to make this interoperable with POMDPs.jl!! One thing that would help for that is consistency of the update function. Both POMDPs.update and GaussianFilters.update are very similar, except for the argument ordering:

POMDPs.update(filter, b, action, measurement)
GaussianFilters.update(b, action, measurement, filter)

We chose to put the filter first because it is the object that has the most "ownership" over the update function, i.e. the person who writes the filter type is most likely to write the corresponding update function. Would you consider switching the order of the arguments in this package to be consistent? That way people could easily switch between this and, for example, ParticleFilters.jl, and filters from this package could have plug and play compatibility with POMDPs.jl!! :)

Additional filter implementations (DD2)

Some time ago, I played around with the DD2 Kalman filter--see "New developments in state estimation for nonlinear systems" by Nørgaard, Poulsen, Ravn in Automatica 36 (11), 1627-1638 (doi:10.1016/s0005-1098(00)00089-3; it's on scihub).

From what I can tell, this is kinda similar to the UKF filter (a nonlinear extension of classic Kalman filtering, not relying on explicitly supplied derivatives). It is based on Stirling interpolation, and claims better performance (specifically covariance estimation) than the UKF.

There is some Matlab reference code, but ready-to-use library implementations are not common (that I found).

I would base any work on the current UKF implementation (because some implementation details seem quite similar to me).
I also have some real world data to benchmark them against each other.

Would it make sense to add an implementation to this package?
Does anyone have any input/opinion on this (interesting/redundant/pointless/belongs elsewhere)?

Leaky abstraction of PHDFilter

The PHDFilter currently seems a bit out of place. It is not an AbstractFiltert and the update dispatch expects s a different signature:

function update(phd::PHDFilter, b0::GaussianMixture,

That is, it does not have the usual update(filter, belief, action, observation) layout and thus all the functionality of src/simulate.jl simply does not apply. It would be nice to streamline this, or remove the PHDFilter from this package, or find a useful abstraction for it.

Type instability from keyword argument

[Suggestion (3) in series started in #14]

I noticed here:

return return_prediction ? (bp, bn) : bn
that you return different types based on the value of an argument.

This results in type instability which can have significant performance effects. In the example below, the performance is reduced by a factor of 100 because of this.

(when there is no keyword present in the call, the compiler can deduce the return type, but if ether b=true or b=false is present, it can't find the exact type; it knows only that it is a Union{Int64, Tuple{Int64,Int64}})

julia> function f(a; b = false)
           if b
               return a^2, a^3
           else
               return a^2
           end
       end
f (generic function with 1 method)

julia> using BenchmarkTools

julia> @benchmark f(2, b=false)
BenchmarkTools.Trial: 
  memory estimate:  0 bytes
  allocs estimate:  0
  --------------
  minimum time:     2.095 ns (0.00% GC)
  median time:      2.264 ns (0.00% GC)
  mean time:        2.313 ns (0.00% GC)
  maximum time:     16.213 ns (0.00% GC)
  --------------
  samples:          10000
  evals/sample:     1000

julia> @benchmark f(2, b=$false)
BenchmarkTools.Trial: 
  memory estimate:  0 bytes
  allocs estimate:  0
  --------------
  minimum time:     2.170 ns (0.00% GC)
  median time:      2.536 ns (0.00% GC)
  mean time:        2.501 ns (0.00% GC)
  maximum time:     474.652 ns (0.00% GC)
  --------------
  samples:          10000
  evals/sample:     1000

julia> @benchmark f(2)
BenchmarkTools.Trial: 
  memory estimate:  0 bytes
  allocs estimate:  0
  --------------
  minimum time:     0.029 ns (0.00% GC)
  median time:      0.032 ns (0.00% GC)
  mean time:        0.034 ns (0.00% GC)
  maximum time:     13.271 ns (0.00% GC)
  --------------
  samples:          10000
  evals/sample:     1000

julia> @code_warntype f(2)
Variables
  #self#::Core.Compiler.Const(f, false)
  a::Int64

Body::Int64
1 ─ %1 = Main.:(#f#3)(false, #self#, a)::Int64
└──      return %1

julia> @code_warntype f(2, b=false)
Variables
  #unused#::Core.Compiler.Const(getfield(Main, Symbol("#kw##f"))(), false)
  @_2::NamedTuple{(:b,),Tuple{Bool}}
  @_3::Core.Compiler.Const(f, false)
  a::Int64
  b::Bool
  @_6::Bool

Body::Union{Int64, Tuple{Int64,Int64}}
1 ─ %1  = Base.haskey(@_2, :b)::Core.Compiler.Const(true, false)
│         %1
│         (@_6 = Base.getindex(@_2, :b))
└──       goto #3
2 ─       Core.Compiler.Const(:(@_6 = false), false)
3 ┄       (b = @_6)
│   %7  = (:b,)::Core.Compiler.Const((:b,), false)
│   %8  = Core.apply_type(Core.NamedTuple, %7)::Core.Compiler.Const(NamedTuple{(:b,),T} where T<:Tuple, false)
│   %9  = Base.structdiff(@_2, %8)::Core.Compiler.Const(NamedTuple(), false)
│   %10 = Base.pairs(%9)::Core.Compiler.Const(Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}(), false)
│   %11 = Base.isempty(%10)::Core.Compiler.Const(true, false)
│         %11
└──       goto #5
4 ─       Core.Compiler.Const(:(Base.kwerr(@_2, @_3, a)), false)
5 ┄ %15 = Main.:(#f#3)(b, @_3, a)::Union{Int64, Tuple{Int64,Int64}}
└──       return %15

Use Immutable Types

[Suggestion (2) in the series started in #14]

Usually, when a mutable object is instantiated, it allocates memory on the heap, which, in certain cases where caching is important, like inner loops or within Monte Carlo Tree Search, etc. can make them much slower. I think you could achieve significant speedup by making all of the types that you can immutable, especially things like GaussianBelief that are likely to be created and recreated often :)

Use of multiple dispatch and nothing

I think there are a couple type checks and if statements that could be handled by multiple dispatch as well as some checks for boolean that could use nothing or missing instead.

Examples of lines that could use nothing:
https://github.com/sisl/GaussianFilters.jl/blob/master/src/kf.jl#L57
https://github.com/sisl/GaussianFilters.jl/blob/master/src/kf.jl#L56

Examples of lines that could use multiple dispatch:
https://github.com/sisl/GaussianFilters.jl/blob/master/src/ukf.jl#L27
https://github.com/sisl/GaussianFilters.jl/blob/master/src/ukf.jl#L218
https://github.com/sisl/GaussianFilters.jl/blob/master/src/ukf.jl#L225

All struct fields should be concrete types

If you want to maximize performance, having structs immutable is a good first step. However, to get even better performance, it should be avoided to have non-concrete fields. These things are sometimes quite subtle, e.g. this

W::Symmetric{c}

is a problem because Symmetric has two template parameters, e.g.:

julia> Symmetric(A)
3×3 Symmetric{Int64,Array{Int64,2}}:
 1  2  3
 2  5  6
 3  6  8

julia> typeof(Symmetric(A))
Symmetric{Int64,Array{Int64,2}}

Therefore Symmetric{a} where {a} is not a concrete type and makes uses of the corresponding struct unnecessarily slow. A better thing to do here is to have a bounded type parameter, e.g.:

struct LinearDynamicsModel{TA<:AbstracArray, TA<:AbstractArray,
                TW<:Symmetric} <: DynamicsModel
    A::TA
    B::TB
    W::TW
end

Another instance of this problem is the use of Function as a field type, as Function is also an abstract type:

Allow any `AbstractArray`

Wooohoo!! I am extremely excited about this package!! I have been hoping for a good EKF package that uses ForwardDiff for a while. This really does look like a great start! :) I think that I will get a lot of use out of it in classes I teach and research I do, and, if it's well-executed, I think it will be something that helps people transition from MATLAB to Julia.

Since I'm probably going to rely on this a lot, I am going to give a few suggestions - I hope they don't seem antagonistic; I just want to make sure the package is as good as possible!

The first suggestion is to allow the use of AbstractArrays everywhere. This will allow people to use their own, more efficient, array types. In particular, using StaticArrays for small arrays can completely avoid the allocations that happen with built-in arrays and result in 10x or even 100x speedups.

This is very simple to support - for example GaussianBelief would look like this:

struct GaussianBelief{V<:AbstractVector,M<:AbstractMatrix}
    μ::V
    Σ::M
end

Filters should not implement `Base.step`

To add to the wish-list that @zsunberg started in #14 here is a small suggestion from my side:

IMHO you should not implement Base.step for the filter because the notion of "step" here is different from what Base refers to. By importing Base.step ...

... you add a method to the Base.step function object which has the following documented behavior.

Base.step — Function.

step(r)

Get the step size of an AbstractRange object.

.. which is more of a "stepsize" and arguably a horrible name to be occupied by Base in the first place.

Add Interacting Multiple Models

It would be a nice addition to this package.
Reference:
Bar-Shalom, Yaakov, X. Rong Li, and Thiagalingam Kirubarajan. Estimation with applications to tracking and navigation: theory algorithms and software. John Wiley & Sons, 2004.

Parameterize abstract filters

function predict(b0::GaussianBelief, u::Vector{a},
filter::ExtendedKalmanFilter) where a<:Number
# Motion update
# Linear motion
if filter.d isa LinearDynamicsModel

ExtendedKalmanFilter should be parameterized by DynamicsModel and ObservationModel types.

RNG never used when sampling initial state

In simulate.jl, a random state is sampled from the Gaussian Belief:

    # make initial state
    s0 = rand(rng, b0)

However, the rng passed into the rand() function isn't actually used to seed experiments; in kf_classes.jl, the rand() function is defined as

function Base.rand(rng::AbstractRNG, b::GaussianBelief)
    return b.μ + cholesky(b.Σ).L * randn(size(b.Σ,1))
end

Should there be two extensions of rand() - one that dispatches on the rng for simulation reproducibility and one that does not? For example,

function Base.rand(rng::AbstractRNG, b::GaussianBelief)
    return b.μ + cholesky(b.Σ).L * randn(rng, size(b.Σ,1))
end

function Base.rand(b::GaussianBelief)
    return b.μ + cholesky(b.Σ).L * randn(size(b.Σ,1))
end

Dependency version conflicts

The Julia package registry implicitly added some [compat] information for all the dependencies of this package.
But those are now out of date (but everything just works when circumventing those compat checks).

This is a problem because these implicit dependencies are practically invisible and this leads to version conflicts
that are really annoying to debug, especially for newer users.

From what I found out, the Julia package registry made compat information mandatory for all dependencies (meaning it can't just be removed there).

IMO, the best way to solve this would be to adapt/update the compat info from the registry and integrate it directly in Project.toml, because this would be much more transparent for users and solve current version conflicts!

Alternatively, the "hidden" compat info that the Julia package registry provides for
GaussianFilters.jl could also be changed without touching this project.

TL;DR:
I could do either:

  • pull request for this projects project.toml (preferable)
  • pull request for the registry

Are pull request for additional filter implementations or API extensions acceptable in general (because I have some ideas, but I'm unsure if this project is still live)?

Also, thanks for the amazing work. This project was already super useful to me!

Allow to use custom jacobian implementation.

For prototyping, ForwardDiff.jacobian is very good but it still allocates significantly more memory than a hand-written Jacobian.

F = ForwardDiff.jacobian-> filter.d.f(μ, u), b0.μ)

H = ForwardDiff.jacobian-> filter.o.h(μ, u), bp.μ)

I think the user should be allowed to overload these functions by implementing something like GaussianFilters.linear_obsmodel and GaussianFilters.linear_dynamics. You can provide a default implementation based in ForwardDiff.jacobian of these methods and for the linear cases, you can provide a simple identity. By this means, you can avoid the manual type checking and let dispatch do the magic.

Interface with POMDPs belief updaters

The ParticleFilters.jl package is excellent because it can also interface with the POMDPs belief updaters—I'm wondering if this is something you've intended for this package too.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.