Comments (14)
Not too clutter up the ideas here, but I wanted to make sure that @sglyon and @albop knew what you are up to, as they work on DSLs.
In the longrun, it sure would be nice to have be able to handle difference equations (i.e. a difference rather than a derivative operator) in this sort of DSL for my dream Dolo2.jl which ditches python :-)
from modelingtoolkit.jl.
Hi there! Interesting project indeed and I'm curious about it.
@jlperla : I don't know what you are talking about. Dolo.jl doesn't depend on python. It depends on a serialization file in yaml though. And has some algorithm in common with dolo.py. But you are not suggesting to remove dependence on math are you ? ;-) I'll create an issue to provide an option to write models in julia or with a dsl and assign it to you :-)
from modelingtoolkit.jl.
This isn't the right place to have the discussion, but I will do it anyways!
- What you have written in Dolo is a language agnostic domain specific language (DSL) specified in yaml. You then have an interpreter written in python, julia, or whatever. It is along the same lines as AMPL or Dynare.
- Then anytime you want to add features, you need to add the features first to your language and then to the underlying implementations in julia, python. If something is easy in Julia but not python? Tough. You want to add in some new monte-carlo integration or GPU support or high precision floating points? Add it to the yaml specification and implement it in julia and python. Want to add in a new prior distribution for simulating, or support a new commerical optimizer type? Implement it yourself.
- Hence, python is important because as long as you support it, you need to be held back by it as the lowest common denominator for available features.
What I am suggesting is a radical redesign: build an EDSL (embedded DSL) in Julia itself. Think along the lines of https://github.com/JuliaApproximation/ApproxFun.jl or https://github.com/JuliaOpt/JuMP.jl or https://github.com/yebai/Turing.jl/wiki/Get-started or https://github.com/JuliaStats/GLM.jl
from modelingtoolkit.jl.
Why not have that discussion in the DSL thread?
SciML/DifferentialEquations.jl#261
This is just about derivatives...
from modelingtoolkit.jl.
https://github.com/JuliaDiff/DiffRules.jl might be useful.
It would be nice if this was generally extensible.
from modelingtoolkit.jl.
It would be nice if this was generally extensible.
I think that's the key property IMO. I had an implementation which didn't have this so I scrapped it. I think it can be done by allowing the definition of a dispatch for a function. I almost have it.
from modelingtoolkit.jl.
I put in a basic derivative expansion API but right now it requires a weird dispatch and doesn't take into account the chain rule. We likely need something slightly different here but I don't know what.
from modelingtoolkit.jl.
If the purpose is to calculate Jacobians, Hessians, etc., why not do this at a later stage by feeding Julia expressions to Calculus.jl or similar? That is, before the construction of the ODEFunction
we will have expressions for the rhs of each equation, so we can just generate the corresponding derivatives, no?
from modelingtoolkit.jl.
I think it would be harder to write functions to feed everything to Calculus.jl and then convert back to our AST, rather than to just write a simple mechanism on our AST. We want to do things like index reduction of DAEs, conversion to first-order ODEs, etc. directly on our ASTs and it would be bad in the long-run to hack it onto Calculus.jl (which really doesn't offer much here: one page of derivative rules? We can just copy those over).
from modelingtoolkit.jl.
OK. A dual number system (storing Variable
s and their partials) would be relatively easy if we have a list of derivative rules, since most functions and operators are already defined, but we would need to interpret of the AST (i.e. using Variable
as values).
from modelingtoolkit.jl.
Yeah, that's pretty much what this is doing right now. I just need to find out how to make it do the chain rule, and how to make it easy for users to declare derivatives of their function by some variable (i.e. d sin(x) / dt
we need to have them say it should be cos(x)*dx/dt
) . I'm not sure about the best way to do that when you scale to multiple arguments.
from modelingtoolkit.jl.
But with a dual number you do not need to worry about chain rule. For example, +
and sin
would be something like
+(x::Dual, y::Dual) = Dual(x.val + y.val, x.partials .+ y.partials)
sin(x::Dual) = Dual(sin(x.val), cos(x.val).*x.partials)
So in the end the derivatives are stored in partials
. This allows calculating Jacobians and sensitivity equations, but maybe d sin(x)/dt
needs a different approach, I am not sure.
from modelingtoolkit.jl.
That's just different and for different applications. Dual numbers actually just do the chain rule in the Dual part. But the issue with this is it doesn't build functions. We want the ASTs to do further transformations on. For example, take a 4th order ODE. What's the inverse of the Jacobian of the transformation of the 4th order ODE to a system of 1st order ODEs? You can generate functions of functions to do this, but then it'll be doing all of the calculations at runtime, and it'll have to do the full calculation at every step.
If we instead build the AST, we can build functions that directly compute the result. This is of course much much faster which is why @ode_def
does well.
Of course there's a tradeoff though. Autodiff does scale better to large systems of variables. One way that we avoid the problems here is by allowing our variables to be vectors, and thus we aren't writing out the AST for every component but instead doing the vector calculus. This helps manage the explosion for many sizable models with structure. In addition, with ParameterizedFunctions.jl we saw that this scaled very well with systems that had sparse Jacobians which is quite common in diffeq and nonlinear solving applications.
When the systems get large enough we would need to make use of autodiff. That simply means that some transformations won't be possible, or we just add a separate transform function that does this via autodiff. Additionally, users could tell it to avoid differentiating complicated functions. If you do something like:
@register complicated_function(x)
d_complicated_function(x) = ForwardDiff.derivative(complicated_function,x)
@register d_complicated_function(x)
@derivative complicated_function(x) d_complicated_function(x)
would tell it to not try differentiating past complicated_function
and instead use autodiff on it (and we can make that simpler with a macro of course). This way we can more precisely apply autodiff to smaller functions to make the calculations even cheaper, even when it's required. It also lets users say when 1-D numerical diff could be useful (and our simplification routines can then make it done only once a function).
from modelingtoolkit.jl.
Basic derivatives with automatic chain rule is complete.
from modelingtoolkit.jl.
Related Issues (20)
- Load error with @mtkbuild HOT 7
- Solve step fails with array variables
- Support `remake` to update individual variables/species in new MTK version HOT 7
- Erroe when plotting observed variable(s) HOT 3
- ModelingToolkit does not save array as static arrays (for initial conditions and parameter values)
- Model linearization fails when no parameter is used HOT 5
- Wrapping `connector` subcomponents into a `@connector` generates undsired equations
- Internation functions for conversion between various forms of representing states/parameters etc. (e.g. symbolics or indexes) HOT 5
- `initializesystem` needs to handle independent variable HOT 2
- UndefVarError: `issymbollike` not defined HOT 2
- Use SPEX for Bareiss
- ERROR: LinearAlgebra.SingularException(10) with MTK v8.74+ HOT 4
- @mtkmodel gives "LoadError: Could not parse 0 of component..." HOT 2
- Move `@variables` and `@parameters` defaults to the `ODESystem` `defaults` keyword to support defaults that depend on components. Or add a @defaults block to support out of order defaults manually. HOT 1
- `structural_simplify` does not seem to include `dae_index_lowering` anymore HOT 2
- Cannot access observed variables in NonlinearSolution HOT 1
- Events for @mtkmodel components HOT 5
- "Ignore" option for connectors HOT 1
- Support for simulation of hybrid systems HOT 3
- Inconsistency when passing parameters to sub components
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from modelingtoolkit.jl.