Giter Club home page Giter Club logo

Comments (23)

kdomino avatar kdomino commented on July 29, 2024

Acknowledged, the correction will be done shortly

from datagencopulabased.jl.

lrnv avatar lrnv commented on July 29, 2024

In facts, all over the package you use ::Float64 restrictions. If there is no need, simply change that to ::Real and It'll be a lot better (as it would also allow to Rationals, or really any type of numbers). Overspecifying is not a good thing in Julia :)

In the case you have a technical issue, like an algorithm that you rely on and that is not implement for anything else than Float64 (e.g a C or C++ piece of code ? I did not check all your code), maybe i could help you translate it if you want !

from datagencopulabased.jl.

kdomino avatar kdomino commented on July 29, 2024

for Clayton copula I have still some problems with quantile(::Gamma{BigFloat}, ::BigFloat) yielding
MethodError: no method matching gammainvcdf(::BigFloat, ::BigFloat, ::BigFloat)
I will think how to deal with it

from datagencopulabased.jl.

lrnv avatar lrnv commented on July 29, 2024

Well i think you might have the same problem for other archimedean copulas that uses the same trick (quantile of the distribution corrsponding to the radial part).

Note that the Distribution.jl depends for these on StatsFuns.jl which imports functions from R, which usually calls machine code, hence only Float64.

You could use SpecialFunctions.jl which provides a pure julia implementation. For this one, it's there : https://juliamath.github.io/SpecialFunctions.jl/stable/functions_list/#SpecialFunctions.gamma_inc_inv

It seems like there are some other for other distributions. We could also ask Distributions.jl to depend on SpecialFunctions and not StatsFuns, but i dont know if they will complain to that.

from datagencopulabased.jl.

kdomino avatar kdomino commented on July 29, 2024

thanks

from datagencopulabased.jl.

kdomino avatar kdomino commented on July 29, 2024

For some Archimedean copulas I use a low level implementation, BigFloat works for Gumbel, Frank and AMH and Gubmel Nested. At first glace I have similar problems with gamma_inc_inv(), but I will look deeper on it. I just need some time

from datagencopulabased.jl.

kdomino avatar kdomino commented on July 29, 2024

In SpecialFunctions there is:

1035: gamma_inc_inv(a::AbstractFloat, p::AbstractFloat, q::AbstractFloat) = throw(MethodError(gamma_inc_inv,(a,p,q,"")))

I think they will make it working for ::AbstractFloat in some future

from datagencopulabased.jl.

lrnv avatar lrnv commented on July 29, 2024

Lol i did not see that. So we are stuck ?

from datagencopulabased.jl.

kdomino avatar kdomino commented on July 29, 2024

perhaps I can implement it at low level, but not right now

from datagencopulabased.jl.

lrnv avatar lrnv commented on July 29, 2024

Well the right thing to do is to call the method for abstract float, and let it specialize by itself to the type de numbers that enters your functions.

from datagencopulabased.jl.

kdomino avatar kdomino commented on July 29, 2024

you mean convert types to Float64 if necessary

from datagencopulabased.jl.

lrnv avatar lrnv commented on July 29, 2024

Exactly the opposite. Write code that fails on other types ef floats because the gamma_inc_inv function that's not exists. Then open an issue to the type-provider package to tell them that this special function should be implemented.

from datagencopulabased.jl.

kdomino avatar kdomino commented on July 29, 2024

Ok, I will do it

from datagencopulabased.jl.

kdomino avatar kdomino commented on July 29, 2024

the same is with Distributions.NegativeBinomial() I will think if there is simillar function in SpecialFunctions and do the same

from datagencopulabased.jl.

lrnv avatar lrnv commented on July 29, 2024

I think the SpecialFunctions.jl package is quite standard, and that if the implementation for a certain type does not exist, you should let it fail graciously with the right error. This is IMHo the right thing to do. I'll open an issue on SpecialFunctions.

from datagencopulabased.jl.

kdomino avatar kdomino commented on July 29, 2024

Ok thanks. I need to get dependencies right first.

from datagencopulabased.jl.

kdomino avatar kdomino commented on July 29, 2024

done, tests will fail due to gamma_inc_inv for bigFloats

from datagencopulabased.jl.

lrnv avatar lrnv commented on July 29, 2024

Thinking again about this package from 2 years older and much less julia-begginer eyes, I noticed that you use everywhere type annotations, and even some that are not concrete types :

function arch_gen(copula::String, r::Matrix{T}, θ::Real; rng::AbstractRNG) where T <: Real

This is really bad practice, because θ::Real is abstractly typed: the compiler will not be able to compile the function, and you'll get awful type inference at runtime.

Moreover, if you just remove the type annotation all together, using e.g.,

function arch_gen(copula, r, θ; rng)
     T = eltype(r)

it will :

  • Still work exactly the same way
  • Make your code cleaner
  • Allow to use your code even in ways you did not intend it (e.g., changing the types of the numbers, as we were talking about two years ago)
  • And finaly allow type inference to pass through everything, which will results in much better performances.
  • Allow to pass any type of vectors and matrices to your functions, including StaticArrays.jl or other things…

This is just an example, as I noticed most of your functions are strictly typed. Maybe we could consider untyping 95% of them.


Randomly reading, i also found this one:

function levyel::Real, u1::Real, u2::Real)

This is awful for performance in a function that will be called many times, and just removing the three ::Real will make it a lot faster.


More generally speaking, why do you need so much type restrictions in every function ? Most of your functions do not use these types of restrictions. Just let the functions fail if the user does not provide what it should be (and explain in the doc strings, as you already did, what should be provided).

from datagencopulabased.jl.

kdomino avatar kdomino commented on July 29, 2024

Hello,

Thanks for so long interest. Basically I was taught that type declaration is the Julia custom and speeds up the computation. However in general it may not be a case

I will take a look on the internal functions like levyel(θ::Real, u1::Real, u2::Real) first.

from datagencopulabased.jl.

lrnv avatar lrnv commented on July 29, 2024

Basically I was taught that type declaration is the Julia custom and speeds up the computation.

This is completely false.

Types restrictions (using <:) on method definition does not harm the computational speed. However, these types restrictions do never makes the computations faster: the compiler uses them to choose which method to use, not to compile the method (the method executed is always compiled on the concrete types of the arguments at the call site). On the other hand, too strict restriction might prevent other people to use your functions on their types (granted, this is the main goal sometimes).

Howeve, non-concrete types annotations (using ::T where T is a non-concrete type) are catastrophic for performance : The compiler will never compile the calling site and always redispatch, which is like not-compiling-at-all. Consider the function f(x::Real) = x + 1 and look at it through @code_warntype f(Real(1)), compared to the function f(x) = x+1 and @code_warntype f(1): you will see the problem.

Last, the custom is to not annotate types at all unless really necessary : it allows multiple dispatch to really rise and this maximizes code reuse.

I can make a PR if you want.

from datagencopulabased.jl.

kdomino avatar kdomino commented on July 29, 2024

Ok, so I removed types from sampleunivdists.jl file and pushed. In nightly Julia different seeding for RNG is used, so test from random sampling will fail. However besides it looks to be working. You can fork from here and we can work. It is rather busy time for me, but sometimes I can find a few hours between tasks.

from datagencopulabased.jl.

lrnv avatar lrnv commented on July 29, 2024

I do not now if i forked before of after your change, but i'm on it :)

from datagencopulabased.jl.

kdomino avatar kdomino commented on July 29, 2024

OK, it is good. Feel free to contact me at [email protected]

from datagencopulabased.jl.

Related Issues (9)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.