Giter Club home page Giter Club logo

Comments (9)

jrevels avatar jrevels commented on May 16, 2024

Thanks for bringing this to my attention!

It seems the problem stems from calling f on a GradientNumber in another function's scope, which is weird, since f(::GradientNumber) is type-stable when called in global scope:

julia> using ForwardDiff

julia> f(x) = 2x
f (generic function with 1 method)

julia> gn = ForwardDiff.GradientNumber(1.0, 1.0)
ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}(1.0,ForwardDiff.Partials{Float64,Tuple{Float64}}((1.0,)))

julia> @code_warntype f(gn)
Variables:
  x::ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}
  ########tup#7076#7078#7081#7084::Tuple{Float64}
  ########x#7077#7079#7082#7085::Int64
  ######_var0#7080#7083#7086::Tuple{Float64}

Body:
  begin  # none, line 1:
      $(Expr(:boundscheck, false))
      ######_var0#7080#7083#7086 = (top(tuple))((Base.box)(Base.Float64,(Base.mul_float)((Base.getfield)((top(getfield))((top(getfield))(x::ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}},:partials)::ForwardDiff.Partials{Float64,Tuple{Float64}},:data)::Tuple{Float64},1)::Float64,(Base.box)(Float64,(Base.sitofp)(Float64,2)::Any)::Float64)::Any)::Float64)::Tuple{Float64}
      goto 1
      ######_var0#7080#7083#7086 = $(Expr(:boundscheck, :((top(getfield))(Base,:pop))))
      1:
      return $(Expr(:new, ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}, :((Base.box)(Base.Float64,(Base.mul_float)((top(getfield))(x::ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}},:value)::Float64,(Base.box)(Float64,(Base.sitofp)(Float64,2))::Float64))::Float64), :($(Expr(:new, ForwardDiff.Partials{Float64,Tuple{Float64}}, :(######_var0#7080#7083#7086::Tuple{Float64}))))))
  end::ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}

From the above, we know that f on a GradientNumber is type-stable...

julia> a(x) = ForwardDiff.GradientNumber(x, one(x))
a (generic function with 1 method)

julia> @code_warntype a(1.0)
Variables:
  x::Float64
  ##grad#7159::Tuple{Float64}

Body:
  begin  # none, line 1:
      return $(Expr(:new, ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}, :(x::Float64), :($(Expr(:new, ForwardDiff.Partials{Float64,Tuple{Float64}}, :((top(tuple))((Base.box)(Float64,(Base.sitofp)(Float64,1))::Float64)::Tuple{Float64}))))))
  end::ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}

Okay, so construction of the GradientNumber is also type-stable, as expected...

julia> b(f, x) = f(a(x))
b (generic function with 1 method)

julia> @code_warntype b(f, 1.0)
Variables:
  f::F
  x::Float64
  ####grad#7159#7166::Tuple{Float64}

Body:
  begin  # none, line 1:
      return (f::F)($(Expr(:new, ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}, :(x::Float64), :($(Expr(:new, ForwardDiff.Partials{Float64,Tuple{Float64}}, :((top(tuple))((Base.box)(Float64,(Base.sitofp)(Float64,1))::Float64)::Tuple{Float64})))))))::Any
  end::Any

...aaaaand there we go (b is essentially what ForwardDiff does inside of derivative). So it seems like, though it's theoretically type-stable, Julia's type inference can't deduce the type of f(::GradientNumber) when it's called outside of global scope.

This problem persists even if we inline f and throw in a type assertion:

julia> using ForwardDiff

julia> @inline f{T}(x::T) = (2*x)::promote_type(Int, T)
f (generic function with 1 method)

julia> b(f, x) = f(ForwardDiff.GradientNumber(x, one(x)))
b (generic function with 1 method)

julia> @code_warntype b(f, 1.0)
Variables:
  f::F
  x::Float64
  ##grad#7047::Tuple{Float64}

Body:
  begin  # none, line 1:
      return (f::F)($(Expr(:new, ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}, :(x::Float64), :($(Expr(:new, ForwardDiff.Partials{Float64,Tuple{Float64}}, :((top(tuple))((Base.box)(Float64,(Base.sitofp)(Float64,1))::Float64)::Tuple{Float64})))))))::Any
  end::Any

Weird stuff. I'll have to play around with this more.

from forwarddiff.jl.

jrevels avatar jrevels commented on May 16, 2024

Also, note that:

julia> c(x) = f(a(x))
c (generic function with 1 method)

julia> @code_warntype c(1.0)
Variables:
  x::Float64
  ####grad#7586#7589::Tuple{Float64}

Body:
  begin  # none, line 1:
      return (Main.f)($(Expr(:new, ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}, :(x::Float64), :($(Expr(:new, ForwardDiff.Partials{Float64,Tuple{Float64}}, :((top(tuple))((Base.box)(Float64,(Base.sitofp)(Float64,1))::Float64)::Tuple{Float64})))))))::ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}
  end::ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}

So, generally, it appears that this is properly type-inferred:

g(x) = f(x)

But this isn't:

g(f, x) = f(x) 

In practice:

julia> g(f, x) = f(x)
g (generic function with 1 method)

julia> @code_warntype g(sin, 1)
Variables:
  f::F
  x::Int64

Body:
  begin  # none, line 1:
      return (f::F)(x::Int64)::Any
  end::Any

Looks like it's time to go issue hunting in Base...

from forwarddiff.jl.

mlubin avatar mlubin commented on May 16, 2024

Type inference for passed-in functions has never worked and won't work for a while. See http://numericextensionsjl.readthedocs.org/en/latest/functors.html

from forwarddiff.jl.

jrevels avatar jrevels commented on May 16, 2024

@mlubin Do you know if there's an issue to track this in Base? I ran into this a very long time ago but didn't really look into it past "I'll just write inlined loops rather than higher-order functions."

@Ken-B You can get better type inference for the example you gave by using a functor type:

julia> immutable F end

julia> F(x) = 2x
F

julia> @code_warntype ForwardDiff.derivative(F, 1.0)
Variables:
  f::Type{F}
  x::Float64
  ##f#7449::Type{F}
  ##x#7450::Float64
  ###s50#7451::Type{Void}
  ##result#7452::ForwardDiff.ForwardDiffResult{ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}}
  ####grad#7448#7453::Tuple{Float64}

Body:
  begin  # /Users/jarrettrevels/.julia/ForwardDiff/src/api/derivative.jl, line 24:
      GenSym(0) = call(f::Type{F},$(Expr(:new, ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}, :(x::Float64), :($(Expr(:new, ForwardDiff.Partials{Float64,Tuple{Float64}}, :((top(tuple))((Base.box)(Float64,(Base.sitofp)(Float64,1))::Float64)::Tuple{Float64})))))))::ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}
      ##result#7452 = $(Expr(:new, ForwardDiff.ForwardDiffResult{ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}}, GenSym(0)))
      return (ForwardDiff.first)((top(getfield))((top(getfield))((top(getfield))(##result#7452::ForwardDiff.ForwardDiffResult{ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}},:data)::ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}},:partials)::ForwardDiff.Partials{Float64,Tuple{Float64}},:data)::Tuple{Float64})::Float64
  end::Float64

Note that the closure-generating version of derivative will still be poorly inferred:

julia> g = ForwardDiff.derivative(F)
d (generic function with 1 method)

julia> @code_warntype g(1.0)
Variables:
  x::Float64

Body:
  begin  # /Users/jarrettrevels/.julia/ForwardDiff/src/api/derivative.jl, line 43:
      return (ForwardDiff.derivative)(f::Type{T},x::Float64,A)::Any
  end::Any

There might be ways to write functions to accept functor types such that the above would be inferred correctly...I'll look into it. It might be a worthy optimization to make if functor types are necessary to enable correct type inference.

from forwarddiff.jl.

jrevels avatar jrevels commented on May 16, 2024

Just pushed a small patch that enables better type inferencing when using the closure-generating derivative method.

This whole issue comes down to optimizations that haven't been made yet in Base, but at least there is a workaround here for now (i.e. functor types). Closing this as there's nothing more to do here.

from forwarddiff.jl.

Ken-B avatar Ken-B commented on May 16, 2024

Type inference for passed-in functions has never worked and won't work for a while.

@jrevels @mlubin It seems that with the latest functions rewrite (JuliaLang/julia#13412) in Julia 0.5, this type inference for functions does work now (same example from above):

julia> g(f,x) = f(x)
g (generic function with 2 methods)

julia> @code_warntype g(sin, 1.0)
Variables:
  #self#::#g
  f::Base.#sin
  x::Float64

Body:
  begin  # none, line 1: # math.jl, line 137:
      GenSym(1) = (top(ccall))((top(tuple))("sin",Base.Math.libm)::Tuple{ASCIIString,ASCIIString},Base.Math.Float64,(top(svec))(Base.Math.Float64)::SimpleVector,x::Float64,0)::Float64
      return (Base.Math.nan_dom_err)(GenSym(1),x::Float64)::Float64
  end::Float64

Could we reopen this issue?

(ps: thanks for looking at this in the past)

from forwarddiff.jl.

Ken-B avatar Ken-B commented on May 16, 2024

Looking a bit deeper, there also seems to be a type inference regression on Julia 0.5. Taking the first analysis of @jrevels above, we no longer see correct type inference for multiplication by a GradientNumber. It can't figure out the type parameters:

julia> f(x) = 2x
f (generic function with 1 method)

julia> gn = ForwardDiff.GradientNumber(1.0, 1.0)
ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}(1.0,ForwardDiff.Partials{Float64,Tuple{Float64}}((1.0,)))

julia> @code_warntype f(gn)
Variables:
  #self#::#f
  x::ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}}
  ######_var0#8212#8214#8216::Tuple{Float64}
  ########types#8208#8213#8215#8217::Tuple{Type{Int64}}

Body:
  begin  # none, line 1:
      $(Expr(:inbounds, false)) # /Users/ken/.julia/v0.5/ForwardDiff/src/GradientNumber.jl, line 87:
      $(Expr(:inbounds, false)) # /Users/ken/.julia/v0.5/ForwardDiff/src/GradientNumber.jl, line 86:
      $(Expr(:inbounds, false)) # /Users/ken/.julia/v0.5/ForwardDiff/src/ForwardDiff.jl, line 71: # /Users/ken/.julia/v0.5/ForwardDiff/src/ForwardDiff.jl, line 64: # promotion.jl, line 121: # promotion.jl, line 122:
      $(Expr(:inbounds, :pop))
      $(Expr(:inbounds, false)) # /Users/ken/.julia/v0.5/ForwardDiff/src/Partials.jl, line 117:
      $(Expr(:inbounds, false)) # /Users/ken/.julia/v0.5/ForwardDiff/src/Partials.jl, line 191: # /Users/ken/.julia/v0.5/ForwardDiff/src/Partials.jl, line 170:
      $(Expr(:inbounds, true))
      ######_var0#8212#8214#8216 = (top(tuple))((Base.box)(Base.Float64,(Base.mul_float)((Base.getfield)((top(getfield))((top(getfield))(x::ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}},:partials)::ForwardDiff.Partials{Float64,Tuple{Float64}},:data)::Tuple{Float64},1)::Float64,(Base.box)(Float64,(Base.sitofp)(Float64,3)))))::Tuple{Float64}
      goto 1
      ######_var0#8212#8214#8216 = $(Expr(:inbounds, :((top(getfield))(Base,:pop))))
      1: 
      $(Expr(:inbounds, :pop))
      $(Expr(:inbounds, :pop))
      $(Expr(:inbounds, :pop))
      $(Expr(:inbounds, :pop))
      return (ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}})((Base.box)(Base.Float64,(Base.mul_float)((top(getfield))(x::ForwardDiff.GradientNumber{1,Float64,Tuple{Float64}},:value)::Float64,(Base.box)(Float64,(Base.sitofp)(Float64,3)))),$(Expr(:new, ForwardDiff.Partials{Float64,Tuple{Float64}}, :(######_var0#8212#8214#8216::Tuple{Float64}))))::ForwardDiff.GradientNumber{N,T,C}
  end::ForwardDiff.GradientNumber{N,T,C}

Notice the {N,T,C} in the end instead of {1,Float64,Tuple{Float64}} with Julia 0.4.

from forwarddiff.jl.

KristofferC avatar KristofferC commented on May 16, 2024

#75

Also: JuliaLang/julia#14294

from forwarddiff.jl.

Ken-B avatar Ken-B commented on May 16, 2024

Thanks, @KristofferC, I just saw it myself and was about to edit my post. I should have looked before posting, sorry for that noise.

Still, I believe we should reopen this issue now that type inference for passed-in functions works (after #75 gets fixed, of course) and we might get a type-stable derivate.

from forwarddiff.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.