Giter Club home page Giter Club logo

reactant.jl's People

Contributors

avik-pal avatar github-actions[bot] avatar mofeing avatar pangoraw avatar vchuravy avatar wsmoses avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

reactant.jl's Issues

xla RuntimeError on differentiating simple neural network

using Reactant, Lux, Random # , Optimisers

function loss_fn(model, ps, st, data)
    y, st_new = model(data, ps, st)
    return sum(y) # , st_new, (;)
end

model = Dense(10, 5, tanh)

data = rand(Float32, 10, 3)

ps, st = Lux.setup(Xoshiro(0), model)

reactant_ps = Reactant.make_tracer(IdDict(), ps, (), Reactant.ArrayToConcrete, nothing)
reactant_st = Reactant.make_tracer(IdDict(), st, (), Reactant.ArrayToConcrete, nothing)
reactant_data = Reactant.make_tracer(IdDict(), data, (), Reactant.ArrayToConcrete, nothing)

reactant_loss_fn = Reactant.compile(
    loss_fn, (model, reactant_ps, reactant_st, reactant_data))

function gradient_loss_fn(model, ps, st, data)
    dps = Enzyme.make_zero(ps)
    Enzyme.autodiff(Enzyme.Reverse, loss_fn, Active, Const(model),
        Duplicated(ps, dps), Const(st), Const(data))
    return dps
end


gradient_loss_fn(model, ps, st, data)  # works

reactant_gradient_loss_fn = Reactant.compile(
    gradient_loss_fn, (model, reactant_ps, reactant_st, reactant_data))

# Lux.Experimental.single_train_step(AutoReactant(), loss_fn, data, ts)
terminate called after throwing an instance of 'xla::XlaRuntimeError'
  what():  UNKNOWN: <unknown>:0: error: Reduction function must return a scalar or tuple of scalars but returns shape: f32[5]: 
<unknown>:0: note: see current operation: "func.return"(%11, %10) : (tensor<1x5xf32>, tensor<10x5xf32>) -> ()

Semantically incorrect definition of `Base.similar`

I found this code:

Reactant.jl/src/Reactant.jl

Lines 122 to 124 in 1cd144f

function Base.similar(x::ConcreteRArray{T,Shape,N}, ::Type{T2}) where {T,Shape,N,T2}
return ConcreteRArray{T,Shape,N}(x.data)
end

This code is wrong: Base.similar is not used to convert the eltype of an array, but to create a new array (with the same size and eltype by default, but configurable) but with undefined content.

Instead, Base.convert should be used for this.

Furthermore, this definition just returns the same array?

Track closures' captured variables

From Julia documentation,

A closure is simply a callable object with field names corresponding to captured variables. For example, the following code:

function adder(x)
    return y->x+y
end

is lowered to (roughly):

struct ##1{T}
    x::T
end

(_::##1)(y) = _.x + y

function adder(x)
    return ##1(x)
end

It shouldn't be hard to add support for closures. We just need to correctly track the captured variables in Reactant.make_mlir_fn, right?

Random crashes on printing `ConcreteRArray` content result of a call

I'm randomly running into a crash when printing the output of calling stablehlo.einsum just after calling it.

I believe it might be a problem with buffer synchronization because...

  1. Timings take the same time for different matrix sizes
  2. The error is more prone to appear for larger matrix sizes
  3. The error still happens even if I call XLA.await on the result buffer and XLA.is_ready returns true

Also, the problem might be linked with using struct types, as I've been unable to recreate the error if working directly on arrays.

julia> f(a′, b′)
2048×2048 Tensor{Float64, 2, Reactant.ConcreteRArray{Float64, (2048, 2048), 2}}:
   0.0    233.947  239.041  242.471  244.843  233.684  234.742  242.731  241.478  235.834  246.115  228.468  162.586   153.2       241.457   238.066   238.748   231.795   250.899   Error showing value of type Tensor{Float64, 2, Reactant.ConcreteRArray{Float64, (2048, 2048), 2}}:
ERROR: ArgumentError: can't repeat a string -1 times
Stacktrace:
  [1] repeat(s::String, r::Int64)
    @ Base ./strings/substring.jl:263
  [2] repeat
    @ Base ./strings/substring.jl:260 [inlined]
  [3] print_matrix_row(io::IOContext{Base.TTY}, X::AbstractVecOrMat, A::Vector{Tuple{Int64, Int64}}, i::Int64, cols::Vector{Int64}, sep::String, idxlast::Int64)
    @ Base ./arrayshow.jl:118
  [4] _print_matrix(io::IOContext{Base.TTY}, X::AbstractVecOrMat, pre::String, sep::String, post::String, hdots::String, vdots::String, ddots::String, hmod::Int64, vmod::Int64, rowsA::UnitRange{Int64}, colsA::UnitRange{Int64})
    @ Base ./arrayshow.jl:254
  [5] print_matrix(io::IOContext{Base.TTY}, X::Tensor{Float64, 2, Reactant.ConcreteRArray{Float64, (2048, 2048), 2}}, pre::String, sep::String, post::String, hdots::String, vdots::String, ddots::String, hmod::Int64, vmod::Int64)
    @ Base ./arrayshow.jl:171
  [6] print_matrix(io::IO, X::AbstractVecOrMat, pre::AbstractString, sep::AbstractString, post::AbstractString, hdots::AbstractString, vdots::AbstractString, ddots::AbstractString, hmod::Integer, vmod::Integer)
    @ Base ./arrayshow.jl:171 [inlined]
  [7] print_array
    @ ./arrayshow.jl:358 [inlined]
  [8] show(io::IOContext{Base.TTY}, ::MIME{Symbol("text/plain")}, X::Tensor{Float64, 2, Reactant.ConcreteRArray{Float64, (2048, 2048), 2}})
    @ Base ./arrayshow.jl:399
  [9] (::REPL.var"#55#56"{REPL.REPLDisplay{REPL.LineEditREPL}, MIME{Symbol("text/plain")}, Base.RefValue{Any}})(io::Any)
    @ REPL /gpfs/apps/MN5/GPP/JULIA/1.10.0/INTEL/share/julia/stdlib/v1.10/REPL/src/REPL.jl:273
 [10] with_repl_linfo(f::Any, repl::REPL.LineEditREPL)
    @ REPL /gpfs/apps/MN5/GPP/JULIA/1.10.0/INTEL/share/julia/stdlib/v1.10/REPL/src/REPL.jl:569
 [11] display(d::REPL.REPLDisplay, mime::MIME{Symbol("text/plain")}, x::Any)
    @ REPL /gpfs/apps/MN5/GPP/JULIA/1.10.0/INTEL/share/julia/stdlib/v1.10/REPL/src/REPL.jl:259
 [12] display(d::REPL.REPLDisplay, x::Any)
    @ REPL /gpfs/apps/MN5/GPP/JULIA/1.10.0/INTEL/share/julia/stdlib/v1.10/REPL/src/REPL.jl:278
 [13] display(x::Any)
    @ Base.Multimedia ./multimedia.jl:340
 [14] #invokelatest#2
    @ ./essentials.jl:887 [inlined]
 [15] invokelatest
    @ ./essentials.jl:884 [inlined]
 [16] print_response(errio::IO, response::Any, show_value::Bool, have_color::Bool, specialdisplay::Union{Nothing, AbstractDisplay})
    @ REPL /gpfs/apps/MN5/GPP/JULIA/1.10.0/INTEL/share/julia/stdlib/v1.10/REPL/src/REPL.jl:315
 [17] (::REPL.var"#57#58"{REPL.LineEditREPL, Pair{Any, Bool}, Bool, Bool})(io::Any)
    @ REPL /gpfs/apps/MN5/GPP/JULIA/1.10.0/INTEL/share/julia/stdlib/v1.10/REPL/src/REPL.jl:284
 [18] with_repl_linfo(f::Any, repl::REPL.LineEditREPL)
    @ REPL /gpfs/apps/MN5/GPP/JULIA/1.10.0/INTEL/share/julia/stdlib/v1.10/REPL/src/REPL.jl:569
 [19] print_response(repl::REPL.AbstractREPL, response::Any, show_value::Bool, have_color::Bool)
    @ REPL /gpfs/apps/MN5/GPP/JULIA/1.10.0/INTEL/share/julia/stdlib/v1.10/REPL/src/REPL.jl:282
 [20] (::REPL.var"#do_respond#80"{Bool, Bool, REPL.var"#93#103"{REPL.LineEditREPL, REPL.REPLHistoryProvider}, REPL.LineEditREPL, REPL.LineEdit.Prompt})(s::REPL.LineEdit.MIState, buf::Any, ok::Bool)
    @ REPL /gpfs/apps/MN5/GPP/JULIA/1.10.0/INTEL/share/julia/stdlib/v1.10/REPL/src/REPL.jl:911
 [21] (::VSCodeServer.var"#103#106"{REPL.var"#do_respond#80"{Bool, Bool, REPL.var"#93#103"{REPL.LineEditREPL, REPL.REPLHistoryProvider}, REPL.LineEditREPL, REPL.LineEdit.Prompt}})(mi::REPL.LineEdit.MIState, buf::IOBuffer, ok::Bool)
    @ VSCodeServer ~/.vscode-server/extensions/julialang.language-julia-1.79.2/scripts/packages/VSCodeServer/src/repl.jl:122
 [22] #invokelatest#2
    @ Base ./essentials.jl:887 [inlined]
 [23] invokelatest
    @ Base ./essentials.jl:884 [inlined]
 [24] run_interface(terminal::REPL.Terminals.TextTerminal, m::REPL.LineEdit.ModalInterface, s::REPL.LineEdit.MIState)
    @ REPL.LineEdit /gpfs/apps/MN5/GPP/JULIA/1.10.0/INTEL/share/julia/stdlib/v1.10/REPL/src/LineEdit.jl:2656
 [25] run_frontend(repl::REPL.LineEditREPL, backend::REPL.REPLBackendRef)
    @ REPL /gpfs/apps/MN5/GPP/JULIA/1.10.0/INTEL/share/julia/stdlib/v1.10/REPL/src/REPL.jl:1312
 [26] (::REPL.var"#62#68"{REPL.LineEditREPL, REPL.REPLBackendRef})()
    @ REPL /gpfs/apps/MN5/GPP/JULIA/1.10.0/INTEL/share/julia/stdlib/v1.10/REPL/src/REPL.jl:386

MWE

using Reactant
using Cassette

struct Tensor{T,N,A<:AbstractArray{T,N}} <: AbstractArray{T,N}
           data::A
           inds::Vector{Symbol}
       end

Tensor(data::A, inds::AbstractVector{Symbol}) where {T,N,A<:AbstractArray{T,N}} = Tensor{T,N,A}(data, inds)

Base.parent(x::Tensor) = x.data
Base.size(t::Tensor) = size(parent(t))
Base.@propagate_inbounds Base.getindex(t::Tensor, i...) = getindex(parent(t), i...)

n = 2048
a = Tensor(rand(n,n), [:i, :j]);
b = Tensor(rand(n,n), [:j, :k])

a′ = Tensor(Reactant.ConcreteRArray(a.data), a.inds);
b′ = Tensor(Reactant.ConcreteRArray(b.data), b.inds);

contract(a,b) = a.data * b.data
function contract(a::Tensor{Ta,Na,Aa}, b::Tensor{Tb,Nb,Ab}) where {Ta,Na,Aa<:Reactant.TracedRArray,Tb,Nb,Ab<:Reactant.TracedRArray}
	ia = collect(a.inds)
	ib = collect(b.inds)
	i = (ia, ib)
	
	ic::Vector{Symbol} = setdiff(ia  ib, i)
	
	T = Base.promote_eltype(a, b)
	mlirty = Reactant.MLIR.IR.Type(T)
	
	op_a = parent(a).mlir_data
	op_b = parent(b).mlir_data
	rsize = (size(a.data, 1), size(b.data, 2))
	result_0 = Reactant.MLIR.IR.TensorType(rsize, mlirty)
	einsum_config = Reactant.MLIR.IR.Attribute("$(join(ia)),$(join(ib))->$(join(ic))")
	
	result = Reactant.MLIR.IR.result(Reactant.MLIR.Dialects.stablehlo.einsum(op_a, op_b; result_0, einsum_config))
	
	data = Reactant.TracedRArray{T,rsize,length(ic)}((), result)
	_res = Tensor(data, ic)
	return _res
end

f = Reactant.compile(contract, (a′,b′))

f(a′,b′)

Without Tensor, it seems to work

using Reactant
using Cassette

n = 2048
a = rand(n, n);
b = rand(n, n);

a′ = Reactant.ConcreteRArray(a);
b′ = Reactant.ConcreteRArray(b);

matmul(a,b) = a * b

function matmul(a::Reactant.TracedRArray{Ta,Sa,Na}, b::Reactant.TracedRArray{Tb,Sb,Nb}) where {Ta,Tb,Sa,Sb,Na,Nb}
	T = Base.promote_type(Ta,Tb)
	mlirty = Reactant.MLIR.IR.Type(T)
	rsize = (Sa[1], Sb[2])
	result_0 = Reactant.MLIR.IR.TensorType(rsize, mlirty)
	einsum_config = Reactant.MLIR.IR.Attribute("ij,jk->ik")
	
	result = Reactant.MLIR.IR.result(Reactant.MLIR.Dialects.stablehlo.einsum(a.mlir_data, b.mlir_data; result_0, einsum_config))
	
	return Reactant.TracedRArray{T,rsize,2}((), result)
end

Cassette.overdub(ctx::Reactant.TraceCtx, f::typeof(matmul), args...; kwargs...) = f(args...; kwargs...)

f = Reactant.compile(matmul, (a′,b′))

f(a′,b′)

What is a device to the RArray?

alternatively construct RArrays with a device

I don't see a device argument to ConcreteRArray. Is this currently not implemented?

Also, can we directly pass a Ptr to the data for ConcreteRArray, for example, if we have a CuArray input to a Lux model, I want to wrap it as a ConcreteRArray without paying the cost to copy it.

Incosistent semantics of element-wise application of array functions compared to Julia

In Julia, exp, cos, sin and other functions, when run with array inputs, do not return the element-wise application of the function but the "matrix"-version of those functions. However, Reactant generates the element-wise application of those functions.

In order to keep consistency with Julia semantics, we should change this.

Example

julia> Reactant.@code_hlo sin(c)
Module:
module attributes {transform.with_named_sequence} {
  func.func @main(%arg0: tensor<2x3xf64>) -> tensor<2x3xf64> {
    %0 = stablehlo.sine %arg0 : tensor<2x3xf64>
    return %0 : tensor<2x3xf64>
  }
}

julia> f(x) = sin.(x) # need to define this way because `@code_hlo` doesn't like broadcasting
f (generic function with 1 method)

julia> Reactant.@code_hlo f(c)
Module:
module attributes {transform.with_named_sequence} {
  func.func @main(%arg0: tensor<2x3xf64>) -> tensor<2x3xf64> {
    %0 = stablehlo.sine %arg0 : tensor<2x3xf64>
    return %0 : tensor<2x3xf64>
  }
}

[Eager Mode] Broadcasting `ConcreteRArray`

Not needed for compilation, but if we want to allow eager more operations.

Broadcasting doesn't preserve Array type

julia> using Reactant

julia> Reactant.ConcreteRArray(rand(10, 3))
10×3 Reactant.ConcreteRArray{Float64, (10, 3), 2}:
 0.241899   0.892     0.344609
 0.658603   0.978326  0.0684158
 0.154474   0.506356  0.737901
 0.665978   0.635999  0.312166
 0.955006   0.613347  0.834144
 0.261397   0.361902  0.758361
 0.838547   0.939787  0.191731
 0.336063   0.649798  0.305722
 0.0539672  0.754429  0.444449
 0.964148   0.524124  0.628569

julia> Reactant.ConcreteRArray(rand(10, 3)) .+ 1
10×3 Matrix{Float64}:
 1.60667  1.2144   1.15662
 1.79119  1.18538  1.81092
 1.01788  1.73736  1.10406
 1.88946  1.51043  1.7367
 1.11927  1.46719  1.82236
 1.18602  1.32522  1.6286
 1.12319  1.79284  1.85603
 1.42769  1.33347  1.96626
 1.40952  1.39544  1.52497
 1.23689  1.28314  1.1096

Currently expected since we don't have the broadcast style overloads.

Crash on `Reactant.compile`

When trying to run the test on my aarch64 macOS, I'm getting the following fatal error on line

f=Reactant.compile(fastmax, (a,))

Precompiling project...
  7 dependencies successfully precompiled in 20 seconds. 122 already precompiled.
     Testing Running tests...
Test Summary: | Pass  Total  Time
Layout        |    5      5  0.1s
F0504 21:01:25.451342 38805431 simple_orc_jit.cc:310] Check failed: target_machine != nullptr 
*** Check failure stack trace: ***
    @        0x1601222b0  absl::lts_20230802::log_internal::LogMessage::SendToLog()
    @        0x160121cac  absl::lts_20230802::log_internal::LogMessage::Flush()
    @        0x16012268c  absl::lts_20230802::log_internal::LogMessageFatal::~LogMessageFatal()
    @        0x1601226ac  absl::lts_20230802::log_internal::LogMessageFatal::~LogMessageFatal()
    @        0x15c6435c0  xla::cpu::SimpleOrcJIT::InferTargetMachineForJIT()
    @        0x15be5668c  xla::cpu::CpuCompiler::RunHloPasses()
    @        0x15be2c130  xla::TfrtCpuClient::Compile()
    @        0x15be2ceb8  xla::TfrtCpuClient::Compile()
    @        0x15b960608  ClientCompile
    @        0x122db1ccc  0x0
    @        0x122db27d0  0x0
    @        0x122b80ce4  0x0
    @        0x122b880bc  0x0
    @        0x1015c23b4  do_call
    @        0x1015c08ec  eval_body
    @        0x1015c0e08  eval_body
    @        0x1015c0e08  eval_body
    @        0x1015c11d0  jl_interpret_toplevel_thunk
    @        0x1015d96f8  jl_toplevel_eval_flex
    @        0x1015d94c0  jl_toplevel_eval_flex
    @        0x1015da3a8  ijl_toplevel_eval_in
    @        0x130aa1ab8  japi1_include_string_81107.4
    @        0x1015a7cb4  ijl_apply_generic
    @        0x13125fcbc  japi1__include_81115.4
    @        0x1015c23b4  do_call
    @        0x1015c0bb0  eval_body
    @        0x1015c11d0  jl_interpret_toplevel_thunk
    @        0x1015d96f8  jl_toplevel_eval_flex
    @        0x1015d94c0  jl_toplevel_eval_flex
    @        0x1015da3a8  ijl_toplevel_eval_in
    @        0x130aa1ab8  japi1_include_string_81107.4
    @        0x1015a7cb4  ijl_apply_generic
    @        0x13125fcbc  japi1__include_81115.4
    @        0x1015c23b4  do_call
    @        0x1015c0bb0  eval_body
    @        0x1015c11d0  jl_interpret_toplevel_thunk
    @        0x1015d96f8  jl_toplevel_eval_flex
    @        0x1015d94c0  jl_toplevel_eval_flex
    @        0x1015da3a8  ijl_toplevel_eval_in
    @        0x13172133c  jlplt_ijl_toplevel_eval_in_25935.3
    @        0x1016057b0  true_main
    @        0x1016056a4  jl_repl_entrypoint
    @        0x100b93f6c  main
    @        0x18d02a0e0  start

[46954] signal (6): Abort trap: 6
in expression starting at /Users/mofeing/Developer/Reactant.jl/test/basic.jl:7
__pthread_kill at /usr/lib/system/libsystem_kernel.dylib (unknown line)
Allocations: 5647750 (Pool: 5642220; Big: 5530); GC: 7
ERROR: Package Reactant errored during testing (received signal: 6)

I'm opening an issue because I don't see this error on CI.

Crash on precompiling `ReactantNNlibExt`

When trying to run the tests, I get the following error:

Precompiling project...
  ✗ Reactant  ReactantNNlibExt
  1 dependency successfully precompiled in 7 seconds. 127 already precompiled.
  1 dependency had output during precompilation:
┌ Reactant
│  error in running finalizer: TypeError(func=Symbol("ccall: first argument not a pointer or valid constant expression"), context="", expected=Ptr{T} where T, got=(:PjRtBufferFree, 0x000000008fc4a150))
│  ijl_type_error_rt at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-HL2F7YQ3XH.0/build/default-honeycrisp-HL2F7YQ3XH-0/julialang/julia-release-1-dot-10/src/rtutils.c:119
│  ijl_type_error at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-HL2F7YQ3XH.0/build/default-honeycrisp-HL2F7YQ3XH-0/julialang/julia-release-1-dot-10/src/rtutils.c:127#5 at /Users/mofeing/Developer/Reactant.jl/src/XLA.jl:114
│  unknown function (ip: 0x105c080e3)
│  _jl_invoke at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-HL2F7YQ3XH.0/build/default-honeycrisp-HL2F7YQ3XH-0/julialang/julia-release-1-dot-10/src/gf.c:0 [inlined]
│  ijl_apply_generic at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-HL2F7YQ3XH.0/build/default-honeycrisp-HL2F7YQ3XH-0/julialang/julia-release-1-dot-10/src/gf.c:3077
│  run_finalizer at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-HL2F7YQ3XH.0/build/default-honeycrisp-HL2F7YQ3XH-0/julialang/julia-release-1-dot-10/src/gc.c:318
│  jl_gc_run_finalizers_in_list at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-HL2F7YQ3XH.0/build/default-honeycrisp-HL2F7YQ3XH-0/julialang/julia-release-1-dot-10/src/gc.c:408
│  run_finalizers at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-HL2F7YQ3XH.0/build/default-honeycrisp-HL2F7YQ3XH-0/julialang/julia-release-1-dot-10/src/gc.c:454
│  ijl_atexit_hook at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-HL2F7YQ3XH.0/build/default-honeycrisp-HL2F7YQ3XH-0/julialang/julia-release-1-dot-10/src/init.c:299
│  jl_repl_entrypoint at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-HL2F7YQ3XH.0/build/default-honeycrisp-HL2F7YQ3XH-0/julialang/julia-release-1-dot-10/src/jlapi.c:732
└  
     Testing Running tests...
libc++abi: terminating due to uncaught exception of type xla::XlaRuntimeError: NOT_FOUND: Could not find registered platform with name: "cuda". Available platform names are: 

[42937] signal (6): Abort trap: 6
in expression starting at /Users/mofeing/Developer/Reactant.jl/test/runtests.jl:1
__pthread_kill at /usr/lib/system/libsystem_kernel.dylib (unknown line)
Allocations: 2908 (Pool: 2900; Big: 8); GC: 0
ERROR: Package Reactant errored during testing (received signal: 6)

Fix precompilation in macOS

When running the example in the README, it crashes with the following error.

julia> f=Reactant.compile(sinsum_add, (input1,input2))
F0626 06:23:06.397895 67363613 simple_orc_jit.cc:311] Check failed: target_machine != nullptr 
*** Check failure stack trace: ***
    @        0x30f082cc0  absl::lts_20230802::log_internal::LogMessage::SendToLog()
    @        0x30f0826bc  absl::lts_20230802::log_internal::LogMessage::Flush()
    @        0x30f08309c  absl::lts_20230802::log_internal::LogMessageFatal::~LogMessageFatal()
    @        0x30f0830bc  absl::lts_20230802::log_internal::LogMessageFatal::~LogMessageFatal()
    @        0x30b62c0b0  xla::cpu::SimpleOrcJIT::InferTargetMachineForJIT()
    @        0x30b5976ec  xla::cpu::CpuCompiler::RunHloPasses()
    @        0x30b4328f0  xla::TfrtCpuClient::Compile()
    @        0x30b433678  xla::TfrtCpuClient::Compile()
    @        0x30af44610  ClientCompile
    @        0x1463a015c  0x0
    @        0x135b5ccd0  0x0
    @        0x135b840bc  0x0
    @        0x10309a484  do_call
    @        0x103098c80  eval_body
    @        0x1030992a0  jl_interpret_toplevel_thunk
    @        0x1030b17e4  jl_toplevel_eval_flex
    @        0x1030b15ac  jl_toplevel_eval_flex
    @        0x1030b15ac  jl_toplevel_eval_flex
    @        0x1030b15ac  jl_toplevel_eval_flex
    @        0x1030b2494  ijl_toplevel_eval_in
    @        0x1382f2268  japi1_eval_user_input_91784.4
    @        0x136bb6474  julia_YY.run_replYY.59_91871.3
    @        0x13606e5d4  julia_YY.1013_82830.3
    @        0x10308e1c8  jl_f__call_latest
    @        0x138301228  julia_run_main_repl_82800.4
    @        0x1030dd910  true_main
    @        0x1030dd804  jl_repl_entrypoint
    @        0x10266bf6c  main
    @        0x1951020e0  start

[92763] signal (6): Abort trap: 6
in expression starting at REPL[5]:1
__pthread_kill at /usr/lib/system/libsystem_kernel.dylib (unknown line)
Allocations: 8223320 (Pool: 8215026; Big: 8294); GC: 13
fish: Job 1, 'julia --project=.' terminated by signal SIGABRT (Abort)

The error seems to originate from XLA's and Julia's LLVM symbols getting mixed.

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

Macro to pretty print the stable hlo code

It is something similar to @code_llvm but that prints the StableHLO-generated code.

This is probably not hard; it is already available inside the compile function but it would be nice to have it.

Define `similar` for Reactant.ConcreteRArray

The inplace Lux functions currently fail because it constructs a matrix

using Reactant, Lux, Random
using Test

# Generate some data for the XOR problem: vectors of length 2, as columns of a matrix:
noisy = rand(Float32, 2, 1000)                                        # 2×1000 Matrix{Float32}
truth = [xor(col[1] > 0.5, col[2] > 0.5) for col in eachcol(noisy)]   # 1000-element Vector{Bool}

# Define our model, a multi-layer perceptron with one hidden layer of size 3:
model = Chain(Dense(2 => 3, tanh),   # activation function inside layer
    BatchNorm(3), Dense(3 => 2), softmax)
ps, st = Lux.setup(Xoshiro(123), model)

using BenchmarkTools

origout, _ = model(noisy, ps, st)
@show origout[3]
@btime model($noisy, $ps, $st)

cmodel = Reactant.make_tracer(IdDict(), model, (), Reactant.ArrayToConcrete, nothing)
cps = Reactant.make_tracer(IdDict(), ps, (), Reactant.ArrayToConcrete, nothing)
cst = Reactant.make_tracer(IdDict(), st, (), Reactant.ArrayToConcrete, nothing)
cnoisy = Reactant.ConcreteRArray(noisy)

f = Reactant.compile((a, b, c, d) -> a(b, c, d), (cmodel, cnoisy, cps, CST))

Minimal version

using Reactant

noisy = rand(Float32, 2, 1000)
cnoisy = Reactant.ConcreteRArray(noisy)
similar(cnoisy)  # Matrix

Display releases on GitHub?

Since this package is evolving fast, having releases show up on GitHub (not just in the General registry) would be very useful, especially to track changes

Pipeline Error Reactant + Enzyme

I reduced the Lux version down to just Enzyme + Reactant

using Reactant, Enzyme

function simple_reduce(w, x)
    y = sum(w * x; dims=1)
    return sum(y)
end

w = Reactant.ConcreteRArray(randn(Float32, 10, 3))
x = Reactant.ConcreteRArray(randn(Float32, 3, 5))

simple_reduce_xla = Reactant.compile(simple_reduce, (w, x))

simple_reduce_xla(w, x) # Works

function simple_reduce_grad(w, x)
    dw = Enzyme.make_zero(w)
    dx = Enzyme.make_zero(x)
    Enzyme.autodiff(
        Enzyme.Reverse, simple_reduce, Active, Duplicated(w, dw), Duplicated(x, dx))
    return dw, dx
end

simple_reduce_grad_xla = Reactant.compile(simple_reduce_grad, (w, x))

simple_reduce_grad_xla(w, x)
error: size of operand dimension 0 (5) is not equal to 1 or size of result dimension 0 (10)
Pipeline failed

If you replace the function with

function simple_reduce(w, x)
    return sum(w * x)
end

I think it is stemming from the reduce operation being generated incorrectly see LuxDL/Lux.jl#665 (comment) (couldn't reduce that yet)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.