Comments (3)
The problem is that inv
doesn't accept sufficiently exotic AbstractMatrix
types:
julia> HH = view(P,:,[1,2,4],:)
3×3×100 view(lazystack(::Vector{Matrix{Float64}}), :, [1, 2, 4], :) ...
julia> inv(HH[:,:,1]) # inv(::Matrix)
3×3 Matrix{Float64}:
6.7336 -18.7875 21.6366
-4.88027 1.76205 0.148484
2.17953 -14.5834 16.9896
julia> inv(@view HH[:,:,1]) # same error as @cast
ERROR: MethodError: no method matching factorize(::SubArray{Float64, 2, LazyStack.Stacked{...
julia> @cast invH[i,j,k] := inv(HH[:,:,k])[i,j] lazy=false
3×3×100 Array{Float64, 3}:
@cast
is quite lazy by default, which saves allocations, and scalar broadcasting tends to work fine with this. But after a few operations you can get a very complicated weird type, and this may be slow to deal with, or as here, fail completely. lazy=false
should opt out of all of this.
from tensorcast.jl.
Thanks for the response.
What do you mean by "slow to deal with?" I assumed that it would not affect the runtime to resolve the type (after the first run).
from tensorcast.jl.
Yes complicated types alone aren't a problem. But complicated array types are, partly because accessing their elements involves more steps (like translating indices for views, and following different pointers for slices) and partly because they prevent the use of optimised routines which expect a simple Array:
julia> @btime sum(x) setup=(x=randn(30,30));
min 247.120 ns, mean 249.191 ns (0 allocations)
julia> @btime x*x setup=(x=randn(30,30));
min 1.483 μs, mean 3.319 μs (1 allocation, 7.19 KiB)
julia> using LazyStack
julia> @btime sum(x) setup=(x=view(lazystack([randn(30) for _ in 1:30]), 30:-1:1, :));
min 4.411 μs, mean 4.446 μs (0 allocations) # complicated to access each element
julia> @btime x*x setup=(x=lazystack([randn(30) for _ in 1:30])); # won't use BLAS
min 13.375 μs, mean 16.052 μs (3 allocations, 27.69 KiB)
I don't think there's a great general rule for when such costs outweigh the costs of making an array. This package's defaults are quite lazy, picturing things like this:
julia> @btime abs2.(Compat.stack(x)) setup=(x=[randn(30) for _ in 1:30]);
min 900.615 ns, mean 3.502 μs (2 allocations, 14.38 KiB)
julia> @btime abs2.(lazystack(x)) setup=(x=[randn(30) for _ in 1:30]);
min 549.690 ns, mean 1.710 μs (1 allocation, 7.19 KiB)
from tensorcast.jl.
Related Issues (20)
- Smarter repeat?
- Use vector of indexes HOT 3
- World Age Issues with TensorCast calls from pyJulia HOT 3
- Exploit `einops` for docs / tests / advertising
- Slicing an array produces `Vector{<:SubArray}`, hence allocates HOT 4
- Indices which run over a shorter range than `axes(A,d)` HOT 9
- Extra singleton dimension appears during casting HOT 2
- @cast for remaining n-1 dimensions, e.g, add noise to a tensor of column vectors HOT 1
- @cast works for hcat but not for [ ] HOT 1
- @cast on functions with tuples as lvalues HOT 3
- Concatenation / forced indexing HOT 1
- @cast into an SMatrix HOT 3
- Interpolation & scope HOT 1
- Hope to close the both side indices check HOT 2
- Performance of nested reductions HOT 1
- Expr -> Symbol MethodError when combining mapslices and reshapes HOT 1
- Support for AxisKeys
- array arguments in @cast indexing HOT 1
- Error in summation within @reduce HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from tensorcast.jl.