juliaperf / likwid.jl Goto Github PK
View Code? Open in Web Editor NEWJulia wrapper for the performance monitoring and benchmarking suite LIKWID.
Home Page: https://juliaperf.github.io/LIKWID.jl/dev/
License: MIT License
Julia wrapper for the performance monitoring and benchmarking suite LIKWID.
Home Page: https://juliaperf.github.io/LIKWID.jl/dev/
License: MIT License
As far as I can see, there is no group switch in the example code at https://juliaperf.github.io/LIKWID.jl/dev/examples/perfmon/, that's why only the first group gets measured (|
is the group delimiter). You would need to run your benchmark code twice, once for the first group, then "switch groups" and run your benchmark again for the second group measurement.
The two groups have different gid
s, so when reading the results/metrics, you have to specify the right gid
.
Originally posted by @TomTheBear in #28 (comment)
julia> using CUDA
julia> CUDA.functional()
true
julia> using LIKWID
julia> LIKWID.gpusupport()
true
julia> NvMon.init([0])
true
julia> gid = NvMon.add_event_set("FLOPS_DP")
1
julia> NvMon.setup_counters(gid)
true
julia> NvMon.start_counters()
true
julia> NvMon.stop_counters()
signal (11): Segmentation fault
in expression starting at REPL[13]:1
nvmon_perfworks_stopCounters at /upb/departments/pc2/groups/pc2-mitarbeiter/bauerc/n2/easybuild/software/likwid-gpu/5.2.1-GCC-11.2.0/lib/
liblikwid.so (unknown line)
nvmon_stopCounters at /upb/departments/pc2/groups/pc2-mitarbeiter/bauerc/n2/easybuild/software/likwid-gpu/5.2.1-GCC-11.2.0/lib/liblikwid.
so (unknown line)
nvmon_stopCounters at /scratch/pc2-mitarbeiter/bauerc/devel/LIKWID.jl/src/LibLikwid.jl:1924 [inlined]
stop_counters at /scratch/pc2-mitarbeiter/bauerc/devel/LIKWID.jl/src/nvmon.jl:212
unknown function (ip: 0x1523ee1247b4)
_jl_invoke at /buildworker/worker/package_linux64/build/src/gf.c:2247 [inlined]
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2429
jl_apply at /buildworker/worker/package_linux64/build/src/julia.h:1788 [inlined]
do_call at /buildworker/worker/package_linux64/build/src/interpreter.c:126
eval_value at /buildworker/worker/package_linux64/build/src/interpreter.c:215
eval_stmt_value at /buildworker/worker/package_linux64/build/src/interpreter.c:166 [inlined]
eval_body at /buildworker/worker/package_linux64/build/src/interpreter.c:587
jl_interpret_toplevel_thunk at /buildworker/worker/package_linux64/build/src/interpreter.c:731
jl_toplevel_eval_flex at /buildworker/worker/package_linux64/build/src/toplevel.c:885
jl_toplevel_eval_flex at /buildworker/worker/package_linux64/build/src/toplevel.c:830
eval_body at /buildworker/worker/package_linux64/build/src/interpreter.c:550
eval_body at /buildworker/worker/package_linux64/build/src/interpreter.c:516
jl_interpret_toplevel_thunk at /buildworker/worker/package_linux64/build/src/interpreter.c:731
jl_toplevel_eval_flex at /buildworker/worker/package_linux64/build/src/toplevel.c:885
jl_toplevel_eval_in at /buildworker/worker/package_linux64/build/src/toplevel.c:944
eval at ./boot.jl:373 [inlined]
eval_user_input at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.7/REPL/src/REPL.jl:150
repl_backend_loop at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.7/REPL/src/REPL.jl:246
start_repl_backend at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.7/REPL/src/REPL.jl:231
#run_repl#47 at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.7/REPL/src/REPL.jl:364
run_repl at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.7/REPL/src/REPL.jl:351
_jl_invoke at /buildworker/worker/package_linux64/build/src/gf.c:2247 [inlined]
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2429
#930 at ./client.jl:394
jfptr_YY.930_45169.clone_1 at /opt/software/pc2/EB-SW/software/JuliaHPC/1.7.2-fosscuda-2022a-linux-x86_64/lib/julia/sys.so (unknown line)
_jl_invoke at /buildworker/worker/package_linux64/build/src/gf.c:2247 [inlined]
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2429
jl_apply at /buildworker/worker/package_linux64/build/src/julia.h:1788 [inlined]
jl_f__call_latest at /buildworker/worker/package_linux64/build/src/builtins.c:757
#invokelatest#2 at ./essentials.jl:716 [inlined]
invokelatest at ./essentials.jl:714 [inlined]
run_main_repl at ./client.jl:379
exec_options at ./client.jl:309
_start at ./client.jl:495
jfptr__start_38732.clone_1 at /opt/software/pc2/EB-SW/software/JuliaHPC/1.7.2-fosscuda-2022a-linux-x86_64/lib/julia/sys.so (unknown line)
_jl_invoke at /buildworker/worker/package_linux64/build/src/gf.c:2247 [inlined]
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2429
jl_apply at /buildworker/worker/package_linux64/build/src/julia.h:1788 [inlined]
true_main at /buildworker/worker/package_linux64/build/src/jlapi.c:559
jl_repl_entrypoint at /buildworker/worker/package_linux64/build/src/jlapi.c:701
main at julia (unknown line)
__libc_start_main at /lib64/libc.so.6 (unknown line)
unknown function (ip: 0x400808)
Allocations: 17039085 (Pool: 17034285; Big: 4800); GC: 11
Segmentation fault (core dumped)
To reproduce / for copy-paste:
using LIKWID
LIKWID.gpusupport()
NvMon.init([0])
gid = NvMon.add_event_set("FLOPS_DP")
NvMon.setup_counters(gid)
NvMon.start_counters()
NvMon.stop_counters()
Not sure yet whether this is an issue with likwid, LIKWID.jl, or the likwid installation on Noctua 2.
(cc @TomTheBear)
Like @perfmon
but only analyses marked regions.
EDIT - The particular test cases work fine if I started Julia as
likwid-perfctr -C 0 -g FLOPS_SP -G 0 -W FLOPS_SP -m julia
However, the tests still fail (log given as a post below)
I am getting the following issues when trying to use LIKWID. Starting the julia REPL as usual with $ julia
(which might be wrong for LIKWID.jl
?), executing ] test LIKWID
, I get
Testing Running tests...
[ Info: No CUDA/GPU found.
Libdl.find_library("libcuda") = ""
filter(contains("cuda"), lowercase.(Libdl.dllist())) = String[]
[ Info: CUDA.versioninfo():
┌ Warning: Unsuccessful!
└ @ Main ~/.julia/packages/LIKWID/pbxmY/test/runtests.jl:27
ErrorException("could not load symbol \"cuDriverGetVersion\":\n/opt/julia-1.7.1/bin/julia: undefined symbol: cuDriverGetVersion")
┌ Error: Exception while generating log record in module Main at /home/arpit/.julia/packages/LIKWID/pbxmY/test/runtests.jl:34
│ exception =
│ could not load library "liblikwid"
│ liblikwid.so: cannot open shared object file: No such file or directory
│ Stacktrace:
│ [1] dlopen(s::String, flags::UInt32; throw_error::Bool)
│ @ Base.Libc.Libdl ./libdl.jl:117
│ [2] dlopen (repeats 2 times)
│ @ ./libdl.jl:117 [inlined]
│ [3] dlopen(f::LIKWID.var"#8#9", args::String; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
│ @ Base.Libc.Libdl ./libdl.jl:144
│ [4] dlopen
│ @ ./libdl.jl:142 [inlined]
│ [5] _check_likwid_gpusupport
│ @ ~/.julia/packages/LIKWID/pbxmY/src/misc.jl:51 [inlined]
│ [6] gpusupport()
│ @ LIKWID ~/.julia/packages/LIKWID/pbxmY/src/misc.jl:45
│ [7] top-level scope
│ @ logging.jl:361
│ [8] include(fname::String)
│ @ Base.MainInclude ./client.jl:451
│ [9] top-level scope
│ @ none:6
│ [10] eval
│ @ ./boot.jl:373 [inlined]
│ [11] exec_options(opts::Base.JLOptions)
│ @ Base ./client.jl:268
│ [12] _start()
│ @ Base ./client.jl:495
└ @ Main ~/.julia/packages/LIKWID/pbxmY/test/runtests.jl:34
ERROR: LoadError: could not load library "liblikwid"
liblikwid.so: cannot open shared object file: No such file or directory
Stacktrace:
[1] dlopen(s::String, flags::UInt32; throw_error::Bool)
@ Base.Libc.Libdl ./libdl.jl:117
[2] dlopen (repeats 2 times)
@ ./libdl.jl:117 [inlined]
[3] dlopen(f::LIKWID.var"#8#9", args::String; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ Base.Libc.Libdl ./libdl.jl:144
[4] dlopen
@ ./libdl.jl:142 [inlined]
[5] _check_likwid_gpusupport
@ ~/.julia/packages/LIKWID/pbxmY/src/misc.jl:51 [inlined]
[6] gpusupport()
@ LIKWID ~/.julia/packages/LIKWID/pbxmY/src/misc.jl:45
[7] top-level scope
@ ~/.julia/packages/LIKWID/pbxmY/test/runtests.jl:37
[8] include(fname::String)
@ Base.MainInclude ./client.jl:451
[9] top-level scope
@ none:6
in expression starting at /home/arpit/.julia/packages/LIKWID/pbxmY/test/runtests.jl:37
ERROR: Package LIKWID errored during testing
I am able to run perfctr.jl
and saxpy.jl
(removing GPU part atm) correctly, i.e., with commands like
likwid-perfctr -C 0 -g FLOPS_SP -G 0 -W FLOPS_SP -m julia --project=. saxpy.jl
, and also able to run everything I tried on https://github.com/RRZE-HPC/likwid. Another place I get the same error is when I try, within REPL,
julia> topo = LIKWID.get_cpu_topology()
ERROR: could not load library "liblikwid"
liblikwid.so: cannot open shared object file: No such file or directory
Stacktrace:
[1] topology_init
@ ~/.julia/packages/LIKWID/pbxmY/src/LibLikwid.jl:712 [inlined]
[2] init_topology()
@ LIKWID ~/.julia/packages/LIKWID/pbxmY/src/topology.jl:3
[3] get_cpu_topology()
@ LIKWID ~/.julia/packages/LIKWID/pbxmY/src/topology.jl:111
[4] top-level scope
@ REPL[3]:1
while running $ likwid-topology
on shell gives the correct output. Here's my versioninfo()
output
Julia Version 1.7.1
Commit ac5cc99908 (2021-12-22 19:35 UTC)
Platform Info:
OS: Linux (x86_64-pc-linux-gnu)
CPU: Intel(R) Core(TM) i7-3770 CPU @ 3.40GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-12.0.1 (ORCJIT, ivybridge)
Would be nice to have.
See https://github.com/RRZE-HPC/likwid/wiki/LIKWID-and-Nvidia-GPUs and https://github.com/RRZE-HPC/likwid/wiki/TutorialNvidiaGPUs.
https://github.com/RRZE-HPC/pylikwid
"Priority" list:
likwid_markerGetRegion(regionTag, nevents, events, time, count)
(see pylikwid.markergetregion
) to query results interactively?Including twice this code at the REPL gives me a segfault:
using LIKWID
using LinearAlgebra
a = randn(1000, 1000)
function f()
a*a
end
peak = LinearAlgebra.peakflops() # FLOP/s
using STREAMBenchmark
bw = memory_bandwidth(write_allocate=false)[1] * (1024^2) #byte/s
# intensity is FLOP per byte
roof(intensity) = min(peak, intensity*bw)
NUM_THREADS = 1
cpustr = join(collect(0:NUM_THREADS-1), ",")
LIKWID.LIKWID_THREADS(cpustr)
# the location the marker file will be stored
LIKWID.LIKWID_FILEPATH(joinpath(@__DIR__, "likwid_marker.out"))
# Use the access daemon
LIKWID.LIKWID_MODE(1)
# Overwrite registers (if they are in use)
LIKWID.LIKWID_FORCE(true)
# Debug level
LIKWID.LIKWID_DEBUG(0)
# Events to measure
LIKWID.LIKWID_EVENTS("FLOPS_DP|L2|INSTR_RETIRED_ANY:FIXC0")
using LIKWID: PerfMon
Marker.init_nothreads()
PerfMon.init()
Marker.registerregion("total")
Marker.startregion("total")
for _ = 1:100
f()
end
Marker.stopregion("total")
nevents, events, time, count = Marker.getregion("total")
Marker.close()
https://github.com/SciML/SciMLStyle
Example ThreadPinning.jl, see here: https://github.com/carstenbauer/ThreadPinning.jl/tree/main/test
When likwid fails to identify the CPU it returns an only partially initialized cpu info stuct. However during topology resolution (here
Lines 79 to 80 in 5d2cd59
Reproducible on master for likwid and LIKWID.jl, as well as for both latest releases.
using LIKWID
using CUDA
LIKWID.init_topology_gpu()
N = 10_000
a = 3.141f0 # Float32
x = CUDA.rand(Float32, N)
y = CUDA.rand(Float32, N)
z = CUDA.zeros(Float32, N)
saxpy!(z, a, x, y) = z .= a .* x .+ y
saxpy!(z, a, x, y); # warmup
metrics, events = @nvmon "FLOPS_SP" saxpy!(z, a, x, y);
gives
julia> metrics, events = @nvmon "FLOPS_SP" saxpy!(z, a, x, y);
Group: FLOPS_SP
┌────────────────────────────────────────────────────┬─────────┐
│ Event │ GPU 1 │
├────────────────────────────────────────────────────┼─────────┤
│ SMSP_SASS_THREAD_INST_EXECUTED_OP_FADD_PRED_ON_SUM │ 0.0 │
│ SMSP_SASS_THREAD_INST_EXECUTED_OP_FMUL_PRED_ON_SUM │ 0.0 │
│ SMSP_SASS_THREAD_INST_EXECUTED_OP_FFMA_PRED_ON_SUM │ 10000.0 │
└────────────────────────────────────────────────────┴─────────┘
┌─────────────────────┬────────────┐
│ Metric │ GPU 1 │
├─────────────────────┼────────────┤
│ Runtime (RDTSC) [s] │ 1.84467e10 │
│ SP [MFLOP/s] │ 1.0842e-12 │
└─────────────────────┴────────────┘
The metric values don't make sense!?
https://juliaperf.github.io/LIKWID.jl/stable/
Fix duplicate output of print_supported_cpus
.
Among other things, to be able to hide all printing under ] test
.
Hi there,
I'm having some problems with running LIKWID.jl on my cluster. This is what I've done so far with the native backend:
First I tried running @perfmon
, it segfaulted because likwid v4.0.0 was installed and the LIKWID.jl bindings were generated against a newer version, so I asked the admins to update it to 5.2.2.
Tried running @perfmon
again, this time I see a bunch of sigabrts but the Julia process stays alive and the command seemingly completes successfully (though I'm not sure I trust the results). From the stacktraces (see below) I guessed that the issue is this call in likwid to fork(): https://github.com/RRZE-HPC/likwid/blob/v5.2.2/src/access_x86_rdpmc.c#L101. I figured that the sigabrts are coming from the child processes, and that's why the main Julia process doesn't die.
$ LD_LIBRARY_PATH=/home/wrigleyj/likwid_dist/lib:$LD_LIBRARY_PATH PATH=/home/wrigleyj/likwid_dist/bin:$PATH julia
[ Info: Precompiling AbbreviatedStackTraces [ac637c84-cc71-43bf-9c33-c1b4316be3d4]
[ Info: Skipping precompilation since __precompile__(false). Importing AbbreviatedStackTraces [ac637c84-cc71-43bf-9c33-c1b4316be3d4].
_
_ _ _(_)_ | Documentation: https://docs.julialang.org
(_) | (_) (_) |
_ _ _| |_ __ _ | Type "?" for help, "]?" for Pkg help.
| | | | | | |/ _` | |
| | |_| | | | (_| | | Version 1.9.0-beta4 (2023-02-07)
_/ |\__'_|_|_|\__'_| | Official https://julialang.org/ release
|__/ |
julia> using LIKWID
julia> metrics, events = @perfmon "FLOPS_DP" sum(1:100000)
[47765] signal (6.-6): Aborted
in expression starting at REPL[2]:1
gsignal at /lib64/libc.so.6 (unknown line)
abort at /lib64/libc.so.6 (unknown line)
uv_mutex_destroy at /workspace/srcdir/libuv/src/unix/thread.c:429
uv__threadpool_cleanup at /workspace/srcdir/libuv/src/threadpool.c:185
uv_library_shutdown at /workspace/srcdir/libuv/src/uv-common.c:941
_dl_fini at /lib64/ld-linux-x86-64.so.2 (unknown line)
__run_exit_handlers at /lib64/libc.so.6 (unknown line)
exit at /lib64/libc.so.6 (unknown line)
test_rdpmc.constprop.1 at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
access_x86_rdpmc_init at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
access_client_init at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
HPMaddThread at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
perfmon_init_maps at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
perfmon_init at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
perfmon_init at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/LibLikwid.jl:838 [inlined]
init at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:35
init at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:46 [inlined]
#perfmon#9 at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:630
unknown function (ip: 0x2b11136079ac)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
perfmon at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:626
unknown function (ip: 0x2b11135fbab6)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
jl_apply at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/julia.h:1874 [inlined]
do_call at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:126
eval_value at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:226
eval_stmt_value at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:177 [inlined]
eval_body at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:624
jl_interpret_toplevel_thunk at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:762
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:912
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:856
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:856
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:856
ijl_toplevel_eval_in at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:971
eval at ./boot.jl:370 [inlined]
eval_user_input at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:153
repl_backend_loop at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:249
#start_repl_backend#46 at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:234
kwcall at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:231
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
#run_repl#59 at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:377
run_repl at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:363
jfptr_run_repl_60032.clone_1 at /home/wrigleyj/.julia/juliaup/julia-1.9.0-beta4+0.x64.linux.gnu/lib/julia/sys.so (unknown line)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
#1017 at ./client.jl:421
jfptr_YY.1017_52335.clone_1 at /home/wrigleyj/.julia/juliaup/julia-1.9.0-beta4+0.x64.linux.gnu/lib/julia/sys.so (unknown line)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
jl_apply at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/julia.h:1874 [inlined]
jl_f__call_latest at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/builtins.c:774
#invokelatest#2 at ./essentials.jl:816 [inlined]
invokelatest at ./essentials.jl:813 [inlined]
run_main_repl at ./client.jl:405
exec_options at ./client.jl:322
_start at ./client.jl:522
jfptr__start_55341.clone_1 at /home/wrigleyj/.julia/juliaup/julia-1.9.0-beta4+0.x64.linux.gnu/lib/julia/sys.so (unknown line)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
jl_apply at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/julia.h:1874 [inlined]
true_main at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/jlapi.c:573
jl_repl_entrypoint at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/jlapi.c:717
main at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/cli/loader_exe.c:59
__libc_start_main at /lib64/libc.so.6 (unknown line)
unknown function (ip: 0x401098)
Allocations: 8510067 (Pool: 8501372; Big: 8695); GC: 13
[47778] signal (6.-6): Aborted
in expression starting at REPL[2]:1
gsignal at /lib64/libc.so.6 (unknown line)
abort at /lib64/libc.so.6 (unknown line)
uv_mutex_destroy at /workspace/srcdir/libuv/src/unix/thread.c:429
uv__threadpool_cleanup at /workspace/srcdir/libuv/src/threadpool.c:185
uv_library_shutdown at /workspace/srcdir/libuv/src/uv-common.c:941
_dl_fini at /lib64/ld-linux-x86-64.so.2 (unknown line)
__run_exit_handlers at /lib64/libc.so.6 (unknown line)
exit at /lib64/libc.so.6 (unknown line)
segfault_sigaction_rdpmc at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
_L_unlock_13 at /lib64/libpthread.so.0 (unknown line)
test_rdpmc.constprop.1 at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
access_x86_rdpmc_init at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
access_client_init at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
HPMaddThread at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
perfmon_init_maps at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
perfmon_init at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
perfmon_init at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/LibLikwid.jl:838 [inlined]
init at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:35
init at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:46 [inlined]
#perfmon#9 at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:630
unknown function (ip: 0x2b11136079ac)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
perfmon at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:626
unknown function (ip: 0x2b11135fbab6)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
jl_apply at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/julia.h:1874 [inlined]
do_call at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:126
eval_value at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:226
eval_stmt_value at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:177 [inlined]
eval_body at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:624
jl_interpret_toplevel_thunk at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:762
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:912
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:856
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:856
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:856
ijl_toplevel_eval_in at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:971
eval at ./boot.jl:370 [inlined]
eval_user_input at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:153
repl_backend_loop at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:249
#start_repl_backend#46 at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:234
kwcall at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:231
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
#run_repl#59 at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:377
run_repl at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:363
jfptr_run_repl_60032.clone_1 at /home/wrigleyj/.julia/juliaup/julia-1.9.0-beta4+0.x64.linux.gnu/lib/julia/sys.so (unknown line)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
#1017 at ./client.jl:421
jfptr_YY.1017_52335.clone_1 at /home/wrigleyj/.julia/juliaup/julia-1.9.0-beta4+0.x64.linux.gnu/lib/julia/sys.so (unknown line)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
jl_apply at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/julia.h:1874 [inlined]
jl_f__call_latest at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/builtins.c:774
#invokelatest#2 at ./essentials.jl:816 [inlined]
invokelatest at ./essentials.jl:813 [inlined]
run_main_repl at ./client.jl:405
exec_options at ./client.jl:322
_start at ./client.jl:522
jfptr__start_55341.clone_1 at /home/wrigleyj/.julia/juliaup/julia-1.9.0-beta4+0.x64.linux.gnu/lib/julia/sys.so (unknown line)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
jl_apply at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/julia.h:1874 [inlined]
true_main at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/jlapi.c:573
jl_repl_entrypoint at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/jlapi.c:717
main at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/cli/loader_exe.c:59
__libc_start_main at /lib64/libc.so.6 (unknown line)
unknown function (ip: 0x401098)
Allocations: 8510067 (Pool: 8501372; Big: 8695); GC: 13
[47790] signal (6.-6): Aborted
in expression starting at REPL[2]:1
gsignal at /lib64/libc.so.6 (unknown line)
abort at /lib64/libc.so.6 (unknown line)
uv_mutex_destroy at /workspace/srcdir/libuv/src/unix/thread.c:429
uv__threadpool_cleanup at /workspace/srcdir/libuv/src/threadpool.c:185
uv_library_shutdown at /workspace/srcdir/libuv/src/uv-common.c:941
_dl_fini at /lib64/ld-linux-x86-64.so.2 (unknown line)
__run_exit_handlers at /lib64/libc.so.6 (unknown line)
exit at /lib64/libc.so.6 (unknown line)
test_rdpmc.constprop.1 at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
access_x86_rdpmc_init at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
access_client_init at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
HPMaddThread at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
perfmon_init_maps at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
perfmon_init at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
perfmon_init at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/LibLikwid.jl:838 [inlined]
init at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:35
init at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:46 [inlined]
#perfmon#9 at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:630
unknown function (ip: 0x2b11136079ac)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
perfmon at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:626
unknown function (ip: 0x2b11135fbab6)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
jl_apply at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/julia.h:1874 [inlined]
do_call at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:126
eval_value at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:226
eval_stmt_value at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:177 [inlined]
eval_body at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:624
jl_interpret_toplevel_thunk at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:762
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:912
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:856
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:856
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:856
ijl_toplevel_eval_in at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:971
eval at ./boot.jl:370 [inlined]
eval_user_input at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:153
repl_backend_loop at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:249
#start_repl_backend#46 at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:234
kwcall at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:231
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
#run_repl#59 at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:377
run_repl at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:363
jfptr_run_repl_60032.clone_1 at /home/wrigleyj/.julia/juliaup/julia-1.9.0-beta4+0.x64.linux.gnu/lib/julia/sys.so (unknown line)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
#1017 at ./client.jl:421
jfptr_YY.1017_52335.clone_1 at /home/wrigleyj/.julia/juliaup/julia-1.9.0-beta4+0.x64.linux.gnu/lib/julia/sys.so (unknown line)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
jl_apply at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/julia.h:1874 [inlined]
jl_f__call_latest at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/builtins.c:774
#invokelatest#2 at ./essentials.jl:816 [inlined]
invokelatest at ./essentials.jl:813 [inlined]
run_main_repl at ./client.jl:405
exec_options at ./client.jl:322
_start at ./client.jl:522
jfptr__start_55341.clone_1 at /home/wrigleyj/.julia/juliaup/julia-1.9.0-beta4+0.x64.linux.gnu/lib/julia/sys.so (unknown line)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
jl_apply at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/julia.h:1874 [inlined]
true_main at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/jlapi.c:573
jl_repl_entrypoint at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/jlapi.c:717
main at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/cli/loader_exe.c:59
__libc_start_main at /lib64/libc.so.6 (unknown line)
unknown function (ip: 0x401098)
Allocations: 8510067 (Pool: 8501372; Big: 8695); GC: 13
[47801] signal (6.-6): Aborted
in expression starting at REPL[2]:1
gsignal at /lib64/libc.so.6 (unknown line)
abort at /lib64/libc.so.6 (unknown line)
uv_mutex_destroy at /workspace/srcdir/libuv/src/unix/thread.c:429
uv__threadpool_cleanup at /workspace/srcdir/libuv/src/threadpool.c:185
uv_library_shutdown at /workspace/srcdir/libuv/src/uv-common.c:941
_dl_fini at /lib64/ld-linux-x86-64.so.2 (unknown line)
__run_exit_handlers at /lib64/libc.so.6 (unknown line)
exit at /lib64/libc.so.6 (unknown line)
test_rdpmc.constprop.1 at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
access_x86_rdpmc_init at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
access_client_init at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
HPMaddThread at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
perfmon_init_maps at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
perfmon_init at /home/wrigleyj/likwid_dist/lib/liblikwid.so (unknown line)
perfmon_init at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/LibLikwid.jl:838 [inlined]
init at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:35
init at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:46 [inlined]
#perfmon#9 at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:630
unknown function (ip: 0x2b11136079ac)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
perfmon at /home/wrigleyj/.julia/packages/LIKWID/uvEoZ/src/perfmon.jl:626
unknown function (ip: 0x2b11135fbab6)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
jl_apply at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/julia.h:1874 [inlined]
do_call at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:126
eval_value at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:226
eval_stmt_value at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:177 [inlined]
eval_body at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:624
jl_interpret_toplevel_thunk at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/interpreter.c:762
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:912
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:856
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:856
jl_toplevel_eval_flex at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:856
ijl_toplevel_eval_in at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/toplevel.c:971
eval at ./boot.jl:370 [inlined]
eval_user_input at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:153
repl_backend_loop at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:249
#start_repl_backend#46 at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:234
kwcall at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:231
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
#run_repl#59 at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:377
run_repl at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/usr/share/julia/stdlib/v1.9/REPL/src/REPL.jl:363
jfptr_run_repl_60032.clone_1 at /home/wrigleyj/.julia/juliaup/julia-1.9.0-beta4+0.x64.linux.gnu/lib/julia/sys.so (unknown line)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
#1017 at ./client.jl:421
jfptr_YY.1017_52335.clone_1 at /home/wrigleyj/.julia/juliaup/julia-1.9.0-beta4+0.x64.linux.gnu/lib/julia/sys.so (unknown line)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
jl_apply at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/julia.h:1874 [inlined]
jl_f__call_latest at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/builtins.c:774
#invokelatest#2 at ./essentials.jl:816 [inlined]
invokelatest at ./essentials.jl:813 [inlined]
run_main_repl at ./client.jl:405
exec_options at ./client.jl:322
_start at ./client.jl:522
jfptr__start_55341.clone_1 at /home/wrigleyj/.julia/juliaup/julia-1.9.0-beta4+0.x64.linux.gnu/lib/julia/sys.so (unknown line)
_jl_invoke at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2689 [inlined]
ijl_apply_generic at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/gf.c:2871
jl_apply at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/julia.h:1874 [inlined]
true_main at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/jlapi.c:573
jl_repl_entrypoint at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/src/jlapi.c:717
main at /cache/build/default-amdci4-0/julialang/julia-release-1-dot-9/cli/loader_exe.c:59
__libc_start_main at /lib64/libc.so.6 (unknown line)
unknown function (ip: 0x401098)
Allocations: 8510067 (Pool: 8501372; Big: 8695); GC: 13
ERROR: The selected register PMC0 is in use.
Please run likwid with force option (-f, --force) to overwrite settings
Group: FLOPS_DP
┌──────────────────────────────────┬───────────┐
│ Event │ Thread 1 │
├──────────────────────────────────┼───────────┤
│ ACTUAL_CPU_CLOCK │ 1.04286e6 │
│ MAX_CPU_CLOCK │ 828072.0 │
│ RETIRED_INSTRUCTIONS │ NaN │
│ CPU_CLOCKS_UNHALTED │ 45161.0 │
│ RETIRED_SSE_AVX_FLOPS_DOUBLE_ALL │ 0.0 │
│ MERGE │ 0.0 │
└──────────────────────────────────┴───────────┘
┌──────────────────────┬─────────────┐
│ Metric │ Thread 1 │
├──────────────────────┼─────────────┤
│ Runtime (RDTSC) [s] │ 3.66003e-6 │
│ Runtime unhalted [s] │ 0.000434528 │
│ Clock [MHz] │ 3022.49 │
│ CPI │ Inf │
│ DP [MFLOP/s] │ 0.0 │
└──────────────────────┴─────────────┘
(OrderedDict("FLOPS_DP" => [OrderedDict("Runtime (RDTSC) [s]" => 3.66002942206152e-6, "Runtime unhalted [s]" => 0.00043452807639303003, "Clock [MHz]" => 3022.4925853523528, "CPI" => Inf, "DP [MFLOP/s]" => 0.0)]), OrderedDict("FLOPS_DP" => [OrderedDict("ACTUAL_CPU_CLOCK" => 1.042859e6, "MAX_CPU_CLOCK" => 828072.0, "RETIRED_INSTRUCTIONS" => 0.0, "CPU_CLOCKS_UNHALTED" => 45161.0, "RETIRED_SSE_AVX_FLOPS_DOUBLE_ALL" => 0.0, "MERGE" => 0.0)]))
That code to use rdpmc instructions was added in likwid v5.2.0, so then I tried installing v5.1.1. Unfortunately that segfaults as well because the LIKWID.jl bindings are for a newer version, specifically, this PR changed a struct. I checked the git history of likwid and that struct was changed after rdpmc support was added, so there's no version I can use that will work with LIKWID.jl's binding and doesn't have rdpmc support (and the earliest LIKWID.jl version that doesn't require that particular struct is v0.1.1, so I don't really want to downgrade that...).
TL;DR, @perfmon
printed a bunch of sigabrt stacktraces every time I run it because of the way likwid checks for rdpmc support (I think) and I can't downgrade anything to get around it.
Then I tried with the perf_event
backend and that kinda worked, modulo some permission errors:
julia> metrics, events = @perfmon "FLOPS_DP" sum(rand(10000))
ERROR - [./src/includes/perfmon_perfevent.h:perfmon_setupCountersThread_perfevent:894] Permission denied.
Setup of event ACTUAL_CPU_CLOCK on CPU 6 failed: Permission denied
ERROR - [./src/includes/perfmon_perfevent.h:perfmon_setupCountersThread_perfevent:894] Permission denied.
Setup of event MAX_CPU_CLOCK on CPU 6 failed: Permission denied
Group: FLOPS_DP
┌──────────────────────────────────┬──────────┐
│ Event │ Thread 1 │
├──────────────────────────────────┼──────────┤
│ ACTUAL_CPU_CLOCK │ 0.0 │
│ MAX_CPU_CLOCK │ 0.0 │
│ RETIRED_INSTRUCTIONS │ 100939.0 │
│ CPU_CLOCKS_UNHALTED │ 92403.0 │
│ RETIRED_SSE_AVX_FLOPS_DOUBLE_ALL │ 1679.0 │
│ MERGE │ 0.0 │
└──────────────────────────────────┴──────────┘
┌──────────────────────┬────────────┐
│ Metric │ Thread 1 │
├──────────────────────┼────────────┤
│ Runtime (RDTSC) [s] │ 9.84015e-5 │
│ Runtime unhalted [s] │ 0.0 │
│ Clock [MHz] │ NaN │
│ CPI │ 0.915434 │
│ DP [MFLOP/s] │ 17.0627 │
└──────────────────────┴────────────┘
(OrderedDict("FLOPS_DP" => [OrderedDict("Runtime (RDTSC) [s]" => 9.840149344766617e-5, "Runtime unhalted [s]" => 0.0, "Clock [MHz]" => NaN, "CPI" => 0.9154340740447201, "DP [MFLOP/s]" => 17.062749163384993)]), OrderedDict("FLOPS_DP" => [OrderedDict("ACTUAL_CPU_CLOCK" => 0.0, "MAX_CPU_CLOCK" => 0.0, "RETIRED_INSTRUCTIONS" => 100939.0, "CPU_CLOCKS_UNHALTED" => 92403.0, "RETIRED_SSE_AVX_FLOPS_DOUBLE_ALL" => 1679.0, "MERGE" => 0.0)]))
And finally I tried running it on Julia 1.8 and it seemed to work:
$ LD_LIBRARY_PATH=/home/wrigleyj/likwid_dist/lib:$LD_LIBRARY_PATH PATH=/home/wrigleyj/likwid_dist/bin:$PATH julia +1.8
_
_ _ _(_)_ | Documentation: https://docs.julialang.org
(_) | (_) (_) |
_ _ _| |_ __ _ | Type "?" for help, "]?" for Pkg help.
| | | | | | |/ _` | |
| | |_| | | | (_| | | Version 1.8.5 (2023-01-08)
_/ |\__'_|_|_|\__'_| | Official https://julialang.org/ release
|__/ |
julia> using LIKWID
julia> metrics, events = @perfmon "FLOPS_DP" sum(rand(10000))
Group: FLOPS_DP
┌──────────────────────────────────┬───────────┐
│ Event │ Thread 1 │
├──────────────────────────────────┼───────────┤
│ ACTUAL_CPU_CLOCK │ 1.32525e6 │
│ MAX_CPU_CLOCK │ 1.77396e6 │
│ RETIRED_INSTRUCTIONS │ 107060.0 │
│ CPU_CLOCKS_UNHALTED │ 158330.0 │
│ RETIRED_SSE_AVX_FLOPS_DOUBLE_ALL │ 20239.0 │
│ MERGE │ 0.0 │
└──────────────────────────────────┴───────────┘
┌──────────────────────┬─────────────┐
│ Metric │ Thread 1 │
├──────────────────────┼─────────────┤
│ Runtime (RDTSC) [s] │ 8.09407e-5 │
│ Runtime unhalted [s] │ 0.000552193 │
│ Clock [MHz] │ 1792.92 │
│ CPI │ 1.47889 │
│ DP [MFLOP/s] │ 250.047 │
└──────────────────────┴─────────────┘
(OrderedDict("FLOPS_DP" => [OrderedDict("Runtime (RDTSC) [s]" => 8.094069278487966e-5, "Runtime unhalted [s]" => 0.000552192642979988, "Clock [MHz]" => 1792.9238408240735, "CPI" => 1.478890341864375, "DP [MFLOP/s]" => 250.04727910830042)]), OrderedDict("FLOPS_DP" => [OrderedDict("ACTUAL_CPU_CLOCK" => 1.325251e6, "MAX_CPU_CLOCK" => 1.77396e6, "RETIRED_INSTRUCTIONS" => 107060.0, "CPU_CLOCKS_UNHALTED" => 158330.0, "RETIRED_SSE_AVX_FLOPS_DOUBLE_ALL" => 20239.0, "MERGE" => 0.0)]))
Though sometimes on 1.8 I also see this error, only in Julia and never when running likwid-perfctr
:
ERROR: The selected register PMC0 is in use.
Please run likwid with force option (-f, --force) to overwrite settings
Raised by @vchuravy in #46 (comment)
This issue is used to trigger TagBot; feel free to unsubscribe.
If you haven't already, you should update your TagBot.yml
to include issue comment triggers.
Please see this post on Discourse for instructions and more details.
Great package! As far as I understand it provides the information needed to locate an application on the roofline model but finding the info without screwing up feels a bit scary to me. Could an example be put in a tutorial or in the library, so that I could do something like roofline(f)
and it'd plot the roofline model and locate f() on it?
(this is for teaching basic performance notions)
Check GPU indexing consistency. Also, shoudl we start counting at 0 (like CUDA)?
Once #10 is merged, we should update https://hpc.fau.de/2021/01/29/julia-interface-for-likwid/.
Hey,
The tool video looked amaizing. Exactly what I need.
I tried to install and run it with the CPU version, but I stucked at:
ERROR: LoadError: TaskFailedException
Stacktrace:
[1] wait
@ ./task.jl:334 [inlined]
[2] threading_run(func::Function)
@ Base.Threads ./threadingconstructs.jl:38
[3] macro expansion
@ ./threadingconstructs.jl:97 [inlined]
[4] get_processor_ids()
@ LIKWID ~/.julia/packages/LIKWID/EYAWk/src/affinity.jl:105
[5] perfmon(f::Function, group_or_groups::String)
@ LIKWID.PerfMon ~/.julia/packages/LIKWID/EYAWk/src/perfmon.jl:618
[6] top-level scope
@ ~/.julia/packages/LIKWID/EYAWk/src/perfmon.jl:707
nested task error: could not load library "liblikwid"
liblikwid.so: cannot open shared object file: No such file or directory
Stacktrace:
[1] likwid_getProcessorId
@ ~/.julia/packages/LIKWID/EYAWk/src/LibLikwid.jl:577 [inlined]
[2] get_processor_id
@ ~/.julia/packages/LIKWID/EYAWk/src/affinity.jl:76 [inlined]
[3] macro expansion
@ ~/.julia/packages/LIKWID/EYAWk/src/affinity.jl:106 [inlined]
[4] (::LIKWID.var"#119#threadsfor_fun#6"{Vector{Int64}, UnitRange{Int64}})(onethread::Bool)
@ LIKWID ./threadingconstructs.jl:85
[5] (::LIKWID.var"#119#threadsfor_fun#6"{Vector{Int64}, UnitRange{Int64}})()
@ LIKWID ./threadingconstructs.jl:52
in expression starting at /home/master/repos/tests/speedtests/test_cpu_likwid.jl:16
I installed Likwid: I think I did exactly like it is mentioned in the video:
# Path were to install likwid
PREFIX ?= /home/master/.local#NO SPACE
# Set the default mode for MSR access.
# This can usually be overriden on the commandline.
# Valid values are: direct, accessdaemon and perf_event
ACCESSMODE = accessdaemon#NO SPACE
...
...
# Build LIKWID with NVIDIA interface (CUDA, CUPTI)
# For configuring include paths, go to CUDA section
NVIDIA_INTERFACE = false#NO SPACE
added: ] add LIKWID
(Also tried with this:
PREFIX ?= /usr/local#NO SPACE)
Both arrives to the same result that I missing the library. Should I load it explicitly with julia or what is this?
On LIKWID#master (RRZE-HPC/likwid@7f5d3f3)
julia> using LIKWID
julia> @perfmon_marker "FLOPS_DP" begin
@marker "test" 3+3
end
signal (11): Segmentation fault
in expression starting at REPL[2]:1
perfmon_readMarkerFile at /upb/departments/pc2/users/b/bauerc/.local/lib/liblikwid.so (unknown line)
perfmon_readMarkerFile at /scratch/pc2-mitarbeiter/bauerc/devel/LIKWID.jl/src/LibLikwid.jl:1071 [inlined]
read at /scratch/pc2-mitarbeiter/bauerc/devel/LIKWID.jl/src/markerfile.jl:11 [inlined]
_print_markerfile at /scratch/pc2-mitarbeiter/bauerc/devel/LIKWID.jl/src/prettyprinting.jl:41
#perfmon_marker#5 at /scratch/pc2-mitarbeiter/bauerc/devel/LIKWID.jl/src/marker.jl:354
unknown function (ip: 0x1554f023992e)
_jl_invoke at /buildworker/worker/package_linux64/build/src/gf.c:2247 [inlined]
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2429
perfmon_marker at /scratch/pc2-mitarbeiter/bauerc/devel/LIKWID.jl/src/marker.jl:339
_jl_invoke at /buildworker/worker/package_linux64/build/src/gf.c:2247 [inlined]
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2429
jl_apply at /buildworker/worker/package_linux64/build/src/julia.h:1788 [inlined]
do_call at /buildworker/worker/package_linux64/build/src/interpreter.c:126
eval_value at /buildworker/worker/package_linux64/build/src/interpreter.c:215
eval_stmt_value at /buildworker/worker/package_linux64/build/src/interpreter.c:166 [inlined]
eval_body at /buildworker/worker/package_linux64/build/src/interpreter.c:587
jl_interpret_toplevel_thunk at /buildworker/worker/package_linux64/build/src/interpreter.c:731
jl_toplevel_eval_flex at /buildworker/worker/package_linux64/build/src/toplevel.c:885
jl_toplevel_eval_flex at /buildworker/worker/package_linux64/build/src/toplevel.c:830
jl_toplevel_eval_in at /buildworker/worker/package_linux64/build/src/toplevel.c:944
eval at ./boot.jl:373 [inlined]
eval_user_input at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.7/REPL/src/REPL.jl:150
repl_backend_loop at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.7/REPL/src/REPL.jl:246
start_repl_backend at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.7/REPL/src/REPL.jl:231
#run_repl#47 at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.7/REPL/src/REPL.jl:364
run_repl at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.7/REPL/src/REPL.jl:351
_jl_invoke at /buildworker/worker/package_linux64/build/src/gf.c:2247 [inlined]
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2429
#930 at ./client.jl:394
jfptr_YY.930_45169.clone_1 at /cm/shared/apps/pc2/EB-SW/software/JuliaHPC/1.7.2-fosscuda-2020b-linux-x86_64/lib/julia/sys.so (unknown line)
_jl_invoke at /buildworker/worker/package_linux64/build/src/gf.c:2247 [inlined]
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2429
jl_apply at /buildworker/worker/package_linux64/build/src/julia.h:1788 [inlined]
jl_f__call_latest at /buildworker/worker/package_linux64/build/src/builtins.c:757
#invokelatest#2 at ./essentials.jl:716 [inlined]
invokelatest at ./essentials.jl:714 [inlined]
run_main_repl at ./client.jl:379
exec_options at ./client.jl:309
_start at ./client.jl:495
jfptr__start_38732.clone_1 at /cm/shared/apps/pc2/EB-SW/software/JuliaHPC/1.7.2-fosscuda-2020b-linux-x86_64/lib/julia/sys.so (unknown line)
_jl_invoke at /buildworker/worker/package_linux64/build/src/gf.c:2247 [inlined]
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2429
jl_apply at /buildworker/worker/package_linux64/build/src/julia.h:1788 [inlined]
true_main at /buildworker/worker/package_linux64/build/src/jlapi.c:559
jl_repl_entrypoint at /buildworker/worker/package_linux64/build/src/jlapi.c:701
main at julia (unknown line)
__libc_start_main at /lib64/libc.so.6 (unknown line)
unknown function (ip: 0x400808)
Allocations: 4599565 (Pool: 4597687; Big: 1878); GC: 6
Segmentation fault (core dumped)
Same works with LIKWID v5.2.1:
julia> using LIKWID
julia> @perfmon_marker "FLOPS_DP" begin
@marker "test" 3+3
end
Region: test, Group: FLOPS_DP
┌──────────────────────────────────────────┬──────────┐
│ Event │ Thread 1 │
├──────────────────────────────────────────┼──────────┤
│ INSTR_RETIRED_ANY │ 4058.0 │
│ CPU_CLK_UNHALTED_CORE │ 9415.0 │
│ CPU_CLK_UNHALTED_REF │ 22560.0 │
│ FP_ARITH_INST_RETIRED_128B_PACKED_DOUBLE │ 0.0 │
│ FP_ARITH_INST_RETIRED_SCALAR_DOUBLE │ 11.0 │
│ FP_ARITH_INST_RETIRED_256B_PACKED_DOUBLE │ 0.0 │
│ FP_ARITH_INST_RETIRED_512B_PACKED_DOUBLE │ 0.0 │
└──────────────────────────────────────────┴──────────┘
┌──────────────────────┬────────────┐
│ Metric │ Thread 1 │
├──────────────────────┼────────────┤
│ Runtime (RDTSC) [s] │ 1.21129e-7 │
│ Runtime unhalted [s] │ 3.93251e-6 │
│ Clock [MHz] │ 999.153 │
│ CPI │ 2.32011 │
│ DP [MFLOP/s] │ 90.8125 │
│ AVX DP [MFLOP/s] │ 0.0 │
│ AVX512 DP [MFLOP/s] │ 0.0 │
│ Packed [MUOPS/s] │ 0.0 │
│ Scalar [MUOPS/s] │ 90.8125 │
│ Vectorization ratio │ 0.0 │
└──────────────────────┴────────────┘
(I've compiled with perf_event
in both cases and have otherwise left the config.mk
as is. Don't think that this info matters though.)
cc @TomTheBear
Issue (MWE):
julia> using LIKWID
julia> using CUDA
julia> LIKWID.init_topology_gpu()
true
vs
julia> using LIKWID
julia> using CUDA
julia> x = CUDA.rand(Float32, 100);
julia> LIKWID.init_topology_gpu()
false
Consequences:
@nvmon
/ nvmon
fails withjulia> metrics, events = @nvmon "FLOPS_SP" saxpy!(z, a, x, y);
ERROR: Couldn't init gpu topology.
Stacktrace:
[1] error(s::String)
@ Base ./error.jl:33
[2] init(gpus::Vector{Int32})
@ LIKWID.NvMon /scratch/pc2-mitarbeiter/bauerc/devel/LIKWID.jl/src/nvmon.jl:16
[3] init
@ /scratch/pc2-mitarbeiter/bauerc/devel/LIKWID.jl/src/nvmon.jl:28 [inlined]
[4] nvmon(f::var"#3#4", group_or_groups::String; gpuids::Int64, print::Bool)
@ LIKWID.NvMon /scratch/pc2-mitarbeiter/bauerc/devel/LIKWID.jl/src/nvmon.jl:427
[5] nvmon(f::Function, group_or_groups::String)
@ LIKWID.NvMon /scratch/pc2-mitarbeiter/bauerc/devel/LIKWID.jl/src/nvmon.jl:426
[6] top-level scope
@ /scratch/pc2-mitarbeiter/bauerc/devel/LIKWID.jl/src/nvmon.jl:477
[7] top-level scope
@ /scratch/pc2-mitarbeiter/bauerc/.julia/packages/CUDA/tTK8Y/src/initialization.jl:52
(It works if we do LIKWID.init_topology_gpu()
right after using LIKWID
.)
➜ bauerc@dgx-01 LIKWID.jl git:(cb/perfmonrev) likwid-perfctr -G 0 -W FLOPS_SP -m julia --project=. perfctr_gpu.jl
--------------------------------------------------------------------------------
CPU name: AMD EPYC 7742 64-Core Processor
CPU type: AMD K17 (Zen2) architecture
CPU clock: 2.25 GHz
--------------------------------------------------------------------------------
Error init GPU Marker API.
--------------------------------------------------------------------------------
GPU Marker API result file does not exist. This may happen if the application has not called LIKWID_GPUMARKER_CLOSE.
where the input file is
# perfctr_gpu.jl
using LIKWID
using LinearAlgebra
using CUDA
# LIKWID.init_topology_gpu() # example works if one uncomments this line
@assert CUDA.functional()
const N = 10_000
const a = 3.141f0 # Float32
# Note: CUDA defaults to Float32
const x = CUDA.rand(N)
const y = CUDA.rand(N)
const z = CUDA.zeros(N)
saxpy!(z,a,x,y) = z .= a .* x .+ y
saxpy!(z,a,x,y) # warmup
GPUMarker.init()
GPUMarker.startregion("saxpy")
saxpy!(z,a,x,y)
GPUMarker.stopregion("saxpy")
GPUMarker.close()
We have a few @test_broken
. Figure out why they are broken and whether that's a bug on our or LIKWIDs side.
First I don't use the project. But maybe It would be easier to use if it was on Yggdrassil. I can contribute to this if you think it would be fine to add it.
Once JuliaLang/julia#44136 is merged (will likely be in Julia 1.8) @threads
will default to @threads :dynamic
instead of the previous @threads :static
! We must therefore replace @threads
by the more explicit @threads :static
where necessary!
cc: @giordano
On Ookami running likwid-perfctr
works:
[vchuravy@fj-debug1 ~]$ likwid-perfctr -m -C 0 -g FLOPS_DP julia perfctr.jl
--------------------------------------------------------------------------------
CPU name:
CPU type: Fujitsu A64FX
CPU clock: 0.00 GHz
--------------------------------------------------------------------------------
--------------------------------------------------------------------------------
Region saxpy, Group 1: FLOPS_DP
+-------------------+------------+
| Region Info | HWThread 0 |
+-------------------+------------+
| RDTSC Runtime [s] | 0.100502 |
| call count | 1 |
+-------------------+------------+
+----------------------+---------+------------+
| Event | Counter | HWThread 0 |
+----------------------+---------+------------+
| INST_RETIRED | PMC0 | 95528990 |
| CPU_CYCLES | PMC1 | 179848400 |
| FP_DP_FIXED_OPS_SPEC | PMC3 | 20114 |
| FP_DP_SCALE_OPS_SPEC | PMC4 | 0 |
+----------------------+---------+------------+
+--------------------------+------------+
| Metric | HWThread 0 |
+--------------------------+------------+
| Runtime (RDTSC) [s] | 0.1005 |
| Clock [MHz] | 1789.5007 |
| CPI | 1.8827 |
| DP (FP) [MFLOP/s] | 0.2001 |
| DP (FP+SVE128) [MFLOP/s] | 0.2001 |
| DP (FP+SVE256) [MFLOP/s] | 0.2001 |
| DP (FP+SVE512) [MFLOP/s] | 0.2001 |
+--------------------------+------------+
But using the CLI interface doesn't
julia> using LIKWID
julia> LIKWID.LibLikwid.likwid_pinThread(0)
1
julia> PerfMon.init(0)
true
julia> groupid = PerfMon.add_event_set("FLOPS_DP")
1
julia> PerfMon.setup_counters(groupid)
Setup of counters failed for thread 0
false
I just saw your talk from JuliaCon 2023 that also mentioned LIKWID.jl and I was eager to try it out. 🚀
Unfortunately, trying the very first tutorial, I got a segfault. After some digging, I found the culprit: the definitions of CpuTopology
between LIKWID.jl and the underlying liblikwid differed. My likwid.h
version 5.1 (installed via apt
) defines the CpuTopology
struct without the numDies
field, while LIKWID.jl version 0.4.4 defines its CpuTopology
struct having numDies
in position 4. Consequently, the fields are shifted when transferring the data from C to Julia: Julia's numDies
contains C's numCoresPerSocket
, ..., numCacheLevels
contains the value of threadPool
, which is a pointer. Wrapping cacheLevels
(C's topologyTree
) into an array of length numCacheLevels
(C's threadPool
) leads to the malformed memory accesses. 💥
likwid.h: https://github.com/RRZE-HPC/likwid/blob/v5.1/src/includes/likwid.h#L370-L380
typedef struct {
uint32_t numHWThreads; /*!< \brief Amount of active HW threads in the system (e.g. in cpuset) */
uint32_t activeHWThreads; /*!< \brief Amount of HW threads in the system and length of \a threadPool */
uint32_t numSockets; /*!< \brief Amount of CPU sockets/packages in the system */
uint32_t numCoresPerSocket; /*!< \brief Amount of physical cores in one CPU socket/package */
uint32_t numThreadsPerCore; /*!< \brief Amount of HW threads in one physical CPU core */
uint32_t numCacheLevels; /*!< \brief Amount of caches for each HW thread and length of \a cacheLevels */
HWThread* threadPool; /*!< \brief List of all HW thread descriptions */
CacheLevel* cacheLevels; /*!< \brief List of all caches in the hierarchy */
struct treeNode* topologyTree; /*!< \brief Anchor for a tree structure describing the system topology */
} CpuTopology;
Liblikwid.jl: https://github.com/JuliaPerf/LIKWID.jl/blob/v0.4.4/src/LibLikwid.jl#L640-L651
struct CpuTopology
numHWThreads::UInt32
activeHWThreads::UInt32
numSockets::UInt32
numDies::UInt32
numCoresPerSocket::UInt32
numThreadsPerCore::UInt32
numCacheLevels::UInt32
threadPool::Ptr{HWThread}
cacheLevels::Ptr{CacheLevel}
topologyTree::Ptr{treeNode}
end
The numDies
field has been added in RRZE-HPC/likwid@a0ac14d, which according to GitHub shipped in likwid version 5.2 and onwards. Moving the numDies
field to the end of the C struct is not an option, I guess, and removing numDies
from LIKWID.jl as its not used anyways. Therefore, I would recommend:
Can we enable this somehow or is it just not supported?
(Currently, I use MEM
in runtests.jl
.)
https://github.com/JuliaPerf/LIKWID.jl/blob/cb/perfmonrev/src/misc.jl#L31 should also set nvmon verbosity level
Test it etc.
cc: @carstenbauer
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.