jcsda / spack-stack Goto Github PK
View Code? Open in Web Editor NEWLicense: Creative Commons Zero v1.0 Universal
License: Creative Commons Zero v1.0 Universal
Current Action runs using GCC on Ubuntu 20.04. Expand this to Intel compilers, macOS, etc.
Wgrib2 exists in Spack with the CMake fork, but the official release of wgrib2 should be used.
There's an issue with Intel 2021/2202 when compiling openblas with AVX-512 support.
My solution in the Github workflow was to use change the Openblas target in packages.yaml, but this isn't ideal.
I encountered the same issue on Orion, and added a +noavx512
variant (could also be a check if using Intel) that adds NO_AVX512=1
At JCSDA we only use the boost headers. As of today, there is no option (variant) in spack for this, therefore I made the boost installation a headers-only one by turning off virtually everything:
boost:
version: [1.72.0]
# Make this a boost-headers package by compiling basically nothing
variants: ~atomic ~chrono ~date_time ~exception ~filesystem ~graph ~iostreams ~locale ~log ~math ~mpi ~numpy +pic ~program_options ~python ~random ~regex ~serialization ~signals ~test ~timer ~wave cxxstd=14 visibility=hidden
We need to check if there is a better way to do this. And/or use the module projections to rename the boost
module to boost-headers
to avoid any misconception.
Beta testers using the "default" site config found that the concretize step can corrupt the spack.yaml file. Specifically, it replaces variables followed by double colons (e.g. compiler::
) with invalid quotes ('compiler:':). If the corresponding section that sets the default compiler gets moved into
site.yaml` as it is the case for the pre-configured sites, this does not happen.
I would like to use this issue to revisit how the site configs. It is a bit confusing to me that the common config files are all copied as is, but the site configs get merged into one file. How about the site config files remain separate but are prefixed with site_
? For example, one would then have:
config.yaml
install
modules.yaml
packages.yaml
site_compilers.yaml
site_config.yaml
site_modules.yaml
site_packages.yaml
spack.yaml
We would also need instructions how users using the "default" site config would configure it (move spack external find
stuff from spack.yaml to the various site_*.yaml
files?) before running the concretizer.
Looking for input and ideas ...
We are ready to move this repository into the ESCOMP GitHub organization, as discussed previously. Let's do this when there are no open pull requests.
Boost v1.76 and below has a conflict with GCC on macOS which
Using Boost 1.77.0+ resolves this issue. I know there's an issue with newer versions of Boost on (e.g. Orion), so I think this should be added to sites/macos/site.yaml
Many of the JEDI packages are written in C++ and require support for newer C++ standards (currently c++-14
, soon hopefully c++-17
). To compile these packages with the current jedi-stack on systems where the default libstdc++
is too old, we set up toolchains as follows (e.g. cheyenne):
# C++-14 compliant compiler settings
# set / export these variables when building for Intel compiler(s)
if [[ "$JEDI_COMPILER" =~ .*"intel"* ]]; then
export CXXFLAGS="-gxx-name=/glade/u/apps/ch/opt/gnu/9.1.0/bin/g++ -Wl,-rpath,/glade/u/apps/ch/opt/gnu/9.1.0/lib64"
export LDFLAGS="-gxx-name=/glade/u/apps/ch/opt/gnu/9.1.0/bin/g++ -Wl,-rpath,/glade/u/apps/ch/opt/gnu/9.1.0/lib64"
fi
This works within the limited scope of jedi-stack, but within spack and as I we are adding more software, doing the same in the compiler config breaks a lot of packages. I think I found a way to configure this in spack so that it works for all packages (currently building):
- compiler:
spec: [email protected]
paths:
cc: /glade/u/apps/opt/intel/2021.2/compiler/latest/linux/bin/intel64/icc
cxx: /glade/u/apps/opt/intel/2021.2/compiler/latest/linux/bin/intel64/icpc
f77: /glade/u/apps/opt/intel/2021.2/compiler/latest/linux/bin/intel64/ifort
fc: /glade/u/apps/opt/intel/2021.2/compiler/latest/linux/bin/intel64/ifort
flags: {}
operating_system: sles12
target: x86_64
modules:
- intel/2021.2
environment:
prepend_path:
LD_LIBRARY_PATH: '/glade/u/apps/opt/intel/2021.2/compiler/2021.2.0/linux/compiler/lib/intel64_lin:/glade/u/apps/ch/opt/gnu/10.1.0/lib64'
CXXFLAGS: '-std=c++14 -gxx-name=/glade/u/apps/ch/opt/gnu/10.1.0/bin/g++ -Wl,-rpath,/glade/u/apps/ch/opt/gnu/10.1.0/lib64'
extra_rpaths: []
Note 1. I don't think -std=c++14
is required, but it's currently building with that. Will try again later without it.
Note 2. For the reason to have the additional LD_LIBRARY_PATH
/glade/u/apps/opt/intel/2021.2/compiler/2021.2.0/linux/compiler/lib/intel64_lin
see #28
Note 3. Right now I also have
set:
I_MPI_ROOT: '/glade/u/apps/opt/intel/2021.2/mpi/latest'
in there to avoid these nasty I_MPI_SUBSTITUTE_INSTALL_DIR
errors in various places. This can hopefully be removed, because we don't want any MPI dependency in the compiler config!
PR #38 introduces tcl modules in addition to lmod/lua modules, with a substantial amount of duplication of the configuration for the modules. It may be possible to remove that duplication.
I get this message when I added GCC to extra_rpaths
in compilers.yaml
.
I don't think there's anything that needs to be supported when using extra_rpaths (or just append it to LD_LIBRARY_PATH for the same effect). The extra rpath is embedded into whatever executable/shared library was built with Spack. New executables created from the stack won't have the rpath embedded, but I don't think that matters. If you link to a library that contains the extra_rpath the rpath is already embedded, and everything would still work.
I could use LD_LIBRARY_PATH
instead. I suppose it doesn't really make a difference as long as you have the meta module loaded.
hpc-stack received an update for CRTM v2.4.0, this should be done for Spack
Earlier on we removed a guard in the eckit package.py
that required cmake-3.19 for building because of problems with 3.20.x. It turns out that there is indeed a problem with 3.20.x that manifests itself in dubious build errors (in the compile phase, not the cmake phase). But - it works with 3.22.x.
So we should put that guard back in but exclude 3.20.x and possibly 3.21.x (to be tested). And then aboid using any existing cmake 3.20.x as external package and choose a newer one, if possible. Otherwise the eckit build will trigger the build of a newer cmake.
For [email protected] +fortran +shared
, identical sets of Fortran modules are installed in subdirectories shared
and static
of the include
directory. These are not found by cmake and they are also not part of the default CPATH
in the lua module files.
Upgrading to [email protected] fixes the problem.
Since there are other good reasons for using version 1.12.1 (better performance as noted recently by @edwardhartnett), and since the default version in the existing jedi-stack is 1.12.0, this upgrade is not a problem from the JEDI side.
The ufs-weather-model currently uses 1.10.6 (https://github.com/ufs-community/ufs-weather-model/blob/develop/modulefiles/ufs_common). @kgerheiser @edwardhartnett do you think we can update to 1.12.1 for the UFS?
It would be useful to have a Spack mirror automatically created from a Github Action, so downloaded so that firewalled systems (e.g. Jet, Hera) can access these packages, rather than relying on others to create their own mirrors.
Still have to work on the details, but I think this should be possible by generating the mirror and then uploading it as an artifact. Maybe have it as a manual action, or whenever there's a merge to develop?
I think the Spack staging/cache directories should be set to reside within spack-stack by default with ${SPACK_STACK_DIR}/cache
. Mirrors and package sources are difficult to find because they are placed in some opaque temp directory.
# The build stage can be purged with `spack clean --stage` and
# `spack clean -a`, so it is important that the specified directory uniquely
# identifies Spack staging to avoid accidentally wiping out non-Spack work.
build_stage:
${SPACK_STACK_DIR}/cache/build_stage
# Directory in which to run tests and store test results.
# Tests will be stored in directories named by date/time and package
# name/hash.
test_stage: ${SPACK_STACK_DIR}/cache/test_stage
# Cache directory for already downloaded source tarballs and archived
# repositories. This can be purged with `spack clean --downloads`.
source_cache: ${SPACK_STACK_DIR}/cache/source_cache
# Cache directory for miscellaneous files, like the package index.
# This can be purged with `spack clean --misc-cache`
misc_cache: ${SPACK_STACK_DIR}/cache/misc_cache
The ufs-weather-model is switching to FMS release 2021.04 (2021.04.01 in spack versioning). We need to update the version number in the common packages.yaml
file.
Trying to build the (default) py-scipy 1.7.3 on macOS with LLVM clang 13.0.0 gives the following error.
I haven't had time to dig into this, but a short-term workaround is to use py-scipy 1.5.3, which builds fine. A quick look tells me that it's got to do with the AVX512 SIMD instructions, for which I remember having seen a way to tell the Python build to target a different CPU architecture.
==> Error: ProcessError: Command exited with status 1:
'/usr/local/bin/python3.9' '-m' 'pip' '-vvv' '--no-input' '--no-cache-dir' '--disable-pip-version-check' 'install' '--no-deps' '--ignore-installed' '--no-build-isolation' '--no-warn-script-location' '--no-index' '--prefix=/Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-scipy-1.7.3-czstuad' '.'
19 errors found in build log:
11 Created temporary directory: /private/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/pip-install-gyaz_4gj
12 Processing /private/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/heinzell/spack-stage/spack-stage-py-scipy-1.7.3-czstuadd6kthr7jrt55mlidz6i3w4qkq/spack-src
13 Added file:///private/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/heinzell/spack-stage/spack-stage-py-scipy-1.7.3-czstuadd6kthr7jrt55mlidz6i3w4qkq/spack-src to build tracker '/private/var/folders/gb/w8lys0xn3
c35rbw_5vfq4mkxxgj2gv/T/pip-req-tracker-z293o4lr'
14 Created temporary directory: /private/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/pip-modern-metadata-26q3xe5h
15 Preparing metadata (pyproject.toml): started
16 Running command /usr/local/opt/[email protected]/bin/python3.9 /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pip-21.3.
1-bf2phg3/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py prepare_metadata_for_build_wheel /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpmm20e33j
>> 17 setup.py:491: UserWarning: Unrecognized setuptools command ('dist_info --egg-base /private/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/pip-modern-metadata-26q3xe5h'), proceeding with generating Cython sources a
nd expanding templates
18 warnings.warn("Unrecognized setuptools command ('{}'), proceeding with "
19 Running from SciPy source directory.
20 Running scipy/linalg/_generate_pyx.py
21 Running scipy/special/_generate_pyx.py
22 Running scipy/stats/_generate_pyx.py
23 Processing scipy/cluster/_vq.pyx
...
53 Processing scipy/fftpack/convolve.pyx
54 Processing scipy/interpolate/interpnd.pyx
55 Processing scipy/interpolate/_bspl.pyx
56 Processing scipy/interpolate/_ppoly.pyx
57 Processing scipy/sparse/_csparsetools.pyx.in
58 Processing scipy/sparse/csgraph/_shortest_path.pyx
>> 59 warning: _cython_special_custom.pxi:9:8: Unreachable code
>> 60 warning: _cython_special_custom.pxi:13:4: Unreachable code
>> 61 warning: _cython_special_custom.pxi:21:8: Unreachable code
>> 62 warning: _cython_special_custom.pxi:25:4: Unreachable code
>> 63 warning: _cython_special_custom.pxi:33:8: Unreachable code
>> 64 warning: _cython_special_custom.pxi:37:4: Unreachable code
>> 65 warning: _cython_special_custom.pxi:45:8: Unreachable code
>> 66 warning: _cython_special_custom.pxi:49:4: Unreachable code
67 Processing scipy/sparse/csgraph/_traversal.pyx
68 Processing scipy/sparse/csgraph/_flow.pyx
69 Processing scipy/sparse/csgraph/_tools.pyx
70 Processing scipy/sparse/csgraph/_matching.pyx
71 Processing scipy/sparse/csgraph/_reordering.pyx
72 Processing scipy/sparse/csgraph/_min_spanning_tree.pyx
...
91 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp50d1qfe5/var
92 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp50d1qfe5/var/folders
93 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp50d1qfe5/var/folders/gb
94 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp50d1qfe5/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv
95 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp50d1qfe5/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T
96 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp50d1qfe5/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp50d1qfe5
>> 97 /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils/sy
stem_info.py:936: UserWarning: Specified path /usr/local/include/python3.9 is invalid.
98 return self.get_paths(self.section, key)
99 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpidvakcei/var
100 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpidvakcei/var/folders
101 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpidvakcei/var/folders/gb
102 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpidvakcei/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv
103 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpidvakcei/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T
...
3682 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpa0pj88ej/var
3683 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpa0pj88ej/var/folders
3684 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpa0pj88ej/var/folders/gb
3685 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpa0pj88ej/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv
3686 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpa0pj88ej/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T
3687 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpa0pj88ej/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpa0pj88ej
>> 3688 /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils/sy
stem_info.py:936: UserWarning: Specified path /usr/local/include/python3.9 is invalid.
3689 return self.get_paths(self.section, key)
3690 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpih4n9q0_/var
3691 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpih4n9q0_/var/folders
3692 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpih4n9q0_/var/folders/gb
3693 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpih4n9q0_/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv
3694 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmpih4n9q0_/var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T
...
4149 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp6fcnrw2_/Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0
/py-numpy-1.20.3-oc7m4wc/lib/python3.9
4150 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp6fcnrw2_/Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0
/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages
4151 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp6fcnrw2_/Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0
/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy
4152 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp6fcnrw2_/Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0
/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils
4153 creating /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp6fcnrw2_/Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0
/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils/checks
4154 CCompilerOpt.dist_test[576] : CCompilerOpt._dist_test_spawn[711] : Command (/Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/spack/lib/spack/env/clang/clang -Wno-unused-result -W
sign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX12.sdk -c /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stac
k-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils/checks/test_flags.c -o /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj
2gv/T/tmp6fcnrw2_/Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/n
umpy/distutils/checks/test_flags.o -MMD -MF /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp6fcnrw2_/Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-cl
ang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils/checks/test_flags.o.d -mavx5124fmaps -mavx5124vnniw -mavx512vpopcntdq) failed with exit status 1 output ->
>> 4155 clang-13: error: unknown argument: '-mavx5124fmaps'
>> 4156 clang-13: error: unknown argument: '-mavx5124vnniw'
4157
4158 CCompilerOpt.cc_test_flags[1003] : testing failed
4159 CCompilerOpt.dist_test[576] : CCompilerOpt._dist_test_spawn[711] : Command (/Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/spack/lib/spack/env/clang/clang -Wno-unused-result -W
sign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX12.sdk -c /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stac
k-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils/checks/cpu_avx512_knm.c -o /var/folders/gb/w8lys0xn3c35rbw_5vfq4mk
xxgj2gv/T/tmp6fcnrw2_/Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packag
es/numpy/distutils/checks/cpu_avx512_knm.o -MMD -MF /var/folders/gb/w8lys0xn3c35rbw_5vfq4mkxxgj2gv/T/tmp6fcnrw2_/Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos
.llvm-clang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils/checks/cpu_avx512_knm.o.d -msse -msse2 -msse3 -mssse3 -msse4.1 -mpopcnt -msse4.2 -mavx -mf16c -mfma -mav
x2 -mavx512f -mavx512cd -mavx512er -mavx512pf -Werror) failed with exit status 1 output ->
>> 4160 /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils/
checks/cpu_avx512_knm.c:9:9: error: implicit declaration of function '_mm512_4fmadd_ps' is invalid in C99 [-Werror,-Wimplicit-function-declaration]
4161 b = _mm512_4fmadd_ps(b, b, b, b, b, NULL);
4162 ^
4163 /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils/
checks/cpu_avx512_knm.c:9:9: note: did you mean '_mm512_fmadd_ps'?
4164 /usr/local/Cellar/llvm/13.0.0_2/lib/clang/13.0.0/include/avx512fintrin.h:2716:1: note: '_mm512_fmadd_ps' declared here
4165 _mm512_fmadd_ps(__m512 __A, __m512 __B, __m512 __C)
4166 ^
>> 4167 /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils/ch
ecks/cpu_avx512_knm.c:9:7: error: assigning to '__m512' (vector of 16 'float' values) from incompatible type 'int'
4168 b = _mm512_4fmadd_ps(b, b, b, b, b, NULL);
4169 ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
>> 4170 /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils/
checks/cpu_avx512_knm.c:11:9: error: implicit declaration of function '_mm512_4dpwssd_epi32' is invalid in C99 [-Werror,-Wimplicit-function-declaration]
4171 a = _mm512_4dpwssd_epi32(a, a, a, a, a, NULL);
4172 ^
4173 /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils/
checks/cpu_avx512_knm.c:11:9: note: did you mean '_mm512_dpwssd_epi32'?
4174 /usr/local/Cellar/llvm/13.0.0_2/lib/clang/13.0.0/include/avx512vnniintrin.h:68:1: note: '_mm512_dpwssd_epi32' declared here
4175 _mm512_dpwssd_epi32(__m512i __S, __m512i __A, __m512i __B)
4176 ^
>> 4177 /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-numpy-1.20.3-oc7m4wc/lib/python3.9/site-packages/numpy/distutils/ch
ecks/cpu_avx512_knm.c:11:7: error: assigning to '__m512i' (vector of 8 'long long' values) from incompatible type 'int'
4178 a = _mm512_4dpwssd_epi32(a, a, a, a, a, NULL);
4179 ^ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
4180 4 errors generated.
4181
4182 CCompilerOpt.feature_test[1458] : testing failed
4183 CCompilerOpt.generate_dispatch_header[2245] : dispatch header dir build/src.macosx-12-x86_64-3.9/numpy/distutils/include does not exist, creating it
...
11571 In file included from /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-p
ackages/pythran/pythonic/types/NoneType.hpp:8:
11572 In file included from /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-p
ackages/pythran/pythonic/builtins/bool_.hpp:7:
11573 In file included from /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-p
ackages/pythran/pythonic/types/tuple.hpp:13:
11574 In file included from /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-p
ackages/pythran/pythonic/types/ndarray.hpp:10:
11575 In file included from /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-p
ackages/pythran/pythonic/builtins/ValueError.hpp:6:
11576 In file included from /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-p
ackages/pythran/pythonic/types/exceptions.hpp:6:
>> 11577 /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-packages/pythran/pythonic
/types/str.hpp:407:12: error: no matching constructor for initialization of 'typename str::reverse_iterator' (aka 'reverse_iterator<(anonymous namespace)::pythonic::types::string_iterator>')
11578 return {data->rbegin()};
11579 ^~~~~~~~~~~~~~~~
11580 /usr/local/opt/llvm/bin/../include/c++/v1/__iterator/reverse_iterator.h:37:28: note: candidate constructor (the implicit copy constructor) not viable: no known conversion from 'reverse_iterator<std::basic_string<
char>::iterator>' to 'const reverse_iterator<(anonymous namespace)::pythonic::types::string_iterator>' for 1st argument
11581 class _LIBCPP_TEMPLATE_VIS reverse_iterator
11582 ^
11583 /usr/local/opt/llvm/bin/../include/c++/v1/__iterator/reverse_iterator.h:37:28: note: candidate constructor (the implicit move constructor) not viable: no known conversion from 'reverse_iterator<std::basic_string<
char>::iterator>' to 'reverse_iterator<(anonymous namespace)::pythonic::types::string_iterator>' for 1st argument
...
11596 In file included from /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-p
ackages/pythran/pythonic/types/NoneType.hpp:8:
11597 In file included from /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-p
ackages/pythran/pythonic/builtins/bool_.hpp:7:
11598 In file included from /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-p
ackages/pythran/pythonic/types/tuple.hpp:13:
11599 In file included from /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-p
ackages/pythran/pythonic/types/ndarray.hpp:10:
11600 In file included from /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-p
ackages/pythran/pythonic/builtins/ValueError.hpp:6:
11601 In file included from /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-p
ackages/pythran/pythonic/types/exceptions.hpp:6:
>> 11602 /Users/heinzell/work/jedi-stack/spack-stack-new-joint/spack-stack-develop-20220215/envs/jedi-ufs.macos.llvm-clang.gfortran/install/clang/13.0.0/py-pythran-0.9.12-ybcespe/lib/python3.9/site-packages/pythran/pythonic
/types/str.hpp:417:12: error: no matching constructor for initialization of 'typename str::reverse_iterator' (aka 'reverse_iterator<(anonymous namespace)::pythonic::types::string_iterator>')
11603 return {data->rend()};
11604 ^~~~~~~~~~~~~~
11605 /usr/local/opt/llvm/bin/../include/c++/v1/__iterator/reverse_iterator.h:37:28: note: candidate constructor (the implicit copy constructor) not viable: no known conversion from 'reverse_iterator<std::basic_string<
char>::iterator>' to 'const reverse_iterator<(anonymous namespace)::pythonic::types::string_iterator>' for 1st argument
11606 class _LIBCPP_TEMPLATE_VIS reverse_iterator
11607 ^
11608 /usr/local/opt/llvm/bin/../include/c++/v1/__iterator/reverse_iterator.h:37:28: note: candidate constructor (the implicit move constructor) not viable: no known conversion from 'reverse_iterator<std::basic_string<
char>::iterator>' to 'reverse_iterator<(anonymous namespace)::pythonic::types::string_iterator>' for 1st argument
After the merge of #73 the install
directory in config.yaml
is being interpreted as relative to config.yaml
in common/
, and then when I run setup_metamodules.py
it's looking in $env/install
and the meta modules are not generated.
I think the default install directory should be changed to $env/install
(https://spack.readthedocs.io/en/latest/configuration.html#spack-specific-variables) to be relative to the environment prefix.
JEDI uses netcdf-cxx4, which is not yet in the spack-stack environments.
This is a complicated issue around Python packages that require installing a newer/different version of pkconfig than the native Python installation provides. The issue in the authoritative spack repo is spack/spack#29308.
In short, with the recent changes in spack to install using pip
instead of python setup.py build
etc, and an apparent bug in pip
, it is required to have a full poetry
(py-poetry
) package installed.
If one tries to installs py-poetry
in Spack, then this has a dependency on py-crashtest
, which itself requires py-poetry
to be installed ... a circular dependency!
The workaround is to use a Python installation that has poetry
installed. This is easy if the Python installation on the HPC provides it (e.g. gaea), or if we use a basic miniconda installation on other HPCs and add poetry (as we do currently on hera).
Orion has the same problem, i.e. we need to provide a miniconda installation in the same way as on hera.
On macOS or Linux, it's easy, we can just use pip
to install poetry
outside of spack and everything works fine.
Note that one doesn't have to / cannot specify the dependency on poetry, having it in the basic Python installation is a workaround that does the magic behind the scenes.
Instead of using a singular site.yaml
, I think we should keep the individual configs (packages.yaml
, compilers.yaml
, etc) in the repository.
This makes it easier to generate configs for each system because Spack already outputs these files instead of having to concatenate them together manually.
create-env.py
can create the site.yaml
from the config files automatically.
It looks like a recent update to spack (spack/spack#28316) breaks building findutils on my macOS, see the comment here: spack/spack#28316 (comment)
The RDHPCS admins recommend moving to Intel 2022.1.2. We have Intel 2021 as compilers on these platforms, but should use 2022. The 2021 compilers are vulnerable to the log4j bug (not sure that it's relevant, but this is what we're going to be targeting).
We need ecflow available for running experiments/workflows.
Some systems (hera, jet - others?) have ecflow servers provided by the sysadmins - need to find a smart way for how to skip it there. While some of the other HPCs do have installations somewhere else (manually in certain cases), it would be good to provide it as part of spack-stack. For generic macOS/Linux, installing it via spack will be the easiest solution for sure.
Add nlohmann-json and nlohmann-json-schema-validator to packages used with all jedi applications (they are used for YAML validations across JEDI).
I think it'd be good to have an option to build https://github.com/JCSDA-internal/soca with spack stack.
I started on trying to build soca (using jedi-fv3-bundle-env
as a starting point), and here are a few things I discovered:
jedi-soca-bundle-env
will need nco
(some soca tests use ncks
)
One of the first tests fails with:
3: ===============================================================================
3: Running test executable
3: ===============================================================================
3: dyld: Library not loaded: @rpath/libatlas_f.0.27.dylib
3: Referenced from: /Users/annash/Documents/jedi/build/all-bundle/bin/soca_gridgen.x
3: Reason: image not found
3: dyld: Library not loaded: @rpath/libatlas_f.0.27.dylib
3: Referenced from: /Users/annash/Documents/jedi/build/all-bundle/bin/soca_gridgen.x
3: Reason: image not found
Looks like Fortran atlas
library isn't found? Any ideas how to fix this?
mapl depends on the ESMF version. In hpc-stack, we capture this by append the ESMF version to the mapl version. This is, however, somewhat inconsistent, because we don't do this for any other package that depends on some package in the stack (i.e. we don't add the netcdf version to any of the packages, even though many of the packages depend on it).
Should we try to be more consistent and drop the ESMF dependency information from the mapl install/module name? Simply roll out an entirely new env if a new mapl/esmf combination is required?
Or are ESMF and map outliers that change a lot more often than netcdf such that it makes sense to add this hierarchy information? We should be able to do this with spack.
I was building the stack on Orion, but I got some strange concretization preferences.
In common/packages.yaml
it marks the preferred Python version as 3.8.10. While I was building on Orion it was using the built-in [email protected]
for most packages, but fckit was building Python 3.8.10 from scratch. It should prefer the system version.
As @aerorahul pointed out, this repo should have a license and copyright information.
The site
configs must come first in order to being able to overwrite the common configs. We need to always remember the weird ordering in spack!
This does not work as expected:
include:
- common/packages.yaml
- site/packages.yaml
Instead, it needs to be this way:
include:
- site/packages.yaml
- common/packages.yaml
Currently the bufr
package in spack doesn't provide a way to install the Python bindings. Need to add option (or make it mandatory) to have the Python bindings enabled and installed.
UFS uses two a release and debug build of ESMF. I think this can be accomplished with variant in the ESMF package.py and using the suffix option in modulefiles.
parallelio fails to build with errors about conflicting type definitions when using apple-clang 12.0.5. This was reported by two people independently. It builds fine on my macOS with [email protected].
I will try to reproduce the error and paste the exact error message here.
I found some very interesting about the ranlib -c
used in bufr, and a couple other packages on macOS. Turns out that setting:
if(APPLE)
# The linker on macOS does not include `common symbols` by default
# Passing the -c flag includes them and fixes an error with undefined symbols
set(CMAKE_Fortran_ARCHIVE_FINISH "<CMAKE_RANLIB> -c <TARGET>")
endif()
Doesn't work as expected when installing libraries. ranlib is run correctly when creating the archive, but when running make install
, ranlib is run again (without the -c) and the archive header is regenerated, negating its effect. This works in UPP because it's built internally as part of UFS (no make install
). As far as I can tell there is no way to control ranlib flags used during the install phase (unless you manually edit the CMake install script), but I did a little research and found something better.
The -fno-common
flag can be used in both ifort and gfortran and prevents variables being added to the common section at compile time. Interestingly, I ran into this when compiling with gfortran. I had only ever experienced this with ifort.
I was adding the MET package which links to bufr and even though I saw ranlib -c
being used in the library build, my build would fail with missing symbols. Then, if I ran ranlib -c
manually on the installed library the build would work.
On macOS*, if a library built with the ar utility contains objects with Fortran module data but no executable functions, the symbols corresponding to the module data may not be resolved when an object referencing them is linked against the library. You can work around this by compiling with option -fno-common.
Another interesting tidbit is that GCC-10 defaults to using -fno-common
, so I wouldn't have experienced this had I not been using GCC-9.
https://gcc.gnu.org/gcc-10/porting_to.html
Not directly a spack-stack related issue, but I thought this would be a good place to document it.
Working on porting Spack build to Azure, AWS, and Google Cloud in NOAA Parallel Works.
I think we've seen this before, but I don't know why. When I attempt to build on Orion, NetCDF-Fortran fails because NetCDF-C is built statically.
When looking at the config.log for NetCDF it says checking if libtool supports shared libraries... no
.
Same error as spack/spack#29354
@climbfuji is there something we can do with these internal repos? Like put them in their own bundle until if/when they become public.
Error loading libimf.so
on Ubuntu. Need to set LD_LIBRARY_PATH
or EXTRA_RPATHS
in compilers.yaml if trying to avoid using ---dirty
.
Create documentation
Some ideas:
Finding compiler/modules
Finding common external packages
Spack Mirrors/Caches
Manually Configuring Compilers/versions
Need a site.yaml for various systems.
This should contain compilers, modules, and external packages.
Some thought should be put to how specific we want to be for generic systems like macOS/Linux. I think it might be better to let users run spack compiler find
and spack external find
rather than hardcoding system paths.
The last remaining restricted libraries from ECMWF are now publicly available: transi replaces trans, and fiat replaces faux.
https://github.com/ecmwf-ifs/fiat
https://github.com/ecmwf-ifs/ectrans
We need to make this change in our spack-stack repos and configuration files.
Hera (and other RDHPCS platforms) block outgoing connections by default. This means Spack can't download the Clingo binary and it has to be built from source.
It also turns out there was another bug which prevented bootstrapping from source because Spack was detecting Hera as a Cray system.
Applying the fix here allowed me to build Clingo: spack/spack#28726
This requires a GCC compile in your compilers.yaml that supports C++14 (> GCC 4.0).
Then, I ran spack bootstrap untrust github-actions
Then, the firewall prevented me from doing more, but I think a Spack mirror would resolve that issue.
On Orion I'm getting:
[+] oah2cnu ^[email protected]%[email protected] arch=linux-centos7-skylake_avx512
And
[+] lnctc7m ^[email protected]%[email protected] arch=linux-centos7-skylake_avx512
because two versions of py-setuptools
are being used (57.4.0 and 59.4.0):
[+] yi6zl45 ^[email protected]%[email protected] arch=linux-centos7-skylake_avx512
[+] dqwpoqv ^[email protected]%[email protected] arch=linux-centos7-skylake_avx512
With the bufr py-cython from nceplibs-bundle being different than the one from jedi-base-env.
I'm thinking one of the Python packages in JEDI is somehow forcing a different version (a conflict somewhere?) because nceplibs has no other use of Python. It is strange though that the nceplibs-bundle bufr is trying to use an older version of py-setuptools. Adding version 57.4.0 to packages.yaml lets them all be built with the same version, but trying to prefer 59.4.0 doesn't (57.4.0 is still built in one of the concretizations).
I'm going to look into it a little more to see why the concretizations are different, but preferring 57.4.0 should work.
I had an issue with packages being concretized more than once with different hashes, and then module generation fails.
For example:
[+] p646jz7 ^[email protected]%[email protected]~ipo+python build_type=RelWithDebInfo patches=e1df1ac218da54b57e84712dfc0b6735b84ce14a72aa18ac0c3533d7d94bb740 arch=linux-centos7-skylake_avx512
[+] 5j5zboe ^[email protected]%[email protected]+blas+lapack patches=873745d7b547857fcfec9cae90b09c133b42a4f0c23b6c2d84cf37e2dd816604 arch=linux-centos7-skylake_avx512
[+] lnctc7m ^[email protected]%[email protected] arch=linux-centos7-skylake_avx512
[+] gb2le4x ^[email protected]%[email protected] arch=linux-centos7-skylake_avx512
[+] dqwpoqv ^[email protected]%[email protected] arch=linux-centos7-skylake_avx512
[+] wxbkvbm ^[email protected]%[email protected] arch=linux-centos7-skylake_avx512
[+] p646jz7 ^[email protected]%[email protected]~ipo+python build_type=RelWithDebInfo patches=e1df1ac218da54b57e84712dfc0b6735b84ce14a72aa18ac0c3533d7d94bb740 arch=linux-centos7-skylake_avx512
[+] bm3glwv ^[email protected]%[email protected]+blas+lapack patches=873745d7b547857fcfec9cae90b09c133b42a4f0c23b6c2d84cf37e2dd816604 arch=linux-centos7-skylake_avx512
[+] qbxqeen ^[email protected]%[email protected]~bignuma~consistent_fpcsr~ilp64+locking+noavx512+pic+shared symbol_suffix=none threads=none arch=linux-centos7-skylake_avx512
[+] lmjma7n ^[email protected]%[email protected]+cpanm+shared+threads patches=0eac10ed90aeb0459ad8851f88081d439a4e41978e586ec743069e8b059370ac,3bbd7d6f9933d80b9571533867b444c6f8f5a1ba0575bfba1fba4db9d885a71a arch=linux-centos7-skylake_avx512
[+] oah2cnu ^[email protected]%[email protected] arch=linux-centos7-skylake_avx512
[+] gb2le4x ^[email protected]%[email protected] arch=linux-centos7-skylake_avx512
You can see two instances of py-numpy
being built with hashes 5j5zboe
and bm3glwv
.
A solution I found is to run spack install --reuse
or spack concretize --reuse
, and then I only see once instance of py-numpy
being built with hash 5j5zboe
.
The --reuse
behavior is supposed to be the default in the next version of Spack.
For the jedi-fv3 environment, the stack installs HDF5 version 1.10.6, but ioda requires HDF5 1.12.0 or greater to run successfully (v1.12.1 preferred).
HDF5 1.12.0 refactored several functions related to chunking and dataspace selections that greatly improved performance when reading and writing large files. IODA also uses several of the new APIs in a few parts of the code (and should error out in ctest when these functions are not available).
https://www.hdfgroup.org/2020/03/release-of-hdf5-1-12-0-newsletter-172/
Rather than loading NetCDF-C and -Fortran individually, (and already depend on each other) it would be nice to have a netcdf
module that encapsulates both. I think this might be possible using a BundlePackage
of netcdf-c
and netcdf-fortran
.
Otherwise, they'll have to be separate or externally created.
Was using Python 3.10 and my py-numpy build failed. Maybe add a conflict to the package, update numpy, or otherwise document it.
We should specify the providers for blas, lapack, fftw in the site config (or provide instructions for doing so for the default configs). intel-oneapi-mkl or intel-mkl work fine on most HPCs that have Intel compilers installed, but on macOS I had lots of trouble with intel-oneapi-mkl and building Python packages. I ended up removing mkl and using this instead:
blas:: [openblas]
lapack:: [openblas]
fftw-api:: [fftw]
As described in #40, and partially implemented in #43, we need providers for the linalg packages. Depending on the system, this can be a combination of
blas:: [openblas]
lapack:: [openblas]
fftw-api:: [fftw]
(this works for mac and other systems)
or everything can be provided by intel-mkl. As described in #40, using intel-mkl on macOS leads to errors with Python 3.9; also, installing intel-mkl for clang/gcc+gfortran users is unnecessary burden.
However, for systems using Intel compilers, in particular when performance is important, we should use MKL.
Currently ESMF and MAPL only get built in debug mode when installing the ufs-weather-model app (or the ufs-weather-model-env). This is because a single meta package cannot rely on two different versions of the same package.
Also, the module names (tcl, lmod) for mapl are off: mapl/2.8.1-debug-esmf-8.2.0-debug-debug-esmf-8.2.0-debug
- this should be mapl/2.8.1-debug-esmf-8.2.0-debug
for the debug version, and mapl/2.8.1-esmf-8.2.0
for the release version.
Note that in order to have a consistent set of modules, mapl must be built in debug mode when esmf+debug is loaded, this is currently inconsistent in the ufs-weather-model / hpc-stack.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.