Comments (14)
Yes, I think I use this downstream in clifford
. I'll try to find some time to diagnose this; my guess is that numba just changed their API somehow
from sparse.
Sure thing -- Just determining what needs changing and on which side at the moment. 😉
from sparse.
For completeness, current master fails with the same errors, plus a few more:
# same docker setup as above
zypper in git-core
git clone https://github.com/pydata/sparse.git
cd sparse
# remove coverage flags
sed -i /addopts/d pytest.ini
PYTHONPATH=$PWD pytest -v
958ff8a328b4:/sparse # PYTHONPATH=$PWD pytest -v
=============================================================================== test session starts ================================================================================
platform linux -- Python 3.10.10, pytest-7.3.1, pluggy-1.0.0 -- /usr/bin/python3.10
cachedir: .pytest_cache
rootdir: /sparse
configfile: pytest.ini
testpaths: sparse
collected 5523 items
...
===================================================================================== FAILURES =====================================================================================
_____________________________________________________________________________ TestBasic.test_roundtrip _____________________________________________________________________________
self = <test_coo_numba.TestBasic object at 0x7f8e7da62cb0>
def test_roundtrip(self):
c1 = sparse.COO(np.eye(3), fill_value=1)
> c2 = identity(c1)
sparse/tests/test_coo_numba.py:41:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib64/python3.10/site-packages/numba/core/dispatcher.py:468: in _compile_for_args
error_rewrite(e, 'typing')
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
e = TypingError('Failed in nopython mode pipeline (step: nopython frontend)\nnon-precise type pyobject\nDuring: typing of ...caused by the following argument(s):\n- argument 0: Cannot determine Numba type of <class \'sparse._coo.core.COO\'>\n')
issue_type = 'typing'
def error_rewrite(e, issue_type):
"""
Rewrite and raise Exception `e` with help supplied based on the
specified issue_type.
"""
if config.SHOW_HELP:
help_msg = errors.error_extras[issue_type]
e.patch_message('\n'.join((str(e).rstrip(), help_msg)))
if config.FULL_TRACEBACKS:
raise e
else:
> raise e.with_traceback(None)
E numba.core.errors.TypingError: Failed in nopython mode pipeline (step: nopython frontend)
E non-precise type pyobject
E During: typing of argument at /sparse/sparse/tests/test_coo_numba.py (8)
E
E File "sparse/tests/test_coo_numba.py", line 8:
E
E @numba.njit
E ^
E
E This error may have been caused by the following argument(s):
E - argument 0: Cannot determine Numba type of <class 'sparse._coo.core.COO'>
/usr/lib64/python3.10/site-packages/numba/core/dispatcher.py:409: TypingError
...
============================================================================= short test summary info ==============================================================================
FAILED sparse/tests/test_coo_numba.py::TestBasic::test_roundtrip - numba.core.errors.TypingError: Failed in nopython mode pipeline (step: nopython frontend)
FAILED sparse/tests/test_coo_numba.py::TestBasic::test_roundtrip_constant - numba.core.errors.TypingError: Failed in nopython mode pipeline (step: nopython frontend)
FAILED sparse/tests/test_coo_numba.py::TestBasic::test_unpack_attrs - numba.core.errors.TypingError: Failed in nopython mode pipeline (step: nopython frontend)
FAILED sparse/tests/test_coo_numba.py::TestBasic::test_repack_attrs - numba.core.errors.TypingError: Failed in nopython mode pipeline (step: nopython frontend)
FAILED sparse/tests/test_dot.py::test_tensordot[coo-gcxs-a_shape1-b_shape1-axes1] - ValueError: 'list' argument must have no negative elements
FAILED sparse/tests/test_dot.py::test_tensordot[coo-gcxs-a_shape3-b_shape3-axes3] - numpy.core._exceptions._ArrayMemoryError: Unable to allocate 685. TiB for an array with shape (94151847520437,) and data type int64
FAILED sparse/tests/test_dot.py::test_tensordot[coo-gcxs-a_shape7-b_shape7-axes7] - ValueError: array is too big; `arr.size * arr.dtype.itemsize` is larger than the maximum possible size.
FAILED sparse/tests/test_dot.py::test_tensordot[coo-gcxs-a_shape9-b_shape9-0] - ValueError: 'list' argument must have no negative elements
FAILED sparse/tests/test_dot.py::test_tensordot[gcxs-coo-a_shape1-b_shape1-axes1] - ValueError: array is too big; `arr.size * arr.dtype.itemsize` is larger than the maximum possible size.
====================================================== 9 failed, 5429 passed, 34 xfailed, 51 xpassed, 280 warnings in 55.48s =======================================================
from sparse.
Could you perhaps pin Numba to 0.56 instead of 0.57 and check if that works?
I believe @eric-wieser wrote the failing code.
Eric, is this used anywhere to the best of your knowledge?
from sparse.
Going back to Numba 0.56 is not an option for us, specifically because of the lack of Python 3.11 support. But yes, downgrading numba via pip (which also downgrades numpy) results in passed tests.
You can view this bug report as a humble indication that sparse needs some adjustments in order to keep up with the numba and numpy APIs.
from sparse.
docker pull opensuse/tumbleweed docker run -it --name sparsetest opensuse/tumbleweedin docker shell:
zypper ref zypper install python310-dask-array python310-numba python310-numpy python310-pytest python310-scipy wget wget https://files.pythonhosted.org/packages/source/s/sparse/sparse-0.14.0.tar.gz tar xf sparse-0.14.0.tar.gz cd sparse-0.14.0 PYTHONPATH=$PWD pytest -vSystem
- OS and version: openSUSE Tumbleweed Linux
sparse
version 0.1.4- NumPy version 1.24.2
- Numba version 0.57
Full build log from opensuse build service: osc-build-sparse.txt
I am unable to reproduce this given the instructions above. The debugging information is ambiguous. When I execute the commands you list I receive Numba 0.56.4 -- not 0.57.0 -- and so the bug doesn't manifest.
from sparse.
Numba in Tumbleweed is at 0.57.0 for sure. You must have forgotten to pull the latest tumbleweed docker image.
e5338e6c78b5:/sparse-0.14.0 # cat /etc/os-release
NAME="openSUSE Tumbleweed"
# VERSION="20230607"
ID="opensuse-tumbleweed"
ID_LIKE="opensuse suse"
VERSION_ID="20230607"
PRETTY_NAME="openSUSE Tumbleweed"
ANSI_COLOR="0;32"
CPE_NAME="cpe:/o:opensuse:tumbleweed:20230607"
BUG_REPORT_URL="https://bugzilla.opensuse.org"
SUPPORT_URL="https://bugs.opensuse.org"
HOME_URL="https://www.opensuse.org"
DOCUMENTATION_URL="https://en.opensuse.org/Portal:Tumbleweed"
LOGO="distributor-logo-Tumbleweed"
e5338e6c78b5:/sparse-0.14.0 # zypper info python310-numba
Loading repository data...
Reading installed packages...
Information for package python310-numba:
----------------------------------------
Repository : openSUSE-Tumbleweed-Oss
Name : python310-numba
Version : 0.57.0-1.1
Arch : x86_64
Vendor : openSUSE
Installed Size : 22.5 MiB
Installed : Yes
Status : up-to-date
Source package : python-numba-0.57.0-1.1.src
Upstream URL : https://numba.pydata.org/
Summary : NumPy-aware optimizing compiler for Python using LLVM
Description :
Numba is a NumPy-aware optimizing compiler for Python. It uses the
LLVM compiler infrastructure to compile Python syntax to
machine code.
It is aware of NumPy arrays as typed memory regions and so can speed-up
code using NumPy arrays. Other, less well-typed code will be translated
to Python C-API calls, effectively removing the "interpreter", but not removing
the dynamic indirection.
Numba is also not a tracing JIT. It *compiles* your code before it gets
run, either using run-time type information or type information you provide
in the decorator.
Numba is a mechanism for producing machine code from Python syntax and typed
data structures such as those that exist in NumPy.
e5338e6c78b5:/sparse-0.14.0 #
from sparse.
I'll try to reproduce this tomorrow or the day after with my local machine, however I believe openSUSE rolled back support since the bug (and perhaps others).
from sparse.
zypper info python310-numba
ffdb52c0c496:/sparse-0.14.0 # cat /etc/os-release
NAME="openSUSE Tumbleweed"
# VERSION="20230604"
ID="opensuse-tumbleweed"
ID_LIKE="opensuse suse"
VERSION_ID="20230604"
PRETTY_NAME="openSUSE Tumbleweed"
ANSI_COLOR="0;32"
CPE_NAME="cpe:/o:opensuse:tumbleweed:20230604"
BUG_REPORT_URL="https://bugzilla.opensuse.org"
SUPPORT_URL="https://bugs.opensuse.org"
HOME_URL="https://www.opensuse.org"
DOCUMENTATION_URL="https://en.opensuse.org/Portal:Tumbleweed"
LOGO="distributor-logo-Tumbleweed"
ffdb52c0c496:/sparse-0.14.0 # zypper info python310-numba
Loading repository data...
Reading installed packages...
Information for package python310-numba:
----------------------------------------
Repository : openSUSE-Tumbleweed-Oss
Name : python310-numba
Version : 0.57.0-1.1
Arch : aarch64
Vendor : openSUSE
Installed Size : 23.0 MiB
Installed : Yes
Status : out-of-date (version 0.56.4-2.1 installed)
Source package : python-numba-0.57.0-1.1.src
Upstream URL : https://numba.pydata.org/
Summary : NumPy-aware optimizing compiler for Python using LLVM
Description :
Numba is a NumPy-aware optimizing compiler for Python. It uses the
LLVM compiler infrastructure to compile Python syntax to
machine code.
It is aware of NumPy arrays as typed memory regions and so can speed-up
code using NumPy arrays. Other, less well-typed code will be translated
to Python C-API calls, effectively removing the "interpreter", but not removing
the dynamic indirection.
Numba is also not a tracing JIT. It *compiles* your code before it gets
run, either using run-time type information or type information you provide
in the decorator.
Numba is a mechanism for producing machine code from Python syntax and typed
data structures such as those that exist in NumPy.
But then
ffdb52c0c496:/sparse-0.14.0 # numba -s | grep "Numba Version"
/usr/bin/python3.10: No module named pip
Numba Version : 0.56.4
from sparse.
Arch : aarch64
...
Status : out-of-date (version 0.56.4-2.1 installed)
There is your problem. Aarch64 is not on the latest rolling release.
from sparse.
Arch : aarch64 ... Status : out-of-date (version 0.56.4-2.1 installed)
There is your problem. Aarch64 is not on the latest rolling release.
Oh, I see, I only have this hardware available at present...
I tried updating the package manually, and now pytest
segfaults for me..
ffdb52c0c496:/sparse-0.14.0 # zypper install python310-numba
Loading repository data...
Reading installed packages...
Resolving package dependencies...
The following 2 packages are going to be upgraded:
python310-llvmlite python310-numba
The following NEW package is going to be installed:
libLLVM14
2 packages to upgrade, 1 new.
Overall download size: 24.2 MiB. Already cached: 0 B. After the operation, additional 91.5 MiB will be used.
Continue? [y/n/v/...? shows all options] (y): y
Retrieving: libLLVM14-14.0.6-9.1.aarch64 (openSUSE-Tumbleweed-Oss) (1/3), 19.7 MiB
Retrieving: libLLVM14-14.0.6-9.1.aarch64.rpm ..........................................................................................................[done (717.4 KiB/s)]
Retrieving: python310-llvmlite-0.40.0-1.1.aarch64 (openSUSE-Tumbleweed-Oss) (2/3), 270.5 KiB
Retrieving: python310-llvmlite-0.40.0-1.1.aarch64.rpm .................................................................................................[done (267.4 KiB/s)]
Retrieving: python310-numba-0.57.0-1.1.aarch64 (openSUSE-Tumbleweed-Oss) (3/3), 4.2 MiB
Retrieving: python310-numba-0.57.0-1.1.aarch64.rpm ....................................................................................................[done (710.1 KiB/s)]
Checking for file conflicts: ........................................................................................................................................[done]
(1/3) Installing: libLLVM14-14.0.6-9.1.aarch64 ......................................................................................................................[done]
(2/3) Installing: python310-llvmlite-0.40.0-1.1.aarch64 .............................................................................................................[done]
warning: file pycc: remove failed: No such file or directory
(3/3) Installing: python310-numba-0.57.0-1.1.aarch64 ................................................................................................................[done]
ffdb52c0c496:/sparse-0.14.0 # PYTHONPATH=$PWD pytest -v
\=========================================================================== test session starts ===========================================================================
platform linux -- Python 3.10.10, pytest-7.3.1, pluggy-1.0.0 -- /usr/bin/python3.10
cachedir: .pytest_cache
rootdir: /sparse-0.14.0
collected 5496 items
sparse/tests/test_array_function.py::test_unary[mean] Fatal Python error: Segmentation fault
Current thread 0x0000ffffb66c2020 (most recent call first):
File "/sparse-0.14.0/sparse/_coo/indexing.py", line 179 in _mask
File "/sparse-0.14.0/sparse/_coo/indexing.py", line 77 in getitem
File "/sparse-0.14.0/sparse/_sparse_array.py", line 386 in reduce
File "/sparse-0.14.0/sparse/_sparse_array.py", line 278 in _reduce
File "/sparse-0.14.0/sparse/_sparse_array.py", line 307 in __array_ufunc__
File "/sparse-0.14.0/sparse/_sparse_array.py", line 419 in sum
File "/sparse-0.14.0/sparse/_sparse_array.py", line 686 in mean
File "/sparse-0.14.0/sparse/_sparse_array.py", line 268 in __array_function__
File "<__array_function__ internals>", line 200 in mean
File "/sparse-0.14.0/sparse/tests/test_array_function.py", line 27 in test_unary
File "/usr/lib/python3.10/site-packages/_pytest/python.py", line 194 in pytest_pyfunc_call
File "/usr/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall
File "/usr/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec
File "/usr/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__
File "/usr/lib/python3.10/site-packages/_pytest/python.py", line 1799 in runtest
File "/usr/lib/python3.10/site-packages/_pytest/runner.py", line 169 in pytest_runtest_call
File "/usr/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall
File "/usr/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec
File "/usr/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__
File "/usr/lib/python3.10/site-packages/_pytest/runner.py", line 262 in <lambda>
File "/usr/lib/python3.10/site-packages/_pytest/runner.py", line 341 in from_call
File "/usr/lib/python3.10/site-packages/_pytest/runner.py", line 261 in call_runtest_hook
File "/usr/lib/python3.10/site-packages/_pytest/runner.py", line 222 in call_and_report
File "/usr/lib/python3.10/site-packages/_pytest/runner.py", line 133 in runtestprotocol
File "/usr/lib/python3.10/site-packages/_pytest/runner.py", line 114 in pytest_runtest_protocol
File "/usr/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall
File "/usr/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec
File "/usr/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__
File "/usr/lib/python3.10/site-packages/_pytest/main.py", line 348 in pytest_runtestloop
File "/usr/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall
File "/usr/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec
File "/usr/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__
File "/usr/lib/python3.10/site-packages/_pytest/main.py", line 323 in _main
File "/usr/lib/python3.10/site-packages/_pytest/main.py", line 269 in wrap_session
File "/usr/lib/python3.10/site-packages/_pytest/main.py", line 316 in pytest_cmdline_main
File "/usr/lib/python3.10/site-packages/pluggy/_callers.py", line 39 in _multicall
File "/usr/lib/python3.10/site-packages/pluggy/_manager.py", line 80 in _hookexec
File "/usr/lib/python3.10/site-packages/pluggy/_hooks.py", line 265 in __call__
File "/usr/lib/python3.10/site-packages/_pytest/config/__init__.py", line 166 in main
File "/usr/lib/python3.10/site-packages/_pytest/config/__init__.py", line 189 in console_main
File "/usr/bin/pytest-3.10", line 33 in <module>
Extension modules: numpy.core._multiarray_umath, numpy.core._multiarray_tests, numpy.linalg._umath_linalg, numpy.fft._pocketfft_internal, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, numpy.random._sfc64, numpy.random._generator, scipy._lib._ccallback_c, yaml._yaml, numba.core.typeconv._typeconv, numba._helperlib, numba._dynfunc, numba._dispatcher, numba.core.runtime._nrt_python, numba.np.ufunc._internal, numba.experimental.jitclass._box, scipy.sparse._sparsetools, _csparsetools, scipy.sparse._csparsetools, scipy.sparse.linalg._isolve._iterative, scipy.linalg._fblas, scipy.linalg._flapack, scipy.linalg._cythonized_array_utils, scipy.linalg._flinalg, scipy.linalg._solve_toeplitz, scipy.linalg._matfuncs_sqrtm_triu, scipy.linalg.cython_lapack, scipy.linalg.cython_blas, scipy.linalg._matfuncs_expm, scipy.linalg._decomp_update, scipy.sparse.linalg._dsolve._superlu, scipy.sparse.linalg._eigen.arpack._arpack, scipy.sparse.csgraph._tools, scipy.sparse.csgraph._shortest_path, scipy.sparse.csgraph._traversal, scipy.sparse.csgraph._min_spanning_tree, scipy.sparse.csgraph._flow, scipy.sparse.csgraph._matching, scipy.sparse.csgraph._reordering, numpy.linalg.lapack_lite, scipy.spatial._ckdtree, scipy._lib.messagestream, scipy.spatial._qhull, scipy.spatial._voronoi, scipy.spatial._distance_wrap, scipy.spatial._hausdorff, scipy.special._ufuncs_cxx, scipy.special._ufuncs, scipy.special._specfun, scipy.special._comb, scipy.special._ellip_harm_2, scipy.spatial.transform._rotation, scipy.ndimage._nd_image, _ni_label, scipy.ndimage._ni_label, scipy.optimize._minpack2, scipy.optimize._group_columns, scipy.optimize._trlib._trlib, scipy.optimize._lbfgsb, _moduleTNC, scipy.optimize._moduleTNC, scipy.optimize._cobyla, scipy.optimize._slsqp, scipy.optimize._minpack, scipy.optimize._lsq.givens_elimination, scipy.optimize._zeros, scipy.optimize.__nnls, scipy.optimize._highs.cython.src._highs_wrapper, scipy.optimize._highs._highs_wrapper, scipy.optimize._highs.cython.src._highs_constants, scipy.optimize._highs._highs_constants, scipy.linalg._interpolative, scipy.optimize._bglu_dense, scipy.optimize._lsap, scipy.optimize._direct, scipy.integrate._odepack, scipy.integrate._quadpack, scipy.integrate._vode, scipy.integrate._dop, scipy.integrate._lsoda, scipy.special.cython_special, scipy.stats._stats, scipy.stats.beta_ufunc, scipy.stats._boost.beta_ufunc, scipy.stats.binom_ufunc, scipy.stats._boost.binom_ufunc, scipy.stats.nbinom_ufunc, scipy.stats._boost.nbinom_ufunc, scipy.stats.hypergeom_ufunc, scipy.stats._boost.hypergeom_ufunc, scipy.stats.ncf_ufunc, scipy.stats._boost.ncf_ufunc, scipy.stats.ncx2_ufunc, scipy.stats._boost.ncx2_ufunc, scipy.stats.nct_ufunc, scipy.stats._boost.nct_ufunc, scipy.stats.skewnorm_ufunc, scipy.stats._boost.skewnorm_ufunc, scipy.stats.invgauss_ufunc, scipy.stats._boost.invgauss_ufunc, scipy.interpolate._fitpack, scipy.interpolate.dfitpack, scipy.interpolate._bspl, scipy.interpolate._ppoly, scipy.interpolate.interpnd, scipy.interpolate._rbfinterp_pythran, scipy.interpolate._rgi_cython, scipy.stats._biasedurn, scipy.stats._levy_stable.levyst, scipy.stats._stats_pythran, scipy._lib._uarray._uarray, scipy.stats._statlib, scipy.stats._mvn, scipy.stats._sobol, scipy.stats._qmc_cy, scipy.stats._rcont.rcont (total: 122)
Segmentation fault
from sparse.
Try doing a full zypper dup
, or just use a virtual environmen without openSUSE on your end. Maybe the wheels or conda packages for arm suit your needs better and still reproduce the errror.
from sparse.
I'll try to reproduce this tomorrow or the day after with my local machine, however I believe openSUSE rolled back support since the bug (and perhaps others).
OK. Thank you. Unfortunately this means, any potential fixes will not make it into 0.57.1 -- we have already delayed the release by a few days trying to fix and debug this, and we need to get on with the release. Unfortunately, everyone who looked at this came to the conclusion, that this is probably not on the Numba side.
@kc611 will update the corresponding Numba issue numba/numba#8993 with details on what else was tried.
If this does turn out to be on the Numba side, we will release a 0.57.2 with a fix.
from sparse.
@bnavigator Would you be willing to make a PR based on numba/numba#8993 (comment)?
If not, I can do it in the coming week; that's when I get my desktop back.
from sparse.
Related Issues (20)
- Unable to install shap on python 3.11 doubt to numba cannot install on Python version 3.11.2 HOT 1
- `einsum` doesn't handle `dtype` and `optimize` kwargs HOT 2
- Proper usage of sparse 3D array: initialization, assignment of values and use in calculations HOT 2
- np.diff equivalent HOT 2
- DeprecationWarning: coords should be an ndarray. This will raise a ValueError in the future. HOT 1
- Correct type hint for sparse COO matrix? HOT 1
- numpy ufuncs can change the compression of a gcxs array HOT 12
- Support `arr.size >= 2 ** 64` as long as each dimension is `< 2 ** 64` HOT 1
- Constructing GCXS from non-canonical `scipy.sparse.csr_matrix` results in wrong results HOT 5
- Support for Fortran order in COO.flatten() HOT 1
- Consider MatRepr for `__repr__` and `_repr_html_` HOT 9
- `GCXS` matmul => slice leads to incorrect results HOT 4
- `TypeError` when accessing `.real` and `.imag` attribute on sparse array of string type
- Installing directly from GitHub fails on Python 3.12 HOT 5
- Some functions seem missing from the top level import HOT 2
- [conda] Sparse version string is not recognized by pip list/check HOT 1
- Segmentation fault on arm64 HOT 12
- Remove runtime dependency on SciPy HOT 1
- Xarray/Numpy/Dask Indexing using a Sparse Array HOT 6
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from sparse.