Giter Club home page Giter Club logo

Comments (16)

Ferenc- avatar Ferenc- commented on May 20, 2024

Hi @ikaakkola,
Well that's how AutoTracing works, it checks the available python modules, and instruments them.
If you have modules in your dependency trees that nobody uses, then it is in general a good idea to remove them.
Now from your traceback (provided it was from the time when you experienced hanging),
the only thing I can see is that the from google.cloud import storage import statement causes hanging,
but not that instana would have anything to do with it.
So if you would like to keep autotrace on, with the current dependency tree, some more input would be needed from you.
Perhaps you could insert from google.cloud import storage into that call path yourself, and see if that hangs anyway?
And perhaps it would be useful to know how the traceback continues after File "/usr/local/lib/python3.6/dist-packages/google/cloud/storage/batch.py", line 30, as that should not be the end of the traceback,
because the way I see it, the import chain continues with from google.cloud.storage._http import Connection,
and that's where google module starts checking your environment variables, which are in fact system calls and can hang.

from python-sensor.

ikaakkola avatar ikaakkola commented on May 20, 2024

Hi,

Agreed, this is how AutoTracing works. The module is installed because the service supports taking backups to Google Cloud storage, and hence it needs the relevant Python modules to be available, as the decision of 'where to backup' is a configuration option. So having modules that "nobody uses" isn't entirely correct, the service could be configured to use the module.

I have not been able to get from google.cloud import storage to hang (eg. via some test.py that does the import). It does however hang consistently when called via https://github.com/instana/python-sensor/blob/master/instana/instrumentation/google/cloud/storage.py - only not when strace or gdb is attached to the process doing the import.

It would indeed be very useful to know how the traceback continues, but this is what the Python traceback gave me with PYTHONFAULTHANDLER=true and kill -SIGABRT- I am not a python developer, so I do not know if there is some way to provide a longer traceback (is there some property to define the maximum length of traceback to return?).

from python-sensor.

Ferenc- avatar Ferenc- commented on May 20, 2024

Perhaps you could try pip3 install pyrasite, and then attach to the hanging process with pyrasite-shell ${PID_OF_YOUR_HANGING_PROCESS}"
and once you get that shell, then copy and execute this loop:

import sys, traceback

for thread_id, frame in sys._current_frames().items():
  print('Stack for thread {}'.format(thread_id))
  traceback.print_stack(frame)
  print('')

from python-sensor.

ikaakkola avatar ikaakkola commented on May 20, 2024

I tried to manually get the instana instrumentation code to hang, but could not. (what I tried is to 'pip3 install instana' and import instana inside the patroni process and then restart the patroni service).

It seems like, based on "Once the Instana host agent is installed on your host, it will periodically scan for running Python processes and automatically apply the Instana Python instrumentation transparently." (quoted from the documentation at https://www.ibm.com/docs/en/obi/current?topic=package-python-configuration-configuring-instana#autotrace) , that this automatic apply of the instrumentation to the running process is what causes it to hang.

I will reconfigure the environment so that instana is again using AutoTrace and will try the above debugging process once I get Patroni to hang again.

from python-sensor.

ikaakkola avatar ikaakkola commented on May 20, 2024

pyrasite-shell hangs when launched against the pid of the hanging process.

When com.instana.plugin.python -> autotrace -> enabled: true is set, all the existing 'patroni' processes in all the running service containers hanged after a moment , so this seems to happen 100% of the time when AutoTrace is used against spilo+patroni postgresql pods.

from python-sensor.

Ferenc- avatar Ferenc- commented on May 20, 2024

And did you get the traceback from the shell?

from python-sensor.

ikaakkola avatar ikaakkola commented on May 20, 2024

I did not, because the shell does not open to any usable state, it is just stuck there.

from python-sensor.

ikaakkola avatar ikaakkola commented on May 20, 2024

When I run the service with INSTANA_DEBUG=true I get the following to STDERR before the process hangs:

2022-07-04 10:14:04,724: 831 INFO instana: Stan is on the scene.  Starting Instana instrumentation version: 1.37.3
2022-07-04 10:14:04,724: 831 DEBUG instana: Loading Host Collector
2022-07-04 10:14:04,724: 831 DEBUG instana: Initializing host agent state machine
2022-07-04 10:14:04,724 INFO: Stan is on the scene.  Starting Instana instrumentation version: 1.37.3
2022-07-04 10:14:04,724 DEBUG: Loading Host Collector
2022-07-04 10:14:04,724 DEBUG: Initializing host agent state machine
2022-07-04 10:14:04,788: 831 DEBUG instana: Instrumenting asyncio
2022-07-04 10:14:04,788 DEBUG: Instrumenting asyncio

(this is exactly as written to a debug file with 2>/tmp/patroni.stderr)

It looks like Stan is on the scene twice , could this cause some weird deadlock issue?

There is a Zombie 'ldconfig.real' process as a child of the Patroni python process, which appears at the same time as the patroni process hangs (and that would be when AutoTrace starts doing the instrumentation stuff).

from python-sensor.

ikaakkola avatar ikaakkola commented on May 20, 2024

I added

import sys, traceback

for thread_id, frame in sys._current_frames().items():
  print('Stack for thread {}'.format(thread_id))
  traceback.print_stack(frame)
  print('')

into /usr/local/lib/python3.6/dist-packages/google/cloud/storage/batch.py just before line 28

which provided me this (so this would be just before the process hangs):

  File "/usr/lib/python3.6/threading.py", line 884, in _bootstrap
    self._bootstrap_inner()
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 1180, in run
    self.finished.wait(self.interval)
  File "/usr/lib/python3.6/threading.py", line 551, in wait
    signaled = self._cond.wait(timeout)
  File "/usr/lib/python3.6/threading.py", line 299, in wait
    gotit = waiter.acquire(True, timeout)
  File "/usr/lib/python3.6/threading.py", line 884, in _bootstrap
    self._bootstrap_inner()
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/local/lib/python3.6/dist-packages/patroni/log.py", line 174, in run
    record = self._queue_handler.queue.get(True)
  File "/usr/lib/python3.6/queue.py", line 164, in get
    self.not_empty.wait()
  File "/usr/lib/python3.6/threading.py", line 295, in wait
    waiter.acquire()
  File "/usr/lib/python3.6/threading.py", line 884, in _bootstrap
    self._bootstrap_inner()
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib/python3.6/socketserver.py", line 236, in serve_forever
    ready = selector.select(poll_interval)
  File "/usr/lib/python3.6/selectors.py", line 376, in select
    fd_event_list = self._poll.poll(timeout)
  File "/usr/lib/python3.6/threading.py", line 884, in _bootstrap
    self._bootstrap_inner()
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/local/lib/python3.6/dist-packages/patroni/postgresql/callback_executor.py", line 29, in run
    self._condition.wait()
  File "/usr/lib/python3.6/threading.py", line 295, in wait
    waiter.acquire()
  File "/usr/lib/python3.6/threading.py", line 884, in _bootstrap
    self._bootstrap_inner()
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/local/lib/python3.6/dist-packages/patroni/dcs/kubernetes.py", line 654, in run
    self._build_cache()
  File "/usr/local/lib/python3.6/dist-packages/patroni/dcs/kubernetes.py", line 625, in _build_cache
    self._do_watch(objects.metadata.resource_version)
  File "/usr/local/lib/python3.6/dist-packages/patroni/dcs/kubernetes.py", line 611, in _do_watch
    for event in iter_response_objects(response):
  File "/usr/local/lib/python3.6/dist-packages/patroni/utils.py", line 383, in iter_response_objects
    for chunk in response.read_chunked(decode_content=False):
  File "/usr/lib/python3/dist-packages/urllib3/response.py", line 598, in read_chunked
    self._update_chunk_length()
  File "/usr/lib/python3/dist-packages/urllib3/response.py", line 540, in _update_chunk_length
    line = self._fp.fp.readline()
  File "/usr/lib/python3.6/socket.py", line 586, in readinto
    return self._sock.recv_into(b)
  File "/usr/lib/python3.6/ssl.py", line 1012, in recv_into
    return self.read(nbytes, buffer)
  File "/usr/lib/python3.6/ssl.py", line 874, in read
    return self._sslobj.read(len, buffer)
  File "/usr/lib/python3.6/ssl.py", line 631, in read
    v = self._sslobj.read(len, buffer)
  File "/usr/lib/python3.6/threading.py", line 884, in _bootstrap
    self._bootstrap_inner()
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/local/lib/python3.6/dist-packages/patroni/dcs/kubernetes.py", line 654, in run
    self._build_cache()
  File "/usr/local/lib/python3.6/dist-packages/patroni/dcs/kubernetes.py", line 625, in _build_cache
    self._do_watch(objects.metadata.resource_version)
  File "/usr/local/lib/python3.6/dist-packages/patroni/dcs/kubernetes.py", line 611, in _do_watch
    for event in iter_response_objects(response):
  File "/usr/local/lib/python3.6/dist-packages/patroni/utils.py", line 383, in iter_response_objects
    for chunk in response.read_chunked(decode_content=False):
  File "/usr/lib/python3/dist-packages/urllib3/response.py", line 598, in read_chunked
    self._update_chunk_length()
  File "/usr/lib/python3/dist-packages/urllib3/response.py", line 540, in _update_chunk_length
    line = self._fp.fp.readline()
  File "/usr/lib/python3.6/socket.py", line 586, in readinto
    return self._sock.recv_into(b)
  File "/usr/lib/python3.6/ssl.py", line 1012, in recv_into
    return self.read(nbytes, buffer)
  File "/usr/lib/python3.6/ssl.py", line 874, in read
    return self._sslobj.read(len, buffer)
  File "/usr/lib/python3.6/ssl.py", line 631, in read
    v = self._sslobj.read(len, buffer)
  File "/usr/local/bin/patroni", line 11, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.6/dist-packages/patroni/__main__.py", line 143, in main
    return patroni_main()
  File "/usr/local/lib/python3.6/dist-packages/patroni/__main__.py", line 135, in patroni_main
    abstract_main(Patroni, schema)
  File "/usr/local/lib/python3.6/dist-packages/patroni/daemon.py", line 100, in abstract_main
    controller.run()
  File "/usr/local/lib/python3.6/dist-packages/patroni/__main__.py", line 105, in run
    super(Patroni, self).run()
  File "/usr/local/lib/python3.6/dist-packages/patroni/daemon.py", line 59, in run
    self._run_cycle()
  File "/usr/local/lib/python3.6/dist-packages/patroni/__main__.py", line 117, in _run_cycle
    self.schedule_next_run()
  File "/usr/local/lib/python3.6/dist-packages/patroni/__main__.py", line 99, in schedule_next_run
    elif self.ha.watch(nap_time):
  File "/usr/local/lib/python3.6/dist-packages/patroni/ha.py", line 1570, in watch
    return self.dcs.watch(leader_index, timeout)
  File "/usr/local/lib/python3.6/dist-packages/patroni/dcs/kubernetes.py", line 1137, in watch
    return super(Kubernetes, self).watch(None, timeout + 0.5)
  File "/usr/local/lib/python3.6/dist-packages/patroni/dcs/__init__.py", line 939, in watch
    self.event.wait(timeout)
  File "/usr/lib/python3.6/threading.py", line 551, in wait
    signaled = self._cond.wait(timeout)
  File "/usr/lib/python3.6/threading.py", line 299, in wait
    gotit = waiter.acquire(True, timeout)
  File "<string>", line 1, in <module>
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/tmp/.instana/python/instana/__init__.py", line 206, in <module>
    boot_agent()
  File "/tmp/.instana/python/instana/__init__.py", line 155, in boot_agent
    from .instrumentation.google.cloud import storage
  File "<frozen importlib._bootstrap>", line 1023, in _handle_fromlist
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/tmp/.instana/python/instana/instrumentation/google/cloud/storage.py", line 14, in <module>
    from google.cloud import storage
  File "<frozen importlib._bootstrap>", line 1023, in _handle_fromlist
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/usr/local/lib/python3.6/dist-packages/google/cloud/storage/__init__.py", line 35, in <module>
    from google.cloud.storage.batch import Batch
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked

from python-sensor.

ikaakkola avatar ikaakkola commented on May 20, 2024

When I add a random 1-10 second sleep into /usr/local/lib/python3.6/dist-packages/google/cloud/storage/batch.py the process no-longer hangs, so the problem is very likely that 'Stan is on the scene' twice and the Google cloud storage library hits a python, or underlying operating system bug, when it ends up calling re.py twice at the same instant;

Current thread 0x00007f8f746a7740 (most recent call first):
  File "/usr/lib/python3.6/re.py", line 182 in search
  File "/usr/lib/python3.6/ctypes/util.py", line 283 in _findSoname_ldconfig
  File "/usr/lib/python3.6/ctypes/util.py", line 313 in find_library
  File "/usr/lib/python3/dist-packages/asn1crypto/_perf/_big_num_ctypes.py", line 35 in <module>

@Ferenc- would you happen to have any idea why AutoTrace appears to be executed twice, at the same time, for these python processes or shall I turn to our Instana support for that instead?

In any case, I would currently say that this isn't really a bug in Instana python-sensor directly.

from python-sensor.

Ferenc- avatar Ferenc- commented on May 20, 2024
  File "/usr/local/lib/python3.6/dist-packages/patroni/__main__.py", line 99, in schedule_next_run
    elif self.ha.watch(nap_time):
  File "/usr/local/lib/python3.6/dist-packages/patroni/ha.py", line 1570, in watch
    return self.dcs.watch(leader_index, timeout)
  File "/usr/local/lib/python3.6/dist-packages/patroni/dcs/kubernetes.py", line 1137, in watch
    return super(Kubernetes, self).watch(None, timeout + 0.5)
  File "/usr/local/lib/python3.6/dist-packages/patroni/dcs/__init__.py", line 939, in watch
    self.event.wait(timeout)
  File "/usr/lib/python3.6/threading.py", line 551, in wait
    signaled = self._cond.wait(timeout)
  File "/usr/lib/python3.6/threading.py", line 299, in wait
    gotit = waiter.acquire(True, timeout)

From this the only thing appears to be "hanging" is patroni itself scheduling it's next run, and it's watchdog waiting. for loop_wait time. That loop_wait time appears to be configurable, so I would recommend try it out with double and half values, for confirming that is the value that your system is waiting for.

would you happen to have any idea why AutoTrace appears to be executed twice, at the same time, for these python processes or shall I turn to our Instana support for that instead?

That is initiated by the agent, which is proprietary. And should not be discussed here, only on different channels.
For that one would had over to for eaxmple (https://support.instana.com/hc/en-us), and since so far I am confident that has nothing to do with your issue, I see no reason for that.

from python-sensor.

ikaakkola avatar ikaakkola commented on May 20, 2024

That is initiated by the agent, which is proprietary. And should not be discussed here, only on different channels.
For that one would had over to for eaxmple (https://support.instana.com/hc/en-us), and since so far I am confident that has nothing to do with your issue, I see no reason for that.

Yes, so I will go via Instana Support for that, which I believe is the cause of this.

My guess is (but as you said, this is proprietary) that AutoTrace somehome manages to import modules twice, which I think should not happen in Python (there is a reload() to , well, reload modules).

from python-sensor.

Ferenc- avatar Ferenc- commented on May 20, 2024

As there is no sign of python-sensor causing any delay, and there is evidence of patroni waiting.
I am closing this for now. Can be reopened if there is some evidence of a python-sensor bug.

from python-sensor.

ikaakkola avatar ikaakkola commented on May 20, 2024

I'm fine with this being closed as I believe it to be an AutoTrace issue and not directly related to 'python-sensor', but it is not a Patroni bug, here is a minimal reproduction of a similar problem where the process hangs (assumes that the kubernetes environment has instana-agent running and python auto trace is enabled). This process hangs earlier in the Instana python code, when importing profiler at line 10 in /tmp/.instana/python/instana/singletons.py:

Start a new Kubernetes pod with image ubuntu:18.04

kubectl run -n <namespace> -i --tty --rm python-debug-pod  --image="ubuntu:18.04" -- bash

Install python3 and required utils

apt-get update && apt-get install -y python3 runit

Create a test python file

#!/usr/bin/python3

# -*- coding: utf-8 -*-
import logging
from time import sleep
from threading import Thread

logging.basicConfig(
    format='%(asctime)s %(levelname)-8s %(message)s',
    level=logging.INFO,
    datefmt='%Y-%m-%d %H:%M:%S')
logger = logging.getLogger("test-app")

class Example(Thread):

    def run (self):
        while True:
            logger.info("Sleeping..")
            sleep(1)
            logger.info("Continuing..")

Example().start()

Start the test python process:

unset TERM && PYTHONFAULTHANDLER=true INSTANA_DEBUG=true chpst -u daemon python3 test.py

Wait for a moment for AutoTrace to attempt to instrument this process and it will hang

Note that the process hangs before INSTANA_DEBUG would print anything, so there will be no "Stan is on the scene" output

Get the running processes

ps -ef
daemon    6272     1  0 07:55 pts/0    00:00:00 python3 test.py
daemon    6373  6272  0 07:56 pts/0    00:00:00 [ldconfig.real] <defunct>

Note that the python3 process has a zombie 'ldconfig.real' child process

Get a traceback from the process (via PYTHONFAULTHANDLER)

kill -SIGABRT 6272

resulting traceback from SIGABRT:

2022-07-05 07:56:19 INFO     Continuing..
2022-07-05 07:56:19 INFO     Sleeping..
Fatal Python error: Aborted

Thread 0x00007f7a699af700 (most recent call first):
  File "test.py", line 19 in run
  File "/usr/lib/python3.6/threading.py", line 916 in _bootstrap_inner
  File "/usr/lib/python3.6/threading.py", line 884 in _bootstrap

Current thread 0x00007f7a6af12740 (most recent call first):
  File "/usr/lib/python3.6/ctypes/util.py", line 283 in _findSoname_ldconfig
  File "/usr/lib/python3.6/ctypes/util.py", line 313 in find_library
  File "/usr/lib/python3.6/uuid.py", line 490 in <module>
  File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
  File "<frozen importlib._bootstrap_external>", line 678 in exec_module
  File "<frozen importlib._bootstrap>", line 665 in _load_unlocked
  File "<frozen importlib._bootstrap>", line 955 in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 971 in _find_and_load
  File "/tmp/.instana/python/instana/autoprofile/profile.py", line 6 in <module>
  File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
  File "<frozen importlib._bootstrap_external>", line 678 in exec_module
  File "<frozen importlib._bootstrap>", line 665 in _load_unlocked
  File "<frozen importlib._bootstrap>", line 955 in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 971 in _find_and_load
  File "/tmp/.instana/python/instana/autoprofile/sampler_scheduler.py", line 8 in <module>
  File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
  File "<frozen importlib._bootstrap_external>", line 678 in exec_module
  File "<frozen importlib._bootstrap>", line 665 in _load_unlocked
  File "<frozen importlib._bootstrap>", line 955 in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 971 in _find_and_load
  File "/tmp/.instana/python/instana/autoprofile/profiler.py", line 13 in <module>
  File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
  File "<frozen importlib._bootstrap_external>", line 678 in exec_module
  File "<frozen importlib._bootstrap>", line 665 in _load_unlocked
  File "<frozen importlib._bootstrap>", line 955 in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 971 in _find_and_load
  File "/tmp/.instana/python/instana/singletons.py", line 10 in <module>
  File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
  File "<frozen importlib._bootstrap_external>", line 678 in exec_module
  File "<frozen importlib._bootstrap>", line 665 in _load_unlocked
  File "<frozen importlib._bootstrap>", line 955 in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 971 in _find_and_load
  File "/tmp/.instana/python/instana/__init__.py", line 125 in boot_agent
  File "/tmp/.instana/python/instana/__init__.py", line 206 in <module>
  File "<frozen importlib._bootstrap>", line 219 in _call_with_frames_removed
  File "<frozen importlib._bootstrap_external>", line 678 in exec_module
  File "<frozen importlib._bootstrap>", line 665 in _load_unlocked
  File "<frozen importlib._bootstrap>", line 955 in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 971 in _find_and_load
  File "<string>", line 1 in <module>
  File "/usr/lib/python3.6/threading.py", line 1072 in _wait_for_tstate_lock
  File "/usr/lib/python3.6/threading.py", line 1056 in join
  File "/usr/lib/python3.6/threading.py", line 1294 in _shutdown
Aborted

Note: the process might need to be started a few times for the hang to happen (while with Patroni it seems to happen every time) on the main loop. When it does, the log output of the process will stop. Even when the main test.py keeps outputting, the Instana instrumentation code is stuck in the above when ps reports that the main python has a defunct ldconfig.real child process.

from python-sensor.

Ferenc- avatar Ferenc- commented on May 20, 2024

I can confirm the issue up to Python 3.8.13, but on3.9.13 and 3.10.13 I can't reproduce it anymore.
Could you check if you can reproduce it on 3.9.13 or 3.10.13?

from python-sensor.

ikaakkola avatar ikaakkola commented on May 20, 2024

Unfortunately I don't have any easy way to access new pythons in the container envs where we have Instana available.

I think this is a combination of the way the Auto Trace works (attaching to a live process with ptrace etc.) and some race condition deadlock in python calling ldconfig.

Thanks for the help you provided here, at least I learned python debugging 👍

from python-sensor.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.