Giter Club home page Giter Club logo

city-scrapers-pitt's People

Contributors

0x1f602 avatar ben-nathanson avatar biniona avatar bonfirefan avatar cheog avatar danwarren avatar dkori avatar lyons7 avatar maxachis avatar mishugana avatar nathanderon avatar pjsier avatar richiebful avatar synapticarbors avatar will-snavely avatar woodyeagle avatar wsnavely avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

city-scrapers-pitt's Issues

Pennsylvania Liquour Board Crashes and Yields No Events

🤔

Checking the WayBack Machine, it doesn't look like the website changed much. Any thoughts on why this happened and what we could change?

(city-scrapers-pitt) $scrapy crawl pa_liquorboard
2020-01-27 22:31:25 [scrapy.utils.log] INFO: Scrapy 1.8.0 started (bot: city_scrapers)
2020-01-27 22:31:26 [scrapy.utils.log] INFO: Versions: lxml 4.4.2.0, libxml2 2.9.10, cssselect 1.1.0, parsel 1.5.2, w3lib 1.21.0, Twisted 19.10.0, Python 3.7.5 (default, Nov 13 2019, 13:25:19) - [Clang 11.0.0 (clang-1100.0.33.12)], pyOpenSSL 19.1.0 (OpenSSL 1.1.1d  10 Sep 2019), cryptography 2.8, Platform Darwin-19.2.0-x86_64-i386-64bit
2020-01-27 22:31:26 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'city_scrapers', 'CLOSESPIDER_ERRORCOUNT': 5, 'COMMANDS_MODULE': 'city_scrapers_core.commands', 'NEWSPIDER_MODULE': 'city_scrapers.spiders', 'SPIDER_MODULES': ['city_scrapers.spiders'], 'USER_AGENT': 'City Scrapers [development mode]. Learn more and say hello at https://www.citybureau.org/city-scrapers/'}
2020-01-27 22:31:26 [scrapy.extensions.telnet] INFO: Telnet Password: nice try
2020-01-27 22:31:26 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.telnet.TelnetConsole',
 'scrapy.extensions.memusage.MemoryUsage',
 'scrapy.extensions.logstats.LogStats']
2020-01-27 22:31:26 [scrapy.middleware] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
 'scrapy.downloadermiddlewares.retry.RetryMiddleware',
 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
 'scrapy.downloadermiddlewares.stats.DownloaderStats']
2020-01-27 22:31:26 [scrapy.middleware] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
 'scrapy.spidermiddlewares.referer.RefererMiddleware',
 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
 'scrapy.spidermiddlewares.depth.DepthMiddleware']
2020-01-27 22:31:26 [scrapy.middleware] INFO: Enabled item pipelines:
['city_scrapers_core.pipelines.DefaultValuesPipeline',
 'city_scrapers_core.pipelines.MeetingPipeline']
2020-01-27 22:31:26 [scrapy.core.engine] INFO: Spider opened
2020-01-27 22:31:26 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2020-01-27 22:31:26 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023
2020-01-27 22:31:27 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://www.lcb.pa.gov/About-Us/Board/Pages/Public-Meetings.aspx> (referer: None)
2020-01-27 22:31:27 [scrapy.core.scraper] ERROR: Spider error processing <GET https://www.lcb.pa.gov/About-Us/Board/Pages/Public-Meetings.aspx> (referer: None)
Traceback (most recent call last):
  File "/Users/ben/.local/share/virtualenvs/city-scrapers-pitt-YjN_7ZsS/lib/python3.7/site-packages/scrapy/utils/defer.py", line 102, in iter_errback
    yield next(it)
  File "/Users/ben/.local/share/virtualenvs/city-scrapers-pitt-YjN_7ZsS/lib/python3.7/site-packages/scrapy/core/spidermw.py", line 84, in evaluate_iterable
    for r in iterable:
  File "/Users/ben/.local/share/virtualenvs/city-scrapers-pitt-YjN_7ZsS/lib/python3.7/site-packages/scrapy/spidermiddlewares/offsite.py", line 29, in process_spider_output
    for x in result:
  File "/Users/ben/.local/share/virtualenvs/city-scrapers-pitt-YjN_7ZsS/lib/python3.7/site-packages/scrapy/core/spidermw.py", line 84, in evaluate_iterable
    for r in iterable:
  File "/Users/ben/.local/share/virtualenvs/city-scrapers-pitt-YjN_7ZsS/lib/python3.7/site-packages/scrapy/spidermiddlewares/referer.py", line 339, in <genexpr>
    return (_set_referer(r) for r in result or ())
  File "/Users/ben/.local/share/virtualenvs/city-scrapers-pitt-YjN_7ZsS/lib/python3.7/site-packages/scrapy/core/spidermw.py", line 84, in evaluate_iterable
    for r in iterable:
  File "/Users/ben/.local/share/virtualenvs/city-scrapers-pitt-YjN_7ZsS/lib/python3.7/site-packages/scrapy/spidermiddlewares/urllength.py", line 37, in <genexpr>
    return (r for r in result or () if _filter(r))
  File "/Users/ben/.local/share/virtualenvs/city-scrapers-pitt-YjN_7ZsS/lib/python3.7/site-packages/scrapy/core/spidermw.py", line 84, in evaluate_iterable
    for r in iterable:
  File "/Users/ben/.local/share/virtualenvs/city-scrapers-pitt-YjN_7ZsS/lib/python3.7/site-packages/scrapy/spidermiddlewares/depth.py", line 58, in <genexpr>
    return (r for r in result or () if _filter(r))
  File "/Users/ben/Desktop/throw/city-scrapers-pitt/city_scrapers/spiders/pa_liquorboard.py", line 38, in parse
    start=self._parse_start(item),
  File "/Users/ben/Desktop/throw/city-scrapers-pitt/city_scrapers/spiders/pa_liquorboard.py", line 66, in _parse_start
    date_object = datetime.date(datetime.strptime(" ".join(item.split()[-3:]), '%B %d, %Y'))
  File "/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/lib/python3.7/_strptime.py", line 577, in _strptime_datetime
    tt, fraction, gmtoff_fraction = _strptime(data_string, format)
  File "/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/lib/python3.7/_strptime.py", line 359, in _strptime
    (data_string, format))
ValueError: time data '-' does not match format '%B %d, %Y'
2020-01-27 22:31:27 [scrapy.core.engine] INFO: Closing spider (finished)
2020-01-27 22:31:27 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 323,
 'downloader/request_count': 1,
 'downloader/request_method_count/GET': 1,
 'downloader/response_bytes': 41906,
 'downloader/response_count': 1,
 'downloader/response_status_count/200': 1,
 'elapsed_time_seconds': 1.45097,
 'finish_reason': 'finished',
 'finish_time': datetime.datetime(2020, 1, 28, 3, 31, 27, 573543),
 'log_count/DEBUG': 1,
 'log_count/ERROR': 1,
 'log_count/INFO': 10,
 'memusage/max': 59457536,
 'memusage/startup': 59457536,
 'response_received_count': 1,
 'scheduler/dequeued': 1,
 'scheduler/dequeued/memory': 1,
 'scheduler/enqueued': 1,
 'scheduler/enqueued/memory': 1,
 'spider_exceptions/ValueError': 1,
 'start_time': datetime.datetime(2020, 1, 28, 3, 31, 26, 122573)}
2020-01-27 22:31:27 [scrapy.core.engine] INFO: Spider closed (finished)

Pytest Failing on Fresh Clone

I am trying to run pytest on a fresh clone of the main repo. Following instructions from the main documentation, I cloned a new copy of the repository, created a pipenv, synced it, and ran a sample spider. Running 'pytest' in bash will give the following traceback.

Traceback (most recent call last): File "/home/kavan/.local/share/virtualenvs/city-scrapers-pitt-vm2XhXX5/bin/pytest", line 5, in <module> from pytest import main File "/home/kavan/.local/share/virtualenvs/city-scrapers-pitt-vm2XhXX5/lib/python3.5/site-packages/pytest/__init__.py", line 6, in <module> from _pytest.assertion import register_assert_rewrite File "/home/kavan/.local/share/virtualenvs/city-scrapers-pitt-vm2XhXX5/lib/python3.5/site-packages/_pytest/assertion/__init__.py", line 7, in <module> from _pytest.assertion import rewrite File "/home/kavan/.local/share/virtualenvs/city-scrapers-pitt-vm2XhXX5/lib/python3.5/site-packages/_pytest/assertion/rewrite.py", line 29, in <module> from _pytest.pathlib import fnmatch_ex File "/home/kavan/.local/share/virtualenvs/city-scrapers-pitt-vm2XhXX5/lib/python3.5/site-packages/_pytest/pathlib.py", line 26, in <module> from pathlib2 import Path, PurePath ImportError: No module named 'pathlib2'

Specs:
Linux Mint 18.3 (Virtual Machine)
Python: 3.5.2
Pip (pipenv): 19.3.1

SSL: CERTIFICATE_VERIFY_FAILED error

Got the following output

============================================================================================================ test session starts =============================================================================================================
platform darwin -- Python 3.6.6, pytest-5.2.1, py-1.8.0, pluggy-0.13.0
rootdir: /Users/matthew/dev/city-scrapers, inifile: setup.cfg
collected 123 items / 1 errors / 122 selected                                                                                                                                                                                                

=================================================================================================================== ERRORS ===================================================================================================================
______________________________________________________________________________________________ ERROR collecting tests/test_pitt_housing_opp.py _______________________________________________________________________________________________
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py:1318: in do_open
    encode_chunked=req.has_header('Transfer-encoding'))
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py:1239: in request
    self._send_request(method, url, body, headers, encode_chunked)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py:1285: in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py:1234: in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py:1026: in _send_output
    self.send(msg)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py:964: in send
    self.connect()
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py:1400: in connect
    server_hostname=server_hostname)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/ssl.py:407: in wrap_socket
    _context=self, _session=session)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/ssl.py:814: in __init__
    self.do_handshake()
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/ssl.py:1068: in do_handshake
    self._sslobj.do_handshake()
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/ssl.py:689: in do_handshake
    self._sslobj.do_handshake()
E   ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:841)

During handling of the above exception, another exception occurred:
tests/test_pitt_housing_opp.py:8: in <module>
    from city_scrapers.spiders.pitt_housing_opp import PittHousingOppSpider
city_scrapers/spiders/pitt_housing_opp.py:13: in <module>
    url = urllib.request.urlopen(json_url)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py:223: in urlopen
    return opener.open(url, data, timeout)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py:532: in open
    response = meth(req, response)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py:642: in http_response
    'http', request, response, code, msg, hdrs)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py:564: in error
    result = self._call_chain(*args)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py:504: in _call_chain
    result = func(*args)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py:756: in http_error_302
    return self.parent.open(new, timeout=req.timeout)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py:526: in open
    response = self._open(req, data)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py:544: in _open
    '_open', req)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py:504: in _call_chain
    result = func(*args)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py:1361: in https_open
    context=self._context, check_hostname=self._check_hostname)
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/urllib/request.py:1320: in do_open
    raise URLError(err)
E   urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:841)>
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 1 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
============================================================================================================== 1 error in 1.89s ==============================================================================================================

pipenv install errors

$ pipenv install
Installing dependencies from Pipfile.lock (32f095)▒
An error occurred while installing scrapy==1.6.0 --hash=sha256:4ec03552c9aed1e82a376488150e904e0213bdee3ca140225105e9b03d0de204 --hash=sha256:558dfd10ac53cb324ecd7eefd3eac412161c7507c082b01b0bcd2c6e2e9f0766! Will try again.
An error occurred while installing scrapy-sentry==0.9.0 --hash=sha256:4c78e0fb5a5940639f11ae2a1fb28d4ea1a41180d88289efc6d8e39e1e14227f! Will try again.

An error occurred while installing twisted==19.2.0 --hash=sha256:1708e1928ae84ec9d3ebab0d427e20e1e38ff721b15bbced476d047d4a43abbe --hash=sha256:3716eca1c80ef88729a85e9a70f09acce8df2adf7c58590d54c94f383945bc47! Will try again.
Installing initially failed dependencies▒

Add new Meeting Statuses to documentation and project

In order to better track meetings, we will add RESCHEDULED and UPDATED as meeting statuses. Comes up in Issue #16 and in the general last-minute rescheduling of meetings.

RESCHEDULED means the meeting was moved to a new date
UPDATED is the new meeting time of a meeting.

Statuses are now:

  • CANCELLED
  • TENTATIVE
  • CONFIRMED
  • PASSED
  • RESCHEDULED
  • UPDATED

We'll need to add Pittsburgh-specific documentation reflecting this.

Pytest: ABCs from Collections Deprecated

When I run 'pytest' I get the following warnings:

�[33m==================================================================== warnings summary =====================================================================�[0m
/Users/ben/.local/share/virtualenvs/city-scrapers-pitt-bd9KzvWh/lib/python3.7/site-packages/scrapy/utils/datatypes.py:11
/Users/ben/.local/share/virtualenvs/city-scrapers-pitt-bd9KzvWh/lib/python3.7/site-packages/scrapy/utils/datatypes.py:11: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working
from collections import OrderedDict, Mapping

/Users/ben/.local/share/virtualenvs/city-scrapers-pitt-bd9KzvWh/lib/python3.7/site-packages/scrapy/item.py:8
/Users/ben/.local/share/virtualenvs/city-scrapers-pitt-bd9KzvWh/lib/python3.7/site-packages/scrapy/item.py:8: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working
from collections import MutableMapping

-- Docs: https://docs.pytest.org/en/latest/warnings.html
�[33m�[1m========================================================== 96 passed, 2 warnings in 1.70 seconds ==========================================================�[0m

Is this me? Is this scrapy? Let me know if I can clear up these warnings. Happy to provide more logs if that would be helpful.

Thanks!
Ben

Twisted Error using 'pipenv install twisted

Upon trying to run scrapy, I got an error. reading through it, I discovered it required twisted. I installed it using "sudo pipenv install twisted", I got this terminal out. I am doing this on a chromebook using the native debian VM.

City Scrapers Events Website Returns a 404

Our website is not building as of January 26th. This means that the page returns a 404 and events are unavailable.

Here is the "Page Build Warning" I received about this issue:

The page build completed successfully, but returned the following warning for the master branch:
The CNAME bonfirefan.github.io/city-scrapers-pitt/ is not properly formatted. See https://help.github.com/articles/troubleshooting-custom-domains/#github-repository-setup-errors for more information.
For information on troubleshooting Jekyll see:
https://help.github.com/articles/troubleshooting-jekyll-builds
If you have any questions you can contact us by replying to this email.

"scrapy list" gives an error

scrapy list

Traceback (most recent call last):
File "/sda9/cityscrapers-sandbox/bin/scrapy", line 10, in
sys.exit(execute())
File "/sda9/cityscrapers-sandbox/lib/python3.7/site-packages/scrapy/cmdline.py", line 150, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "/sda9/cityscrapers-sandbox/lib/python3.7/site-packages/scrapy/cmdline.py", line 90, in _run_print_help
func(*a, **kw)
File "/sda9/cityscrapers-sandbox/lib/python3.7/site-packages/scrapy/cmdline.py", line 157, in _run_command
cmd.run(args, opts)
File "/sda9/cityscrapers-sandbox/lib/python3.7/site-packages/city_scrapers_core/commands/list.py", line 8, in run
print("{0: <6} | {1}".format(s, getattr(cls, "agency", cls.agency_name)))
AttributeError: type object 'AlleCountySpider' has no attribute 'agency_name'

Scrapy Shell doesn't run after copying spider from legacy city-scrapers repo

Copied a spider I was working on in the legacy city-scrapers repo into my spiders branch in the city-scrapers-pitt repo.

This prevented "Scrapy Shell" from running on my mac terminal, the error said 'No module named "city_scrapers.spider"'.

Solution was to update import message at the beginning of the spyder, update lines importing from city_scrapers.spiders to import from city_scrapers_core.spiders.

Twisted Error

Trying to open up Scrapy shell and I run into this error.

File "c:\users\deron.virtualenvs\city-scrapers-pitt-gerzyw5n\lib\site-packages\scrapy_monkeypatches.py", line 20, in
import twisted.persisted.styles # NOQA
ModuleNotFoundError: No module named 'twisted'

When trying to install twisted using pipenv install twisted also led to the following:

Installing twisted…
Adding twisted to Pipfile's [packages]…
Installation Succeeded
Pipfile.lock (ab942f) out of date, updating to (fe2388)…
Locking [dev-packages] dependencies…
Success!
Locking [packages] dependencies…
Locking Failed!
...
ERROR: ERROR: Could not find a version that matches wheels

Trying to install wheels gives a similar error.

Add to documentation for commit instructions: YAPF

Currently we hit YAPF issues in travis occasionally. We might want to add some documentation to help developers run YAPF before they check in, and fix any formatting problems.

To show YAPF discrepancies we can run:
yapf --diff -r .

After doing this, developers can either apply the changes manually, or automatically apply them with:
yapf -r -i .

Spider: City of Pittsburgh Planning Commission

Spider Name:

pitt_city_planning

Website:

http://pittsburghpa.gov/dcp/notices

Scraping Notes:

This is a plain text with event names as ‘p tags’ and details as ‘ul’ //an unordered list

According to the Pittsburgh Planning Commission, meetings are every other Tuesday around 1pm, downtown on the 1st floor of the Civic Building at 200 Ross Street.

Notices of hearing appear to be posted at http://pittsburghpa.gov/dcp/notices

City Planning Commission meetings are held every
other Tuesday beginning in the early afternoon. They are
located downtown on the 1st floor of the Civic Building at
200 Ross Street.
Planning Commission agendas, applicant presentations,
and minutes are posted online at pittsburghpa.gov/dcp/
From the City Planning home page, click on “Planning
Commission” on the right hand menu. To receive the
agenda by email, please contact Dolores Hanna at
[email protected] or 412-255-2473.
Briefing
At approximately 1 PM, the Commission hears project
briefings off of the official record. The public is welcome to
observe, but no public comments are taken. The briefing
portion of the meeting provides an initial presentation to
the Commission, then projects return in two weeks on the
Hearing and Action agenda.
Hearing and Action
Hearing and Action begins no earlier than 2 PM.
Applicants make a presentation again, including any
revisions or additional information requested by the
Commission during briefing. Public comment is then
accepted—limited to 3 minutes per person, per project.
The Commission usually votes on projects the same day
they are presented for Hearing and Action.

test html file does not include \r

was finding some inconsistency between running my tests and running scrapy from command line.

For some reason when I ran the spider from using the scrapy command, the text from the website always began with \r. But when running the tests the html file did not include these \r.

It was easy to fix this problem - just used lstrip('\r') on the text and now it is all consistent. But it is a little strange that there is this difference.

Master Documentation Issues

  • Twisted Error using 'pipenv install twisted #36
  • Updates from "upstream" not integrating in local environment - Solved #68
  • SSL: CERTIFICATE_VERIFY_FAILED error #64
  • city-scrapers-core module not found #60

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.