Giter Club home page Giter Club logo

parallel's Introduction

Parallel

Gem Version Build Status

Run any code in parallel Processes(> use all CPUs), Threads(> speedup blocking operations), or Ractors(> use all CPUs).
Best suited for map-reduce or e.g. parallel downloads/uploads.

Install

gem install parallel

Usage

# 2 CPUs -> work in 2 processes (a,b + c)
results = Parallel.map(['a','b','c']) do |one_letter|
  SomeClass.expensive_calculation(one_letter)
end

# 3 Processes -> finished after 1 run
results = Parallel.map(['a','b','c'], in_processes: 3) { |one_letter| SomeClass.expensive_calculation(one_letter) }

# 3 Threads -> finished after 1 run
results = Parallel.map(['a','b','c'], in_threads: 3) { |one_letter| SomeClass.expensive_calculation(one_letter) }

# 3 Ractors -> finished after 1 run
results = Parallel.map(['a','b','c'], in_ractors: 3, ractor: [SomeClass, :expensive_calculation])

Same can be done with each

Parallel.each(['a','b','c']) { |one_letter| ... }

or each_with_index, map_with_index, flat_map

Produce one item at a time with lambda (anything that responds to .call) or Queue.

items = [1,2,3]
Parallel.each( -> { items.pop || Parallel::Stop }) { |number| ... }

Also supports any? or all?

Parallel.any?([1,2,3,4,5,6,7]) { |number| number == 4 }
# => true

Parallel.all?([1,2,nil,4,5]) { |number| number != nil }
# => false

Processes/Threads are workers, they grab the next piece of work when they finish.

Processes

  • Speedup through multiple CPUs
  • Speedup for blocking operations
  • Variables are protected from change
  • Extra memory used
  • Child processes are killed when your main process is killed through Ctrl+c or kill -2

Threads

  • Speedup for blocking operations
  • Variables can be shared/modified
  • No extra memory used

Ractors

  • Ruby 3.0+ only
  • Speedup for blocking operations
  • No extra memory used
  • Very fast to spawn
  • Experimental and unstable
  • start and finish hooks are called on main thread
  • Variables must be passed in Parallel.map([1,2,3].map { |i| [i, ARGV, local_var] }, ...
  • use Ractor.make_shareable to pass in global objects

ActiveRecord

Connection Lost

  • Multithreading needs connection pooling, forks need reconnects
  • Adjust connection pool size in config/database.yml when multithreading
# reproducibly fixes things (spec/cases/map_with_ar.rb)
Parallel.each(User.all, in_processes: 8) do |user|
  user.update_attribute(:some_attribute, some_value)
end
User.connection.reconnect!

# maybe helps: explicitly use connection pool
Parallel.each(User.all, in_threads: 8) do |user|
  ActiveRecord::Base.connection_pool.with_connection do
    user.update_attribute(:some_attribute, some_value)
  end
end

# maybe helps: reconnect once inside every fork
Parallel.each(User.all, in_processes: 8) do |user|
  @reconnected ||= User.connection.reconnect! || true
  user.update_attribute(:some_attribute, some_value)
end

NameError: uninitialized constant

A race happens when ActiveRecord models are autoloaded inside parallel threads in environments that lazy-load, like development, test, or migrations.

To fix, autoloaded classes before the parallel block with either require '<modelname>' or ModelName.class.

Break

Parallel.map([1, 2, 3]) do |i|
  raise Parallel::Break # -> stops after all current items are finished
end
Parallel.map([1, 2, 3]) { |i| raise Parallel::Break, i if i == 2 } == 2

Kill

Only use if whatever is executing in the sub-command is safe to kill at any point

Parallel.map([1,2,3]) do |x|
  raise Parallel::Kill if x == 1# -> stop all sub-processes, killing them instantly
  sleep 100 # Do stuff
end

Progress / ETA

# gem install ruby-progressbar

Parallel.map(1..50, progress: "Doing stuff") { sleep 1 }

# Doing stuff | ETA: 00:00:02 | ====================               | Time: 00:00:10

Use :finish or :start hook to get progress information.

  • :start has item and index
  • :finish has item, index, and result

They are called on the main process and protected with a mutex. (To just get the index, use the more performant Parallel.each_with_index)

Parallel.map(1..100, finish: -> (item, i, result) { ... do something ... }) { sleep 1 }

Set finish_in_order: true to call the :finish hook in the order of the input (will take longer to see initial output).

Parallel.map(1..9, finish: -> (item, i, result) { puts "#{item} ok" }, finish_in_order: true) { sleep rand }

Worker number

Use Parallel.worker_number to determine the worker slot in which your task is running.

Parallel.each(1..5, in_processes: 2) { |i| puts "Item: #{i}, Worker: #{Parallel.worker_number}" }
Item: 1, Worker: 1
Item: 2, Worker: 0
Item: 3, Worker: 1
Item: 4, Worker: 0
Item: 5, Worker: 1

Dynamically generating jobs

Example: wait for work to arrive or sleep

queue = []
Thread.new { loop { queue << rand(100); sleep 2 } } # job producer
Parallel.map(Proc.new { queue.pop }, in_processes: 3) { |f| f ? puts("#{f} received") : sleep(1) }

Tips

  • [Benchmark/Test] Disable threading/forking with in_threads: 0 or in_processes: 0, to run the same code with different setups
  • [Isolation] Do not reuse previous worker processes: isolation: true
  • [Stop all processes with an alternate interrupt signal] 'INT' (from ctrl+c) is caught by default. Catch 'TERM' (from kill) with interrupt_signal: 'TERM'
  • [Process count via ENV] PARALLEL_PROCESSOR_COUNT=16 will use 16 instead of the number of processors detected. This is used to reconfigure a tool using parallel without inserting custom logic.

TODO

  • Replace Signal trapping with simple rescue Interrupt handler

Authors

Michael Grosser
[email protected]
License: MIT

parallel's People

Contributors

aeroastro avatar agentydragon avatar ajaska avatar bpaquet avatar bpo avatar brendar avatar duffyjp avatar grosser avatar jmozmoz avatar joakimk avatar jrafanie avatar jurriaan avatar kachick avatar mattyb avatar mifix avatar mikezter avatar ndbroadbent avatar orien avatar pedromartinez avatar robworley avatar rushibam avatar sanemat avatar thiagopradi avatar vaibhavmdhoke avatar walf443 avatar westonganger avatar wp avatar yaoguai avatar yuki-inoue avatar zedtux avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

parallel's Issues

Parallel & Serial

Can you add a feature which allows to immediately start new processes if some processes have finished, rather than waiting for all of the processes to complete in the current batch?

e.g. I have 10 processes and I have set the batch size to 4. From the first 4 processes, if 2 are over, Parallel should start the 5th one immediately before waiting for all 4 processes to finish.

Something like this --
http://parallelforkmgr.rubyforge.org/

But Parallel::ForkManager is ugly and confusing to use.

Potential race condition with in_threads

It looks like the block passed to Thread.new ends up writing its output to a shared 'out' array that is a local variable. I think there is a potential race condition here since multiple threads could be concurrently writing their results to the same array, which is not a thread-safe data structure. This is probably more visible in JRuby where more than a single thread can be executing at a given time.

CTRL-C does not stop threads

Hi!

I'm not sure if it's a bug or a feature, but if I try this and press CTRL-C, it stil waits till the end:

$ ruby -e 'require "parallel"; Parallel.each([1,2,3], :in_threads => 3) {|n| sleep 60; puts n }'
^C
2
3
1

I would like the threads to stop on SIGINT.

Thanks!

Parallel and ActiveRecord (3.0.3)/mysql - probably my misuse...

Hi,

Many thanks for the cool lib - finding it really handy.

I have been using Parallel (0.5.1) with sqlite for a background job largely successfully, but now I am moving to mysql and seem to be getting issues with parallel db access.

I have tried the mysql, mysql2 and mysqlplus adaptors.

I am using Rails3/ActiveRecord on top of the adaptor - 3.0.3

With mysql, in the console, I do this:

Parallel.each(QuoteResponse.all, :in_processes => 5) do |r|
puts r.quote_site.site_name
end

Which gives this error (the site_name is a method on a related table - so it seems like its not loading the relating table - the non parallel version of above works fine :( ):

NoMethodError: undefined method site_name' for nil:NilClass from /Users/kimptoc/.rvm/gems/ruby-1.8.7-p330@p-ecom1-rails3/gems/activesupport-3.0.3/lib/active_support/whiny_nil.rb:48:inmethod_missing'
from (irb):2
from /Users/kimptoc/.rvm/gems/ruby-1.8.7-p330@p-ecom1-rails3/gems/parallel-0.5.1/lib/parallel.rb:250:in call' from /Users/kimptoc/.rvm/gems/ruby-1.8.7-p330@p-ecom1-rails3/gems/parallel-0.5.1/lib/parallel.rb:250:incall_with_index'
from /Users/kimptoc/.rvm/gems/ruby-1.8.7-p330@p-ecom1-rails3/gems/parallel-0.5.1/lib/parallel.rb:186:in process_incoming_jobs' from /Users/kimptoc/.rvm/gems/ruby-1.8.7-p330@p-ecom1-rails3/gems/parallel-0.5.1/lib/parallel.rb:169:inworker'
from /Users/kimptoc/.rvm/gems/ruby-1.8.7-p330@p-ecom1-rails3/gems/parallel-0.5.1/lib/parallel.rb:164:in `fork'

DeadWorker in ops with Phusion Passenger

On my ops server I'm getting a 500 internal server error with only the below output in the server log when trying to run the action calling Parallel.

On my dev environment using plain old "rails server" it works fine though. I'm not really sure what is causing this though.

Completed 500 Internal Server Error in 1332ms

Parallel::DeadWorker (Parallel::DeadWorker):
parallel (0.6.4) lib/parallel.rb:51:in rescue in work' parallel (0.6.4) lib/parallel.rb:48:inwork'
parallel (0.6.4) lib/parallel.rb:210:in block (3 levels) in work_in_processes ' parallel (0.6.4) lib/parallel.rb:331:inwith_instrumentation'
parallel (0.6.4) lib/parallel.rb:209:in block (2 levels) in work_in_processes ' parallel (0.6.4) lib/parallel.rb:204:inloop'
parallel (0.6.4) lib/parallel.rb:204:in block in work_in_processes' parallel (0.6.4) lib/parallel.rb:65:inblock (2 levels) in in_threads'

Calling it with the following code inside an objets model:

results = Parallel.map((calc_start..calc_stop)) do | start |
... (do some calculations
result = [array]
end

::ActiveRecord::Base.establish_connection

There are no access to any of the object's parameters inside of the loop but I threw in the establish connection command after the loop because during my development I was noticing I was still losing access to the database when running later commands.

Are there any other logs or things I should look at?

Make it work on Ernumerable instead of Array?

Hi,

Great little gem but I've got a feature request. Could you make it work on Enumerable instead of Array?

I wanted to use it to split up the work of processing objects from a database. In this case it's from Mongo.

I tried to write something like

Parallel.each(mongo_collection.find) do |obj|

do something with obj

end

but realized that you don't call .each internally but instead use the [index] operator for Array.

Regards,

  • Daniel

Crash with rake task

Hi, I'm using parallel(0.5.8) to process a regular amount of data with a rake task and parallel is crashing on production box, here is the log: https://gist.github.com/1204831

Ruby version: ruby 1.8.7 (2010-04-19 patchlevel 253) [x86_64-linux], MBARI 0x6770, Ruby Enterprise Edition 2010.02
O.S: CentOS release 5.5 (Final)

On my development box the same rake task works fine:
Ruby version: ruby 1.8.7 (2010-08-16 patchlevel 302) [i686-darwin10.5.0]
O.S: OS X

Base64

Why are you using base64 in the encode and decode methods? Is this needed?

parallel postgresql error

I get the following error when I try to access my DB after Parallel finishes a job:

server closed the connection unexpectedly
This probably means the server terminated abnormally
before or while processing the request.

my code is as such:
def first_function
results = Parallel.map(some_array, :in_processes=>3) do |element|
ActiveRecord::Base.connection.reconnect!
do_something
end
return results
end

so far it works well and I get the results I expect

def second_function
do_something_with_results(results) #works well
ActiveRecord::Base.connection.execute(some_query) #this fails on every query I try
end

I know this is a problem starts with Parallel because when I change the line:
results = Parallel.map(some_array, :in_processes=>3) do |element|

to the line:
results = some_array.collect do |element|

I get the same value in "results" but the queries after don't fail.

please help.

Timeout option

Hey,

would be great if a timeout option could be added to Parallel. Sometime I want to return after X seconds rather than waiting for the slowest thread to come back.

Processes in batches?

Hi,

Quick question. Does Parallel process in batches? For example if I have an array of 100 objects that I want to iterate through and I set the :in_threads => 10 or :in_processes => 10, would it start processing the first 10 jobs (1..10), then wait for all 10 to finish, and then start on the second batch (job 11..20)? Or does it immediately begin processing the next job when 1 of the threads finishes its job? For example if you're using 10 threads, and thread #3 finishes it's job first, would it go ahead and pick up job #11 from the array while the other 9 are still processing their current jobs, or would thread #3 have to wait for all other threads to finish first (the batch) before starting a new job?

Looking at the source it would seem both threads and processes will be processing in batches. Could you confirm whether this is true or not? Thanks! :)

Is this a bug? Or am I wrong somewhere

require "parallel"

class Test1
    attr_accessor :count

    def initialize
        @count = "First"
    end

    def inc
        @count = "Second"
    end
end

arr = []
arr.push Test1.new

puts "** Before #{arr[0].count}"

Parallel.map(arr, :in_processes => 8) { |test|
    test.inc
}

puts "** After #{arr[0].count}"

I am expecting the second puts to print

** After Second

but it doesn't.

Am I wrong somewhere?

How does mapping work of problems to processes work

Not really a code bug but I'm curious how this gem is slicing up the problem set. Maybe updating the readme with this information will be helpful to others later

I.e. if I say
Parallel.map((1..100), :in_processes => 20) do | item |
...
end

  • Does this create a worker pool of 20 processes that as they finish an iteration they go and grab the next item and run the loops contents for that item?
  • Does it map a range of values to each process (process 1 gets items 1-5, process 2 gets items 6-10, ...)?
  • Some other way

In my code I'm dealing with a shifting window problem where i have to also change the size of the window so I'm dealing with nested loops. The last iteration of my top loop has the least possible window sizes so it runs the fastest but the first iteration has the most possible window sizes so it runs the slowest. If you're using the first way of splitting up the problem set then it works fine for my code but the second way would mean I'm not getting quite as much performance.

There could be other problems though where the second method would be faster because of the lower overhead.

DeadWorker when parallel processing

Hi,

I am getting the following errors when parallel processing on an ubuntu 12.04 LTS, using rbenv en ruby 1.9.3-p448.

ruby: pthread_mutex_lock.c:84: __pthread_mutex_lock: Assertion `mutex->__data.__owner == 0' failed.

Parallel::DeadWorker

Tasks: TOP => upgrade
(See full trace by running task with --trace)
/foo/bar/.rbenv/versions/1.9.3-p448/lib/ruby/gems/1.9.1/gems/parallel-0.9.0/lib/parallel.rb:319:in write': Broken pipe (Errno::EPIPE) from /foo/bar/.rbenv/versions/1.9.3-p448/lib/ruby/gems/1.9.1/gems/parallel-0.9.0/lib/parallel.rb:319:indump'
from /foo/bar/.rbenv/versions/1.9.3-p448/lib/ruby/gems/1.9.1/gems/parallel-0.9.0/lib/parallel.rb:319:in process_incoming_jobs' from /foo/bar/.rbenv/versions/1.9.3-p448/lib/ruby/gems/1.9.1/gems/parallel-0.9.0/lib/parallel.rb:298:inblock in worker'
from /foo/bar/.rbenv/versions/1.9.3-p448/lib/ruby/gems/1.9.1/gems/parallel-0.9.0/lib/parallel.rb:291:in fork' from /foo/bar/.rbenv/versions/1.9.3-p448/lib/ruby/gems/1.9.1/gems/parallel-0.9.0/lib/parallel.rb:291:inworker'
from /foo/bar/.rbenv/versions/1.9.3-p448/lib/ruby/gems/1.9.1/gems/parallel-0.9.0/lib/parallel.rb:279:in block in create_workers' from /foo/bar/.rbenv/versions/1.9.3-p448/lib/ruby/gems/1.9.1/gems/parallel-0.9.0/lib/parallel.rb:278:ineach'
from /foo/bar/.rbenv/versions/1.9.3-p448/lib/ruby/gems/1.9.1/gems/parallel-0.9.0/lib/parallel.rb:278:in create_workers' from /foo/bar/.rbenv/versions/1.9.3-p448/lib/ruby/gems/1.9.1/gems/parallel-0.9.0/lib/parallel.rb:242:inwork_in_processes'
from /foo/bar/.rbenv/versions/1.9.3-p448/lib/ruby/gems/1.9.1/gems/parallel-0.9.0/lib/parallel.rb:114:in map' from /foo/bar/.rbenv/versions/1.9.3-p448/lib/ruby/gems/1.9.1/gems/parallel-0.9.0/lib/parallel.rb:81:ineach'

I googled around and found that

ruby: pthread_mutex_lock.c:84: __pthread_mutex_lock: Assertion `mutex->__data.__owner == 0' failed.

is caused by the fact that a mutex should only be unlocked by the thread that locked it. See http://stackoverflow.com/questions/1105745/pthread-mutex-assertion-error.

I am running in 4 processes. My Ubuntu VM has 2vCPU's .

Any ideas what to do about this?

Multiple parellel requests to a soap api are taking collectively longer time

I'm trying to send three parallel request to a soap api. Normally each request takes around 13 seconds to complete so for three requests it takes around 39 seconds. After using parallel gem and sending three parallel requests using 3 threads it takes around 23 seconds to complete all three requests which is really nice but I'm not able to figure out why its not completing it in like 14-15 seconds. Its not an issue actually but as you know alot about parallel processing so I thought maybe you can point me in the right direction.

Docs

some docs that at least say what functions are defined would be nice ;)

Parallel and writing data to a file

require 'parallel'

f = File.new('test.dat', 'w')
f.write "#{$$}: Start processing\n"
puts "#{$$}: Start processing"

Parallel.map([1, 2, 3, 4], :in_processes => 2){ |x| x }

f.write "#{$$}: Stop processing\n"
puts "#{$$}: Stop processing"
f.close

STDOUT contains strings as I expected:

32023: Start processing
32023: Stop processing

But the file contains 3 strings"Start processing" with the same PID:

32023: Start processing
32023: Start processing
32023: Start processing
32023: Stop processing

Is this normal behavior?

No _dump_data is defined for class Proc

Trying to run Parallel in rake like this:

  desc "Parallel Continuous integration"
  task :parallel_check do
    require 'parallel'
    results = Parallel.map(FileList.new('??')+FileList.new('??-??')) do |lang|
      Rake::Task["ci:" +lang+"_check"].execute
    end
    fail if results.any { |result| result!=0}
  end

I get the following backtrace at the end of each subprocess:

        from /home/jnavila/.gems/gems/parallel-0.8.0/lib/parallel.rb:279:in `process_incoming_jobs'
        from /home/jnavila/.gems/gems/parallel-0.8.0/lib/parallel.rb:257:in `block in worker'
        from /home/jnavila/.gems/gems/parallel-0.8.0/lib/parallel.rb:250:in `fork'
        from /home/jnavila/.gems/gems/parallel-0.8.0/lib/parallel.rb:250:in `worker'
        from /home/jnavila/.gems/gems/parallel-0.8.0/lib/parallel.rb:238:in `block in create_workers'
        from /home/jnavila/.gems/gems/parallel-0.8.0/lib/parallel.rb:237:in `each'
        from /home/jnavila/.gems/gems/parallel-0.8.0/lib/parallel.rb:237:in `create_workers'
        from /home/jnavila/.gems/gems/parallel-0.8.0/lib/parallel.rb:201:in `work_in_processes'
        from /home/jnavila/.gems/gems/parallel-0.8.0/lib/parallel.rb:106:in `map'
 ...

Is there a way to get rid these exeptions?

Errno::ESRCH when sending interrupt signal

Recently I have not been able to get the parallel gem working (in the current version) for the INT signal (e.g. Ctrl+C). For a little background, the machine in question is 32-bit Ubuntu 12.04 with Ruby 1.9.3, and 4 processors detected (4 cores, 1 CPU). The funny thing is that for two items in parallel, the code works just as expected. However, for three items or more, I get Errno::ESRCH (process cannot be found) when doing Ctrl+C.

$ ruby -v
ruby 1.9.3p0 (2011-10-30 revision 33570) [i686-linux]

$ ruby -e 'require "parallel"; Parallel.each([1,2]) {|n| sleep 60; puts n }'
^CParallel execution interrupted, exiting ...

$ ruby -e 'require "parallel"; Parallel.each([1,2,3]) {|n| sleep 60; puts n }'
^CParallel execution interrupted, exiting ...
/var/lib/gems/1.9.1/gems/parallel-0.5.18/lib/parallel.rb:267:in `kill': No such process (Errno::ESRCH)
       from /var/lib/gems/1.9.1/gems/parallel-0.5.18/lib/parallel.rb:267:in `block (2 levels) in kill_on_ctrl_c'
       from /var/lib/gems/1.9.1/gems/parallel-0.5.18/lib/parallel.rb:267:in `each'
       from /var/lib/gems/1.9.1/gems/parallel-0.5.18/lib/parallel.rb:267:in `block in kill_on_ctrl_c'
       from /var/lib/gems/1.9.1/gems/parallel-0.5.18/lib/parallel.rb:237:in `call'
       from /var/lib/gems/1.9.1/gems/parallel-0.5.18/lib/parallel.rb:237:in `join'
       from /var/lib/gems/1.9.1/gems/parallel-0.5.18/lib/parallel.rb:237:in `block in wait_for_threads'
       from /var/lib/gems/1.9.1/gems/parallel-0.5.18/lib/parallel.rb:235:in `each'
       from /var/lib/gems/1.9.1/gems/parallel-0.5.18/lib/parallel.rb:235:in `wait_for_threads'
       from /var/lib/gems/1.9.1/gems/parallel-0.5.18/lib/parallel.rb:18:in `in_threads'
       from /var/lib/gems/1.9.1/gems/parallel-0.5.18/lib/parallel.rb:149:in `work_in_processes'
       from /var/lib/gems/1.9.1/gems/parallel-0.5.18/lib/parallel.rb:55:in `map'
       from /var/lib/gems/1.9.1/gems/parallel-0.5.18/lib/parallel.rb:30:in `each'
       from -e:1:in `<main>'

I changed the code around to something like....

  pids.each do |pid|
    if pid
      begin
        Process.kill(:KILL, pid)
      rescue Errno::ESRCH
        # Child PID already dead
      end
    end
  end

That seemed to fix the problem. Not sure why children would already be dead, though...?

Include a license

It says "MIT" in a few places (gemspec and readme), but that's a bit thin. There are multiple versions of the MIT license out there (see Wikipedia).

Assuming you mean the original MIT license, could you please include the following in a file LICENSE.txt in the repository?

This is currently blocking the Wikimedia Foundation (the non-profit behind Wikipedia.org) from using a Ruby application that depends on the parallel gem.

Copyright (C) 2013 Michael Grosser <[email protected]>

Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
"Software"), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:

The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

no _dump_data is defined for class Mutex (TypeError)

I'm getting this error when using the Mechanize gem:

def get_page(n, agent)
puts "Getting page #{n}..."
return agent.get(url_for_page(n))
end

pages = Parallel.map(page_num_array) do |n|
get_page(n, agent)
end

This is on Mac OS X, using RVM, in both ruby 1.9.3 and 2.0.0.

Physical processor count not working in FreeBSD

It appears that "hw.physicalcpu" is missing in my FreeBSD 9.1 VM, so the physical processor count for Parallel is not working properly. I don't know if this is something related to it being a VM, or simply an OID that is not used anymore...?

$ ruby -e 'require "parallel"; puts Parallel.physical_processor_count'
sysctl: unknown oid 'hw.physicalcpu'
0

However, "hw.ncpu" is still available.

`LoadError (Expected ... to define Classname)` while running parallelized code

My development environment is choking on load errors while running parallel'd code, saying something about having trouble reloading certain classes in my /lib directory.

The classes change from time to time, and I have to reload my server to make it work. It works the first time I run a request that triggers the parallel code, and fails after that.

From my development.rb:

# Explicitly tells ActiveSupport to always unload these constants after
# every reqeust. Allows lib files to look like gems, but still use Rails autoloading.
# http://wondible.com/2011/12/23/give-rails-autoloading-a-boot-to-the-head/
# http://mechanicalbee.com/2011/autoreload-gems-rails-3.html
ActiveSupport::Dependencies.explicitly_unloadable_constants += %w[ Buyads ]

I didn't post app-specific code because I wasn't sure what was relevant.

Am I misconfiguring something?

Starting with more than one thread

Thanks for the amazing library!

Part of some code I'm writing features a web spider that crawls pages, with each hit updating an array that functions essentially as a queue. Semi-pseudocode:

@queue = []
@visited = []

def hit(url)
    find_links.each { |link|
        next if @visited.include? link
        @queue << link
    }

    @visited << url
end

def crawl
    Parallel.map(@queue, :in_threads => 4) { |url|
        hit(url)
    }
end

Using in_threads it all works perfectly; though the array begins with one element, the first run-through queues up some more and these are gradually processed by the worker until all the unique URLs are hit.

However, there's a slight issue. Because of this line in map:

size = [array.size, size].min

…since my array starts with only one element, only one worker thread is started, and so my code never uses more than one thread.

I can get around this by queuing the initial URL x times, where x is the number of threads I want to use; that fetches the initial URL x times and everything else approximately once, but that seems awfully hacky and generally suboptimal.

Is there a way to tell Parallel to use more than one worker thread even if the initial workload doesn't have more than one element?

Queue support

I have a number of files which I need to download and process. To speed up my application I would like to download files during processing of previosly downloaded files.
Can you please provide any suggestion how to do it? My only idea is to use a queue.
I read this pull request but didn't find the queue implementation in the current source code. Is queue supported or not? Or is there any solution how to achieve what I need?

Mysql::Error: Lost connection to MySQL server during query

Hi.
Parallel.map(orders, :in_processes => 6) do |order|
ActiveRecord::Base.establish_connection
# some huge calculation
end

And i get this error:
Mysql::Error: Lost connection to MySQL server during query

or this one:
Mysql::Error: MySQL server has gone away

What i did wrong?

timeout() inside Parallel.map()

Parallel.map(accounts, :in_threads => accounts.count) do |account|
start_time = Time.now
begin
timeout(3.5) do
client.get(xxx)
end
rescue Timeout::Error
elapsed = "%.1f" % (Time.now - start_time)
logger.info "Timed out after #{elapsed}s"
next
end
end

  1. request timeouts are always failing
  2. failing too late Timed out after 7.0s (8.0s/9.0s) etc.

Ubuntu 11.04
Apache passenger 3.0.4
Ruby 1.9.2-p180

Parallel is really slow for many small jobs

Since parallel forks for each job, as opposed to forking on each process, when you have many small jobs where the forking overhead is high, Parallel takes a really long time.

Here is my benchmark script with results:

http://gist.github.com/624566

I just created a library playing with using the same fork per process, and using bi-directional pipes to pass jobs back and forth between the master and worker fork.

http://github.com/ngauthier/forque

Then someone pointed out you guys were already doing a similar project.

Think we should merge forque's double-pipe fork conservation into parallel for the in_processes method?

(note: the system used in forque is the core of the Hydra project http://github.com/ngauthier/hydra )

Put license back in gem

Although I am a fan of cleaning up things. Can you please put the license back in the gem.

[Rails] Connection Pool Timeout

I was having an issue where I was trying to perform too many long running DB connections in parallel. Rails suggested I increase my connection pool size, but I was afraid of having too many connections to mySQL per web worker process.

I realized my error was that I was simply using too many of the available connections, so I wanted to make sure I never exceeded the available pool. My solution was to use something like

:in_threads => [User.all.lenght, ActiveRecord::Base.connection_pool.instance_eval { @size }].min

This will use the number of connections equal to the number of records, OR the connection pool size, whichever is smaller.

Option not to force enumerable into array

This gem would be severely more useful if it were capable of enumerating over an Enumerator without first converting it to an array. I'm trying to use this to speed up a project that works with Minecraft's .mca format. I have a method that returns an enumerator over all the map chunks - which are read only as they are needed from the file, and doing that at the beginning of the process takes a bit of overhead. I can imagine other scenarios where the enumerator should not be converted to an array before enumeration as well.

Stop all parallel threads

On Windows (jruby 1.7.3 and MRI Ruby 1.9.3 and 2.0.0) I tried this:

Parallel.each((1..8), :in_threads=>4) {|i| 
  p i
  if (i==4) then 
    break
 end
}

And it results in an exception:
LocalJumpError: break from proc-closure

Is this a bug? Is there (another?) way to stop all parallel threads?

Parallel doesn't loop through all elements

I recently noticed that parallel doesn't loop through all elements of hashes or array, at least in 0.8.4. Here's a simple example:

#!/usr/bin/env ruby

require 'parallel'

array = []

2000.times do
  array << rand
end

STDERR.puts "Array Length: #{array.length}"

Parallel.each(array) do |item|
  puts "Item: #{item}"
end

Outputs as follows:

$  ./bug.rb > test.log
Array Length: 2000

$  grep Item test.log | wc -l
1798

Parallel doesn't recognize OpenBSD

Programs that use this parallel gem don't behave as expected running in OpenBSD.

Things were going wrong for no apparent reason, until I noticed the following message:
Unknown architecture ( openbsd5.1 ) assuming one processor.

Exceptions in parallel.map on database write

I'm getting exceptions in parallel.map threads when I save a record to postgresql. Most of the time it works ok, but sometimes I get an exception (not always the same one) such as below. Not sure if it's an issue with parallels, ruby, postgresql or Passenger/REE.
I use Rails 3.1.1rc2, ruby 1.8.7 on Passenger/REE.

    uninitialized constant ActiveRecord::Associations::JoinDependency::JoinBase

    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/associations/join_dependency.rb:13:in `initialize'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/relation/finder_methods.rb:219:in `new'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/relation/finder_methods.rb:219:in `construct_join_dependency_for_association_find'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/relation/finder_methods.rb:192:in `exists?'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/validations/uniqueness.rb:33:in `validate_each'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validator.rb:153:in `validate'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validator.rb:150:in `each'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validator.rb:150:in `validate'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:302:in `send'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:302:in `_callback_before_375'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:449:in `_run_validate_callbacks'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:81:in `send'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:81:in `run_callbacks'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validations.rb:212:in `run_validations!'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validations/callbacks.rb:53:in `run_validations!'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:390:in `_run_validation_callbacks'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:81:in `send'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:81:in `run_callbacks'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validations/callbacks.rb:53:in `run_validations!'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validations.rb:179:in `valid?'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/validations.rb:69:in `valid?'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/validations/associated.rb:5:in `validate_each'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/validations/associated.rb:5:in `collect'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/validations/associated.rb:5:in `validate_each'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validator.rb:153:in `validate'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validator.rb:150:in `each'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validator.rb:150:in `validate'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:302:in `send'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:302:in `_callback_before_411'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:449:in `_run_validate_callbacks'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:81:in `send'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:81:in `run_callbacks'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validations.rb:212:in `run_validations!'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validations/callbacks.rb:53:in `run_validations!'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:390:in `_run_validation_callbacks'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:81:in `send'
    /home/aaa/shared/bundle/ruby/1.8/gems/activesupport-3.1.1.rc2/lib/active_support/callbacks.rb:81:in `run_callbacks'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validations/callbacks.rb:53:in `run_validations!'
    /home/aaa/shared/bundle/ruby/1.8/gems/activemodel-3.1.1.rc2/lib/active_model/validations.rb:179:in `valid?'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/validations.rb:69:in `valid?'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/validations.rb:77:in `perform_validations'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/validations.rb:56:in `save!'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/attribute_methods/dirty.rb:33:in `save!'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/transactions.rb:246:in `save!'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/transactions.rb:295:in `with_transaction_returning_status'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/connection_adapters/abstract/database_statements.rb:192:in `transaction'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/transactions.rb:208:in `transaction'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/transactions.rb:293:in `with_transaction_returning_status'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/transactions.rb:246:in `save!'
    /home/aaa/shared/bundle/ruby/1.8/gems/activerecord-3.1.1.rc2/lib/active_record/validations.rb:41:in `create!'
    /home/aaa.rb:165:in `save' ...
    /home/aaa/shared/bundle/ruby/1.8/gems/parallel-0.5.9/lib/parallel.rb:260:in `call'
    /home/aaa/shared/bundle/ruby/1.8/gems/parallel-0.5.9/lib/parallel.rb:260:in `call_with_index'
    /home/aaa/shared/bundle/ruby/1.8/gems/parallel-0.5.9/lib/parallel.rb:114:in `work_in_threads'
    /home/aaa/shared/bundle/ruby/1.8/gems/parallel-0.5.9/lib/parallel.rb:107:in `loop'
    /home/aaa/shared/bundle/ruby/1.8/gems/parallel-0.5.9/lib/parallel.rb:107:in `work_in_threads'
    /home/aaa/shared/bundle/ruby/1.8/gems/parallel-0.5.9/lib/parallel.rb:16:in `in_threads'
    /home/aaa/shared/bundle/ruby/1.8/gems/parallel-0.5.9/lib/parallel.rb:15:in `initialize'
    /home/aaa/shared/bundle/ruby/1.8/gems/parallel-0.5.9/lib/parallel.rb:15:in `new'
    /home/aaa/shared/bundle/ruby/1.8/gems/parallel-0.5.9/lib/parallel.rb:15:in `in_threads'
    /home/aaa/shared/bundle/ruby/1.8/gems/parallel-0.5.9/lib/parallel.rb:14:in `times'
    /home/aaa/shared/bundle/ruby/1.8/gems/parallel-0.5.9/lib/parallel.rb:14:in `in_threads'
    /home/aaa/shared/bundle/ruby/1.8/gems/parallel-0.5.9/lib/parallel.rb:105:in `work_in_threads'
    /home/aaa/shared/bundle/ruby/1.8/gems/parallel-0.5.9/lib/parallel.rb:55:in `map'

Killing sleeping processes

Hi there,

First and foremost, great job on the easy-to-use API!
I have a little question: Is there a way to kill sleeping child processes once the last iteration is completed? If I run something like this:
Parallel.each(1..1000, :in_processes => 4){|lol| puts lol}
The program goes through all the numbers just fine but it refuses to exit properly and kill all the processes. Even if I ctrl+c, the processes persist.

Cheers, Alejandro.

Permission denied reading VERSION file, when not using RVM

When installing the gem NOT using rvm, via:
$ sudo gem install parallel

Then the following file has read permissions only for root user:

  • /usr/local/lib/ruby/gems/1.9.1/gems/parallel-0.5.15/lib/../VERSION

As a result 'require "parallel"' fails
The completed stack trace is:

/usr/local/lib/ruby/gems/1.9.1/gems/parallel-0.5.15/lib/parallel.rb:5:in read': Permission denied - /usr/local/lib/ruby/gems/1.9.1/gems/parallel-0.5.15/lib/../VERSION (Errno::EACCES) from /usr/local/lib/ruby/gems/1.9.1/gems/parallel-0.5.15/lib/parallel.rb:5:inclass:Parallel'
from /usr/local/lib/ruby/gems/1.9.1/gems/parallel-0.5.15/lib/parallel.rb:4:in <top (required)>' from /usr/local/lib/ruby/gems/1.9.1/gems/bundler-1.0.21/lib/bundler/runtime.rb:68:inrequire'
from /usr/local/lib/ruby/gems/1.9.1/gems/bundler-1.0.21/lib/bundler/runtime.rb:68:in block (2 levels) in require' from /usr/local/lib/ruby/gems/1.9.1/gems/bundler-1.0.21/lib/bundler/runtime.rb:66:ineach'
from /usr/local/lib/ruby/gems/1.9.1/gems/bundler-1.0.21/lib/bundler/runtime.rb:66:in block in require' from /usr/local/lib/ruby/gems/1.9.1/gems/bundler-1.0.21/lib/bundler/runtime.rb:55:ineach'

The permission of the VERSION file in this case is:

$ ls -la /usr/local/lib/ruby/gems/1.9.1/gems/parallel-0.5.15/lib/../VERSION

-rw------- 1 root root 7 Mar 4 14:06 VERSION

I'd appreciate your assistance in fixing this and bumping the gem version.
Thanks,

Windows support ?

I'm trying to use the parallel gem via autotest / parallel_test, but am getting an unimplemented error for the fork in parallel.rb. Is it possible that you might add windows support ?

Detach process

Hello ,

Is it possible to detach processes

when I do that
Parallel ( :processes >= X) do
bla bla
end
p "bla"

all processes must be finished before executing the next code

thnx

ActiveRecord::StatementInvalid: Mysql2::Error: MySQL server has gone away

I have the following code in my rails app

results = Parallel.map(user_ids, :in_processes => 4) do |user_id|
//some calculations
[user_id, result]
end

This works properly in development environment. But when I try to test the method which contains this code, test fails with ActiveRecord::StatementInvalid: Mysql2::Error: MySQL server has gone away

Any help is much appreciated!

PG Error: PG::UnableToSend

workmaster2n/parallel@1d5eae3

When I run

bundle exec ruby spec/cases/map_with_ar_postges.rb 

I get back:

making user
making user
making user
making user
making user
making user
making user
making user
/Users/tyler/.rvm/gems/ruby-1.9.3-p392@parallel/gems/activerecord-4.0.0/lib/active_record/connection_adapters/postgresql_adapter.rb:512:in `exec': server closed the connection unexpectedly (PG::UnableToSend)
    This probably means the server terminated abnormally
    before or while processing the request.
    from /Users/tyler/.rvm/gems/ruby-1.9.3-p392@parallel/gems/activerecord-4.0.0/lib/active_record/connection_adapters/postgresql_adapter.rb:512:in `dealloc'
    from /Users/tyler/.rvm/gems/ruby-1.9.3-p392@parallel/gems/activerecord-4.0.0/lib/active_record/connection_adapters/postgresql_adapter.rb:495:in `block in clear'
    from /Users/tyler/.rvm/gems/ruby-1.9.3-p392@parallel/gems/activerecord-4.0.0/lib/active_record/connection_adapters/postgresql_adapter.rb:494:in `each_value'
    from /Users/tyler/.rvm/gems/ruby-1.9.3-p392@parallel/gems/activerecord-4.0.0/lib/active_record/connection_adapters/postgresql_adapter.rb:494:in `clear'
    from /Users/tyler/.rvm/gems/ruby-1.9.3-p392@parallel/gems/activerecord-4.0.0/lib/active_record/connection_adapters/postgresql_adapter.rb:557:in `clear_cache!'
    from /Users/tyler/.rvm/gems/ruby-1.9.3-p392@parallel/gems/activerecord-4.0.0/lib/active_record/connection_adapters/abstract_adapter.rb:322:in `reconnect!'
    from /Users/tyler/.rvm/gems/ruby-1.9.3-p392@parallel/gems/activerecord-4.0.0/lib/active_record/connection_adapters/postgresql_adapter.rb:569:in `reconnect!'
    from spec/cases/map_with_ar_postges.rb:39:in `block in <main>'
    from /Users/tyler/.rvm/rubies/ruby-1.9.3-p392/lib/ruby/1.9.1/tempfile.rb:320:in `open'
    from spec/cases/map_with_ar_postges.rb:4:in `<main>'

It appears that something is going awry with postgres.

jruby failure

process_count method fails. This version seems to work ok tho...at least for jruby on os x.
In general I think one could come up with a more robust way of finding it by using the os gem
if OS.mac?
case RUBY_PLATFORM
...

:)

Cheers!
-roger-

def self.processor_count
case RUBY_PLATFORM
when /darwin9/
hwprefs cpu_count.to_i
when /darwin10/
(hwprefs_available? ? hwprefs thread_count : sysctl -n hw.ncpu).to_i
when /linux/
cat /proc/cpuinfo | grep processor | wc -l.to_i
when /freebsd/
sysctl -n hw.ncpu.to_i
else
require 'rbconfig'
if RbConfig::CONFIG['host_os'] =~ /darwin/
(hwprefs_available? ? hwprefs thread_count : sysctl -n hw.ncpu).to_i
else
raise 'unable to determine processor count'
end
end
end

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.