Dask is a flexible parallel computing library for analytics. See documentation for more information.
New BSD. See License File.
Start a cluster in EC2 for dask.distributed
Dask is a flexible parallel computing library for analytics. See documentation for more information.
New BSD. See License File.
Destroy terminates the instances: is there an option to stop them and then restart from them? It can save money and time :).
In dask/distributed#326 we rename the dask.distributed executables from dfoo
to dask-foo
. We should probably rename dec2
to dask-ec2
accordingly.
I have got dec2 starting and provisioning instances just fine using default settings.
I am now trying to use it in a different aws region. Tthe instances are starting but not being provisioned. The log looks like its going through the motions but nothing is being installed
Im using
dec2 up
with --region-name ap-southeast-2 --ami ami-69631053
Im using this ami as it looked to me to be the most similar to the one that was the default used in dec2 for the us east region.
Thanks
I successfully managed to ramp up a dask cluster using the following command:
dask-ec2 up --keyname xx --keypair xx --count 3 --nprocs 1 --type t2.medium --region-name eu-central-1 --ami ami-accff2b1
However, I cannot reach the web interface, which is supposed to be found there:
Addresses
---------
Web Interface: http://52.59.221.151:8787/status
When I dask-ec2 ssh
into the head node and execute dask-scheduler
there, I can however access the web interface under the above address.
Shouldn't the scheduler be started automatically? At least that's what the instructions suggest.
I am using the github version of dask-ec2 (commit 7be342b).
Is/will there be spot instance support?
This is low-priority
I sometimes want to manage the workers and scheduler myself, rather than have salt do it. In these cases I use the nicely provided --no-dask
flag. However, I find that I now need to install Anaconda by hand. It would be nice if Anaconda were installed regardless.
I'm fine for now, it's relatively straightforward to install anaconda from the command line with cssh.
Dask.Distributed Installation succeeded
Scheduler Address: 52.23.171.227:8786
Starting python shell
Python 3.4.4 |Anaconda 2.3.0 (64-bit)| (default, Jan 11 2016, 13:54:01)
Type "copyright", "credits" or "license" for more information.
IPython 4.0.0 -- An enhanced Interactive Python.
? -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help -> Python's own help system.
object? -> Details about 'object', use 'object??' for extra details.
distributed.utils - ERROR - Could not connect to 52.23.171.227:8786
Traceback (most recent call last):
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/distributed-1.8.1-py3.4.egg/distributed/executor.py", line 308, in _start
ident = yield r.identity()
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/gen.py", line 1008, in run
value = future.result()
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/concurrent.py", line 232, in result
raise_exc_info(self._exc_info)
File "<string>", line 3, in raise_exc_info
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/gen.py", line 1014, in run
yielded = self.gen.throw(*exc_info)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/distributed-1.8.1-py3.4.egg/distributed/core.py", line 384, in send_recv_from_rpc
result = yield send_recv(stream=stream, op=key, **kwargs)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/gen.py", line 1008, in run
value = future.result()
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/concurrent.py", line 232, in result
raise_exc_info(self._exc_info)
File "<string>", line 3, in raise_exc_info
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/gen.py", line 1014, in run
yielded = self.gen.throw(*exc_info)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/distributed-1.8.1-py3.4.egg/distributed/core.py", line 281, in send_recv
response = yield read(stream)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/gen.py", line 1008, in run
value = future.result()
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/concurrent.py", line 232, in result
raise_exc_info(self._exc_info)
File "<string>", line 3, in raise_exc_info
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/gen.py", line 1014, in run
yielded = self.gen.throw(*exc_info)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/distributed-1.8.1-py3.4.egg/distributed/core.py", line 204, in read
msg = yield stream.read_until(sentinel)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/gen.py", line 1008, in run
value = future.result()
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/concurrent.py", line 232, in result
raise_exc_info(self._exc_info)
File "<string>", line 3, in raise_exc_info
tornado.iostream.StreamClosedError: Stream is closed
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/distributed-1.8.1-py3.4.egg/distributed/utils.py", line 96, in f
result[0] = yield gen.maybe_future(func(*args, **kwargs))
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/gen.py", line 1008, in run
value = future.result()
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/concurrent.py", line 232, in result
raise_exc_info(self._exc_info)
File "<string>", line 3, in raise_exc_info
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/tornado/gen.py", line 1014, in run
yielded = self.gen.throw(*exc_info)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/distributed-1.8.1-py3.4.egg/distributed/executor.py", line 310, in _start
raise IOError("Could not connect to %s:%d" % (ip, port))
OSError: Could not connect to 52.23.171.227:8786
But if I ssh into the remote machine things are indeed set up. Does this depend on my security settings? Would ssh-ing be a more reasonable default?
mrocklin@carbon:~/workspace/dask-ec2$ dask-ec2 up --keyname mrocklin --keypair ~/.ssh/anaconda-cluster.pem --name dask
DEBUG: Searching for default VPC
DEBUG: Default VPC found - Using VPC ID: vpc-3d7dd359
DEBUG: Searching for default subnet in VPC vpc-3d7dd359
DEBUG: Default subnet found - Using Subnet ID: subnet-5edb0063
Launching nodes
DEBUG: Checking that keyname 'mrocklin' exists on EC2
DEBUG: Checking that security group 'dask-ec2-default' exists on EC2
DEBUG: Getting all security groups and filtering by VPC ID vpc-3d7dd359 and name dask-ec2-default
DEBUG: Found Security groups: [ec2.SecurityGroup(id='sg-acd2bed7')]
DEBUG: Getting all security groups and filtering by VPC ID vpc-3d7dd359 and name dask-ec2-default
DEBUG: Found Security groups: [ec2.SecurityGroup(id='sg-acd2bed7')]
DEBUG: Creating 4 instances on EC2
DEBUG: Tagging instance 'i-03987315a257975c2'
DEBUG: Tagging instance 'i-05df877b682eb0abe'
DEBUG: Tagging instance 'i-0b69e0fdbcc9da4a0'
DEBUG: Tagging instance 'i-0b9f8d3146a2583c5'
Checking SSH connection to nodes
DEBUG: Checking ssh connection for 54.158.80.221
DEBUG: Attempt 1/10 of function 'check_ssh' failed
DEBUG: Checking ssh connection for 54.158.80.221
DEBUG: Running command bash -c 'ls' on '54.158.80.221'
DEBUG: Checking ssh connection for 54.86.24.142
DEBUG: Running command bash -c 'ls' on '54.86.24.142'
DEBUG: Checking ssh connection for 54.173.4.143
DEBUG: Running command bash -c 'ls' on '54.173.4.143'
DEBUG: Checking ssh connection for 54.162.35.44
DEBUG: Running command bash -c 'ls' on '54.162.35.44'
+------------------+-----------+
| Node IP | SSH check |
+==================+===========+
| 54.162.35.44:22 | True |
| 54.86.24.142:22 | True |
| 54.173.4.143:22 | True |
| 54.158.80.221:22 | True |
+------------------+-----------+
Bootstraping salt master
DEBUG: Running command sudo -S bash -c 'curl -sS -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N stable' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N -P -L -p salt-api stable' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/auto_accept.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/auto_accept.conf' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'apt-get install -y python-pip' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'pip install cherrypy' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'pip install PyOpenSSL' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'salt-call --local tls.create_self_signed_cert' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/rest_cherrypy.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/rest_cherrypy.conf' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/external_auth.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/external_auth.conf' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'id -u saltdev &>/dev/null || useradd -p $(openssl passwd -1 saltdev) saltdev' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'service salt-master restart' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'service salt-api restart' on '54.158.80.221'
Bootstraping salt minions
DEBUG: Installing salt-minion on all the nodes
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.158.80.221 -i node-2 stable' on '54.173.4.143'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.158.80.221 -i node-1 stable' on '54.86.24.142'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.158.80.221 -i node-3 stable' on '54.162.35.44'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.158.80.221 -i node-0 stable' on '54.158.80.221'
DEBUG: Configuring salt-mine on the salt minions
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.173.4.143'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.162.35.44'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.86.24.142'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.173.4.143'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.162.35.44'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.86.24.142'
DEBUG: Restarting the salt-minion service
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.86.24.142'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.173.4.143'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.162.35.44'
Uploading salt formulas
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt to /srv/salt
DEBUG: Running command sudo -S bash -c 'mkdir -p /srv/salt' on '54.158.80.221'
DEBUG: Creating directory /tmp/.__tmp_copy mode=511
DEBUG: Creating directory /tmp/.__tmp_copy/supervisor mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/supervisor to /tmp/.__tmp_copy/supervisor
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/supervisor' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/supervisor/settings.sls to /tmp/.__tmp_copy/supervisor/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/supervisor/init.sls to /tmp/.__tmp_copy/supervisor/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java to /tmp/.__tmp_copy/java
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/debug.sls to /tmp/.__tmp_copy/java/debug.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/settings.sls to /tmp/.__tmp_copy/java/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/init.sls to /tmp/.__tmp_copy/java/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java/openjdk mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/openjdk to /tmp/.__tmp_copy/java/openjdk
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java/openjdk' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/openjdk/settings.sls to /tmp/.__tmp_copy/java/openjdk/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/openjdk/init.sls to /tmp/.__tmp_copy/java/openjdk/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java/openjdk/templates mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/openjdk/templates to /tmp/.__tmp_copy/java/openjdk/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java/openjdk/templates' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/openjdk/templates/java.sh to /tmp/.__tmp_copy/java/openjdk/templates/java.sh
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/openjdk/env.sls to /tmp/.__tmp_copy/java/openjdk/env.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask to /tmp/.__tmp_copy/dask
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask' on '54.158.80.221'
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed to /tmp/.__tmp_copy/dask/distributed
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/debug.sls to /tmp/.__tmp_copy/dask/distributed/debug.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/scheduler mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/scheduler to /tmp/.__tmp_copy/dask/distributed/scheduler
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/scheduler' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/scheduler/init.sls to /tmp/.__tmp_copy/dask/distributed/scheduler/init.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/settings.sls to /tmp/.__tmp_copy/dask/distributed/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/init.sls to /tmp/.__tmp_copy/dask/distributed/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/templates mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/templates to /tmp/.__tmp_copy/dask/distributed/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/templates' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/templates/dask-scheduler.conf to /tmp/.__tmp_copy/dask/distributed/templates/dask-scheduler.conf
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/templates/dask-worker.conf to /tmp/.__tmp_copy/dask/distributed/templates/dask-worker.conf
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/worker mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/worker to /tmp/.__tmp_copy/dask/distributed/worker
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/worker' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/worker/init.sls to /tmp/.__tmp_copy/dask/distributed/worker/init.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/init2.sls to /tmp/.__tmp_copy/dask/distributed/init2.sls
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter to /tmp/.__tmp_copy/jupyter
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/settings.sls to /tmp/.__tmp_copy/jupyter/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/init.sls to /tmp/.__tmp_copy/jupyter/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter/templates mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/templates to /tmp/.__tmp_copy/jupyter/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter/templates' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/templates/jupyter-notebook.conf to /tmp/.__tmp_copy/jupyter/templates/jupyter-notebook.conf
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/templates/jupyter_notebook_config_py to /tmp/.__tmp_copy/jupyter/templates/jupyter_notebook_config_py
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter/notebook mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/notebook to /tmp/.__tmp_copy/jupyter/notebook
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter/notebook' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/notebook/settings.sls to /tmp/.__tmp_copy/jupyter/notebook/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/notebook/init.sls to /tmp/.__tmp_copy/jupyter/notebook/init.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/top.sls to /tmp/.__tmp_copy/top.sls
DEBUG: Creating directory /tmp/.__tmp_copy/system mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/system to /tmp/.__tmp_copy/system
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/system' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/system/base.sls to /tmp/.__tmp_copy/system/base.sls
DEBUG: Creating directory /tmp/.__tmp_copy/system/templates mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/system/templates to /tmp/.__tmp_copy/system/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/system/templates' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/system/templates/limits.conf to /tmp/.__tmp_copy/system/templates/limits.conf
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/system/macros.sls to /tmp/.__tmp_copy/system/macros.sls
DEBUG: Creating directory /tmp/.__tmp_copy/conda mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/conda to /tmp/.__tmp_copy/conda
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/conda' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/conda/debug.sls to /tmp/.__tmp_copy/conda/debug.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/conda/settings.sls to /tmp/.__tmp_copy/conda/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/conda/init.sls to /tmp/.__tmp_copy/conda/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/conda/templates mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/conda/templates to /tmp/.__tmp_copy/conda/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/conda/templates' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/conda/templates/conda.sh to /tmp/.__tmp_copy/conda/templates/conda.sh
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy/* /srv/salt' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/pillar to /srv/pillar
DEBUG: Running command sudo -S bash -c 'mkdir -p /srv/pillar' on '54.158.80.221'
DEBUG: Creating directory /tmp/.__tmp_copy mode=511
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/pillar/top.sls to /tmp/.__tmp_copy/top.sls
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy/* /srv/pillar' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
Uploading conda settings
DEBUG: Uploading file /tmp/tmpw34kocly to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /srv/pillar/conda.sls' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
^C
Aborted!
mrocklin@carbon:~/workspace/dask-ec2$ dask-ec2 provision
Checking SSH connection to nodes
DEBUG: Checking ssh connection for 54.158.80.221
DEBUG: Running command bash -c 'ls' on '54.158.80.221'
DEBUG: Checking ssh connection for 54.86.24.142
DEBUG: Running command bash -c 'ls' on '54.86.24.142'
DEBUG: Checking ssh connection for 54.173.4.143
DEBUG: Running command bash -c 'ls' on '54.173.4.143'
DEBUG: Checking ssh connection for 54.162.35.44
DEBUG: Running command bash -c 'ls' on '54.162.35.44'
+------------------+-----------+
| Node IP | SSH check |
+==================+===========+
| 54.158.80.221:22 | True |
| 54.86.24.142:22 | True |
| 54.162.35.44:22 | True |
| 54.173.4.143:22 | True |
+------------------+-----------+
Bootstraping salt master
DEBUG: Running command sudo -S bash -c 'curl -sS -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N stable' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N -P -L -p salt-api stable' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/auto_accept.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/auto_accept.conf' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'apt-get install -y python-pip' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'pip install cherrypy' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'pip install PyOpenSSL' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'salt-call --local tls.create_self_signed_cert' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/rest_cherrypy.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/rest_cherrypy.conf' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/external_auth.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/external_auth.conf' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'id -u saltdev &>/dev/null || useradd -p $(openssl passwd -1 saltdev) saltdev' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'service salt-master restart' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'service salt-api restart' on '54.158.80.221'
Bootstraping salt minions
DEBUG: Installing salt-minion on all the nodes
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.158.80.221 -i node-2 stable' on '54.173.4.143'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.158.80.221 -i node-1 stable' on '54.86.24.142'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.158.80.221 -i node-3 stable' on '54.162.35.44'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.158.80.221 -i node-0 stable' on '54.158.80.221'
DEBUG: Configuring salt-mine on the salt minions
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.162.35.44'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.173.4.143'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.86.24.142'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.162.35.44'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.173.4.143'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.86.24.142'
DEBUG: Restarting the salt-minion service
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.173.4.143'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.86.24.142'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.162.35.44'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.158.80.221'
Uploading salt formulas
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt to /srv/salt
DEBUG: Running command sudo -S bash -c 'mkdir -p /srv/salt' on '54.158.80.221'
DEBUG: Creating directory /tmp/.__tmp_copy mode=511
DEBUG: Creating directory /tmp/.__tmp_copy/supervisor mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/supervisor to /tmp/.__tmp_copy/supervisor
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/supervisor' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/supervisor/settings.sls to /tmp/.__tmp_copy/supervisor/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/supervisor/init.sls to /tmp/.__tmp_copy/supervisor/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java to /tmp/.__tmp_copy/java
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/debug.sls to /tmp/.__tmp_copy/java/debug.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/settings.sls to /tmp/.__tmp_copy/java/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/init.sls to /tmp/.__tmp_copy/java/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java/openjdk mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/openjdk to /tmp/.__tmp_copy/java/openjdk
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java/openjdk' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/openjdk/settings.sls to /tmp/.__tmp_copy/java/openjdk/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/openjdk/init.sls to /tmp/.__tmp_copy/java/openjdk/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java/openjdk/templates mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/openjdk/templates to /tmp/.__tmp_copy/java/openjdk/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java/openjdk/templates' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/openjdk/templates/java.sh to /tmp/.__tmp_copy/java/openjdk/templates/java.sh
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/java/openjdk/env.sls to /tmp/.__tmp_copy/java/openjdk/env.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask to /tmp/.__tmp_copy/dask
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask' on '54.158.80.221'
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed to /tmp/.__tmp_copy/dask/distributed
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/debug.sls to /tmp/.__tmp_copy/dask/distributed/debug.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/scheduler mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/scheduler to /tmp/.__tmp_copy/dask/distributed/scheduler
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/scheduler' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/scheduler/init.sls to /tmp/.__tmp_copy/dask/distributed/scheduler/init.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/settings.sls to /tmp/.__tmp_copy/dask/distributed/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/init.sls to /tmp/.__tmp_copy/dask/distributed/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/templates mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/templates to /tmp/.__tmp_copy/dask/distributed/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/templates' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/templates/dask-scheduler.conf to /tmp/.__tmp_copy/dask/distributed/templates/dask-scheduler.conf
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/templates/dask-worker.conf to /tmp/.__tmp_copy/dask/distributed/templates/dask-worker.conf
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/worker mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/worker to /tmp/.__tmp_copy/dask/distributed/worker
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/worker' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/worker/init.sls to /tmp/.__tmp_copy/dask/distributed/worker/init.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/dask/distributed/init2.sls to /tmp/.__tmp_copy/dask/distributed/init2.sls
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter to /tmp/.__tmp_copy/jupyter
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/settings.sls to /tmp/.__tmp_copy/jupyter/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/init.sls to /tmp/.__tmp_copy/jupyter/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter/templates mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/templates to /tmp/.__tmp_copy/jupyter/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter/templates' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/templates/jupyter-notebook.conf to /tmp/.__tmp_copy/jupyter/templates/jupyter-notebook.conf
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/templates/jupyter_notebook_config_py to /tmp/.__tmp_copy/jupyter/templates/jupyter_notebook_config_py
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter/notebook mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/notebook to /tmp/.__tmp_copy/jupyter/notebook
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter/notebook' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/notebook/settings.sls to /tmp/.__tmp_copy/jupyter/notebook/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/jupyter/notebook/init.sls to /tmp/.__tmp_copy/jupyter/notebook/init.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/top.sls to /tmp/.__tmp_copy/top.sls
DEBUG: Creating directory /tmp/.__tmp_copy/system mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/system to /tmp/.__tmp_copy/system
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/system' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/system/base.sls to /tmp/.__tmp_copy/system/base.sls
DEBUG: Creating directory /tmp/.__tmp_copy/system/templates mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/system/templates to /tmp/.__tmp_copy/system/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/system/templates' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/system/templates/limits.conf to /tmp/.__tmp_copy/system/templates/limits.conf
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/system/macros.sls to /tmp/.__tmp_copy/system/macros.sls
DEBUG: Creating directory /tmp/.__tmp_copy/conda mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/conda to /tmp/.__tmp_copy/conda
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/conda' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/conda/debug.sls to /tmp/.__tmp_copy/conda/debug.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/conda/settings.sls to /tmp/.__tmp_copy/conda/settings.sls
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/conda/init.sls to /tmp/.__tmp_copy/conda/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/conda/templates mode=511
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/conda/templates to /tmp/.__tmp_copy/conda/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/conda/templates' on '54.158.80.221'
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/salt/conda/templates/conda.sh to /tmp/.__tmp_copy/conda/templates/conda.sh
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy/* /srv/salt' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
DEBUG: Uploading directory /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/pillar to /srv/pillar
DEBUG: Running command sudo -S bash -c 'mkdir -p /srv/pillar' on '54.158.80.221'
DEBUG: Creating directory /tmp/.__tmp_copy mode=511
DEBUG: Uploading file /home/mrocklin/workspace/dask-ec2/dask_ec2/formulas/pillar/top.sls to /tmp/.__tmp_copy/top.sls
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy/* /srv/pillar' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
Uploading conda settings
DEBUG: Uploading file /tmp/tmpqbii5w12 to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /srv/pillar/conda.sls' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
+---------+----------------------+-----------------+
| Node ID | # Successful actions | # Failed action |
+=========+======================+=================+
| node-2 | 6 | 0 |
| node-1 | 6 | 0 |
| node-0 | 6 | 0 |
| node-3 | 6 | 0 |
+---------+----------------------+-----------------+
DEBUG: Uploading file /tmp/tmphkt15c6z to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /srv/pillar/dask.sls' on '54.158.80.221'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.158.80.221'
Installing scheduler
Is this an option given the dependencies?
I was recently testing a fresh github clone of dask-ec2, could perfectly create a cluster with default configuration, but not destroy it again with dark-ec2 destroy
(apparently, the instances were not found, see message below), so I had to terminate the instances in the AWS console. Is this a known issue?
deeplook:test-dask-ec2 dinu$ dask-ec2 up --region-name eu-central-1 --keyname my_aws_key --keypair ~/.ssh/my_aws_key.pem --ami ami-ac1524b1
[...]
deeplook:test-dask-ec2 dinu$ dask-ec2 destroy
Are you sure you want to destroy the cluster? [y/N]: y
DEBUG: Searching for default VPC
DEBUG: Default VPC found - Using VPC ID: vpc-d63fe9b3
DEBUG: Searching for default subnet in VPC vpc-d63fe9b3
DEBUG: Default subnet found - Using Subnet ID: subnet-5fa49277
Terminating instances
DEBUG: Terminating instances: ['i-b7f06f0a', 'i-adf06f10', 'i-acf06f11', 'i-aff06f12']
Unexpected EC2 error: An error occurred (InvalidInstanceID.NotFound) when calling the DescribeInstances operation: The instance IDs 'i-acf06f11, i-adf06f10, i-aff06f12, i-b7f06f0a' do not exist
deeplook:test-dask-ec2 dinu$ dask-ec2 --version
dask-ec2, version v0.3.0+8.gc9da588
Is it possible that while dask-ec2 destroy
terminates instances it leaves volumes running? This might result in unexpected costs to users. If this is the case is there a robust way to terminate volumes as well?
Q: Should we deploy dask.distributed by default on creation?
When I run dec2 dask-distributed
I don't immediately see the next step:
(27)mrocklin@workstation:~$ dec2 dask-distributed
Installing scheduler
+---------+--------------+----------+
| Node ID | # Successful | # Failed |
+=========+==============+==========+
| node-0 | 10 | 0 |
+---------+--------------+----------+
Installing workers
+---------+--------------+----------+
| Node ID | # Successful | # Failed |
+=========+==============+==========+
| node-1 | 10 | 0 |
| node-3 | 10 | 0 |
| node-2 | 10 | 0 |
+---------+--------------+----------+
This might be friendlier if it would print the address of the head node as in 192.168.0.1:8786
Given that it is quite easy to forget about destroying an existing cluster (see #35) I would consider it important to set an EC2 instance type as a default value, that is much cheaper than m3.2xlarge
. Otherwise one can get unexpected significant increases in one's AWS invoice, like it happened to me, because of experimenting with dask-ec2
. In my case this was ca. 96 CPU hours and $60 per day! :-(
This (and other values) should be listed in the generated cluster.yml
file, too. So there is some evidence without needing to go to the AWS online console.
It would be very convenient if, on top of installing Anaconda, we also conda installed a few packages. Notably, the latest Pandas release has some very important bugfixes for parallel computing.
Hi,
I manage to run the dask-ec2 up command and manage to ssh into the head node.
I start the ipython and execute the following the command, "from distributed import Client, s3, progress" and got a cannon import name 's3' error. Here is the full stack of error.
ImportError Traceback (most recent call last)
in ()
----> 1 from distributed import Executor, s3, progress
ImportError: cannot import name 's3'
I would like to start a cluster on EC2 where my workers use processes, rather than threads.
When doing things manually I do the following:
$ dscheduler
$ dworker ADDRESS --nprocs 8
$ dworker ADDRESS --nprocs 8
$ dworker ADDRESS --nprocs 8
Is it possible to pass through this keyword (and possibly others as they arise) through the dec2
command?
$ dec2 ... --nprocs 8
The last commit to master doesn't pass CI, however It doesn't seem to do anything connected with the failure
Running dscheduler
now also starts up a Bokeh web application to visualize the state of the cluster if Bokeh is installed. In order to make this web server visible to the outside world we need to tell it the address by which users will refer to it, likely the publicly accessible IP address of the head node. We can do this by passing the address under the --host ADDRESS
keyword to dscheduler like so:
dscheduler --host EXTERNALLY_VISIBLE_ADDRESS
Is this address easily available within the salt configuration file that specifies how to launch the scheduler?
Install Anaconda
Firefox: https://www.continuum.io/downloads
Firefox: Click "PYTHON 2.7" "LINUX 64-BIT"
$: bash ~/Downloads/Anaconda2-4.1.1-Linux-x86_64.sh
$: conda update conda
$: conda update anaconda
$: conda config --add channels conda-forge
Create Conda Environment
$: conda create --name CT python=2
$: source activate CT
Install dask-ec2
$: conda install dask-ec2
Use dask-ec2
$: dask-ec2 up --keyname DaskDistributed --keypair ~/data/Safe/DaskDistributed.pem
Leads to the already documented issue:
CherryPy Error; JSON Parsing Error from Pepper #25
Install latest dask-ec2 version from GitHub (9/15/2016)
$: cd ~/sw
$: git clone https://github.com/dask/dask-ec2.git
$: cd dask-ec2
$: python setup.py install
$: cd ~
List conda packages
$: conda list
boto3 1.3.1 py27_0
botocore 1.4.49 py27_0 conda-forge
ca-certificates 2016.8.31 0 conda-forge
certifi 2016.8.31 py27_0 conda-forge
click 6.6 py27_0 conda-forge
docutils 0.12 py27_0 conda-forge
futures 3.0.5 py27_0 conda-forge
jmespath 0.9.0 py27_0 conda-forge
ncurses 5.9 9 conda-forge
openssl 1.0.2h 2 conda-forge
paramiko 1.17.2 py27_0 conda-forge
pip 8.1.2 py27_0 conda-forge
pycrypto 2.6.1 py27_0 conda-forge
python 2.7.12 1 conda-forge
python-dateutil 2.5.3 py27_0 conda-forge
pyyaml 3.11 py27_0 conda-forge
readline 6.2 0 conda-forge
setuptools 26.1.1 py27_0 conda-forge
six 1.10.0 py27_0 conda-forge
sqlite 3.13.0 1 conda-forge
tk 8.5.19 0 conda-forge
wheel 0.29.0 py27_0 conda-forge
yaml 0.1.6 0 conda-forge
zlib 1.2.8 3 conda-forge
dask-ec2 0.3.0+6.g95eb53a <pip>
ecdsa 0.13 py27_0
Use dask-ec2
$: dask-ec2 up --keyname DaskDistributed --keypair ~/data/Safe/DaskDistributed.pem
Lead to PepperException failure ...
_Uploading conda settings
DEBUG: Uploading file /tmp/tmp_8AnN_ to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /srv/pillar/conda.sls' on '52.3.243.195'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '52.3.243.195'
Traceback (most recent call last):
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/dask_ec2-0.3.0+6.g95eb53a-py2.7.egg/dask_ec2/cli/main.py", line 23, in start
cli(obj={})
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/click/core.py", line 716, in __call__
return self.main(*args, **kwargs)
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/click/core.py", line 696, in main
rv = self.invoke(ctx)
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/click/core.py", line 1060, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/click/core.py", line 889, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/click/core.py", line 534, in invoke
return callback(*args, **kwargs)
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/dask_ec2-0.3.0+6.g95eb53a-py2.7.egg/dask_ec2/cli/main.py", line 165, in up
ctx.invoke(provision, filepath=filepath, anaconda_=anaconda_, dask=dask, notebook=notebook, nprocs=nprocs)
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/click/core.py", line 534, in invoke
return callback(*args, **kwargs)
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/dask_ec2-0.3.0+6.g95eb53a-py2.7.egg/dask_ec2/cli/main.py", line 306, in provision
ctx.invoke(anaconda, filepath=filepath)
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/click/core.py", line 534, in invoke
return callback(*args, **kwargs)
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/dask_ec2-0.3.0+6.g95eb53a-py2.7.egg/dask_ec2/cli/main.py", line 326, in anaconda
output = cluster.salt_call("*", "state.sls", ["conda"])
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/dask_ec2-0.3.0+6.g95eb53a-py2.7.egg/dask_ec2/cluster.py", line 65, in salt_call
return self.pepper.local(target, module, args)
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/dask_ec2-0.3.0+6.g95eb53a-py2.7.egg/dask_ec2/libpepper.py", line 226, in local
return self.low([low], path='/')
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/dask_ec2-0.3.0+6.g95eb53a-py2.7.egg/dask_ec2/libpepper.py", line 201, in low
return self.req(path, lowstate)
File "/home/drh/sw/anaconda2/envs/CT/lib/python2.7/site-packages/dask_ec2-0.3.0+6.g95eb53a-py2.7.egg/dask_ec2/libpepper.py", line 145, in req
raise PepperException('Server error.')
PepperException: Server error._
mrocklin@notebook:~$ dask-ec2 up --keyname mrocklin --keypair ~/.ssh/anaconda-cluster.pem --name dask --type m4.2xlarge --count 10 --no-dask --anaconda
Launching nodes
DEBUG: Checking that keyname 'mrocklin' exists on EC2
DEBUG: Checking that security group 'dask-ec2-default' exists on EC2
DEBUG: Creating 10 instances on EC2
DEBUG: Tagging instance 'i-e6383860'
DEBUG: Tagging instance 'i-e7383861'
DEBUG: Tagging instance 'i-e4383862'
DEBUG: Tagging instance 'i-e5383863'
DEBUG: Tagging instance 'i-e0383866'
DEBUG: Tagging instance 'i-ef383869'
DEBUG: Tagging instance 'i-ea38386c'
DEBUG: Tagging instance 'i-eb38386d'
DEBUG: Tagging instance 'i-e838386e'
DEBUG: Tagging instance 'i-e938386f'
Checking SSH connection to nodes
DEBUG: Checking ssh connection for 54.198.166.234
DEBUG: Attempt 1/10 of function 'check_ssh' failed
DEBUG: Checking ssh connection for 54.198.166.234
DEBUG: Running command bash -c 'ls' on '54.198.166.234'
DEBUG: Checking ssh connection for 54.174.132.155
DEBUG: Running command bash -c 'ls' on '54.174.132.155'
DEBUG: Checking ssh connection for 54.237.196.140
DEBUG: Running command bash -c 'ls' on '54.237.196.140'
DEBUG: Checking ssh connection for 54.237.202.33
DEBUG: Running command bash -c 'ls' on '54.237.202.33'
DEBUG: Checking ssh connection for 54.198.219.79
DEBUG: Running command bash -c 'ls' on '54.198.219.79'
DEBUG: Checking ssh connection for 54.237.196.167
DEBUG: Running command bash -c 'ls' on '54.237.196.167'
DEBUG: Checking ssh connection for 54.235.231.162
DEBUG: Running command bash -c 'ls' on '54.235.231.162'
DEBUG: Checking ssh connection for 54.224.65.71
DEBUG: Running command bash -c 'ls' on '54.224.65.71'
DEBUG: Checking ssh connection for 54.237.196.172
DEBUG: Running command bash -c 'ls' on '54.237.196.172'
DEBUG: Checking ssh connection for 54.237.200.199
DEBUG: Running command bash -c 'ls' on '54.237.200.199'
+-------------------+-----------+
| Node IP | SSH check |
+===================+===========+
| 54.237.196.167:22 | True |
| 54.237.202.33:22 | True |
| 54.237.200.199:22 | True |
| 54.237.196.140:22 | True |
| 54.237.196.172:22 | True |
| 54.198.219.79:22 | True |
| 54.224.65.71:22 | True |
| 54.198.166.234:22 | True |
| 54.235.231.162:22 | True |
| 54.174.132.155:22 | True |
+-------------------+-----------+
Bootstraping salt master
DEBUG: Running command sudo -S bash -c 'curl -sS -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N stable' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N -P -L -p salt-api stable' on '54.198.166.234'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/templates/auto_accept.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/auto_accept.conf' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'pip install cherrypy' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'pip install PyOpenSSL' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'salt-call --local tls.create_self_signed_cert' on '54.198.166.234'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/templates/rest_cherrypy.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/rest_cherrypy.conf' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.198.166.234'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/templates/external_auth.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/external_auth.conf' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'id -u saltdev &>/dev/null || useradd -p $(openssl passwd -1 saltdev) saltdev' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'service salt-master restart' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'service salt-api restart' on '54.198.166.234'
Bootstraping salt minions
DEBUG: Installing salt-minion on all the nodes
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.198.166.234 -i node-3 stable' on '54.237.202.33'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.198.166.234 -i node-4 stable' on '54.198.219.79'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.198.166.234 -i node-2 stable' on '54.237.196.140'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.198.166.234 -i node-1 stable' on '54.174.132.155'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.198.166.234 -i node-0 stable' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.198.166.234 -i node-7 stable' on '54.224.65.71'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.198.166.234 -i node-9 stable' on '54.237.200.199'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.198.166.234 -i node-5 stable' on '54.237.196.167'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.198.166.234 -i node-8 stable' on '54.237.196.172'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.198.166.234 -i node-6 stable' on '54.235.231.162'
DEBUG: Configuring salt-mine on the salt minions
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.237.196.167'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.237.196.172'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.224.65.71'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.174.132.155'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.237.202.33'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.198.219.79'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.237.196.140'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.237.200.199'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.235.231.162'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.237.196.167'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.237.196.172'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.224.65.71'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.174.132.155'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.198.219.79'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.237.202.33'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.237.196.140'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.237.200.199'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.235.231.162'
DEBUG: Restarting the salt-minion service
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.198.219.79'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.237.200.199'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.237.196.167'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.237.202.33'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.174.132.155'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.237.196.140'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.224.65.71'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.235.231.162'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.237.196.172'
Uploading salt formulas
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt to /srv/salt
DEBUG: Running command sudo -S bash -c 'mkdir -p /srv/salt' on '54.198.166.234'
DEBUG: Creating directory /tmp/.__tmp_copy mode=511
DEBUG: Creating directory /tmp/.__tmp_copy/supervisor mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/supervisor to /tmp/.__tmp_copy/supervisor
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/supervisor' on '54.198.166.234'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/supervisor/settings.sls to /tmp/.__tmp_copy/supervisor/settings.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/supervisor/init.sls to /tmp/.__tmp_copy/supervisor/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/system mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/system to /tmp/.__tmp_copy/system
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/system' on '54.198.166.234'
DEBUG: Creating directory /tmp/.__tmp_copy/system/templates mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/system/templates to /tmp/.__tmp_copy/system/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/system/templates' on '54.198.166.234'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/system/templates/limits.conf to /tmp/.__tmp_copy/system/templates/limits.conf
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/system/macros.sls to /tmp/.__tmp_copy/system/macros.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/system/base.sls to /tmp/.__tmp_copy/system/base.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/java to /tmp/.__tmp_copy/java
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java' on '54.198.166.234'
DEBUG: Creating directory /tmp/.__tmp_copy/java/openjdk mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/java/openjdk to /tmp/.__tmp_copy/java/openjdk
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java/openjdk' on '54.198.166.234'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/java/openjdk/env.sls to /tmp/.__tmp_copy/java/openjdk/env.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java/openjdk/templates mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/java/openjdk/templates to /tmp/.__tmp_copy/java/openjdk/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java/openjdk/templates' on '54.198.166.234'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/java/openjdk/templates/java.sh to /tmp/.__tmp_copy/java/openjdk/templates/java.sh
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/java/openjdk/settings.sls to /tmp/.__tmp_copy/java/openjdk/settings.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/java/openjdk/init.sls to /tmp/.__tmp_copy/java/openjdk/init.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/java/settings.sls to /tmp/.__tmp_copy/java/settings.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/java/debug.sls to /tmp/.__tmp_copy/java/debug.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/java/init.sls to /tmp/.__tmp_copy/java/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/jupyter to /tmp/.__tmp_copy/jupyter
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter' on '54.198.166.234'
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter/templates mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/jupyter/templates to /tmp/.__tmp_copy/jupyter/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter/templates' on '54.198.166.234'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/jupyter/templates/jupyter-notebook.conf to /tmp/.__tmp_copy/jupyter/templates/jupyter-notebook.conf
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/jupyter/settings.sls to /tmp/.__tmp_copy/jupyter/settings.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/jupyter/init.sls to /tmp/.__tmp_copy/jupyter/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter/notebook mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/jupyter/notebook to /tmp/.__tmp_copy/jupyter/notebook
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter/notebook' on '54.198.166.234'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/jupyter/notebook/init.sls to /tmp/.__tmp_copy/jupyter/notebook/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/conda mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/conda to /tmp/.__tmp_copy/conda
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/conda' on '54.198.166.234'
DEBUG: Creating directory /tmp/.__tmp_copy/conda/templates mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/conda/templates to /tmp/.__tmp_copy/conda/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/conda/templates' on '54.198.166.234'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/conda/templates/conda.sh to /tmp/.__tmp_copy/conda/templates/conda.sh
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/conda/settings.sls to /tmp/.__tmp_copy/conda/settings.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/conda/debug.sls to /tmp/.__tmp_copy/conda/debug.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/conda/init.sls to /tmp/.__tmp_copy/conda/init.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/top.sls to /tmp/.__tmp_copy/top.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/dask to /tmp/.__tmp_copy/dask
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask' on '54.198.166.234'
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/dask/distributed to /tmp/.__tmp_copy/dask/distributed
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed' on '54.198.166.234'
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/worker mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/dask/distributed/worker to /tmp/.__tmp_copy/dask/distributed/worker
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/worker' on '54.198.166.234'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/dask/distributed/worker/init.sls to /tmp/.__tmp_copy/dask/distributed/worker/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/templates mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/dask/distributed/templates to /tmp/.__tmp_copy/dask/distributed/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/templates' on '54.198.166.234'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/dask/distributed/templates/dworker.conf to /tmp/.__tmp_copy/dask/distributed/templates/dworker.conf
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/dask/distributed/templates/dscheduler.conf to /tmp/.__tmp_copy/dask/distributed/templates/dscheduler.conf
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/scheduler mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/dask/distributed/scheduler to /tmp/.__tmp_copy/dask/distributed/scheduler
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/scheduler' on '54.198.166.234'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/dask/distributed/scheduler/init.sls to /tmp/.__tmp_copy/dask/distributed/scheduler/init.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/dask/distributed/settings.sls to /tmp/.__tmp_copy/dask/distributed/settings.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/dask/distributed/debug.sls to /tmp/.__tmp_copy/dask/distributed/debug.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/salt/dask/distributed/init.sls to /tmp/.__tmp_copy/dask/distributed/init.sls
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy/* /srv/salt' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.198.166.234'
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/pillar to /srv/pillar
DEBUG: Running command sudo -S bash -c 'mkdir -p /srv/pillar' on '54.198.166.234'
DEBUG: Creating directory /tmp/.__tmp_copy mode=511
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.5/site-packages/dask_ec2-0.2.0+6.g85c14de-py3.5.egg/dask_ec2/formulas/pillar/top.sls to /tmp/.__tmp_copy/top.sls
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy/* /srv/pillar' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.198.166.234'
Uploading conda settings
DEBUG: Uploading file /tmp/tmpn7x_nwn_ to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /srv/pillar/conda.sls' on '54.198.166.234'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.198.166.234'
+------------------------------+----------------------+-----------------+
| Node ID | # Successful actions | # Failed action |
+==============================+======================+=================+
| node-8 | 1 | 5 |
| node-6 | 1 | 5 |
| node-2 | 1 | 5 |
| node-4 | 1 | 5 |
| node-7 | 1 | 5 |
| node-0 | 1 | 5 |
| ip-172-31-3-211.ec2.internal | 1 | 5 |
| ip-172-31-3-205.ec2.internal | 1 | 5 |
| node-1 | 1 | 5 |
| node-5 | 1 | 5 |
+------------------------------+----------------------+-----------------+
Failed states for 'node-8'
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
Failed states for 'node-6'
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
Failed states for 'node-2'
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
Failed states for 'node-4'
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
Failed states for 'node-7'
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
Failed states for 'node-0'
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
Failed states for 'ip-172-31-3-211.ec2.internal'
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
Failed states for 'ip-172-31-3-205.ec2.internal'
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
Failed states for 'node-1'
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
Failed states for 'node-5'
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
Installing Jupyter notebook on the head node
+------------------------------+----------------------+-----------------+
| Node ID | # Successful actions | # Failed action |
+==============================+======================+=================+
| node-8 | 5 | 8 |
| node-6 | 5 | 8 |
| node-2 | 5 | 8 |
| node-4 | 5 | 8 |
| node-7 | 5 | 8 |
| node-0 | 5 | 8 |
| ip-172-31-3-211.ec2.internal | 5 | 8 |
| ip-172-31-3-205.ec2.internal | 5 | 8 |
| node-1 | 5 | 8 |
| node-5 | 5 | 8 |
+------------------------------+----------------------+-----------------+
Failed states for 'node-8'
supervisord | notebook-running | jupyter-notebook | running: jupyter-notebook: ERROR (no such file)
cmd | jupyter-install | /opt/anaconda//bin/conda install jupyter -y -q | run: One or more requisite failed: conda.remove-anconda, conda./etc/profile.d/conda.sh, conda.miniconda-install, conda.miniconda-pip, conda.miniconda-download
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | notebook-restart-if-change | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf restart jupyter-notebook | wait: One or more requisite failed: jupyter.jupyter-install
Failed states for 'node-6'
supervisord | notebook-running | jupyter-notebook | running: jupyter-notebook: ERROR (no such file)
cmd | jupyter-install | /opt/anaconda//bin/conda install jupyter -y -q | run: One or more requisite failed: conda.remove-anconda, conda./etc/profile.d/conda.sh, conda.miniconda-install, conda.miniconda-pip, conda.miniconda-download
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | notebook-restart-if-change | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf restart jupyter-notebook | wait: One or more requisite failed: jupyter.jupyter-install
Failed states for 'node-2'
supervisord | notebook-running | jupyter-notebook | running: jupyter-notebook: ERROR (no such file)
cmd | jupyter-install | /opt/anaconda//bin/conda install jupyter -y -q | run: One or more requisite failed: conda.remove-anconda, conda./etc/profile.d/conda.sh, conda.miniconda-install, conda.miniconda-pip, conda.miniconda-download
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | notebook-restart-if-change | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf restart jupyter-notebook | wait: One or more requisite failed: jupyter.jupyter-install
Failed states for 'node-4'
supervisord | notebook-running | jupyter-notebook | running: jupyter-notebook: ERROR (no such file)
cmd | jupyter-install | /opt/anaconda//bin/conda install jupyter -y -q | run: One or more requisite failed: conda.remove-anconda, conda./etc/profile.d/conda.sh, conda.miniconda-install, conda.miniconda-pip, conda.miniconda-download
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | notebook-restart-if-change | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf restart jupyter-notebook | wait: One or more requisite failed: jupyter.jupyter-install
Failed states for 'node-7'
supervisord | notebook-running | jupyter-notebook | running: jupyter-notebook: ERROR (no such file)
cmd | jupyter-install | /opt/anaconda//bin/conda install jupyter -y -q | run: One or more requisite failed: conda.remove-anconda, conda./etc/profile.d/conda.sh, conda.miniconda-install, conda.miniconda-pip, conda.miniconda-download
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | notebook-restart-if-change | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf restart jupyter-notebook | wait: One or more requisite failed: jupyter.jupyter-install
Failed states for 'node-0'
supervisord | notebook-running | jupyter-notebook | running: jupyter-notebook: ERROR (no such file)
cmd | jupyter-install | /opt/anaconda//bin/conda install jupyter -y -q | run: One or more requisite failed: conda.remove-anconda, conda./etc/profile.d/conda.sh, conda.miniconda-install, conda.miniconda-pip, conda.miniconda-download
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | notebook-restart-if-change | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf restart jupyter-notebook | wait: One or more requisite failed: jupyter.jupyter-install
Failed states for 'ip-172-31-3-211.ec2.internal'
supervisord | notebook-running | jupyter-notebook | running: jupyter-notebook: ERROR (no such file)
cmd | jupyter-install | /opt/anaconda//bin/conda install jupyter -y -q | run: One or more requisite failed: conda.remove-anconda, conda./etc/profile.d/conda.sh, conda.miniconda-install, conda.miniconda-pip, conda.miniconda-download
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | notebook-restart-if-change | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf restart jupyter-notebook | wait: One or more requisite failed: jupyter.jupyter-install
Failed states for 'ip-172-31-3-205.ec2.internal'
supervisord | notebook-running | jupyter-notebook | running: jupyter-notebook: ERROR (no such file)
cmd | jupyter-install | /opt/anaconda//bin/conda install jupyter -y -q | run: One or more requisite failed: conda.remove-anconda, conda./etc/profile.d/conda.sh, conda.miniconda-install, conda.miniconda-pip, conda.miniconda-download
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | notebook-restart-if-change | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf restart jupyter-notebook | wait: One or more requisite failed: jupyter.jupyter-install
Failed states for 'node-1'
supervisord | notebook-running | jupyter-notebook | running: jupyter-notebook: ERROR (no such file)
cmd | jupyter-install | /opt/anaconda//bin/conda install jupyter -y -q | run: One or more requisite failed: conda.remove-anconda, conda./etc/profile.d/conda.sh, conda.miniconda-install, conda.miniconda-pip, conda.miniconda-download
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | notebook-restart-if-change | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf restart jupyter-notebook | wait: One or more requisite failed: jupyter.jupyter-install
Failed states for 'node-5'
supervisord | notebook-running | jupyter-notebook | running: jupyter-notebook: ERROR (no such file)
cmd | jupyter-install | /opt/anaconda//bin/conda install jupyter -y -q | run: One or more requisite failed: conda.remove-anconda, conda./etc/profile.d/conda.sh, conda.miniconda-install, conda.miniconda-pip, conda.miniconda-download
cmd | miniconda-download | curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh | run: Command "curl https://repo.continuum.io/archive/Anaconda3-2.5.0-Linux-x86_64.sh > /tmp/miniconda.sh" run
cmd | miniconda-pip | /opt/anaconda//bin/conda install pip -y -q | run: One or more requisite failed: conda.miniconda-install
cmd | miniconda-install | bash /tmp/miniconda.sh -b -p /opt/anaconda/ | run: One or more requisite failed: conda.miniconda-download
file | /etc/profile.d/conda.sh | /etc/profile.d/conda.sh | managed: One or more requisite failed: conda.miniconda-install
cmd | remove-anconda | /opt/anaconda//bin/conda remove anaconda || true | run: One or more requisite failed: conda.miniconda-install
cmd | notebook-restart-if-change | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf restart jupyter-notebook | wait: One or more requisite failed: jupyter.jupyter-install
Jupyter notebook available at http://54.198.166.234:8888/
mrocklin@notebook:~$
When running dask-ec2 up
, the tags
option is required since tag verification and parsing does not handle the None
default value.
Hello! I've tried a few times to launch a small cluster with dask-ec2 and receive some errors near the tail end of the debug logging. I run the following:
dask-ec2 up --keyname MyFirstKey --keypair ~/.ssh/MyFirstKey.pem --type t2.micro --volume-size 10
And towards the very end of the logging output I see the following:
Uploading conda settings
DEBUG: Uploading file /var/folders/m5/nrknxtfx3s5dz2b3ztwzkbg00000gn/T/tmpggu9pm2u to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /srv/pillar/conda.sls' on '54.88.149.107'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.88.149.107'
+---------+----------------------+-----------------+
| Node ID | # Successful actions | # Failed action |
+=========+======================+=================+
| node-0 | 6 | 0 |
| node-2 | 6 | 0 |
| node-1 | 6 | 0 |
| node-3 | 6 | 0 |
+---------+----------------------+-----------------+
DEBUG: Uploading file /var/folders/m5/nrknxtfx3s5dz2b3ztwzkbg00000gn/T/tmpvlm6hbzb to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /srv/pillar/dask.sls' on '54.88.149.107'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.88.149.107'
Installing scheduler
+---------+----------------------+-----------------+
| Node ID | # Successful actions | # Failed action |
+=========+======================+=================+
| node-0 | 16 | 0 |
+---------+----------------------+-----------------+
Installing workers
+---------+----------------------+-----------------+
| Node ID | # Successful actions | # Failed action |
+=========+======================+=================+
| node-2 | 10 | 6 |
| node-1 | 10 | 6 |
| node-3 | 10 | 6 |
+---------+----------------------+-----------------+
Failed states for 'node-2'
supervisord | dworker-running | dworker | running: One or more requisite failed: dask.distributed.dask-install, dask.distributed.distributed-install, dask.distributed.worker.dworker-update-supervisor, dask.distributed.update-pandas, dask.distributed.worker.dworker.conf
pip | dask-install | dask | installed: An importable pip module is required but could not be found on your system. This usually means that the system's pip package is not installed properly.
cmd | update-pandas | /opt/anaconda//bin/conda update pandas | run: One or more requisite failed: dask.distributed.distributed-install
file | dworker.conf | /etc/supervisor/conf.d//dworker.conf | managed: One or more requisite failed: dask.distributed.dask-install, dask.distributed.distributed-install, dask.distributed.update-pandas
cmd | dworker-update-supervisor | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf update && sleep 2 | wait: One or more requisite failed: dask.distributed.worker.dworker.conf
pip | distributed-install | distributed | installed: An importable pip module is required but could not be found on your system. This usually means that the system's pip package is not installed properly.
Failed states for 'node-1'
supervisord | dworker-running | dworker | running: One or more requisite failed: dask.distributed.dask-install, dask.distributed.distributed-install, dask.distributed.worker.dworker-update-supervisor, dask.distributed.update-pandas, dask.distributed.worker.dworker.conf
pip | dask-install | dask | installed: An importable pip module is required but could not be found on your system. This usually means that the system's pip package is not installed properly.
cmd | update-pandas | /opt/anaconda//bin/conda update pandas | run: One or more requisite failed: dask.distributed.distributed-install
file | dworker.conf | /etc/supervisor/conf.d//dworker.conf | managed: One or more requisite failed: dask.distributed.dask-install, dask.distributed.distributed-install, dask.distributed.update-pandas
cmd | dworker-update-supervisor | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf update && sleep 2 | wait: One or more requisite failed: dask.distributed.worker.dworker.conf
pip | distributed-install | distributed | installed: An importable pip module is required but could not be found on your system. This usually means that the system's pip package is not installed properly.
Failed states for 'node-3'
supervisord | dworker-running | dworker | running: One or more requisite failed: dask.distributed.dask-install, dask.distributed.distributed-install, dask.distributed.worker.dworker-update-supervisor, dask.distributed.update-pandas, dask.distributed.worker.dworker.conf
pip | dask-install | dask | installed: An importable pip module is required but could not be found on your system. This usually means that the system's pip package is not installed properly.
cmd | update-pandas | /opt/anaconda//bin/conda update pandas | run: One or more requisite failed: dask.distributed.distributed-install
file | dworker.conf | /etc/supervisor/conf.d//dworker.conf | managed: One or more requisite failed: dask.distributed.dask-install, dask.distributed.distributed-install, dask.distributed.update-pandas
cmd | dworker-update-supervisor | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf update && sleep 2 | wait: One or more requisite failed: dask.distributed.worker.dworker.conf
pip | distributed-install | distributed | installed: An importable pip module is required but could not be found on your system. This usually means that the system's pip package is not installed properly.
When I ssh into the head node (scheduler?) I can submit jobs, but futures seem to stay in 'pendng' status:
In [1]: from distributed import Executor, s3, progress
In [2]: e = Executor('127.0.0.1:8786')
In [7]: def inc(x):
...: return x + 1
...: rez1 = e.submit(inc, 1)
...: rez2 = e.submit(inc, 2)
...:
In [8]: rez1
Out[8]: <Future: status: pending, key: inc-034e352530d02eccdc76c6ce4a799c5c>
In [9]: rez2
Out[9]: <Future: status: pending, key: inc-49e469fc400d788ae4cb493758329b9a>
I launched the web ui and it looks as if the jobs were recieved, but nothing is happening to them:
I'm running python 3.5.2 with the standard anaconda distro, dask v0.11.0, distributed v1.12.2, dask-ec2 v0.3.1.
I would be happy to add the full logging output if helpful. Thanks!
mrocklin@workstation:~/workspace/dask-ec2$ dask-ec2 provision
Checking SSH connection to nodes
DEBUG: Checking ssh connection for 54.173.16.97
DEBUG: Running command bash -c 'ls' on '54.173.16.97'
DEBUG: Checking ssh connection for 54.173.87.57
DEBUG: Running command bash -c 'ls' on '54.173.87.57'
DEBUG: Checking ssh connection for 54.167.183.231
DEBUG: Running command bash -c 'ls' on '54.167.183.231'
DEBUG: Checking ssh connection for 52.91.178.109
DEBUG: Running command bash -c 'ls' on '52.91.178.109'
DEBUG: Checking ssh connection for 54.173.112.197
DEBUG: Running command bash -c 'ls' on '54.173.112.197'
DEBUG: Checking ssh connection for 54.167.185.99
DEBUG: Running command bash -c 'ls' on '54.167.185.99'
DEBUG: Checking ssh connection for 54.165.69.199
DEBUG: Running command bash -c 'ls' on '54.165.69.199'
DEBUG: Checking ssh connection for 54.82.177.20
DEBUG: Running command bash -c 'ls' on '54.82.177.20'
+-------------------+-----------+
| Node IP | SSH check |
+===================+===========+
| 54.173.87.57:22 | True |
| 54.167.183.231:22 | True |
| 52.91.178.109:22 | True |
| 54.173.112.197:22 | True |
| 54.82.177.20:22 | True |
| 54.165.69.199:22 | True |
| 54.173.16.97:22 | True |
| 54.167.185.99:22 | True |
+-------------------+-----------+
Bootstraping salt master
DEBUG: Running command sudo -S bash -c 'curl -sS -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N stable' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N -P -L -p salt-api stable' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/templates/auto_accept.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/auto_accept.conf' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'apt-get install -y python-pip' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'pip install cherrypy' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'pip install PyOpenSSL' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'salt-call --local tls.create_self_signed_cert' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/templates/rest_cherrypy.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/rest_cherrypy.conf' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/templates/external_auth.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/external_auth.conf' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'id -u saltdev &>/dev/null || useradd -p $(openssl passwd -1 saltdev) saltdev' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'service salt-master restart' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'service salt-api restart' on '54.173.16.97'
Bootstraping salt minions
DEBUG: Installing salt-minion on all the nodes
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.173.16.97 -i node-2 stable' on '54.167.183.231'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.173.16.97 -i node-7 stable' on '54.82.177.20'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.173.16.97 -i node-0 stable' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.173.16.97 -i node-1 stable' on '54.173.87.57'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.173.16.97 -i node-3 stable' on '52.91.178.109'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.173.16.97 -i node-6 stable' on '54.165.69.199'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.173.16.97 -i node-4 stable' on '54.173.112.197'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 54.173.16.97 -i node-5 stable' on '54.167.185.99'
DEBUG: Configuring salt-mine on the salt minions
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.167.183.231'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.173.112.197'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.173.87.57'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.167.185.99'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.82.177.20'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '54.165.69.199'
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '52.91.178.109'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.167.183.231'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.82.177.20'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.173.112.197'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.173.87.57'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.167.185.99'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.165.69.199'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '52.91.178.109'
DEBUG: Restarting the salt-minion service
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.167.183.231'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '52.91.178.109'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.173.87.57'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.165.69.199'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.82.177.20'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.173.112.197'
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '54.167.185.99'
Uploading salt formulas
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt to /srv/salt
DEBUG: Running command sudo -S bash -c 'mkdir -p /srv/salt' on '54.173.16.97'
DEBUG: Creating directory /tmp/.__tmp_copy mode=511
DEBUG: Creating directory /tmp/.__tmp_copy/system mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/system to /tmp/.__tmp_copy/system
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/system' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/system/base.sls to /tmp/.__tmp_copy/system/base.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/system/macros.sls to /tmp/.__tmp_copy/system/macros.sls
DEBUG: Creating directory /tmp/.__tmp_copy/system/templates mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/system/templates to /tmp/.__tmp_copy/system/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/system/templates' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/system/templates/limits.conf to /tmp/.__tmp_copy/system/templates/limits.conf
DEBUG: Creating directory /tmp/.__tmp_copy/conda mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/conda to /tmp/.__tmp_copy/conda
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/conda' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/conda/init.sls to /tmp/.__tmp_copy/conda/init.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/conda/debug.sls to /tmp/.__tmp_copy/conda/debug.sls
DEBUG: Creating directory /tmp/.__tmp_copy/conda/templates mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/conda/templates to /tmp/.__tmp_copy/conda/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/conda/templates' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/conda/templates/conda.sh to /tmp/.__tmp_copy/conda/templates/conda.sh
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/conda/settings.sls to /tmp/.__tmp_copy/conda/settings.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/java to /tmp/.__tmp_copy/java
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/java/init.sls to /tmp/.__tmp_copy/java/init.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/java/debug.sls to /tmp/.__tmp_copy/java/debug.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java/openjdk mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/java/openjdk to /tmp/.__tmp_copy/java/openjdk
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java/openjdk' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/java/openjdk/env.sls to /tmp/.__tmp_copy/java/openjdk/env.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/java/openjdk/init.sls to /tmp/.__tmp_copy/java/openjdk/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java/openjdk/templates mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/java/openjdk/templates to /tmp/.__tmp_copy/java/openjdk/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java/openjdk/templates' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/java/openjdk/templates/java.sh to /tmp/.__tmp_copy/java/openjdk/templates/java.sh
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/java/openjdk/settings.sls to /tmp/.__tmp_copy/java/openjdk/settings.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/java/settings.sls to /tmp/.__tmp_copy/java/settings.sls
DEBUG: Creating directory /tmp/.__tmp_copy/supervisor mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/supervisor to /tmp/.__tmp_copy/supervisor
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/supervisor' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/supervisor/init.sls to /tmp/.__tmp_copy/supervisor/init.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/supervisor/settings.sls to /tmp/.__tmp_copy/supervisor/settings.sls
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/jupyter to /tmp/.__tmp_copy/jupyter
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/jupyter/init.sls to /tmp/.__tmp_copy/jupyter/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter/templates mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/jupyter/templates to /tmp/.__tmp_copy/jupyter/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter/templates' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/jupyter/templates/jupyter-notebook.conf to /tmp/.__tmp_copy/jupyter/templates/jupyter-notebook.conf
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter/notebook mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/jupyter/notebook to /tmp/.__tmp_copy/jupyter/notebook
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter/notebook' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/jupyter/notebook/init.sls to /tmp/.__tmp_copy/jupyter/notebook/init.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/jupyter/settings.sls to /tmp/.__tmp_copy/jupyter/settings.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/top.sls to /tmp/.__tmp_copy/top.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/dask to /tmp/.__tmp_copy/dask
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask' on '54.173.16.97'
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/dask/distributed to /tmp/.__tmp_copy/dask/distributed
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/dask/distributed/init.sls to /tmp/.__tmp_copy/dask/distributed/init.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/dask/distributed/debug.sls to /tmp/.__tmp_copy/dask/distributed/debug.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/templates mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/dask/distributed/templates to /tmp/.__tmp_copy/dask/distributed/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/templates' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/dask/distributed/templates/dscheduler.conf to /tmp/.__tmp_copy/dask/distributed/templates/dscheduler.conf
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/dask/distributed/templates/dworker.conf to /tmp/.__tmp_copy/dask/distributed/templates/dworker.conf
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/worker mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/dask/distributed/worker to /tmp/.__tmp_copy/dask/distributed/worker
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/worker' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/dask/distributed/worker/init.sls to /tmp/.__tmp_copy/dask/distributed/worker/init.sls
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/dask/distributed/settings.sls to /tmp/.__tmp_copy/dask/distributed/settings.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/scheduler mode=511
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/dask/distributed/scheduler to /tmp/.__tmp_copy/dask/distributed/scheduler
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/scheduler' on '54.173.16.97'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/salt/dask/distributed/scheduler/init.sls to /tmp/.__tmp_copy/dask/distributed/scheduler/init.sls
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy/* /srv/salt' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.173.16.97'
DEBUG: Uploading directory /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/pillar to /srv/pillar
DEBUG: Running command sudo -S bash -c 'mkdir -p /srv/pillar' on '54.173.16.97'
DEBUG: Creating directory /tmp/.__tmp_copy mode=511
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/formulas/pillar/top.sls to /tmp/.__tmp_copy/top.sls
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy/* /srv/pillar' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.173.16.97'
Uploading conda settings
DEBUG: Uploading file /tmp/tmpeais13mn to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /srv/pillar/conda.sls' on '54.173.16.97'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.173.16.97'
Traceback (most recent call last):
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/libpepper.py", line 133, in req
f = urlopen(req, context=con)
File "/home/mrocklin/Software/anaconda/lib/python3.4/urllib/request.py", line 161, in urlopen
return opener.open(url, data, timeout)
File "/home/mrocklin/Software/anaconda/lib/python3.4/urllib/request.py", line 470, in open
response = meth(req, response)
File "/home/mrocklin/Software/anaconda/lib/python3.4/urllib/request.py", line 580, in http_response
'http', request, response, code, msg, hdrs)
File "/home/mrocklin/Software/anaconda/lib/python3.4/urllib/request.py", line 508, in error
return self._call_chain(*args)
File "/home/mrocklin/Software/anaconda/lib/python3.4/urllib/request.py", line 442, in _call_chain
result = func(*args)
File "/home/mrocklin/Software/anaconda/lib/python3.4/urllib/request.py", line 588, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 500: Internal Server Error
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/cli/main.py", line 23, in start
cli(obj={})
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/click/core.py", line 716, in __call__
return self.main(*args, **kwargs)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/click/core.py", line 696, in main
rv = self.invoke(ctx)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/click/core.py", line 1060, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/click/core.py", line 889, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/click/core.py", line 534, in invoke
return callback(*args, **kwargs)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/cli/main.py", line 306, in provision
ctx.invoke(anaconda, filepath=filepath)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/click/core.py", line 534, in invoke
return callback(*args, **kwargs)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/cli/main.py", line 326, in anaconda
output = cluster.salt_call("*", "state.sls", ["conda"])
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/cluster.py", line 65, in salt_call
return self.pepper.local(target, module, args)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/libpepper.py", line 226, in local
return self.low([low], path='/')
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/libpepper.py", line 201, in low
return self.req(path, lowstate)
File "/home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+6.g95eb53a-py3.4.egg/dask_ec2/libpepper.py", line 145, in req
raise PepperException('Server error.')
dask_ec2.libpepper.PepperException: Server error.
I'm not sure how to destroy the cluster that I've just launched. As a user this causes some anxiety about costs.
(27)mrocklin@workstation:~$ dec2
Usage: dec2 [OPTIONS] COMMAND [ARGS]...
Options:
--version Show the version and exit.
-h, --help Show this message and exit.
Commands:
cloudera-manager Start a Cloudera manager cluster
dask-distributed Start a dask.distributed cluster
provision Provision salt instances
ssh SSH to one of the node. 0-index
up Launch instances
Right now using python 3 as client will create a python 2 distributed cluster that doesnt work because of pickle issues.
Need to change the conda formula to have a setting and upload pillars.
both of them are ignored on EC2.init()
with an error like the following
DEBUG: Searching for default subnet in VPC vpc-d82ebebc
ERROR: There is no default subnet on VPC vpc-d82ebebc, please pass a subne
cc @mrocklin
On a vanilla dask-ec2
cluster I can't read data from s3 using standard dd.read_csv
syntax. It appears that this is becauses3fs
is not installed.
ubuntu@ip-172-31-29-138:~$ ipython
Python 2.7.12 |Continuum Analytics, Inc.| (default, Jul 2 2016, 17:42:40)
Type "copyright", "credits" or "license" for more information.
IPython 4.0.3 -- An enhanced Interactive Python.
? -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help -> Python's own help system.
object? -> Details about 'object', use 'object??' for extra details.
In [1]: import s3fs
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-1-e035bbfffb37> in <module>()
----> 1 import s3fs
ImportError: No module named s3fs
In [2]: quit()
Hi,
While my dask-ec2 is trying to install the scheduler, I have faced this error message:
Installing scheduler
Traceback (most recent call last):
File "/home/adas/.local/lib/python2.7/site-packages/dask_ec2/cli/main.py", line 23, in start
cli(obj={})
File "/home/adas/.local/lib/python2.7/site-packages/click/core.py", line 716, in call
return self.main(*args, **kwargs)
File "/home/adas/.local/lib/python2.7/site-packages/click/core.py", line 696, in main
rv = self.invoke(ctx)
File "/home/adas/.local/lib/python2.7/site-packages/click/core.py", line 1060, in invoke
return process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/adas/.local/lib/python2.7/site-packages/click/core.py", line 889, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/adas/.local/lib/python2.7/site-packages/click/core.py", line 534, in invoke
return callback(*args, **kwargs)
File "/home/adas/.local/lib/python2.7/site-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File "/home/adas/.local/lib/python2.7/site-packages/dask_ec2/cli/main.py", line 165, in up
ctx.invoke(provision, filepath=filepath, anaconda=anaconda_, dask=dask, notebook=notebook, nprocs=nprocs)
File "/home/adas/.local/lib/python2.7/site-packages/click/core.py", line 534, in invoke
return callback(*args, **kwargs)
File "/home/adas/.local/lib/python2.7/site-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File "/home/adas/.local/lib/python2.7/site-packages/dask_ec2/cli/main.py", line 309, in provision
ctx.invoke(dask_install, filepath=filepath, nprocs=nprocs)
File "/home/adas/.local/lib/python2.7/site-packages/click/core.py", line 534, in invoke
return callback(*args, **kwargs)
File "/home/adas/.local/lib/python2.7/site-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File "/home/adas/.local/lib/python2.7/site-packages/dask_ec2/cli/daskd.py", line 59, in dask_install
cluster.pepper.local("node-0", "grains.append", ["roles", "dask.distributed.scheduler"])
File "/home/adas/.local/lib/python2.7/site-packages/dask_ec2/cluster.py", line 54, in get_pepper_client
self._pepper.login('saltdev', 'saltdev', 'pam')
File "/home/adas/.local/lib/python2.7/site-packages/dask_ec2/libpepper.py", line 284, in login
'eauth': eauth
File "/home/adas/.local/lib/python2.7/site-packages/dask_ec2/libpepper.py", line 133, in req
f = urlopen(req, context=con)
File "/usr/lib/python2.7/urllib2.py", line 154, in urlopen
return opener.open(url, data, timeout)
File "/usr/lib/python2.7/urllib2.py", line 429, in open
response = self._open(req, data)
File "/usr/lib/python2.7/urllib2.py", line 447, in _open
'_open', req)
File "/usr/lib/python2.7/urllib2.py", line 407, in _call_chain
result = func(*args)
File "/usr/lib/python2.7/urllib2.py", line 1241, in https_open
context=self._context)
File "/usr/lib/python2.7/urllib2.py", line 1201, in do_open
r = h.getresponse(buffering=True)
File "/usr/lib/python2.7/httplib.py", line 1136, in getresponse
response.begin()
File "/usr/lib/python2.7/httplib.py", line 453, in begin
version, status, reason = self._read_status()
File "/usr/lib/python2.7/httplib.py", line 409, in _read_status
line = self.fp.readline(_MAXLINE + 1)
File "/usr/lib/python2.7/socket.py", line 480, in readline
data = self._sock.recv(self._rbufsize)
File "/usr/lib/python2.7/ssl.py", line 756, in recv
return self.read(buflen)
File "/usr/lib/python2.7/ssl.py", line 643, in read
v = self._sslobj.read(len)
SSLError: [SSL: WRONG_VERSION_NUMBER] wrong version number (_ssl.c:1754)
For the CLI, the current default log level has dask_ec2
set to DEBUG
and paramiko
set to WARNING
. While keeping the default as is, the addition of the following logging options would be helpful:
--quiet
, -q
: both dask_ec2
and paramiko
have log level of WARNING
--verbose
, -v
: both dask_ec2
and paramiko
have log level of DEBUG
Because of a known (and already fixed but not released) bug in paramiko (paramiko/paramiko#615) in python 3 I am catching all TypeError
s in Instance.ssh_check
(302f68c) this should be removed when a new version of paramiko is released and we require that new version for dec2
.
Hi,
I would like to know what is the the "correct" way of using distributed dask on amazon instances with tensorflow or scikit-learn. Do I have to install tensorflow in all the amazon instances? Can I submit my task from my local machine or do I have to ssh into the head node and run my task from there? In my python file, how can I run dask distributed functions? Is the following the right way?
import tensorflow as tf
from distributed import Client
c = Client()
#define some tensorflow variables and functions etc...
c.submit(some tensorflow functions)
It seems that 'c = Client('127.0.0.1:8786') # Connect to scheduler running on the head node' does not work. I will get an error: OSError: Could not connect to 127.0.0.1:8786
However, if I execute 'c = Client' and then execute c again, it seems to work and shows the following:
<Client: scheduler="127.0.0.1:8786" processes=8 cores=8>
Am I missing something?
Hi,
When I ssh into my head node, I start ipython and did the following:
from distributed import Client
c = Client()
I got the following error:
bokeh.command.util - CRITICAL - Cannot start Bokeh server, port 8787 is already in use
It did not appear in the past. Did I do something wrong?
found a windows incompatibility in salt.py upload_pillar
The issue is best described by this...
http://stackoverflow.com/questions/15169101/how-to-create-a-temporary-file-that-can-be-read-by-a-subprocess
below is my modified version of the function that works on windows.
def upload_pillar(cluster, name, data):
import os
import yaml
import tempfile
master = cluster.instances[0].ssh_client
f = tempfile.NamedTemporaryFile('w',delete = False)
try:
yaml.safe_dump(data, f, default_flow_style=False)
f.close()
local = f.name
remote = "/srv/pillar/{}".format(name)
master.put(local, remote, sudo=True)
finally:
os.remove(f.name)
So that you could spin up a cluster on digital ocean, vultr or linode?
New versions of jupyter now require this by default.
Hi, I encountered this error:
Installing scheduler
+---------+----------------------+-----------------+
| Node ID | # Successful actions | # Failed action |
+=========+======================+=================+
| node-0 | 12 | 5 |
+---------+----------------------+-----------------+
Failed states for 'node-0'
file | dscheduler.conf | /etc/supervisor/conf.d//dscheduler.conf | managed: One or more requisite failed: dask.distributed.bokeh-install, dask.distributed.update-pandas
supervisord | dscheduler-running | dscheduler | running: One or more requisite failed: dask.distributed.scheduler.dscheduler.conf, dask.distributed.bokeh-install, dask.distributed.update-pandas, dask.distributed.scheduler.dscheduler-update-supervisor
cmd | bokeh-install | /opt/anaconda//bin/conda install bokeh -y -q | run: Command "/opt/anaconda//bin/conda install bokeh -y -q" run
cmd | update-pandas | /opt/anaconda//bin/conda update pandas | run: Command "/opt/anaconda//bin/conda update pandas" run
cmd | dscheduler-update-supervisor | /usr/bin/supervisorctl -c /etc/supervisor/supervisord.conf update && sleep 2 | wait: One or more requisite failed: dask.distributed.scheduler.dscheduler.conf
Hello,
In reading up on Dask.distributed I found out about dec2 as an extremely simple way to spin up a cluster on ec2. However, I'm constrained by the fact that my cluster must reside in a VPC. As far as I can tell it's not possible to do this with the script as-is.
It seems like it shouldn't be too hard to modify it myself to support a VPC though. Any advice much appreciated!
Thanks,
Dan
when trying to install salt master I get this problem
The directory '/home/ubuntu/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disable
d. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
The directory '/home/ubuntu/.cache/pip' or its parent directory is not owned by the current user and caching wheels has been disable
d. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
/usr/local/lib/python2.7/dist-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py:318: SNIMissingWarning: An HTTPS request h
as been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to
present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve th
is. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.
SNIMissingWarning
/usr/local/lib/python2.7/dist-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py:122: InsecurePlatformWarning: A true SSLCo
ntext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fai
l. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/se
curity.html#insecureplatformwarning.
InsecurePlatformWarning
Command "/usr/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-HkHZWW/cryptography/setup.py';f=getattr(tokeniz
e, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tm
p/pip-mWN456-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-build-HkH
ZWW/cryptography/
/usr/local/lib/python2.7/dist-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py:122: InsecurePlatformWarning: A true SSLCo
ntext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fai
l. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/se
curity.html#insecureplatformwarning.
InsecurePlatformWarning
Couldn't install PyOpenSSL. Error is above (maybe try again)
seems to fixed if you do the following
sudo apt-get install libssl-dev libffi-dev python-dev -y
before you upgrade pyopenssl, it seems someone has removed an old built version from pypi
I'm getting an error about installing CherryPy because pip is not installed when running dask-ec2 with default arguments:
--snip--
Bootstraping salt master
DEBUG: Running command sudo -S bash -c 'curl -sS -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N stable' on '54.209.163.215'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N -P -L -p salt-api stable' on '54.209.163.215'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/templates/auto_accept.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/auto_accept.conf' on '54.209.163.215'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '54.209.163.215'
DEBUG: Running command sudo -S bash -c 'pip install cherrypy' on '54.209.163.215'
DEBUG: Attempt 1/3 of function '__install_salt_rest_api' failed
DEBUG: Running command sudo -S bash -c 'pip install cherrypy' on '54.209.163.215'
DEBUG: Attempt 2/3 of function '__install_salt_rest_api' failed
DEBUG: Running command sudo -S bash -c 'pip install cherrypy' on '54.209.163.215'
DEBUG: Attempt 3/3 of function '__install_salt_rest_api' failed
ERROR: bash: pip: command not found
Couldn't install CherryPy. Error is above (maybe try again)
--snip--
After fixing this with fabric and rerunning, there is an error parsing a server response in JSON:
--snip--
Bootstraping salt master
DEBUG: Running command sudo -S bash -c 'curl -sS -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N stable' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N -P -L -p salt-api stable' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/templates/auto_accept.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/auto_accept.conf' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'pip install cherrypy' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'pip install PyOpenSSL' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'salt-call --local tls.create_self_signed_cert' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/templates/rest_cherrypy.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/rest_cherrypy.conf' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/templates/external_auth.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/external_auth.conf' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'id -u saltdev &>/dev/null || useradd -p $(openssl passwd -1 saltdev) saltdev' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'service salt-master restart' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'service salt-api restart' on '52.90.3.73'
Bootstraping salt minions
DEBUG: Installing salt-minion on all the nodes
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -P -L -A 52.90.3.73 -i node-0 stable' on '52.90.3.73'
DEBUG: Configuring salt-mine on the salt minions
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/templates/mine_functions.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/minion.d/mine.conf' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '52.90.3.73'
DEBUG: Restarting the salt-minion service
DEBUG: Running command sudo -S bash -c 'service salt-minion restart' on '52.90.3.73'
Uploading salt formulas
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt to /srv/salt
DEBUG: Running command sudo -S bash -c 'mkdir -p /srv/salt' on '52.90.3.73'
DEBUG: Creating directory /tmp/.__tmp_copy mode=511
DEBUG: Creating directory /tmp/.__tmp_copy/system mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/system to /tmp/.__tmp_copy/system
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/system' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/system/macros.sls to /tmp/.__tmp_copy/system/macros.sls
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/system/base.sls to /tmp/.__tmp_copy/system/base.sls
DEBUG: Creating directory /tmp/.__tmp_copy/system/templates mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/system/templates to /tmp/.__tmp_copy/system/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/system/templates' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/system/templates/limits.conf to /tmp/.__tmp_copy/system/templates/limits.conf
DEBUG: Creating directory /tmp/.__tmp_copy/conda mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/conda to /tmp/.__tmp_copy/conda
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/conda' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/conda/init.sls to /tmp/.__tmp_copy/conda/init.sls
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/conda/settings.sls to /tmp/.__tmp_copy/conda/settings.sls
DEBUG: Creating directory /tmp/.__tmp_copy/conda/templates mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/conda/templates to /tmp/.__tmp_copy/conda/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/conda/templates' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/conda/templates/conda.sh to /tmp/.__tmp_copy/conda/templates/conda.sh
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/conda/debug.sls to /tmp/.__tmp_copy/conda/debug.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/dask to /tmp/.__tmp_copy/dask
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask' on '52.90.3.73'
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/dask/distributed to /tmp/.__tmp_copy/dask/distributed
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/dask/distributed/init.sls to /tmp/.__tmp_copy/dask/distributed/init.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/scheduler mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/dask/distributed/scheduler to /tmp/.__tmp_copy/dask/distributed/scheduler
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/scheduler' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/dask/distributed/scheduler/init.sls to /tmp/.__tmp_copy/dask/distributed/scheduler/init.sls
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/dask/distributed/settings.sls to /tmp/.__tmp_copy/dask/distributed/settings.sls
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/templates mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/dask/distributed/templates to /tmp/.__tmp_copy/dask/distributed/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/templates' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/dask/distributed/templates/dworker.conf to /tmp/.__tmp_copy/dask/distributed/templates/dworker.conf
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/dask/distributed/templates/dscheduler.conf to /tmp/.__tmp_copy/dask/distributed/templates/dscheduler.conf
DEBUG: Creating directory /tmp/.__tmp_copy/dask/distributed/worker mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/dask/distributed/worker to /tmp/.__tmp_copy/dask/distributed/worker
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/dask/distributed/worker' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/dask/distributed/worker/init.sls to /tmp/.__tmp_copy/dask/distributed/worker/init.sls
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/dask/distributed/debug.sls to /tmp/.__tmp_copy/dask/distributed/debug.sls
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/jupyter to /tmp/.__tmp_copy/jupyter
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/jupyter/init.sls to /tmp/.__tmp_copy/jupyter/init.sls
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/jupyter/settings.sls to /tmp/.__tmp_copy/jupyter/settings.sls
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter/templates mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/jupyter/templates to /tmp/.__tmp_copy/jupyter/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter/templates' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/jupyter/templates/jupyter-notebook.conf to /tmp/.__tmp_copy/jupyter/templates/jupyter-notebook.conf
DEBUG: Creating directory /tmp/.__tmp_copy/jupyter/notebook mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/jupyter/notebook to /tmp/.__tmp_copy/jupyter/notebook
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/jupyter/notebook' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/jupyter/notebook/init.sls to /tmp/.__tmp_copy/jupyter/notebook/init.sls
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/top.sls to /tmp/.__tmp_copy/top.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/java to /tmp/.__tmp_copy/java
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/java/init.sls to /tmp/.__tmp_copy/java/init.sls
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/java/settings.sls to /tmp/.__tmp_copy/java/settings.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java/openjdk mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/java/openjdk to /tmp/.__tmp_copy/java/openjdk
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java/openjdk' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/java/openjdk/init.sls to /tmp/.__tmp_copy/java/openjdk/init.sls
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/java/openjdk/settings.sls to /tmp/.__tmp_copy/java/openjdk/settings.sls
DEBUG: Creating directory /tmp/.__tmp_copy/java/openjdk/templates mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/java/openjdk/templates to /tmp/.__tmp_copy/java/openjdk/templates
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/java/openjdk/templates' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/java/openjdk/templates/java.sh to /tmp/.__tmp_copy/java/openjdk/templates/java.sh
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/java/openjdk/env.sls to /tmp/.__tmp_copy/java/openjdk/env.sls
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/java/debug.sls to /tmp/.__tmp_copy/java/debug.sls
DEBUG: Creating directory /tmp/.__tmp_copy/supervisor mode=511
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/supervisor to /tmp/.__tmp_copy/supervisor
DEBUG: Running command sudo -S bash -c 'mkdir -p /tmp/.__tmp_copy/supervisor' on '52.90.3.73'
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/supervisor/init.sls to /tmp/.__tmp_copy/supervisor/init.sls
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/salt/supervisor/settings.sls to /tmp/.__tmp_copy/supervisor/settings.sls
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy/* /srv/salt' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '52.90.3.73'
DEBUG: Uploading directory /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/pillar to /srv/pillar
DEBUG: Running command sudo -S bash -c 'mkdir -p /srv/pillar' on '52.90.3.73'
DEBUG: Creating directory /tmp/.__tmp_copy mode=511
DEBUG: Uploading file /home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/formulas/pillar/top.sls to /tmp/.__tmp_copy/top.sls
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy/* /srv/pillar' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '52.90.3.73'
Uploading conda settings
DEBUG: Uploading file /tmp/tmpejteBK to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /srv/pillar/conda.sls' on '52.90.3.73'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '52.90.3.73'
Traceback (most recent call last):
File "/home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/cli/main.py", line 23, in start
cli(obj={})
File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 716, in __call__
return self.main(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 696, in main
rv = self.invoke(ctx)
File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 1060, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 889, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 534, in invoke
return callback(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File "/home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/cli/main.py", line 299, in provision
ctx.invoke(anaconda, filepath=filepath)
File "/usr/local/lib/python2.7/dist-packages/click/core.py", line 534, in invoke
return callback(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File "/home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/cli/main.py", line 319, in anaconda
output = cluster.salt_call("*", "state.sls", ["conda"])
File "/home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/cluster.py", line 65, in salt_call
return self.pepper.local(target, module, args)
File "/home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/cluster.py", line 54, in get_pepper_client
self._pepper.login('saltdev', 'saltdev', 'pam')
File "/home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/libpepper.py", line 284, in login
'eauth': eauth
File "/home/FOOBAR/.local/lib/python2.7/site-packages/dask_ec2/libpepper.py", line 151, in req
raise PepperException('Unable to parse the server response.')
PepperException: Unable to parse the server response.
--snip--
When running dec2 from Python 3.4 I get a cluster running Python 3.5. The default behavior or connecting on startup causes a protocol mismatch with cloudpickle. Three solutions come to mind:
When a Salt command returns a dictionary like {u'return': [{}]}
, the print_state
method converts this to a None
response, causing the following error:
Installing scheduler
Traceback (most recent call last):
File "/Users/jcrail/anaconda2/envs/ec2/lib/python2.7/site-packages/dask_ec2-0.4.0+12.g8492341.dirty-py2.7.egg/dask_ec2/cli/main.py", line 26, in start
cli(obj={})
File "/Users/jcrail/anaconda2/envs/ec2/lib/python2.7/site-packages/click/core.py", line 722, in __call__
return self.main(*args, **kwargs)
File "/Users/jcrail/anaconda2/envs/ec2/lib/python2.7/site-packages/click/core.py", line 697, in main
rv = self.invoke(ctx)
File "/Users/jcrail/anaconda2/envs/ec2/lib/python2.7/site-packages/click/core.py", line 1066, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/Users/jcrail/anaconda2/envs/ec2/lib/python2.7/site-packages/click/core.py", line 895, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/Users/jcrail/anaconda2/envs/ec2/lib/python2.7/site-packages/click/core.py", line 535, in invoke
return callback(*args, **kwargs)
File "/Users/jcrail/anaconda2/envs/ec2/lib/python2.7/site-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File "/Users/jcrail/anaconda2/envs/ec2/lib/python2.7/site-packages/dask_ec2-0.4.0+12.g8492341.dirty-py2.7.egg/dask_ec2/cli/main.py", line 368, in provision
ctx.invoke(dask_install, filepath=filepath, nprocs=nprocs, source=source)
File "/Users/jcrail/anaconda2/envs/ec2/lib/python2.7/site-packages/click/core.py", line 535, in invoke
return callback(*args, **kwargs)
File "/Users/jcrail/anaconda2/envs/ec2/lib/python2.7/site-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File "/Users/jcrail/anaconda2/envs/ec2/lib/python2.7/site-packages/dask_ec2-0.4.0+12.g8492341.dirty-py2.7.egg/dask_ec2/cli/daskd.py", line 74, in dask_install
response = print_state(output)
File "/Users/jcrail/anaconda2/envs/ec2/lib/python2.7/site-packages/dask_ec2-0.4.0+12.g8492341.dirty-py2.7.egg/dask_ec2/cli/main.py", line 399, in print_state
data.extend(response.aggregated_to_table(agg=len))
AttributeError: 'NoneType' object has no attribute 'aggregated_to_table'
It would be nice to give users a few introductory notebooks to play with when they first arrive.
One convenient way to do this would be to copy a directory of files from this repository to the head node.
When I start up a dask-ec2 cluster with this command:
dask-ec2 up \
--keyname aws_norcal_key \
--keypair ~/.ssh/aws_norcal_key.pem \
--region-name us-west-1 \
I get this error:
DEBUG: Searching for default VPC
DEBUG: Default VPC found - Using VPC ID: vpc-0e0c2d6b
DEBUG: Searching for default subnet in VPC vpc-0e0c2d6b
DEBUG: Default subnet found - Using Subnet ID: subnet-ef18fe8b
Launching nodes
DEBUG: Checking that keyname 'aws_norcal_key' exists on EC2
Unexpected EC2 error: An error occurred (InvalidAMIID.NotFound) when calling the DescribeImages operation: The image id '[ami-d05e75b8]' does not exist
I can fix it if I run:
dask-ec2 up \
--keyname aws_norcal_key \
--keypair ~/.ssh/aws_norcal_key.pem \
--region-name us-west-1 \
--ami ami-48db9d28
Hi, I tried to do the following:
dask-ec2 up --keyname raymond --keypair raymond.pem --name raymond_dask --region-name eu-central-1 --ami ami-f9619996 --username ubuntu --type t2.micro
But I keep getting ssh connection errors:
Checking SSH connection to nodes
DEBUG: Checking ssh connection for 35.156.103.53
DEBUG: Attempt 1/10 of function 'check_ssh' failed
DEBUG: Checking ssh connection for 35.156.103.53
DEBUG: Attempt 2/10 of function 'check_ssh' failed
DEBUG: Checking ssh connection for 35.156.103.53
DEBUG: Attempt 3/10 of function 'check_ssh' failed
DEBUG: Checking ssh connection for 35.156.103.53
DEBUG: Attempt 4/10 of function 'check_ssh' failed
DEBUG: Checking ssh connection for 35.156.103.53
DEBUG: Attempt 5/10 of function 'check_ssh' failed
DEBUG: Checking ssh connection for 35.156.103.53
DEBUG: Attempt 6/10 of function 'check_ssh' failed
DEBUG: Checking ssh connection for 35.156.103.53
DEBUG: Attempt 7/10 of function 'check_ssh' failed
DEBUG: Checking ssh connection for 35.156.103.53
DEBUG: Attempt 8/10 of function 'check_ssh' failed
DEBUG: Checking ssh connection for 35.156.103.53
DEBUG: Attempt 9/10 of function 'check_ssh' failed
DEBUG: Checking ssh connection for 35.156.103.53
DEBUG: Attempt 10/10 of function 'check_ssh' failed
ERROR: Retries limit exceeded
I find that I need to follow these instructions when using dec2 on larger clusters to increase the open file limit for the ubuntu user.
http://www.cyberciti.biz/faq/linux-increase-the-maximum-number-of-open-files/
Notably the section starting User Level FD Limits
Bootstraping salt master
DEBUG: Running command sudo -S bash -c 'curl -sS -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N stable' on '52.87.254.124'
DEBUG: Running command sudo -S bash -c 'curl -L https://bootstrap.saltstack.com | sh -s -- -d -X -M -N -P -L -p salt-api stable' on '52.87.254.124'
DEBUG: Uploading file /home/mrocklin/Software/anaconda/lib/python3.4/site-packages/dask_ec2-0.3.0+5.g54cb5fd-py3.4.egg/dask_ec2/templates/auto_accept.conf to /tmp/.__tmp_copy
DEBUG: Running command sudo -S bash -c 'cp -rf /tmp/.__tmp_copy /etc/salt/master.d/auto_accept.conf' on '52.87.254.124'
DEBUG: Running command sudo -S bash -c 'rm -rf /tmp/.__tmp_copy' on '52.87.254.124'
DEBUG: Running command sudo -S bash -c 'pip install cherrypy' on '52.87.254.124'
DEBUG: Attempt 1/3 of function '__install_salt_rest_api' failed
DEBUG: Running command sudo -S bash -c 'pip install cherrypy' on '52.87.254.124'
DEBUG: Attempt 2/3 of function '__install_salt_rest_api' failed
DEBUG: Running command sudo -S bash -c 'pip install cherrypy' on '52.87.254.124'
DEBUG: Attempt 3/3 of function '__install_salt_rest_api' failed
ERROR: 'str' object has no attribute 'decode'
Couldn't install CherryPy. Error is above (maybe try again)
I've tried again. Things are still unhappy. Maybe a change upstream?
mrocklin@workstation:~/workspace/dec2$ dec2 up --keyname mrocklin --keypair ~/.ssh/anaconda-cluster.pem --nprocs 4
Launching nodes
DEBUG: Checking that keyname 'mrocklin' exists on EC2
DEBUG: Checking that security group 'dec2-default' exists on EC2
DEBUG: Creating 4 instances on EC2
DEBUG: Tagging instance 'i-84a82f00'
DEBUG: Tagging instance 'i-85a82f01'
DEBUG: Tagging instance 'i-86a82f02'
DEBUG: Tagging instance 'i-87a82f03'
Checking SSH connection to nodes
DEBUG: Checking ssh connection for 52.23.171.227
ERROR: Error connecting to host '52.23.171.227:22'
timed out
Things seemed to work better if I, after this, call dec2 provision
. Does this mean that timeouts are too optimistic?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.