Giter Club home page Giter Club logo

pysph's People

Contributors

abhinavmuta avatar adityapb avatar amalss18 avatar ananyo-work avatar anshumankumar avatar arkopaldutt avatar arpitragarwal avatar avalentino avatar deeptavker avatar dileepmenon avatar dineshadepu avatar joeweaver avatar kunal-puri avatar miloniatal avatar mpcsdspa avatar nauaneed avatar nellev avatar pawansnegi avatar prabhuramachandran avatar quang-ha avatar rahulgovind avatar sankasuraj avatar saru44 avatar thisisrohan avatar tkoyama010 avatar vishnusivadasan avatar yaaanshk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pysph's Issues

Cannot install PySPH on mac with pip no matter what I do

I have been trying to install the package using pip on my mac (OS version 10.13.5) and get an error every time. I have installed gcc, and exported into cc, again error. Exported clang to cc, and again error. What am doing wrong? I am clueless!

$ pip install PySPH

14 warnings and 1 error generated.
error: command '/usr/bin/clang' failed with exit status 1

----------------------------------------

Command "/usr/bin/python -u -c "import setuptools, tokenize;file='/private/var/folders/gy/bjjhbttd565_gbkystqdkm6c0000gn/T/pip-install-gqjbYq/PySPH/setup.py';f=getattr(tokenize, 'open', open)(file);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, file, 'exec'))" install --record /private/var/folders/gy/bjjhbttd565_gbkystqdkm6c0000gn/T/pip-record-v4gXlV/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /private/var/folders/gy/bjjhbttd565_gbkystqdkm6c0000gn/T/pip-install-gqjbYq/PySPH/

PySPH Install

Hello!

I am installing from git repo... and had several problems. The instructions in the documents no more work as cleanly especially in regards to dependencies. After several tries and combinations of things I got it to pass... but some files in repo seem to be missing. For example, I get the following error. Is it that these files were removed from the distribution?

IOError: /home/virtualpyenv/local/lib/python2.7/site-packages/pyzoltan/core/tests/mesh.txt not found.

Thank you!

Keku

Python Version?

Hi! Tried various older releases including the newest 'master' branch with all kinds of installation options. Several errors in the pysph -v test of one type or another (not counting skipped) even when all local- and global-dependencies are satisfied. Problem occurs even for the older releases.

Are there any recommendations as to the version of PySPH and the version of Python would be the best choice to start a new project?

Having good pip freeze -> requirements.txt with the package==xx.xx.xx (not >=) will help a lot for master as well as when release is made. Thanks a lot... K

Question: setting up Inlet-Outlet with WCSPH

I am trying to set up the following problem.

import numpy as np

from pysph.base.kernels import CubicSpline, QuinticSpline
from pysph.solver.solver import Solver
from pysph.base.utils import get_particle_array, get_particle_array_wcsph
from pysph.solver.application import Application
from pysph.sph.simple_inlet_outlet import SimpleInlet, SimpleOutlet
from pysph.sph.integrator import EPECIntegrator
from pysph.sph.integrator_step import InletOutletStep, WCSPHStep
from pysph.sph.equation import Group
from pysph.sph.wc.basic import TaitEOS, MomentumEquation
from pysph.sph.basic_equations import XSPHCorrection, ContinuityEquation

class WindTunnelApp(Application):

    def initialize(self):
        self.c0 = 1400.0
        self.rho = 1.0
        self.hdx = 1.3
        self.dx = 0.1
        self.gamma = 7.


    def add_user_options(self, group):
        group.add_argument("--speed", action="store",
                          type=float,
                          dest="speed",
                          default=14.,
                          help="Speed of inlet particles.")


    def create_particles(self):
        # Note that you need to create the inlet and outlet arrays in this method.

        # Initially fluid has no particles -- these are generated by the inlet.
        fluid = get_particle_array_wcsph(name='fluid')

        outlet = get_particle_array(name='outlet')

        # Setup the inlet particle array with just the particles we need at the
        # exit plane which is replicated by the inlet.
        dx = self.dx
        dy = self.dx
        dz = self.dx
        y = np.linspace(0, 2.85, 2.85//dy)
        z = np.linspace(0, 2.85, 2.85//dz)
        y, z = np.meshgrid(y, z)
        y = y.flatten()
        z = z.flatten()
        x = np.zeros_like(y)

        h = np.ones_like(x)*max(dx, dy, dz)*self.hdx
        rho = np.ones_like(x) * self.rho
        m = rho*dx*dy*dz

        # Remember to set u otherwise the inlet particles won't move.  Here we
        # use the options which may be set by the user from the command line.
        u = np.ones_like(x)*self.options.speed

        inlet = get_particle_array(name='inlet', x=x, y=y, z=z, m=m, h=h, u=u, rho=rho)

        box_dx = self.dx
        box_w = 0.15
        box_h = 0.003
        box_l = 1.2
        box_x = np.linspace(1., 1.+box_w, (1.+box_w)//box_dx)
        box_y = np.linspace(1., 1.+box_h, (1.+box_h)//box_dx)
        box_z = np.linspace(0., 0.+box_l, (0.+box_l)//box_dx)
        box_x, box_y, box_z = np.meshgrid(box_x, box_y, box_z)
        box_x = box_x.flatten()
        box_y = box_y.flatten()
        box_z = box_z.flatten()
        h = np.ones_like(box_x)*box_dx*self.hdx
        rho = np.ones_like(box_x)*self.rho
        m = rho*box_dx**3
        box = get_particle_array_wcsph(name='box', x=box_x, y=box_y, z=box_z,
                m=m, h=h, rho=rho)

        return [inlet, fluid, outlet, box]


    def create_inlet_outlet(self, particle_arrays):
        # particle_arrays is a dict {name: particle_array}
        fluid_pa = particle_arrays['fluid']
        inlet_pa = particle_arrays['inlet']
        outlet_pa = particle_arrays['outlet']

        # Create the inlet and outlets as described in the documentation.
        n = 3
        inlet = SimpleInlet(
            inlet_pa, fluid_pa, spacing=self.dx, n=3, axis='x',
            xmin=-self.dx*n, xmax=0,
            ymin=0.0, ymax=2.85,
            zmin=0.0, zmax=2.85)
        outlet = SimpleOutlet(
            outlet_pa, fluid_pa,
            xmin=2., xmax=(2 + self.dx*n),
            ymin=0.0, ymax=2.85,
            zmin=0.0, zmax=2.85)
        return [inlet, outlet]


    def create_solver(self):
        kernel = CubicSpline(dim=3)
        integrator = EPECIntegrator(inlet=InletOutletStep(), fluid=WCSPHStep(),
                outlet=InletOutletStep())
        dt = 0.01
        tf = 6
        solver = Solver(kernel=kernel, dim=3, integrator=integrator,
                dt=dt, tf=tf, adaptive_timestep=True,
                cfl=0.3, n_damp=50, output_at_times=np.linspace(0., tf, 20))
        return solver


    def create_equations(self):
        equations = [

            # Equation of state
            Group(equations=[

                    TaitEOS(dest='fluid', sources=None, rho0=self.rho,
                        c0=self.c0, gamma=self.gamma),
                    TaitEOS(dest='box', sources=None, rho0=self.rho,
                        c0=self.c0, gamma=self.gamma),

                    ], real=False
                 ),

            Group(equations=[

                    # Continuity equation
                    ContinuityEquation(dest='fluid', sources=['fluid', 'box']),
                    ContinuityEquation(dest='box', sources=['fluid']),

                    # Momentum equation
                    MomentumEquation(dest='fluid', sources=['fluid', 'box'],
                        c0=self.c0),

                    # Position step with XSPH
                    XSPHCorrection(dest='fluid', sources=['fluid'])
                    ], real=False
                ),
            ]
        return equations


    def create_scheme(self):
        return None


if __name__ == '__main__':
    app = WindTunnelApp()
    app.run()

It runs giving the following error

Generating output in /home/saullo/learning_pysph/wind_tunnel_output
Precompiled code from: /home/saullo/.pysph/source/py2.7-linux-x86_64/m_ab4770118d636f75c744a3036b04d157.pyx
Setup took: 0.09210 secs
0%Traceback (most recent call last):
  File "wind_tunnel.py", line 158, in <module>
    app.run()
  File "/usr/local/lib/python2.7/dist-packages/PySPH-1.0b1.dev0-py2.7-linux-x86_64.egg/pysph/solver/application.py", line 1315, in run
    self.solver.solve(not self.options.quiet)
  File "/usr/local/lib/python2.7/dist-packages/PySPH-1.0b1.dev0-py2.7-linux-x86_64.egg/pysph/solver/solver.py", line 437, in solve
    self.dt = self._get_timestep()
  File "/usr/local/lib/python2.7/dist-packages/PySPH-1.0b1.dev0-py2.7-linux-x86_64.egg/pysph/solver/solver.py", line 733, in _get_timestep
    dt = self._compute_timestep()
  File "/usr/local/lib/python2.7/dist-packages/PySPH-1.0b1.dev0-py2.7-linux-x86_64.egg/pysph/solver/solver.py", line 621, in _compute_timestep
    dt = self.integrator.compute_time_step(undamped_dt, self.cfl)
  File "/usr/local/lib/python2.7/dist-packages/PySPH-1.0b1.dev0-py2.7-linux-x86_64.egg/pysph/sph/integrator.py", line 99, in compute_time_step
    dt_cfl_fac, dt_force_fac, dt_visc_fac = self._get_dt_adapt_factors()
  File "/usr/local/lib/python2.7/dist-packages/PySPH-1.0b1.dev0-py2.7-linux-x86_64.egg/pysph/sph/integrator.py", line 60, in _get_dt_adapt_factors
    max_val = np.max(pa.get(name))
  File "/home/saullo/.local/lib/python2.7/site-packages/numpy/core/fromnumeric.py", line 2334, in amax
    initial=initial)
  File "/home/saullo/.local/lib/python2.7/site-packages/numpy/core/fromnumeric.py", line 83, in _wrapreduction
    return ufunc.reduce(obj, axis, dtype, out, **passkwargs)
ValueError: zero-size array to reduction operation maximum which has no identity

Do you know what could it be? It seems that "fluid" must have initial particles in there. However this would lead to an incorrect inlet/outlet injection of particles.

Facing error while running dam_break_3d

if name == 'main':
app = DamBreak3D()
app.run()
app.post_process(app.info_filename)
zsh: parse error near `sd,'

Whenever I am trying to run the code I see this error. I don't know how to solve

Cannot convert str to pysph.base.nnps_base.DomainManager

I construct a solver in a class, and call run in a function, however I run into an exception like this:

Exception has occurred: TypeError
Cannot convert str to pysph.base.nnps_base.DomainManager
File "D:\Projects\Codes\PySpheral\src\pthreads.py", line 17, in run
self._solver.run()

please can anybody help with that?

Confusing time-performance of SpatialHashNNPS (or LinkedListNNPS) when I run in serial and in parallel

Hello,

I am using pysph for my project, which reduces mostly to using one of LinkedListNNPS or SpatialHashNNPS to find those points, belonging to a spherical grid, that are neighbours of a protein's atoms.

What I have observed so far can be summarized by the following test performed on an HPC cluster with SLURM scheduler.
If I run my script serially on 1 proc, I get an execution time of neighbour search of 0.00285557 sec. This number is an average over 96 repetitions, that is, in my script the first and only process loops over the same search 96 times.
If I run my script in parallel on 96 procs (I use mpi4py), I get an execution time of neighbour search of 0.212207 sec. This time, every process executes only once the neighbour search. In total, I have 96 such numbers, which average to ~0.2.

I tried to find a reason for this 100-fold difference, but I cannot really understand it. Why a single call of SpatialHashNNPS requires ~0.002 sec if I run serially, but ~0.2 sec if I run in parallel. Shouldn't the time for a single call be mostly independent from the number of processors I use?
The only thing that might explain this is that SpatialHashNNPS is parallelized and therefore dependent on the number of processors. Is it so?
What I would like is to have a performance of ~0.002, regardless of the number of processors I use.

I can provide a script to test/reproduce my results, if necessary.

Thank you,
Fabio

26 errors when testing PySPH by <pysph test>

I'm new in PySPH. I've installed all dependencies first. Then I installed Microsoft build tools for VS2019 and I installed PySPH by
pip install https://github.com/pypr/pysph/zipball/master

Environment: Win 10
Python: 3.8.8
Following dependencies are installed:
NumPy
Cython
Mako
cyarray
compyle
pytest
OpenMP
PyOpenCL
Mayavi

I installed most of them by the Anaconda package manager in a separate environment. but when I tested PySPH by pysph test according to the instruction, I got 26 errors as below:

================================================= test session starts =================================================
platform win32 -- Python 3.8.8, pytest-6.1.1, py-1.9.0, pluggy-0.13.1
rootdir: C:\Windows\system32
collected 266 items / 26 errors / 240 selected

======================================================= ERRORS ========================================================
__________________________________ ERROR collecting base/tests/test_device_helper.py __________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_device_helper.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_device_helper.py:4: in <module>
    from pysph.base.utils import get_particle_array  # noqa: E402
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
_________________________________ ERROR collecting base/tests/test_domain_manager.py __________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_domain_manager.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_domain_manager.py:8: in <module>
    from pysph.base.nnps import DomainManager, BoxSortNNPS, LinkedListNNPS
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\nnps.py:1: in <module>
    from pysph.base.nnps_base import get_number_of_threads, py_flatten, \
pysph\base\nnps_base.pyx:1: in init pysph.base.nnps_base
    ???
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
_________________________________ ERROR collecting base/tests/test_neighbor_cache.py __________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_neighbor_cache.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_neighbor_cache.py:7: in <module>
    from pysph.base.nnps import NeighborCache, LinkedListNNPS
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\nnps.py:1: in <module>
    from pysph.base.nnps_base import get_number_of_threads, py_flatten, \
pysph\base\nnps_base.pyx:1: in init pysph.base.nnps_base
    ???
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
______________________________________ ERROR collecting base/tests/test_nnps.py _______________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_nnps.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_nnps.py:12: in <module>
    from pysph.base.utils import get_particle_array
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
_____________________________________ ERROR collecting base/tests/test_octree.py ______________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_octree.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_octree.py:10: in <module>
    from pysph.base.utils import get_particle_array
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
_________________________________ ERROR collecting base/tests/test_particle_array.py __________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_particle_array.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_particle_array.py:10: in <module>
    from pysph.base import particle_array
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
__________________________________ ERROR collecting base/tests/test_periodic_nnps.py __________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_periodic_nnps.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_periodic_nnps.py:7: in <module>
    from pysph.base.nnps import DomainManager, BoxSortNNPS, LinkedListNNPS, \
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\nnps.py:1: in <module>
    from pysph.base.nnps_base import get_number_of_threads, py_flatten, \
pysph\base\nnps_base.pyx:1: in init pysph.base.nnps_base
    ???
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
______________________________________ ERROR collecting base/tests/test_utils.py ______________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_utils.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_utils.py:3: in <module>
    from ..utils import is_overloaded_method
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
___________________________________ ERROR collecting parallel/tests/test_openmp.py ____________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_openmp.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_openmp.py:10: in <module>
    from .example_test_case import ExampleTestCase, get_example_script
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\example_test_case.py:9: in <module>
    from pysph.solver.utils import load, get_files
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\solver\utils.py:16: in <module>
    from pysph.solver.output import load, dump, output_formats  # noqa: 401
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\solver\output.py:9: in <module>
    from pysph.base.particle_array import ParticleArray
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
________________________________ ERROR collecting parallel/tests/test_parallel_run.py _________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel_run.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel_run.py:12: in <module>
    from pysph.parallel.tests.example_test_case import ExampleTestCase, get_example_script
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\example_test_case.py:9: in <module>
    from pysph.solver.utils import load, get_files
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\solver\utils.py:16: in <module>
    from pysph.solver.output import load, dump, output_formats  # noqa: 401
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\solver\output.py:9: in <module>
    from pysph.base.particle_array import ParticleArray
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
__________________________________ ERROR collecting solver/tests/test_application.py __________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\solver\tests\test_application.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\solver\tests\test_application.py:20: in <module>
    from pysph.solver.application import Application
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\solver\application.py:4: in <module>
    from compyle.utils import ArgumentParser
E   ModuleNotFoundError: No module named 'compyle.utils'
____________________________________ ERROR collecting solver/tests/test_solver.py _____________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\solver\tests\test_solver.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\solver\tests\test_solver.py:15: in <module>
    from pysph.solver.solver import Solver
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\solver\solver.py:7: in <module>
    from compyle.profile import profile, profile_ctx
E   ModuleNotFoundError: No module named 'compyle.profile'
_________________________________ ERROR collecting solver/tests/test_solver_utils.py __________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\solver\tests\test_solver_utils.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\solver\tests\test_solver_utils.py:15: in <module>
    from pysph.base.utils import get_particle_array, get_particle_array_wcsph
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
______________________________ ERROR collecting sph/bc/tests/test_simple_inlet_outlet.py ______________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\bc\tests\test_simple_inlet_outlet.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\bc\tests\test_simple_inlet_outlet.py:13: in <module>
    from pysph.base.utils import get_particle_array
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
________________________________ ERROR collecting sph/tests/test_acceleration_eval.py _________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_acceleration_eval.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_acceleration_eval.py:9: in <module>
    from pysph.base.utils import get_particle_array
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
_________________________ ERROR collecting sph/tests/test_acceleration_eval_cython_helper.py __________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_acceleration_eval_cython_helper.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_acceleration_eval_cython_helper.py:10: in <module>
    from pysph.base.particle_array import ParticleArray
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
____________________________________ ERROR collecting sph/tests/test_integrator.py ____________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_integrator.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_integrator.py:9: in <module>
    from pysph.base.utils import get_particle_array, get_particle_array_wcsph
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
_____________________________ ERROR collecting sph/tests/test_integrator_cython_helper.py _____________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_integrator_cython_helper.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_integrator_cython_helper.py:5: in <module>
    from pysph.base.utils import get_particle_array
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
_________________________________ ERROR collecting sph/tests/test_integrator_step.py __________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_integrator_step.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_integrator_step.py:6: in <module>
    from pysph.base.utils import get_particle_array as gpa
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
________________________________ ERROR collecting sph/tests/test_kernel_corrections.py ________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_kernel_corrections.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_kernel_corrections.py:5: in <module>
    from pysph.base.utils import get_particle_array
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
______________________________ ERROR collecting sph/tests/test_multi_group_integrator.py ______________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_multi_group_integrator.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_multi_group_integrator.py:10: in <module>
    from pysph.base.utils import get_particle_array
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
______________________________________ ERROR collecting sph/tests/test_scheme.py ______________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_scheme.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\tests\test_scheme.py:4: in <module>
    from pysph.sph.wc.edac import EDACScheme
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\sph\wc\edac.py:22: in <module>
    from pysph.base.utils import get_particle_array
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
____________________________________ ERROR collecting tools/tests/test_geometry.py ____________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\tools\tests\test_geometry.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\tools\tests\test_geometry.py:5: in <module>
    import pysph.tools.geometry as G
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\tools\geometry.py:4: in <module>
    from pysph.base.nnps import LinkedListNNPS
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\nnps.py:1: in <module>
    from pysph.base.nnps_base import get_number_of_threads, py_flatten, \
pysph\base\nnps_base.pyx:1: in init pysph.base.nnps_base
    ???
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
__________________________________ ERROR collecting tools/tests/test_interpolator.py __________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\tools\tests\test_interpolator.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\tools\tests\test_interpolator.py:16: in <module>
    from pysph.tools.interpolator import get_nx_ny_nz, Interpolator
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\tools\interpolator.py:8: in <module>
    from pysph.base.utils import get_particle_array
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
___________________________________ ERROR collecting tools/tests/test_mesh_tools.py ___________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\tools\tests\test_mesh_tools.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\tools\tests\test_mesh_tools.py:4: in <module>
    from pysph.base.particle_array import ParticleArray
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
_________________________________ ERROR collecting tools/tests/test_sph_evaluator.py __________________________________
ImportError while importing test module 'c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\tools\tests\test_sph_evaluator.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
c:\installers\anaconda\envs\pysph\lib\importlib\__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\tools\tests\test_sph_evaluator.py:5: in <module>
    from pysph.base.utils import get_particle_array
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\utils.py:7: in <module>
    from .particle_array import ParticleArray, \
pysph\base\particle_array.pyx:26: in init pysph.base.particle_array
    ???
c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\device_helper.py:12: in <module>
    from compyle.template import Template
E   ModuleNotFoundError: No module named 'compyle.template'
================================================== warnings summary ===================================================
..\..\installers\anaconda\envs\pysph\lib\site-packages\compyle\types.py:159
  c:\installers\anaconda\envs\pysph\lib\site-packages\compyle\types.py:159: DeprecationWarning: `np.bool` is a deprecated alias for the builtin `bool`. To silence this warning, use `bool` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.bool_` here.
  Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
    np.dtype(np.bool): 'char',

..\..\installers\anaconda\envs\pysph\lib\site-packages\compyle\types.py:168
  c:\installers\anaconda\envs\pysph\lib\site-packages\compyle\types.py:168: DeprecationWarning: `np.bool` is a deprecated alias for the builtin `bool`. To silence this warning, use `bool` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.bool_` here.
  Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
    'bool': np.bool,

..\..\installers\anaconda\envs\pysph\lib\site-packages\compyle\ext_module.py:7
  c:\installers\anaconda\envs\pysph\lib\site-packages\compyle\ext_module.py:7: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
    import imp

..\..\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_nnps.py:11
  c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\base\tests\test_nnps.py:11: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here.
  Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
    from pysph.base.point import IntPoint, Point

..\..\installers\anaconda\envs\pysph\lib\site-packages\pysph\examples\tests\test_examples.py:84
  c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\examples\tests\test_examples.py:84: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.slow

..\..\installers\anaconda\envs\pysph\lib\site-packages\traits\traits.py:330
  c:\installers\anaconda\envs\pysph\lib\site-packages\traits\traits.py:330: DeprecationWarning: 'TraitMap' trait handler has been deprecated. Use Map instead.
    other.append(TraitMap(map))

..\..\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:51
  c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:51: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

..\..\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:55
  c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:55: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

..\..\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:68
  c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:68: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.slow

..\..\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:69
  c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:69: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

..\..\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:88
  c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:88: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

..\..\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:94
  c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:94: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

..\..\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:108
  c:\installers\anaconda\envs\pysph\lib\site-packages\pysph\parallel\tests\test_parallel.py:108: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

-- Docs: https://docs.pytest.org/en/stable/warnings.html
=============================================== short test summary info ===============================================
ERROR base/tests/test_device_helper.py
ERROR base/tests/test_domain_manager.py
ERROR base/tests/test_neighbor_cache.py
ERROR base/tests/test_nnps.py
ERROR base/tests/test_octree.py
ERROR base/tests/test_particle_array.py
ERROR base/tests/test_periodic_nnps.py
ERROR base/tests/test_utils.py
ERROR parallel/tests/test_openmp.py
ERROR parallel/tests/test_parallel_run.py
ERROR solver/tests/test_application.py
ERROR solver/tests/test_solver.py
ERROR solver/tests/test_solver_utils.py
ERROR sph/bc/tests/test_simple_inlet_outlet.py
ERROR sph/tests/test_acceleration_eval.py
ERROR sph/tests/test_acceleration_eval_cython_helper.py
ERROR sph/tests/test_integrator.py
ERROR sph/tests/test_integrator_cython_helper.py
ERROR sph/tests/test_integrator_step.py
ERROR sph/tests/test_kernel_corrections.py
ERROR sph/tests/test_multi_group_integrator.py
ERROR sph/tests/test_scheme.py
ERROR tools/tests/test_geometry.py
ERROR tools/tests/test_interpolator.py
ERROR tools/tests/test_mesh_tools.py
ERROR tools/tests/test_sph_evaluator.py
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 26 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
=========================================== 13 warnings, 26 errors in 2.78s ===========================================

I would appreciate it if you let me know how can I solve this issue.

ModuleNotFoundError: No module named 'pysph.base.gpu_nnps_base'

Hi Guys,

When I am running dam_break_2d example (https://github.com/pypr/pysph/blob/master/pysph/examples/dam_break_2d.py) using OpenCL, I am getting the following error

python dam_break_2d.py --opencl

Error

Traceback (most recent call last):
  File "dam_break_2d.py", line 303, in <module>
    app.run()
  File "C:\Users\CKesanapalli\Anaconda3\lib\site-packages\pysph\solver\application.py", line 1569, in run  
    self._configure_solver()
  File "C:\Users\CKesanapalli\Anaconda3\lib\site-packages\pysph\solver\application.py", line 1025, in _configure_solver
    from pysph.base.gpu_nnps import ZOrderGPUNNPS
  File "C:\Users\CKesanapalli\Anaconda3\lib\site-packages\pysph\base\gpu_nnps.py", line 1, in <module>     
    from pysph.base.gpu_nnps_base import GPUNeighborCache, GPUNNPS, BruteForceNNPS
ModuleNotFoundError: No module named 'pysph.base.gpu_nnps_base'

The error is directing to this line.

from pysph.base.gpu_nnps_base import GPUNeighborCache, GPUNNPS, BruteForceNNPS

Can you tell me how to resolve this issue

Thank you

"core dumped" error in very simple example

I started removing complexity of my previous attempts to figure out what I was doing wrong. The following code is already very simple and still I am getting a "core dumped" error.

import numpy as np

from pysph.base.utils import get_particle_array_wcsph
from pysph.solver.application import Application
from pysph.sph.scheme import WCSPHScheme

class WaterBoxApp(Application):

    def initialize(self):
        self.c0 = 1498.0 # m/s
        self.rho = 1000. # 1000 kg/m3
        self.hdx = 1.3
        self.dx = 0.1
        self.gamma = 7.


    def create_particles(self):
        # Note that you need to create the inlet and outlet arrays in this method.

        # Setup the inlet particle array with just the particles we need at the
        # exit plane which is replicated by the inlet.
        dx = self.dx

        x = np.linspace(3*dx/2, 2.5-2*dx/2, (2.5-2*dx)//dx)
        y = np.linspace(3*dx/2, 2.85-2*dx/2, (2.85-2*dx)//dx)
        z = np.linspace(3*dx/2, 2.85-2*dx/2, (2.85-2*dx)//dx)
        x, y, z = np.meshgrid(x, y, z)
        x = x.flatten()
        y = y.flatten()
        z = z.flatten()
        h = np.ones_like(x)*dx*self.hdx
        rho = np.ones_like(x) * self.rho
        m = rho*dx*dx*dx
        fluid = get_particle_array_wcsph(x=x, y=y, z=z, m=m, h=h, rho=rho, name='fluid')


        x = np.linspace(-5*dx/2, 4.+5*dx/2, (4.+5*dx)//dx)
        y = np.linspace(-5*dx/2, 2.85+5*dx/2, (2.85+5*dx)//dx)
        z = np.linspace(-5*dx/2, 2.85+5*dx/2, (2.85+5*dx)//dx)
        x, y, z = np.meshgrid(x, y, z)
        x = x.flatten()
        y = y.flatten()
        z = z.flatten()
        h = np.ones_like(x)*dx*self.hdx
        rho = np.ones_like(x) * self.rho
        m = rho*dx*dx*dx
        walls = get_particle_array_wcsph(name='walls', x=x, y=y, z=z, m=m, h=h, rho=rho)
        indices = np.where(
                (x >= dx/2) & (x <= 4) &
                (y >= dx/2) & (y <= (2.85-dx/2)) &
                (z >= dx/2) & (z <= (2.85-dx/2)))[0]
        walls.remove_particles(indices)

        box_w = 0.15
        box_h = 0.003
        box_l = 1.2
        box_x = np.linspace(3., 3.+box_w, (3.+box_w)//dx)
        box_y = np.linspace(1.4, 1.4+box_h, (1.4+box_h)//dx)
        box_z = np.linspace(0., 0.+box_l, (0.+box_l)//dx)
        box_x, box_y, box_z = np.meshgrid(box_x, box_y, box_z)
        box_x = box_x.flatten()
        box_y = box_y.flatten()
        box_z = box_z.flatten()
        h = np.ones_like(box_x)*dx*self.hdx
        rho = np.ones_like(box_x)*self.rho
        m = rho*dx*dx*dx
        box = get_particle_array_wcsph(name='box', x=box_x, y=box_y, z=box_z,
                m=m, h=h, rho=rho)

        return [fluid, box, walls]


    def create_scheme(self):
        s = WCSPHScheme(
            ['fluid'], ['box', 'walls'], dim=3, rho0=self.rho, c0=self.c0,
            gz=-9.81,
            tensile_correction=True, hg_correction=True,
            update_h=True, delta_sph=True, summation_density=True,
            nu=8.9e-4,
            h0=self.dx*self.hdx, hdx=self.hdx, gamma=self.gamma, alpha=0.1, beta=0.0)
        dt = 0.0001
        tf = 2
        s.configure_solver(dt=dt, tf=tf)
        return s


if __name__ == '__main__':
    app = WaterBoxApp()
    app.run()

The following error message appears when running with or without "--openmp". With "--opencl" the run proceeds, but generating "NaN" results. Probably the GPU is not checking for 0 division or things like that.

Generating output in /home/saullo/learning_pysph/water_box_simple_output
Compiling code at: /home/saullo/.pysph/source/py2.7-linux-x86_64/m_2eab9c05435eee494a0fdc475e792d3e.pyx
Setup took: 11.44254 secs
0%ERROR: LinkedListNNPS requires too many cells (-1418367197).
[saullo-Aspire-VN7-592G:28132] *** Process received signal ***
[saullo-Aspire-VN7-592G:28132] Signal: Segmentation fault (11)
[saullo-Aspire-VN7-592G:28132] Signal code: Address not mapped (1)
[saullo-Aspire-VN7-592G:28132] Failing at address: 0x7fdf76e625c8
[saullo-Aspire-VN7-592G:28132] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x11390)[0x7fe162b58390]
[saullo-Aspire-VN7-592G:28132] [ 1] /usr/local/lib/python2.7/dist-packages/PySPH-1.0b1.dev0-py2.7-linux-x86_64.egg/pysph/base/linked_list_nnps.so(+0x9035)[0x7fe13e73b035]
[saullo-Aspire-VN7-592G:28132] [ 2] /usr/local/lib/python2.7/dist-packages/PySPH-1.0b1.dev0-py2.7-linux-x86_64.egg/pysph/base/nnps_base.so(+0x1c2f8)[0x7fe13f3542f8]
[saullo-Aspire-VN7-592G:28132] [ 3] /home/saullo/.pysph/source/py2.7-linux-x86_64/m_2eab9c05435eee494a0fdc475e792d3e.so(+0x35609)[0x7fe1371cd609]
[saullo-Aspire-VN7-592G:28132] [ 4] /home/saullo/.pysph/source/py2.7-linux-x86_64/m_2eab9c05435eee494a0fdc475e792d3e.so(+0x1c52d)[0x7fe1371b452d]
[saullo-Aspire-VN7-592G:28132] [ 5] /home/saullo/.pysph/source/py2.7-linux-x86_64/m_2eab9c05435eee494a0fdc475e792d3e.so(+0x2f670)[0x7fe1371c7670]
[saullo-Aspire-VN7-592G:28132] [ 6] /home/saullo/.pysph/source/py2.7-linux-x86_64/m_2eab9c05435eee494a0fdc475e792d3e.so(+0x2fcea)[0x7fe1371c7cea]
[saullo-Aspire-VN7-592G:28132] [ 7] python(PyEval_EvalFrameEx+0x5ca)[0x4bc3fa]
[saullo-Aspire-VN7-592G:28132] [ 8] python(PyEval_EvalFrameEx+0x553f)[0x4c136f]
[saullo-Aspire-VN7-592G:28132] [ 9] python(PyEval_EvalCodeEx+0x306)[0x4b9ab6]
[saullo-Aspire-VN7-592G:28132] [10] python(PyEval_EvalFrameEx+0x58b7)[0x4c16e7]
[saullo-Aspire-VN7-592G:28132] [11] python(PyEval_EvalCodeEx+0x306)[0x4b9ab6]
[saullo-Aspire-VN7-592G:28132] [12] python(PyEval_EvalFrameEx+0x58b7)[0x4c16e7]
[saullo-Aspire-VN7-592G:28132] [13] python(PyEval_EvalCodeEx+0x306)[0x4b9ab6]
[saullo-Aspire-VN7-592G:28132] [14] python[0x4eb30f]
[saullo-Aspire-VN7-592G:28132] [15] python(PyRun_FileExFlags+0x82)[0x4e5422]
[saullo-Aspire-VN7-592G:28132] [16] python(PyRun_SimpleFileExFlags+0x186)[0x4e3cd6]
[saullo-Aspire-VN7-592G:28132] [17] python(Py_Main+0x612)[0x493ae2]
[saullo-Aspire-VN7-592G:28132] [18] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf0)[0x7fe16279d830]
[saullo-Aspire-VN7-592G:28132] [19] python(_start+0x29)[0x4933e9]
[saullo-Aspire-VN7-592G:28132] *** End of error message ***
Segmentation fault (core dumped)

Question during view with s00,s01

HI,
I have a confused with the view of s00 or s01..., when using the VonMisesPlasticity2D. The value of s00 or s01 is greater than the flow_stress(Yo).
1

Best,
Binghui

where to find ami.sh or starcluster directory

Hi

I am having trouble locating the ami.sh and/or the starcluster directory when I set up a EC2 instance from the AMI ID ami-01fdc27a as suggested here: https://pysph.readthedocs.io/en/latest/starcluster/overview.html

I could not find the PySPH repository or a Starcluster directory in my Ubuntu instance,

I then did a
pip install PySPH

and then tried to look around for the StarCluster directory but could not find it either.

Am I missing something here?

Thanks in advance,
Anand

Error while installing PySPH

Hi,

I am new to PySPH and I would like to install it on my Ubuntu machine. But when I do "pip install PySPH" I get:

 ERROR: Command errored out with exit status 1:
     command: /usr/bin/python3.7 -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-cffooqhe/PySPH/setup.py'"'"'; __file__='"'"'/tmp/pip-install-cffooqhe/PySPH/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-install-cffooqhe/PySPH/pip-egg-info
         cwd: /tmp/pip-install-cffooqhe/PySPH/
    Complete output (93 lines):
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmp/pip-install-cffooqhe/PySPH/setup.py", line 763, in <module>
        setup_package()
      File "/tmp/pip-install-cffooqhe/PySPH/setup.py", line 693, in setup_package
        ext_modules = get_basic_extensions() + get_parallel_extensions()
      File "/tmp/pip-install-cffooqhe/PySPH/setup.py", line 258, in get_basic_extensions
        openmp_compile_args, openmp_link_args, openmp_env = get_openmp_flags()
      File "/tmp/pip-install-cffooqhe/PySPH/setup.py", line 121, in get_openmp_flags
        pyxbuild.pyx_to_dll(fname, extension, pyxbuild_dir=tmp_dir)
      File "/usr/lib/python3/dist-packages/pyximport/pyxbuild.py", line 102, in pyx_to_dll
        dist.run_commands()
      File "/usr/lib/python3.7/distutils/dist.py", line 966, in run_commands
        self.run_command(cmd)
      File "/usr/lib/python3.7/distutils/dist.py", line 985, in run_command
        cmd_obj.run()
      File "/usr/lib/python3/dist-packages/Cython/Distutils/old_build_ext.py", line 185, in run
        _build_ext.build_ext.run(self)
      File "/usr/lib/python3.7/distutils/command/build_ext.py", line 340, in run
        self.build_extensions()
      File "/usr/lib/python3/dist-packages/Cython/Distutils/old_build_ext.py", line 192, in build_extensions
        ext.sources = self.cython_sources(ext.sources, ext)
      File "/usr/lib/python3/dist-packages/Cython/Distutils/old_build_ext.py", line 345, in cython_sources
        full_module_name=module_name)
      File "/usr/lib/python3/dist-packages/Cython/Compiler/Main.py", line 693, in compile
        return compile_single(source, options, full_module_name)
      File "/usr/lib/python3/dist-packages/Cython/Compiler/Main.py", line 643, in compile_single
        return run_pipeline(source, options, full_module_name)
      File "/usr/lib/python3/dist-packages/Cython/Compiler/Main.py", line 455, in run_pipeline
        from . import Pipeline
      File "/usr/lib/python3/dist-packages/Cython/Compiler/Pipeline.py", line 9, in <module>
        from .Visitor import CythonTransform
      File "/usr/lib/python3/dist-packages/Cython/Compiler/Visitor.py", line 15, in <module>
        from . import ExprNodes
      File "/usr/lib/python3/dist-packages/Cython/Compiler/ExprNodes.py", line 2844
        await = None
              ^
    SyntaxError: invalid syntax
    Error in sys.excepthook:
    Traceback (most recent call last):
      File "/usr/lib/python3/dist-packages/apport_python_hook.py", line 63, in apport_excepthook
        from apport.fileutils import likely_packaged, get_recent_crashes
      File "/usr/lib/python3/dist-packages/apport/__init__.py", line 5, in <module>
        from apport.report import Report
      File "/usr/lib/python3/dist-packages/apport/report.py", line 30, in <module>
        import apport.fileutils
      File "/usr/lib/python3/dist-packages/apport/fileutils.py", line 23, in <module>
        from apport.packaging_impl import impl as packaging
      File "/usr/lib/python3/dist-packages/apport/packaging_impl.py", line 24, in <module>
        import apt
      File "/usr/lib/python3/dist-packages/apt/__init__.py", line 23, in <module>
        import apt_pkg
    ModuleNotFoundError: No module named 'apt_pkg'
    
    Original exception was:
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmp/pip-install-cffooqhe/PySPH/setup.py", line 763, in <module>
        setup_package()
      File "/tmp/pip-install-cffooqhe/PySPH/setup.py", line 693, in setup_package
        ext_modules = get_basic_extensions() + get_parallel_extensions()
      File "/tmp/pip-install-cffooqhe/PySPH/setup.py", line 258, in get_basic_extensions
        openmp_compile_args, openmp_link_args, openmp_env = get_openmp_flags()
      File "/tmp/pip-install-cffooqhe/PySPH/setup.py", line 121, in get_openmp_flags
        pyxbuild.pyx_to_dll(fname, extension, pyxbuild_dir=tmp_dir)
      File "/usr/lib/python3/dist-packages/pyximport/pyxbuild.py", line 102, in pyx_to_dll
        dist.run_commands()
      File "/usr/lib/python3.7/distutils/dist.py", line 966, in run_commands
        self.run_command(cmd)
      File "/usr/lib/python3.7/distutils/dist.py", line 985, in run_command
        cmd_obj.run()
      File "/usr/lib/python3/dist-packages/Cython/Distutils/old_build_ext.py", line 185, in run
        _build_ext.build_ext.run(self)
      File "/usr/lib/python3.7/distutils/command/build_ext.py", line 340, in run
        self.build_extensions()
      File "/usr/lib/python3/dist-packages/Cython/Distutils/old_build_ext.py", line 192, in build_extensions
        ext.sources = self.cython_sources(ext.sources, ext)
      File "/usr/lib/python3/dist-packages/Cython/Distutils/old_build_ext.py", line 345, in cython_sources
        full_module_name=module_name)
      File "/usr/lib/python3/dist-packages/Cython/Compiler/Main.py", line 693, in compile
        return compile_single(source, options, full_module_name)
      File "/usr/lib/python3/dist-packages/Cython/Compiler/Main.py", line 643, in compile_single
        return run_pipeline(source, options, full_module_name)
      File "/usr/lib/python3/dist-packages/Cython/Compiler/Main.py", line 455, in run_pipeline
        from . import Pipeline
      File "/usr/lib/python3/dist-packages/Cython/Compiler/Pipeline.py", line 9, in <module>
        from .Visitor import CythonTransform
      File "/usr/lib/python3/dist-packages/Cython/Compiler/Visitor.py", line 15, in <module>
        from . import ExprNodes
      File "/usr/lib/python3/dist-packages/Cython/Compiler/ExprNodes.py", line 2844
        await = None
              ^
    SyntaxError: invalid syntax
    ----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.

Can someone help me?

Regards,
Al

Unit test failures on amd64

Unit tests seems to fail on amd64 platform with Python 3.7 for PySPH v1.0a6 (debian sid).

Th test has been run using the following command:

$ python3.7 -m pytest 

Test resulst are reported below:

============================= test session starts ==============================
platform linux -- Python 3.7.4+, pytest-4.6.5, py-1.8.0, pluggy-0.12.0
rootdir: /home/antonio/debian/git/pysph, inifile: tox.ini
collected 1034 items / 58 deselected / 976 selected

pysph/base/tests/test_carray.py ........................                 [  2%]
pysph/base/tests/test_device_helper.py .ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss [  5%]
.ss.ss.ss.ss.ss.ss                                                       [  7%]
pysph/base/tests/test_domain_manager.py .........                        [  8%]
pysph/base/tests/test_kernel.py ........................................ [ 12%]
.................................................................        [ 19%]
pysph/base/tests/test_linalg3.py .......                                 [ 20%]
pysph/base/tests/test_neighbor_cache.py ...                              [ 20%]
pysph/base/tests/test_nnps.py .......ssssssssss......................... [ 24%]
....................................................ssssssssssssssssssss [ 32%]
sssssssssssssssssssssssssssssssssssssssssss............................. [ 39%]
...................                                                      [ 41%]
pysph/base/tests/test_octree.py .....s...s...s...s...s...s...s...s.      [ 44%]
pysph/base/tests/test_particle_array.py ..............................ss [ 48%]
ssssssssssssssssssssssssssssssssssssssssssssssssss                       [ 53%]
pysph/base/tests/test_periodic_nnps.py ...........                       [ 54%]
pysph/base/tests/test_reduce_array.py ......                             [ 55%]
pysph/cpy/tests/test_array.py .ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss       [ 58%]
pysph/cpy/tests/test_ast_utils.py .......                                [ 59%]
pysph/cpy/tests/test_capture_stream.py .....                             [ 60%]
pysph/cpy/tests/test_config.py ............                              [ 61%]
pysph/cpy/tests/test_cython_generator.py ...............                 [ 62%]
pysph/cpy/tests/test_ext_module.py .......                               [ 63%]
pysph/cpy/tests/test_jit.py .................                            [ 65%]
pysph/cpy/tests/test_low_level.py ssss..                                 [ 65%]
pysph/cpy/tests/test_parallel.py s.ss.ss.ss.ss.ss.ss..ss..ss..ss..ss.ss. [ 69%]
ss.ss.ss.ss.ss..ss..ss..ss..s                                            [ 72%]
pysph/cpy/tests/test_translator.py ..................................... [ 76%]
............                                                             [ 77%]
pysph/cpy/tests/test_transpiler.py ..                                    [ 78%]
pysph/cpy/tests/test_types.py .......                                    [ 78%]
pysph/examples/tests/test_riemann_solver.py s........                    [ 79%]
pysph/parallel/tests/test_openmp.py .                                    [ 79%]
pysph/parallel/tests/test_parallel.py ssssss                             [ 80%]
pysph/parallel/tests/test_parallel_run.py s                              [ 80%]
pysph/solver/tests/test_application.py ..                                [ 80%]
pysph/solver/tests/test_solver.py ..                                     [ 80%]
pysph/solver/tests/test_solver_utils.py ......sssss.                     [ 82%]
pysph/sph/tests/test_acceleration_eval.py ....................ssssssssss [ 85%]
ssssssssssssssssss                                                       [ 87%]
pysph/sph/tests/test_acceleration_eval_cython_helper.py ...              [ 87%]
pysph/sph/tests/test_equations.py ...............                        [ 88%]
pysph/sph/tests/test_integrator.py .........sss                          [ 90%]
pysph/sph/tests/test_integrator_cython_helper.py .                       [ 90%]
pysph/sph/tests/test_integrator_step.py ..                               [ 90%]
pysph/sph/tests/test_kernel_corrections.py ................              [ 92%]
pysph/sph/tests/test_linalg.py ...........                               [ 93%]
pysph/sph/tests/test_multi_group_integrator.py .s                        [ 93%]
pysph/sph/tests/test_riemann_solver.py .............                     [ 94%]
pysph/sph/tests/test_scheme.py .                                         [ 94%]
pysph/sph/tests/test_simple_inlet_outlet.py .......                      [ 95%]
pysph/tools/tests/test_geometry.py .............s...                     [ 97%]
pysph/tools/tests/test_geometry_stl.py ........                          [ 98%]
pysph/tools/tests/test_interpolator.py ...FFFFFFFFFF                     [ 99%]
pysph/tools/tests/test_sph_evaluator.py FFF                              [ 99%]
pyzoltan/core/tests/test_zoltan.py ss                                    [100%]

=================================== FAILURES ===================================
________ TestInterpolator.test_should_be_able_to_update_particle_arrays ________

self = <pysph.tools.tests.test_interpolator.TestInterpolator testMethod=test_should_be_able_to_update_particle_arrays>

    def test_should_be_able_to_update_particle_arrays(self):
        # Given
        pa = self._make_2d_grid()
        pa_new = self._make_2d_grid()
        pa_new.p[:] = 10.0
    
>       ip = Interpolator([pa], num_points=1000)

pysph/tools/tests/test_interpolator.py:167: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/tools/interpolator.py:129: in __init__
    self.set_domain(bounds, shape)
pysph/tools/interpolator.py:187: in set_domain
    self.set_interpolation_points(x, y, z)
pysph/tools/interpolator.py:169: in set_interpolation_points
    self._compile_acceleration_eval(arrays)
pysph/tools/interpolator.py:276: in _compile_acceleration_eval
    compiler.compile()
pysph/sph/sph_compiler.py:38: in compile
    mod = helper0.compile(code0)
pysph/sph/acceleration_eval_cython_helper.py:172: in compile
    self._module = self._ext_mod.load()
pysph/cpy/ext_module.py:266: in load
    self.build()
pysph/cpy/ext_module.py:244: in build
    setup_args={'script_args': script_args}
pysph/cpy/capture_stream.py:82: in __exit__
    capture.__exit__(type, value, tb)
pysph/cpy/capture_stream.py:47: in __exit__
    self._cache_output()
pysph/cpy/capture_stream.py:59: in _cache_output
    result = tmp_stream.read()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <encodings.ascii.IncrementalDecoder object at 0x7f228e116550>
input = b'In file included from /usr/lib/python3/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1822,\n                ...: \xe2\x80\x98__pyx_v_fac\xe2\x80\x99 was declared here\n 4076 |   double __pyx_v_fac;\n      |          ^~~~~~~~~~~\n'
final = True

    def decode(self, input, final=False):
>       return codecs.ascii_decode(input, self.errors)[0]
E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 875: ordinal not in range(128)

/usr/lib/python3.7/encodings/ascii.py:26: UnicodeDecodeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx
running build_ext
cythoning /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx to /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp
building 'm_63eb3305b979b7a1b9a17a7d96aac627' extension
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base -I/usr/include/python3.7m -c /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/lib.linux-amd64-3.7/m_63eb3305b979b7a1b9a17a7d96aac627.cpython-37m-x86_64-linux-gnu.so
----------------------------- Captured stderr call -----------------------------
warning: pysph/base/nnps_base.pxd:25:13: 'INT_MAX' redeclared 
_____________ TestInterpolator.test_should_correctly_update_domain _____________

self = <pysph.tools.tests.test_interpolator.TestInterpolator testMethod=test_should_correctly_update_domain>

    def test_should_correctly_update_domain(self):
        # Given
        pa = self._make_2d_grid()
>       ip = Interpolator([pa], num_points=1000)

pysph/tools/tests/test_interpolator.py:181: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/tools/interpolator.py:129: in __init__
    self.set_domain(bounds, shape)
pysph/tools/interpolator.py:187: in set_domain
    self.set_interpolation_points(x, y, z)
pysph/tools/interpolator.py:169: in set_interpolation_points
    self._compile_acceleration_eval(arrays)
pysph/tools/interpolator.py:276: in _compile_acceleration_eval
    compiler.compile()
pysph/sph/sph_compiler.py:38: in compile
    mod = helper0.compile(code0)
pysph/sph/acceleration_eval_cython_helper.py:172: in compile
    self._module = self._ext_mod.load()
pysph/cpy/ext_module.py:266: in load
    self.build()
pysph/cpy/ext_module.py:244: in build
    setup_args={'script_args': script_args}
pysph/cpy/capture_stream.py:82: in __exit__
    capture.__exit__(type, value, tb)
pysph/cpy/capture_stream.py:47: in __exit__
    self._cache_output()
pysph/cpy/capture_stream.py:59: in _cache_output
    result = tmp_stream.read()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <encodings.ascii.IncrementalDecoder object at 0x7f228f397650>
input = b'In file included from /usr/lib/python3/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1822,\n                ...: \xe2\x80\x98__pyx_v_fac\xe2\x80\x99 was declared here\n 4076 |   double __pyx_v_fac;\n      |          ^~~~~~~~~~~\n'
final = True

    def decode(self, input, final=False):
>       return codecs.ascii_decode(input, self.errors)[0]
E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 875: ordinal not in range(128)

/usr/lib/python3.7/encodings/ascii.py:26: UnicodeDecodeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx
running build_ext
cythoning /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx to /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp
building 'm_63eb3305b979b7a1b9a17a7d96aac627' extension
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base -I/usr/include/python3.7m -c /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/lib.linux-amd64-3.7/m_63eb3305b979b7a1b9a17a7d96aac627.cpython-37m-x86_64-linux-gnu.so
----------------------------- Captured stderr call -----------------------------
warning: pysph/base/nnps_base.pxd:25:13: 'INT_MAX' redeclared 
_________________ TestInterpolator.test_should_work_on_2d_data _________________

self = <pysph.tools.tests.test_interpolator.TestInterpolator testMethod=test_should_work_on_2d_data>

    def test_should_work_on_2d_data(self):
        # Given
        pa = self._make_2d_grid()
    
        # When.
>       ip = Interpolator([pa], num_points=1000)

pysph/tools/tests/test_interpolator.py:100: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/tools/interpolator.py:129: in __init__
    self.set_domain(bounds, shape)
pysph/tools/interpolator.py:187: in set_domain
    self.set_interpolation_points(x, y, z)
pysph/tools/interpolator.py:169: in set_interpolation_points
    self._compile_acceleration_eval(arrays)
pysph/tools/interpolator.py:276: in _compile_acceleration_eval
    compiler.compile()
pysph/sph/sph_compiler.py:38: in compile
    mod = helper0.compile(code0)
pysph/sph/acceleration_eval_cython_helper.py:172: in compile
    self._module = self._ext_mod.load()
pysph/cpy/ext_module.py:266: in load
    self.build()
pysph/cpy/ext_module.py:244: in build
    setup_args={'script_args': script_args}
pysph/cpy/capture_stream.py:82: in __exit__
    capture.__exit__(type, value, tb)
pysph/cpy/capture_stream.py:47: in __exit__
    self._cache_output()
pysph/cpy/capture_stream.py:59: in _cache_output
    result = tmp_stream.read()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <encodings.ascii.IncrementalDecoder object at 0x7f228efac810>
input = b'In file included from /usr/lib/python3/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1822,\n                ...: \xe2\x80\x98__pyx_v_fac\xe2\x80\x99 was declared here\n 4076 |   double __pyx_v_fac;\n      |          ^~~~~~~~~~~\n'
final = True

    def decode(self, input, final=False):
>       return codecs.ascii_decode(input, self.errors)[0]
E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 875: ordinal not in range(128)

/usr/lib/python3.7/encodings/ascii.py:26: UnicodeDecodeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx
running build_ext
cythoning /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx to /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp
building 'm_63eb3305b979b7a1b9a17a7d96aac627' extension
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base -I/usr/include/python3.7m -c /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/lib.linux-amd64-3.7/m_63eb3305b979b7a1b9a17a7d96aac627.cpython-37m-x86_64-linux-gnu.so
----------------------------- Captured stderr call -----------------------------
warning: pysph/base/nnps_base.pxd:25:13: 'INT_MAX' redeclared 
______ TestInterpolator.test_should_work_when_arrays_have_different_props ______

self = <pysph.tools.tests.test_interpolator.TestInterpolator testMethod=test_should_work_when_arrays_have_different_props>

    def test_should_work_when_arrays_have_different_props(self):
        # Given
        pa1 = self._make_2d_grid()
        pa1.add_property('junk', default=2.0)
        pa2 = self._make_2d_grid('solid')
    
        # When.
>       ip = Interpolator([pa1, pa2], num_points=1000)

pysph/tools/tests/test_interpolator.py:199: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/tools/interpolator.py:129: in __init__
    self.set_domain(bounds, shape)
pysph/tools/interpolator.py:187: in set_domain
    self.set_interpolation_points(x, y, z)
pysph/tools/interpolator.py:169: in set_interpolation_points
    self._compile_acceleration_eval(arrays)
pysph/tools/interpolator.py:276: in _compile_acceleration_eval
    compiler.compile()
pysph/sph/sph_compiler.py:38: in compile
    mod = helper0.compile(code0)
pysph/sph/acceleration_eval_cython_helper.py:172: in compile
    self._module = self._ext_mod.load()
pysph/cpy/ext_module.py:266: in load
    self.build()
pysph/cpy/ext_module.py:244: in build
    setup_args={'script_args': script_args}
pysph/cpy/capture_stream.py:82: in __exit__
    capture.__exit__(type, value, tb)
pysph/cpy/capture_stream.py:47: in __exit__
    self._cache_output()
pysph/cpy/capture_stream.py:59: in _cache_output
    result = tmp_stream.read()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <encodings.ascii.IncrementalDecoder object at 0x7f228fc83310>
input = b'In file included from /usr/lib/python3/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1822,\n                ...: \xe2\x80\x98__pyx_v_fac\xe2\x80\x99 was declared here\n 4084 |   double __pyx_v_fac;\n      |          ^~~~~~~~~~~\n'
final = True

    def decode(self, input, final=False):
>       return codecs.ascii_decode(input, self.errors)[0]
E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 875: ordinal not in range(128)

/usr/lib/python3.7/encodings/ascii.py:26: UnicodeDecodeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_65f5bdc760e2a27615a73da5488fbd36.pyx
running build_ext
cythoning /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_65f5bdc760e2a27615a73da5488fbd36.pyx to /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_65f5bdc760e2a27615a73da5488fbd36.cpp
building 'm_65f5bdc760e2a27615a73da5488fbd36' extension
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base -I/usr/include/python3.7m -c /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_65f5bdc760e2a27615a73da5488fbd36.cpp -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_65f5bdc760e2a27615a73da5488fbd36.o
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_65f5bdc760e2a27615a73da5488fbd36.o -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/lib.linux-amd64-3.7/m_65f5bdc760e2a27615a73da5488fbd36.cpython-37m-x86_64-linux-gnu.so
----------------------------- Captured stderr call -----------------------------
warning: pysph/base/nnps_base.pxd:25:13: 'INT_MAX' redeclared 
_____________ TestInterpolator.test_should_work_with_changed_data ______________

self = <pysph.tools.tests.test_interpolator.TestInterpolator testMethod=test_should_work_with_changed_data>

    def test_should_work_with_changed_data(self):
        # Given
        pa = self._make_2d_grid()
>       ip = Interpolator([pa], num_points=1000)

pysph/tools/tests/test_interpolator.py:150: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/tools/interpolator.py:129: in __init__
    self.set_domain(bounds, shape)
pysph/tools/interpolator.py:187: in set_domain
    self.set_interpolation_points(x, y, z)
pysph/tools/interpolator.py:169: in set_interpolation_points
    self._compile_acceleration_eval(arrays)
pysph/tools/interpolator.py:276: in _compile_acceleration_eval
    compiler.compile()
pysph/sph/sph_compiler.py:38: in compile
    mod = helper0.compile(code0)
pysph/sph/acceleration_eval_cython_helper.py:172: in compile
    self._module = self._ext_mod.load()
pysph/cpy/ext_module.py:266: in load
    self.build()
pysph/cpy/ext_module.py:244: in build
    setup_args={'script_args': script_args}
pysph/cpy/capture_stream.py:82: in __exit__
    capture.__exit__(type, value, tb)
pysph/cpy/capture_stream.py:47: in __exit__
    self._cache_output()
pysph/cpy/capture_stream.py:59: in _cache_output
    result = tmp_stream.read()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <encodings.ascii.IncrementalDecoder object at 0x7f2290331f90>
input = b'In file included from /usr/lib/python3/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1822,\n                ...: \xe2\x80\x98__pyx_v_fac\xe2\x80\x99 was declared here\n 4076 |   double __pyx_v_fac;\n      |          ^~~~~~~~~~~\n'
final = True

    def decode(self, input, final=False):
>       return codecs.ascii_decode(input, self.errors)[0]
E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 875: ordinal not in range(128)

/usr/lib/python3.7/encodings/ascii.py:26: UnicodeDecodeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx
running build_ext
cythoning /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx to /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp
building 'm_63eb3305b979b7a1b9a17a7d96aac627' extension
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base -I/usr/include/python3.7m -c /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/lib.linux-amd64-3.7/m_63eb3305b979b7a1b9a17a7d96aac627.cpython-37m-x86_64-linux-gnu.so
----------------------------- Captured stderr call -----------------------------
warning: pysph/base/nnps_base.pxd:25:13: 'INT_MAX' redeclared 
____ TestInterpolator.test_should_work_with_explicit_points_in_constructor _____

self = <pysph.tools.tests.test_interpolator.TestInterpolator testMethod=test_should_work_with_explicit_points_in_constructor>

    def test_should_work_with_explicit_points_in_constructor(self):
        # Given
        pa = self._make_2d_grid()
        x, y = np.random.random((2, 5, 5))
        z = np.zeros_like(x)
    
        # When.
>       ip = Interpolator([pa], x=x, y=y, z=z)

pysph/tools/tests/test_interpolator.py:213: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/tools/interpolator.py:131: in __init__
    self.set_interpolation_points(x=x, y=y, z=z)
pysph/tools/interpolator.py:169: in set_interpolation_points
    self._compile_acceleration_eval(arrays)
pysph/tools/interpolator.py:276: in _compile_acceleration_eval
    compiler.compile()
pysph/sph/sph_compiler.py:38: in compile
    mod = helper0.compile(code0)
pysph/sph/acceleration_eval_cython_helper.py:172: in compile
    self._module = self._ext_mod.load()
pysph/cpy/ext_module.py:266: in load
    self.build()
pysph/cpy/ext_module.py:244: in build
    setup_args={'script_args': script_args}
pysph/cpy/capture_stream.py:82: in __exit__
    capture.__exit__(type, value, tb)
pysph/cpy/capture_stream.py:47: in __exit__
    self._cache_output()
pysph/cpy/capture_stream.py:59: in _cache_output
    result = tmp_stream.read()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <encodings.ascii.IncrementalDecoder object at 0x7f228edd6250>
input = b'In file included from /usr/lib/python3/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1822,\n                ...: \xe2\x80\x98__pyx_v_fac\xe2\x80\x99 was declared here\n 4076 |   double __pyx_v_fac;\n      |          ^~~~~~~~~~~\n'
final = True

    def decode(self, input, final=False):
>       return codecs.ascii_decode(input, self.errors)[0]
E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 875: ordinal not in range(128)

/usr/lib/python3.7/encodings/ascii.py:26: UnicodeDecodeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx
running build_ext
cythoning /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx to /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp
building 'm_63eb3305b979b7a1b9a17a7d96aac627' extension
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base -I/usr/include/python3.7m -c /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/lib.linux-amd64-3.7/m_63eb3305b979b7a1b9a17a7d96aac627.cpython-37m-x86_64-linux-gnu.so
----------------------------- Captured stderr call -----------------------------
warning: pysph/base/nnps_base.pxd:25:13: 'INT_MAX' redeclared 
_______ TestInterpolator.test_should_work_with_explicit_points_without_z _______

self = <pysph.tools.tests.test_interpolator.TestInterpolator testMethod=test_should_work_with_explicit_points_without_z>

    def test_should_work_with_explicit_points_without_z(self):
        # Given
        pa = self._make_2d_grid()
        x, y = np.random.random((2, 5, 5))
    
        # When.
>       ip = Interpolator([pa], x=x, y=y)

pysph/tools/tests/test_interpolator.py:227: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/tools/interpolator.py:131: in __init__
    self.set_interpolation_points(x=x, y=y, z=z)
pysph/tools/interpolator.py:169: in set_interpolation_points
    self._compile_acceleration_eval(arrays)
pysph/tools/interpolator.py:276: in _compile_acceleration_eval
    compiler.compile()
pysph/sph/sph_compiler.py:38: in compile
    mod = helper0.compile(code0)
pysph/sph/acceleration_eval_cython_helper.py:172: in compile
    self._module = self._ext_mod.load()
pysph/cpy/ext_module.py:266: in load
    self.build()
pysph/cpy/ext_module.py:244: in build
    setup_args={'script_args': script_args}
pysph/cpy/capture_stream.py:82: in __exit__
    capture.__exit__(type, value, tb)
pysph/cpy/capture_stream.py:47: in __exit__
    self._cache_output()
pysph/cpy/capture_stream.py:59: in _cache_output
    result = tmp_stream.read()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <encodings.ascii.IncrementalDecoder object at 0x7f22905966d0>
input = b'In file included from /usr/lib/python3/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1822,\n                ...: \xe2\x80\x98__pyx_v_fac\xe2\x80\x99 was declared here\n 4076 |   double __pyx_v_fac;\n      |          ^~~~~~~~~~~\n'
final = True

    def decode(self, input, final=False):
>       return codecs.ascii_decode(input, self.errors)[0]
E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 875: ordinal not in range(128)

/usr/lib/python3.7/encodings/ascii.py:26: UnicodeDecodeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx
running build_ext
cythoning /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx to /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp
building 'm_63eb3305b979b7a1b9a17a7d96aac627' extension
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base -I/usr/include/python3.7m -c /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/lib.linux-amd64-3.7/m_63eb3305b979b7a1b9a17a7d96aac627.cpython-37m-x86_64-linux-gnu.so
----------------------------- Captured stderr call -----------------------------
warning: pysph/base/nnps_base.pxd:25:13: 'INT_MAX' redeclared 
____________ TestInterpolator.test_should_work_with_ghost_particles ____________

self = <pysph.tools.tests.test_interpolator.TestInterpolator testMethod=test_should_work_with_ghost_particles>

    def test_should_work_with_ghost_particles(self):
        # Given
        pa = self._make_2d_grid()
        # Make half the particles ghosts.
        n = pa.get_number_of_particles()
        pa.tag[int(n//2):] = 1
        pa.align_particles()
    
        # When.
>       ip = Interpolator([pa], num_points=1000)

pysph/tools/tests/test_interpolator.py:137: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/tools/interpolator.py:129: in __init__
    self.set_domain(bounds, shape)
pysph/tools/interpolator.py:187: in set_domain
    self.set_interpolation_points(x, y, z)
pysph/tools/interpolator.py:169: in set_interpolation_points
    self._compile_acceleration_eval(arrays)
pysph/tools/interpolator.py:276: in _compile_acceleration_eval
    compiler.compile()
pysph/sph/sph_compiler.py:38: in compile
    mod = helper0.compile(code0)
pysph/sph/acceleration_eval_cython_helper.py:172: in compile
    self._module = self._ext_mod.load()
pysph/cpy/ext_module.py:266: in load
    self.build()
pysph/cpy/ext_module.py:244: in build
    setup_args={'script_args': script_args}
pysph/cpy/capture_stream.py:82: in __exit__
    capture.__exit__(type, value, tb)
pysph/cpy/capture_stream.py:47: in __exit__
    self._cache_output()
pysph/cpy/capture_stream.py:59: in _cache_output
    result = tmp_stream.read()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <encodings.ascii.IncrementalDecoder object at 0x7f2290791050>
input = b'In file included from /usr/lib/python3/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1822,\n                ...: \xe2\x80\x98__pyx_v_fac\xe2\x80\x99 was declared here\n 4076 |   double __pyx_v_fac;\n      |          ^~~~~~~~~~~\n'
final = True

    def decode(self, input, final=False):
>       return codecs.ascii_decode(input, self.errors)[0]
E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 875: ordinal not in range(128)

/usr/lib/python3.7/encodings/ascii.py:26: UnicodeDecodeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx
running build_ext
cythoning /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx to /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp
building 'm_63eb3305b979b7a1b9a17a7d96aac627' extension
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base -I/usr/include/python3.7m -c /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/lib.linux-amd64-3.7/m_63eb3305b979b7a1b9a17a7d96aac627.cpython-37m-x86_64-linux-gnu.so
----------------------------- Captured stderr call -----------------------------
warning: pysph/base/nnps_base.pxd:25:13: 'INT_MAX' redeclared 
____________ TestInterpolator.test_should_work_with_multiple_arrays ____________

self = <pysph.tools.tests.test_interpolator.TestInterpolator testMethod=test_should_work_with_multiple_arrays>

    def test_should_work_with_multiple_arrays(self):
        # Given
        pa1 = self._make_2d_grid()
        pa2 = self._make_2d_grid('solid')
        pa2.p[:] = 4.0
        pa2.u[:] = 0.2
    
        # When.
>       ip = Interpolator([pa1, pa2], num_points=1000)

pysph/tools/tests/test_interpolator.py:118: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/tools/interpolator.py:129: in __init__
    self.set_domain(bounds, shape)
pysph/tools/interpolator.py:187: in set_domain
    self.set_interpolation_points(x, y, z)
pysph/tools/interpolator.py:169: in set_interpolation_points
    self._compile_acceleration_eval(arrays)
pysph/tools/interpolator.py:276: in _compile_acceleration_eval
    compiler.compile()
pysph/sph/sph_compiler.py:38: in compile
    mod = helper0.compile(code0)
pysph/sph/acceleration_eval_cython_helper.py:172: in compile
    self._module = self._ext_mod.load()
pysph/cpy/ext_module.py:266: in load
    self.build()
pysph/cpy/ext_module.py:244: in build
    setup_args={'script_args': script_args}
pysph/cpy/capture_stream.py:82: in __exit__
    capture.__exit__(type, value, tb)
pysph/cpy/capture_stream.py:47: in __exit__
    self._cache_output()
pysph/cpy/capture_stream.py:59: in _cache_output
    result = tmp_stream.read()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <encodings.ascii.IncrementalDecoder object at 0x7f228f346c10>
input = b'In file included from /usr/lib/python3/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1822,\n                ...: \xe2\x80\x98__pyx_v_fac\xe2\x80\x99 was declared here\n 4080 |   double __pyx_v_fac;\n      |          ^~~~~~~~~~~\n'
final = True

    def decode(self, input, final=False):
>       return codecs.ascii_decode(input, self.errors)[0]
E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 875: ordinal not in range(128)

/usr/lib/python3.7/encodings/ascii.py:26: UnicodeDecodeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_201933e7c61a3d2eee2db11937988b59.pyx
running build_ext
cythoning /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_201933e7c61a3d2eee2db11937988b59.pyx to /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_201933e7c61a3d2eee2db11937988b59.cpp
building 'm_201933e7c61a3d2eee2db11937988b59' extension
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base -I/usr/include/python3.7m -c /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_201933e7c61a3d2eee2db11937988b59.cpp -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_201933e7c61a3d2eee2db11937988b59.o
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_201933e7c61a3d2eee2db11937988b59.o -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/lib.linux-amd64-3.7/m_201933e7c61a3d2eee2db11937988b59.cpython-37m-x86_64-linux-gnu.so
----------------------------- Captured stderr call -----------------------------
warning: pysph/base/nnps_base.pxd:25:13: 'INT_MAX' redeclared 
__________ TestInterpolator.test_that_set_interpolation_points_works ___________

self = <pysph.tools.tests.test_interpolator.TestInterpolator testMethod=test_that_set_interpolation_points_works>

    def test_that_set_interpolation_points_works(self):
        # Given
        pa = self._make_2d_grid()
>       ip = Interpolator([pa], num_points=1000)

pysph/tools/tests/test_interpolator.py:238: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/tools/interpolator.py:129: in __init__
    self.set_domain(bounds, shape)
pysph/tools/interpolator.py:187: in set_domain
    self.set_interpolation_points(x, y, z)
pysph/tools/interpolator.py:169: in set_interpolation_points
    self._compile_acceleration_eval(arrays)
pysph/tools/interpolator.py:276: in _compile_acceleration_eval
    compiler.compile()
pysph/sph/sph_compiler.py:38: in compile
    mod = helper0.compile(code0)
pysph/sph/acceleration_eval_cython_helper.py:172: in compile
    self._module = self._ext_mod.load()
pysph/cpy/ext_module.py:266: in load
    self.build()
pysph/cpy/ext_module.py:244: in build
    setup_args={'script_args': script_args}
pysph/cpy/capture_stream.py:82: in __exit__
    capture.__exit__(type, value, tb)
pysph/cpy/capture_stream.py:47: in __exit__
    self._cache_output()
pysph/cpy/capture_stream.py:59: in _cache_output
    result = tmp_stream.read()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <encodings.ascii.IncrementalDecoder object at 0x7f228fb4ff10>
input = b'In file included from /usr/lib/python3/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1822,\n                ...: \xe2\x80\x98__pyx_v_fac\xe2\x80\x99 was declared here\n 4076 |   double __pyx_v_fac;\n      |          ^~~~~~~~~~~\n'
final = True

    def decode(self, input, final=False):
>       return codecs.ascii_decode(input, self.errors)[0]
E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 875: ordinal not in range(128)

/usr/lib/python3.7/encodings/ascii.py:26: UnicodeDecodeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx
running build_ext
cythoning /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx to /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp
building 'm_63eb3305b979b7a1b9a17a7d96aac627' extension
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base -I/usr/include/python3.7m -c /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.cpp -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.o -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/lib.linux-amd64-3.7/m_63eb3305b979b7a1b9a17a7d96aac627.cpython-37m-x86_64-linux-gnu.so
----------------------------- Captured stderr call -----------------------------
warning: pysph/base/nnps_base.pxd:25:13: 'INT_MAX' redeclared 
_______________________ TestSPHEvaluator.test_evaluation _______________________

self = <pysph.tools.tests.test_sph_evaluator.TestSPHEvaluator testMethod=test_evaluation>

    def test_evaluation(self):
        # Given
        xd = [0.5]
        hd = self.src.h[:1]
        dest = get_particle_array(name='dest', x=xd, h=hd)
        sph_eval = SPHEvaluator(
>           arrays=[dest, self.src], equations=self.equations, dim=1
        )

pysph/tools/tests/test_sph_evaluator.py:27: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/tools/sph_evaluator.py:46: in __init__
    compiler.compile()
pysph/sph/sph_compiler.py:38: in compile
    mod = helper0.compile(code0)
pysph/sph/acceleration_eval_cython_helper.py:172: in compile
    self._module = self._ext_mod.load()
pysph/cpy/ext_module.py:266: in load
    self.build()
pysph/cpy/ext_module.py:244: in build
    setup_args={'script_args': script_args}
pysph/cpy/capture_stream.py:82: in __exit__
    capture.__exit__(type, value, tb)
pysph/cpy/capture_stream.py:47: in __exit__
    self._cache_output()
pysph/cpy/capture_stream.py:59: in _cache_output
    result = tmp_stream.read()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <encodings.ascii.IncrementalDecoder object at 0x7f228e05ead0>
input = b'In file included from /usr/lib/python3/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1822,\n                ...: \xe2\x80\x98__pyx_v_fac\xe2\x80\x99 was declared here\n 4049 |   double __pyx_v_fac;\n      |          ^~~~~~~~~~~\n'
final = True

    def decode(self, input, final=False):
>       return codecs.ascii_decode(input, self.errors)[0]
E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 875: ordinal not in range(128)

/usr/lib/python3.7/encodings/ascii.py:26: UnicodeDecodeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.pyx
running build_ext
cythoning /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.pyx to /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.cpp
building 'm_1473f070e1b4d1e184f4ef7b8ae43e27' extension
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base -I/usr/include/python3.7m -c /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.cpp -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.o
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.o -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/lib.linux-amd64-3.7/m_1473f070e1b4d1e184f4ef7b8ae43e27.cpython-37m-x86_64-linux-gnu.so
----------------------------- Captured stderr call -----------------------------
warning: pysph/base/nnps_base.pxd:25:13: 'INT_MAX' redeclared 
_____________ TestSPHEvaluator.test_evaluation_with_domain_manager _____________

self = <pysph.tools.tests.test_sph_evaluator.TestSPHEvaluator testMethod=test_evaluation_with_domain_manager>

    def test_evaluation_with_domain_manager(self):
        # Given
        xd = [0.0]
        hd = self.src.h[:1]
        dest = get_particle_array(name='dest', x=xd, h=hd)
        dx = self.dx
        dm = DomainManager(xmin=-dx/2, xmax=1.0+dx/2, periodic_in_x=True)
        sph_eval = SPHEvaluator(
            arrays=[dest, self.src], equations=self.equations, dim=1,
>           domain_manager=dm
        )

pysph/tools/tests/test_sph_evaluator.py:45: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/tools/sph_evaluator.py:46: in __init__
    compiler.compile()
pysph/sph/sph_compiler.py:38: in compile
    mod = helper0.compile(code0)
pysph/sph/acceleration_eval_cython_helper.py:172: in compile
    self._module = self._ext_mod.load()
pysph/cpy/ext_module.py:266: in load
    self.build()
pysph/cpy/ext_module.py:244: in build
    setup_args={'script_args': script_args}
pysph/cpy/capture_stream.py:82: in __exit__
    capture.__exit__(type, value, tb)
pysph/cpy/capture_stream.py:47: in __exit__
    self._cache_output()
pysph/cpy/capture_stream.py:59: in _cache_output
    result = tmp_stream.read()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <encodings.ascii.IncrementalDecoder object at 0x7f22905f2a90>
input = b'In file included from /usr/lib/python3/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1822,\n                ...: \xe2\x80\x98__pyx_v_fac\xe2\x80\x99 was declared here\n 4049 |   double __pyx_v_fac;\n      |          ^~~~~~~~~~~\n'
final = True

    def decode(self, input, final=False):
>       return codecs.ascii_decode(input, self.errors)[0]
E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 875: ordinal not in range(128)

/usr/lib/python3.7/encodings/ascii.py:26: UnicodeDecodeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.pyx
running build_ext
cythoning /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.pyx to /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.cpp
building 'm_1473f070e1b4d1e184f4ef7b8ae43e27' extension
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base -I/usr/include/python3.7m -c /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.cpp -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.o
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.o -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/lib.linux-amd64-3.7/m_1473f070e1b4d1e184f4ef7b8ae43e27.cpython-37m-x86_64-linux-gnu.so
----------------------------- Captured stderr call -----------------------------
warning: pysph/base/nnps_base.pxd:25:13: 'INT_MAX' redeclared 
________________ TestSPHEvaluator.test_updating_particle_arrays ________________

self = <pysph.tools.tests.test_sph_evaluator.TestSPHEvaluator testMethod=test_updating_particle_arrays>

    def test_updating_particle_arrays(self):
        # Given
        xd = [0.5]
        hd = self.src.h[:1]
        dest = get_particle_array(name='dest', x=xd, h=hd)
        sph_eval = SPHEvaluator(
>           [dest, self.src], equations=self.equations, dim=1
        )

pysph/tools/tests/test_sph_evaluator.py:60: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/tools/sph_evaluator.py:46: in __init__
    compiler.compile()
pysph/sph/sph_compiler.py:38: in compile
    mod = helper0.compile(code0)
pysph/sph/acceleration_eval_cython_helper.py:172: in compile
    self._module = self._ext_mod.load()
pysph/cpy/ext_module.py:266: in load
    self.build()
pysph/cpy/ext_module.py:244: in build
    setup_args={'script_args': script_args}
pysph/cpy/capture_stream.py:82: in __exit__
    capture.__exit__(type, value, tb)
pysph/cpy/capture_stream.py:47: in __exit__
    self._cache_output()
pysph/cpy/capture_stream.py:59: in _cache_output
    result = tmp_stream.read()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <encodings.ascii.IncrementalDecoder object at 0x7f228f187b10>
input = b'In file included from /usr/lib/python3/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1822,\n                ...: \xe2\x80\x98__pyx_v_fac\xe2\x80\x99 was declared here\n 4049 |   double __pyx_v_fac;\n      |          ^~~~~~~~~~~\n'
final = True

    def decode(self, input, final=False):
>       return codecs.ascii_decode(input, self.errors)[0]
E       UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 875: ordinal not in range(128)

/usr/lib/python3.7/encodings/ascii.py:26: UnicodeDecodeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.pyx
running build_ext
cythoning /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.pyx to /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.cpp
building 'm_1473f070e1b4d1e184f4ef7b8ae43e27' extension
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base -I/usr/include/python3.7m -c /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.cpp -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.o
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/home/antonio/debian/git/pysph=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/temp.linux-amd64-3.7/home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.o -o /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/build/lib.linux-amd64-3.7/m_1473f070e1b4d1e184f4ef7b8ae43e27.cpython-37m-x86_64-linux-gnu.so
----------------------------- Captured stderr call -----------------------------
warning: pysph/base/nnps_base.pxd:25:13: 'INT_MAX' redeclared 
=============================== warnings summary ===============================
/usr/lib/python3/dist-packages/_pytest/mark/structures.py:242
/usr/lib/python3/dist-packages/_pytest/mark/structures.py:242
  /usr/lib/python3/dist-packages/_pytest/mark/structures.py:242: PytestCollectionWarning: cannot collect 'test_all_backends' because it is not a function.
    def __call__(self, *args, **kwargs):

/usr/lib/python3/dist-packages/_pytest/mark/structures.py:334
  /usr/lib/python3/dist-packages/_pytest/mark/structures.py:334: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
    PytestUnknownMarkWarning,

/usr/lib/python3/dist-packages/_pytest/mark/structures.py:334
  /usr/lib/python3/dist-packages/_pytest/mark/structures.py:334: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
    PytestUnknownMarkWarning,

.pybuild/cpython3_3.7/build/pysph/base/tests/test_device_helper.py::TestDeviceHelper::test_get_number_of_particles[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_91e0575b640469ac68a80641e2db24b0.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/base/tests/test_device_helper.py::TestDeviceHelper::test_remove_particles[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_9195542b3857fa6f3b64c7d14cb67f62.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/base/tests/test_device_helper.py::TestDeviceHelper::test_remove_particles[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_2250e3389bc8cd7df0c03bbb346401c7.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/base/tests/test_device_helper.py::TestDeviceHelper::test_remove_particles[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_5cf03441ed5cac9d4491b85306d9754c.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/base/tests/test_device_helper.py::TestDeviceHelper::test_extract_particles[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_d34793424e1f2e13e0fecd1771d2a7c0.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestOctreeFor1DDataset::test_levels_in_tree
.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestOctreeFor1DDataset::test_parent_for_node
.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestOctreeFor1DDataset::test_plot_root
.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestOctreeFor1DDataset::test_sum_of_indices_lengths_equals_total_number_of_particles
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py:192: DeprecationWarning: object of type <class 'float'> cannot be safely interpreted as an integer.
    self.x = np.linspace(0, 1, num=N)

.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestCompressedOctreeFor1DDataset::test_levels_in_tree
.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestCompressedOctreeFor1DDataset::test_parent_for_node
.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestCompressedOctreeFor1DDataset::test_plot_root
.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestCompressedOctreeFor1DDataset::test_sum_of_indices_lengths_equals_total_number_of_particles
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py:238: DeprecationWarning: object of type <class 'float'> cannot be safely interpreted as an integer.
    self.x = np.linspace(0, 1, num=N)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_array.py::test_remove[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_54f9f75e0f45f43dc44f58b838c92f06.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_array.py::test_remove[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_70edfcc8cb29a9c84af9920f87d1b951.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_ext_module.py::TestExtModule::test_load_module
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /tmp/tmprrfhrw1s/m_59a539f0f96ecda080392b137f7ab38e.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_ext_module.py::TestExtModule::test_rebuild_when_dependencies_change
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /tmp/tmpuel4wdz6/m_59a539f0f96ecda080392b137f7ab38e.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_jit.py::TestAnnotationHelper::test_non_jit_call_as_call_arg
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/cpy/jit.py:125: UserWarning: 
  In code in line 5:
  
      return g(sin(a))
               ^
  
  
  Function called is not marked by the annotate decorator. Argument
  type defaulting to 'double'. If the type is not 'double', store
  the value in a variable of appropriate type and use the variable
  
    warnings.warn(msg)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_jit.py::TestAnnotationHelper::test_non_jit_call_in_return
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/cpy/jit.py:125: UserWarning: 
  In code in line 5:
  
      return sin(a)
             ^
  
  
  Function called is not marked by the annotate decorator. Argument
  type defaulting to 'double'. If the type is not 'double', store
  the value in a variable of appropriate type and use the variable
  
    warnings.warn(msg)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_low_level.py::TestCython::test_cython_code_with_return_and_nested_call
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_74d45969c7f812097ca91e32d292c9b5.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_low_level.py::TestCython::test_cython_with_externs
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_d99220a885a6ea16f2900812b6af8d9c.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_elementwise_works_with_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_7e50ca36a38cc6e5bdb25fdd039133ca.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_reduction_works_neutral_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_9a2c7ebf648842a29b3befe6f44f4a0e.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_reduction_works_with_external_func_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_3913705ba17110bad123d7fec52d1fa6.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_reduction_works_with_map_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_ea8309077f21b49c2e55c4fff007c57d.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_reduction_works_without_map_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_159c0a504e13152c82e3025f68c52a97.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_scan_last_item_cython_parallel
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_9c51f8a3ca12ef2fff590c265f372100.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_scan_works_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_17aa11b78980ae118499ef3b67528262.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_scan_works_cython_parallel
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_21c9822c18457511d7d6c48f60636280.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_scan_works_with_external_func_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_303d8bc784c0fe041df30586ae859245.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_scan_works_with_external_func_cython_parallel
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_116d33965d4a335e1a3a722a1aa511aa.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_segmented_scan_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_678b9adc1c242c435c53cf38d505d0ea.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_segmented_scan_cython_parallel
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_9824ea52a3de157665153f82c134736f.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_unique_scan_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_10cce52efb3dc468ae85f83fe5c9ea40.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_unique_scan_cython_parallel
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_c5d750f49c73249649bcbfb5023e25d4.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtilsJIT::test_reduction_works_with_external_func_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_5662c6866ef7f5d806bb92c47362ed37.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtilsJIT::test_reduction_works_with_map_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_5c107ae13a11a41f9f03e8f69b3ce2c5.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtilsJIT::test_scan_works_with_external_func_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_9c2467add4a88d72ce0a1f3a67fb447c.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtilsJIT::test_scan_works_with_external_func_cython_parallel
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-amd64/m_3ce863393ce97030d92dd85a8bb3e38d.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_call_initialize_pair
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1ca2f6f3c87a0aced1147a42323c9e0b.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_call_pre_post_functions_in_group
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_380d33049e60eeba93e7e250e099d136.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_call_py_initialize
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_cc7e6f6747bcab4616b25771b58a0003.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_handle_repeated_helper_functions
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_605c1c7cab1f607d6ddbe4596f54170c.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_iterate_iterated_group
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_e6c3637f476bcecde9ab79fac8a493bc.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_iterate_nested_groups
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_2c18826a975a61c409256394bf7c334c.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_not_iterate_normal_group
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_b8d4d327d2a1ea45d3cd9acaaeb31eae.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_run_reduce
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_866e76262ba81f9c5715e8ded2fe3a59.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_support_constants
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_a39281cc879a9b6273edcdbed96961e2.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_support_loop_all_and_loop
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_6602b7587dbde3a4a1b3e89c9c525b5d.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_support_loop_all_and_loop
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_e689702faaab40fa4ea1f65ea53a7bef.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_work_with_non_double_arrays
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_eb65dab7322c93aca79da89ad850753b.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_integrator.py::TestLeapFrogIntegrator::test_integrator_calls_only_py_when_no_stage
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_0f8ccecffa92800f1e1c60caf7d61af8.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_integrator.py::TestLeapFrogIntegrator::test_integrator_calls_py_stage1
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_128d15c2d9e0ca930a88ca79cf433a70.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_integrator.py::TestLeapFrogIntegrator::test_integrator_calls_py_stage1_stage2
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_bb0974af10f5bc3963019f23b881db5f.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_integrator.py::TestLeapFrogIntegrator::test_leapfrog
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_9eceb868e252ed6a0ece6395929a8bcb.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_integrator.py::TestPEFRLIntegrator::test_pefrl
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_172dfa9e1631ea7d9482824bc06d14ec.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection2D::test_crksph
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_f24545a9be7b831ac0425b878177cb20.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection2D::test_crksph_symmetric
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_437e8cc88a6fd6e3c098b821810973a7.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection2D::test_gradient_correction
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_6606a5783a6859b7a65db9ef38136431.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection2D::test_mixed_gradient_correction
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_25d6e5546a9f5073e1f4f91086e35c77.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_band_matrix
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:63: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_band_matrix
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:64: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_band_matrix
.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_dense_matrix
.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_general_matrix
.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_matrix
.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_positivedefinite_Matrix
.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_tridiagonal_matrix
.pybuild/cpython3_3.7/build/pysph/tools/tests/test_geometry.py::TestGeometry::test_rotate
.pybuild/cpython3_3.7/build/pysph/tools/tests/test_geometry.py::TestGeometry::test_rotate
  /usr/lib/python3/dist-packages/numpy/matrixlib/defmatrix.py:71: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    return matrix(data, dtype=dtype, copy=False)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_dense_matrix
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:74: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_dense_matrix
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:75: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_general_matrix
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:52: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_general_matrix
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:53: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_matrix
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:97: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_matrix
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:98: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_positivedefinite_Matrix
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:109: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_positivedefinite_Matrix
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:110: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_tridiagonal_matrix
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:86: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_tridiagonal_matrix
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:87: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_multi_group_integrator.py::TestMultiGroupIntegrator::test_different_accels_per_integrator
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_b02855c53973b41cd416728f4f3d5a63.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_multi_group_integrator.py::TestMultiGroupIntegrator::test_different_accels_per_integrator
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_646dd531e33008b11fdce652ba19a184.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/tools/tests/test_geometry.py::TestGeometry::test_rotate
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/tools/geometry.py:176: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    rotation_matrix = matrix_exp(np.matrix(matrix))

.pybuild/cpython3_3.7/build/pysph/tools/tests/test_geometry.py::TestGeometry::test_rotate
  /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/build/pysph/tools/tests/test_geometry.py:76: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    rotation_matrix = G.matrix_exp(np.matrix(mat))

.pybuild/cpython3_3.7/build/pysph/tools/tests/test_interpolator.py::TestInterpolator::test_should_be_able_to_update_particle_arrays
.pybuild/cpython3_3.7/build/pysph/tools/tests/test_interpolator.py::TestInterpolator::test_should_correctly_update_domain
.pybuild/cpython3_3.7/build/pysph/tools/tests/test_interpolator.py::TestInterpolator::test_should_work_on_2d_data
.pybuild/cpython3_3.7/build/pysph/tools/tests/test_interpolator.py::TestInterpolator::test_should_work_with_changed_data
.pybuild/cpython3_3.7/build/pysph/tools/tests/test_interpolator.py::TestInterpolator::test_should_work_with_explicit_points_in_constructor
.pybuild/cpython3_3.7/build/pysph/tools/tests/test_interpolator.py::TestInterpolator::test_should_work_with_explicit_points_without_z
.pybuild/cpython3_3.7/build/pysph/tools/tests/test_interpolator.py::TestInterpolator::test_should_work_with_ghost_particles
.pybuild/cpython3_3.7/build/pysph/tools/tests/test_interpolator.py::TestInterpolator::test_that_set_interpolation_points_works
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_63eb3305b979b7a1b9a17a7d96aac627.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/tools/tests/test_interpolator.py::TestInterpolator::test_should_work_when_arrays_have_different_props
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_65f5bdc760e2a27615a73da5488fbd36.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/tools/tests/test_interpolator.py::TestInterpolator::test_should_work_with_multiple_arrays
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_201933e7c61a3d2eee2db11937988b59.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/tools/tests/test_sph_evaluator.py::TestSPHEvaluator::test_evaluation
.pybuild/cpython3_3.7/build/pysph/tools/tests/test_sph_evaluator.py::TestSPHEvaluator::test_evaluation_with_domain_manager
.pybuild/cpython3_3.7/build/pysph/tools/tests/test_sph_evaluator.py::TestSPHEvaluator::test_updating_particle_arrays
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /home/antonio/debian/git/pysph/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-amd64/m_1473f070e1b4d1e184f4ef7b8ae43e27.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

-- Docs: https://docs.pytest.org/en/latest/warnings.html
= 13 failed, 680 passed, 283 skipped, 58 deselected, 101 warnings in 454.20 seconds =

Unit test failure on i386

Unit tests seems to fail on i386 with Python 3.7 for PySPH v1.0a6.

Th test has been run using the following command:

$ python3.7 -m pytest -k 'not test_sph_evaluator and not TestInterpolator'

Test resulst are reported below:

============================= test session starts ==============================
platform linux -- Python 3.7.4+, pytest-4.6.5, py-1.8.0, pluggy-0.12.0
rootdir: /<<PKGBUILDDIR>>, inifile: tox.ini
collected 1034 items / 71 deselected / 963 selected

pysph/base/tests/test_carray.py ........................                 [  2%]
pysph/base/tests/test_device_helper.py .ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss [  5%]
FssFss.ss.ss.ssFss                                                       [  7%]
pysph/base/tests/test_domain_manager.py .........                        [  8%]
pysph/base/tests/test_kernel.py ........................................ [ 12%]
.................................................................        [ 19%]
pysph/base/tests/test_linalg3.py .......                                 [ 20%]
pysph/base/tests/test_neighbor_cache.py ...                              [ 20%]
pysph/base/tests/test_nnps.py .......ssssssssss......................... [ 25%]
....................................................ssssssssssssssssssss [ 32%]
sssssssssssssssssssssssssssssssssssssssssss............................. [ 39%]
...................                                                      [ 41%]
pysph/base/tests/test_octree.py .....s...s...s...s...s...s...s...s.      [ 45%]
pysph/base/tests/test_particle_array.py ..............................ss [ 48%]
ssssssssssssssssssssssssssssssssssssssssssssssssss                       [ 54%]
pysph/base/tests/test_periodic_nnps.py ...........                       [ 55%]
pysph/base/tests/test_reduce_array.py ......                             [ 55%]
pysph/cpy/tests/test_array.py .ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss       [ 59%]
pysph/cpy/tests/test_ast_utils.py .......                                [ 60%]
pysph/cpy/tests/test_capture_stream.py .....                             [ 60%]
pysph/cpy/tests/test_config.py ............                              [ 62%]
pysph/cpy/tests/test_cython_generator.py ...............                 [ 63%]
pysph/cpy/tests/test_ext_module.py .......                               [ 64%]
pysph/cpy/tests/test_jit.py .................                            [ 66%]
pysph/cpy/tests/test_low_level.py ssss..                                 [ 66%]
pysph/cpy/tests/test_parallel.py s.ss.ss.ss.ss.ss.ss..ss..ss..ss..ss.ss. [ 70%]
ss.ss.ss.ss.ss..ss..ss..ss..s                                            [ 73%]
pysph/cpy/tests/test_translator.py ..................................... [ 77%]
............                                                             [ 78%]
pysph/cpy/tests/test_transpiler.py ..                                    [ 79%]
pysph/cpy/tests/test_types.py .......                                    [ 79%]
pysph/examples/tests/test_riemann_solver.py s........                    [ 80%]
pysph/parallel/tests/test_openmp.py .                                    [ 80%]
pysph/parallel/tests/test_parallel.py ssssss                             [ 81%]
pysph/parallel/tests/test_parallel_run.py s                              [ 81%]
pysph/solver/tests/test_application.py ..                                [ 81%]
pysph/solver/tests/test_solver.py ..                                     [ 82%]
pysph/solver/tests/test_solver_utils.py ......sssss.                     [ 83%]
pysph/sph/tests/test_acceleration_eval.py ....................ssssssssss [ 86%]
ssssssssssssssssss                                                       [ 88%]
pysph/sph/tests/test_acceleration_eval_cython_helper.py ...              [ 88%]
pysph/sph/tests/test_equations.py ...............                        [ 90%]
pysph/sph/tests/test_integrator.py .........sss                          [ 91%]
pysph/sph/tests/test_integrator_cython_helper.py .                       [ 91%]
pysph/sph/tests/test_integrator_step.py ..                               [ 91%]
pysph/sph/tests/test_kernel_corrections.py ................              [ 93%]
pysph/sph/tests/test_linalg.py ...........                               [ 94%]
pysph/sph/tests/test_multi_group_integrator.py .s                        [ 94%]
pysph/sph/tests/test_riemann_solver.py .............                     [ 96%]
pysph/sph/tests/test_scheme.py .                                         [ 96%]
pysph/sph/tests/test_simple_inlet_outlet.py .......                      [ 96%]
pysph/tools/tests/test_geometry.py .............s...                     [ 98%]
pysph/tools/tests/test_geometry_stl.py .....F..                          [ 99%]
pysph/tools/tests/test_interpolator.py ...                               [ 99%]
pyzoltan/core/tests/test_zoltan.py ss                                    [100%]

=================================== FAILURES ===================================
________________ TestDeviceHelper.test_remove_particles[cython] ________________

obj = array([         0,          0,   -2097152, 1106247679], dtype=int32)
method = 'take', args = (array([0, 3], dtype=uint32),)
kwds = {'axis': None, 'mode': 'raise', 'out': None}

    def _wrapfunc(obj, method, *args, **kwds):
        try:
>           return getattr(obj, method)(*args, **kwds)
E           TypeError: Cannot cast array data from dtype('uint32') to dtype('int32') according to the rule 'safe'

/usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:56: TypeError

During handling of the above exception, another exception occurred:

self = <pysph.base.tests.test_device_helper.TestDeviceHelper object at 0xf38632ac>
backend = 'cython'

    @test_all_backends
    def test_remove_particles(self, backend):
        check_import(backend)
        self.setup()
        # Given
        pa = self.pa
        h = DeviceHelper(pa, backend=backend)
    
        # When
        pa.set_device_helper(h)
        h.resize(4)
        h.x.set(np.array([2.0, 3.0, 4.0, 5.0], h.x.dtype))
    
        indices = np.array([1, 2], dtype=np.uint32)
        indices = array.to_device(indices, backend=backend)
    
>       h.remove_particles(indices)

pysph/base/tests/test_device_helper.py:289: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/base/device_helper.py:400: in remove_particles
    self._remove_particles_bool(if_remove)
pysph/base/device_helper.py:343: in _remove_particles_bool
    self._data[prop].align(s_index)
pysph/cpy/array.py:440: in align
    backend=self.backend))
pysph/cpy/array.py:220: in take
    out = np.take(ary.dev, indices.dev)
/usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:189: in take
    return _wrapfunc(a, 'take', indices, axis=axis, out=out, mode=mode)
/usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:66: in _wrapfunc
    return _wrapit(obj, method, *args, **kwds)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

obj = array([         0,          0,   -2097152, 1106247679], dtype=int32)
method = 'take', args = (array([0, 3], dtype=uint32),)
kwds = {'axis': None, 'mode': 'raise', 'out': None}
wrap = <built-in method __array_wrap__ of numpy.ndarray object at 0xf20d5278>

    def _wrapit(obj, method, *args, **kwds):
        try:
            wrap = obj.__array_wrap__
        except AttributeError:
            wrap = None
>       result = getattr(asarray(obj), method)(*args, **kwds)
E       TypeError: Cannot cast array data from dtype('uint32') to dtype('int32') according to the rule 'safe'

/usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:46: TypeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_9195542b3857fa6f3b64c7d14cb67f62.pyx
running build_ext
cythoning /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_9195542b3857fa6f3b64c7d14cb67f62.pyx to /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_9195542b3857fa6f3b64c7d14cb67f62.cpp
building 'm_9195542b3857fa6f3b64c7d14cb67f62' extension
i686-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/usr/include/python3.7m -c /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_9195542b3857fa6f3b64c7d14cb67f62.cpp -o /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/build/temp.linux-i386-3.7/<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_9195542b3857fa6f3b64c7d14cb67f62.o -fopenmp
i686-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/build/temp.linux-i386-3.7/<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_9195542b3857fa6f3b64c7d14cb67f62.o -o /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/build/lib.linux-i386-3.7/m_9195542b3857fa6f3b64c7d14cb67f62.cpython-37m-i386-linux-gnu.so -fopenmp
Compiling code at: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_2250e3389bc8cd7df0c03bbb346401c7.pyx
running build_ext
cythoning /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_2250e3389bc8cd7df0c03bbb346401c7.pyx to /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_2250e3389bc8cd7df0c03bbb346401c7.cpp
building 'm_2250e3389bc8cd7df0c03bbb346401c7' extension
i686-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/usr/include/python3.7m -c /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_2250e3389bc8cd7df0c03bbb346401c7.cpp -o /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/build/temp.linux-i386-3.7/<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_2250e3389bc8cd7df0c03bbb346401c7.o -fopenmp
i686-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/build/temp.linux-i386-3.7/<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_2250e3389bc8cd7df0c03bbb346401c7.o -o /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/build/lib.linux-i386-3.7/m_2250e3389bc8cd7df0c03bbb346401c7.cpython-37m-i386-linux-gnu.so -fopenmp
Compiling code at: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_5cf03441ed5cac9d4491b85306d9754c.pyx
running build_ext
cythoning /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_5cf03441ed5cac9d4491b85306d9754c.pyx to /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_5cf03441ed5cac9d4491b85306d9754c.cpp
building 'm_5cf03441ed5cac9d4491b85306d9754c' extension
i686-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/usr/include/python3.7m -c /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_5cf03441ed5cac9d4491b85306d9754c.cpp -o /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/build/temp.linux-i386-3.7/<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_5cf03441ed5cac9d4491b85306d9754c.o -fopenmp
i686-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/build/temp.linux-i386-3.7/<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_5cf03441ed5cac9d4491b85306d9754c.o -o /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/build/lib.linux-i386-3.7/m_5cf03441ed5cac9d4491b85306d9754c.cpython-37m-i386-linux-gnu.so -fopenmp
____________ TestDeviceHelper.test_remove_tagged_particles[cython] _____________

obj = array([0, 0, 1, 0, 1], dtype=int32), method = 'take'
args = (array([0, 1, 3], dtype=uint32),)
kwds = {'axis': None, 'mode': 'raise', 'out': None}

    def _wrapfunc(obj, method, *args, **kwds):
        try:
>           return getattr(obj, method)(*args, **kwds)
E           TypeError: Cannot cast array data from dtype('uint32') to dtype('int32') according to the rule 'safe'

/usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:56: TypeError

During handling of the above exception, another exception occurred:

self = <pysph.base.tests.test_device_helper.TestDeviceHelper object at 0xf2fa328c>
backend = 'cython'

    @test_all_backends
    def test_remove_tagged_particles(self, backend):
        check_import(backend)
        self.setup()
        # Given
        pa = self.pa
        h = DeviceHelper(pa, backend=backend)
    
        # When
        pa.set_device_helper(h)
        h.resize(5)
        h.x.set(np.array([2.0, 3.0, 4.0, 5.0, 6.0], h.x.dtype))
        h.tag.set(np.array([0, 0, 1, 0, 1], h.tag.dtype))
    
>       h.remove_tagged_particles(1)

pysph/base/tests/test_device_helper.py:308: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/base/device_helper.py:415: in remove_tagged_particles
    self._remove_particles_bool(if_remove)
pysph/base/device_helper.py:343: in _remove_particles_bool
    self._data[prop].align(s_index)
pysph/cpy/array.py:440: in align
    backend=self.backend))
pysph/cpy/array.py:220: in take
    out = np.take(ary.dev, indices.dev)
/usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:189: in take
    return _wrapfunc(a, 'take', indices, axis=axis, out=out, mode=mode)
/usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:66: in _wrapfunc
    return _wrapit(obj, method, *args, **kwds)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

obj = array([0, 0, 1, 0, 1], dtype=int32), method = 'take'
args = (array([0, 1, 3], dtype=uint32),)
kwds = {'axis': None, 'mode': 'raise', 'out': None}
wrap = <built-in method __array_wrap__ of numpy.ndarray object at 0xf27ff3b8>

    def _wrapit(obj, method, *args, **kwds):
        try:
            wrap = obj.__array_wrap__
        except AttributeError:
            wrap = None
>       result = getattr(asarray(obj), method)(*args, **kwds)
E       TypeError: Cannot cast array data from dtype('uint32') to dtype('int32') according to the rule 'safe'

/usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:46: TypeError
----------------------------- Captured stdout call -----------------------------
Precompiled code from: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_2250e3389bc8cd7df0c03bbb346401c7.pyx
Precompiled code from: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_5cf03441ed5cac9d4491b85306d9754c.pyx
_______________ TestDeviceHelper.test_extract_particles[cython] ________________

obj = array([0, 0, 0, 0], dtype=int32), method = 'take'
args = (array([1, 2], dtype=uint32),)
kwds = {'axis': None, 'mode': 'raise', 'out': None}

    def _wrapfunc(obj, method, *args, **kwds):
        try:
>           return getattr(obj, method)(*args, **kwds)
E           TypeError: Cannot cast array data from dtype('uint32') to dtype('int32') according to the rule 'safe'

/usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:56: TypeError

During handling of the above exception, another exception occurred:

self = <pysph.base.tests.test_device_helper.TestDeviceHelper object at 0xf25e82ac>
backend = 'cython'

    @test_all_backends
    def test_extract_particles(self, backend):
        check_import(backend)
        self.setup()
        # Given
        pa = get_particle_array(name='f', x=[0.0, 1.0, 2.0, 3.0],
                                m=1.0, rho=2.0)
        h = DeviceHelper(pa, backend=backend)
        pa.set_device_helper(h)
    
        # When
        indices = np.array([1, 2], dtype=np.uint32)
        indices = array.to_device(indices, backend=backend)
    
>       result_pa = h.extract_particles(indices)

pysph/base/tests/test_device_helper.py:376: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pysph/base/device_helper.py:584: in extract_particles
    backend=self.backend))
pysph/cpy/array.py:220: in take
    out = np.take(ary.dev, indices.dev)
/usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:189: in take
    return _wrapfunc(a, 'take', indices, axis=axis, out=out, mode=mode)
/usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:66: in _wrapfunc
    return _wrapit(obj, method, *args, **kwds)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

obj = array([0, 0, 0, 0], dtype=int32), method = 'take'
args = (array([1, 2], dtype=uint32),)
kwds = {'axis': None, 'mode': 'raise', 'out': None}
wrap = <built-in method __array_wrap__ of numpy.ndarray object at 0xf2c50570>

    def _wrapit(obj, method, *args, **kwds):
        try:
            wrap = obj.__array_wrap__
        except AttributeError:
            wrap = None
>       result = getattr(asarray(obj), method)(*args, **kwds)
E       TypeError: Cannot cast array data from dtype('uint32') to dtype('int32') according to the rule 'safe'

/usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:46: TypeError
----------------------------- Captured stdout call -----------------------------
Compiling code at: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_d34793424e1f2e13e0fecd1771d2a7c0.pyx
running build_ext
cythoning /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_d34793424e1f2e13e0fecd1771d2a7c0.pyx to /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_d34793424e1f2e13e0fecd1771d2a7c0.cpp
building 'm_d34793424e1f2e13e0fecd1771d2a7c0' extension
i686-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -O2 -fdebug-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/lib/python3/dist-packages/numpy/core/include -I/usr/include/python3.7m -c /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_d34793424e1f2e13e0fecd1771d2a7c0.cpp -o /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/build/temp.linux-i386-3.7/<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_d34793424e1f2e13e0fecd1771d2a7c0.o -fopenmp
i686-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-z,relro -Wl,-z,now -g -O2 -fdebug-prefix-map=/<<PKGBUILDDIR>>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/build/temp.linux-i386-3.7/<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_d34793424e1f2e13e0fecd1771d2a7c0.o -o /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/build/lib.linux-i386-3.7/m_d34793424e1f2e13e0fecd1771d2a7c0.cpython-37m-i386-linux-gnu.so -fopenmp
______________________ TestGeometry.test_get_stl_surface _______________________

self = <pysph.tools.tests.test_geometry_stl.TestGeometry testMethod=test_get_stl_surface>

    def test_get_stl_surface(self):
        """Check if stl surface is generated correctly for unit cube"""
        cube_fname = self._generate_cube_stl()
        h = 0.1
        x, y, z = G.get_stl_surface(cube_fname, h, h, 1)
>       self._cube_assert(x, y, z, h)

pysph/tools/tests/test_geometry_stl.py:176: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <pysph.tools.tests.test_geometry_stl.TestGeometry testMethod=test_get_stl_surface>
x = array([0., 0., 0., ..., 1., 1., 1.])
y = array([-0.1, -0.1, -0.1, ...,  1. ,  1. ,  1. ])
z = array([0. , 0.1, 0.2, ..., 0.8, 0.9, 1. ]), h = 0.1

    def _cube_assert(self, x, y, z, h):
        """Check if x,y,z lie within surface of thickness `h` of a unit cube"""
        def surface1(x, y, z): return min(abs(x), abs(1 - x)) < h and \
            y > -h and y < 1 + h and z > -h and z < 1 + h
    
        def on_surface(x, y, z): return surface1(x, y, z) or \
            surface1(y, x, z) or surface1(z, x, y)
    
        for i in range(x.shape[0]):
>           assert on_surface(x[i], y[i], z[i])
E           AssertionError: assert False
E            +  where False = <function TestGeometry._cube_assert.<locals>.on_surface at 0xf20b2fa4>(0.0, -0.1, 0.0)

pysph/tools/tests/test_geometry_stl.py:162: AssertionError
=============================== warnings summary ===============================
/usr/lib/python3/dist-packages/_pytest/mark/structures.py:242
/usr/lib/python3/dist-packages/_pytest/mark/structures.py:242
  /usr/lib/python3/dist-packages/_pytest/mark/structures.py:242: PytestCollectionWarning: cannot collect 'test_all_backends' because it is not a function.
    def __call__(self, *args, **kwargs):

/usr/lib/python3/dist-packages/_pytest/mark/structures.py:334
  /usr/lib/python3/dist-packages/_pytest/mark/structures.py:334: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
    PytestUnknownMarkWarning,

/usr/lib/python3/dist-packages/_pytest/mark/structures.py:334
  /usr/lib/python3/dist-packages/_pytest/mark/structures.py:334: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
    PytestUnknownMarkWarning,

.pybuild/cpython3_3.7/build/pysph/base/tests/test_device_helper.py::TestDeviceHelper::test_get_number_of_particles[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_91e0575b640469ac68a80641e2db24b0.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/base/tests/test_device_helper.py::TestDeviceHelper::test_remove_particles[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_9195542b3857fa6f3b64c7d14cb67f62.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/base/tests/test_device_helper.py::TestDeviceHelper::test_remove_particles[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_2250e3389bc8cd7df0c03bbb346401c7.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/base/tests/test_device_helper.py::TestDeviceHelper::test_remove_particles[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_5cf03441ed5cac9d4491b85306d9754c.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/base/tests/test_device_helper.py::TestDeviceHelper::test_extract_particles[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_d34793424e1f2e13e0fecd1771d2a7c0.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestOctreeFor1DDataset::test_levels_in_tree
.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestOctreeFor1DDataset::test_parent_for_node
.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestOctreeFor1DDataset::test_plot_root
.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestOctreeFor1DDataset::test_sum_of_indices_lengths_equals_total_number_of_particles
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py:192: DeprecationWarning: object of type <class 'float'> cannot be safely interpreted as an integer.
    self.x = np.linspace(0, 1, num=N)

.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestCompressedOctreeFor1DDataset::test_levels_in_tree
.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestCompressedOctreeFor1DDataset::test_parent_for_node
.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestCompressedOctreeFor1DDataset::test_plot_root
.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py::TestCompressedOctreeFor1DDataset::test_sum_of_indices_lengths_equals_total_number_of_particles
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/base/tests/test_octree.py:238: DeprecationWarning: object of type <class 'float'> cannot be safely interpreted as an integer.
    self.x = np.linspace(0, 1, num=N)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_array.py::test_remove[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_54f9f75e0f45f43dc44f58b838c92f06.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_array.py::test_remove[cython]
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_70edfcc8cb29a9c84af9920f87d1b951.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_ext_module.py::TestExtModule::test_load_module
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /tmp/tmp8655ndf6/m_59a539f0f96ecda080392b137f7ab38e.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_ext_module.py::TestExtModule::test_rebuild_when_dependencies_change
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /tmp/tmp0sb88tdv/m_59a539f0f96ecda080392b137f7ab38e.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_jit.py::TestAnnotationHelper::test_non_jit_call_as_call_arg
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/cpy/jit.py:125: UserWarning: 
  In code in line 5:
  
      return g(sin(a))
               ^
  
  
  Function called is not marked by the annotate decorator. Argument
  type defaulting to 'double'. If the type is not 'double', store
  the value in a variable of appropriate type and use the variable
  
    warnings.warn(msg)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_jit.py::TestAnnotationHelper::test_non_jit_call_in_return
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/cpy/jit.py:125: UserWarning: 
  In code in line 5:
  
      return sin(a)
             ^
  
  
  Function called is not marked by the annotate decorator. Argument
  type defaulting to 'double'. If the type is not 'double', store
  the value in a variable of appropriate type and use the variable
  
    warnings.warn(msg)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_low_level.py::TestCython::test_cython_code_with_return_and_nested_call
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_74d45969c7f812097ca91e32d292c9b5.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_low_level.py::TestCython::test_cython_with_externs
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_d99220a885a6ea16f2900812b6af8d9c.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_elementwise_works_with_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_7e50ca36a38cc6e5bdb25fdd039133ca.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_reduction_works_neutral_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_9a2c7ebf648842a29b3befe6f44f4a0e.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_reduction_works_with_external_func_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_3913705ba17110bad123d7fec52d1fa6.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_reduction_works_with_map_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_ea8309077f21b49c2e55c4fff007c57d.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_reduction_works_without_map_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_159c0a504e13152c82e3025f68c52a97.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_scan_last_item_cython_parallel
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_9c51f8a3ca12ef2fff590c265f372100.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_scan_works_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_17aa11b78980ae118499ef3b67528262.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_scan_works_cython_parallel
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_21c9822c18457511d7d6c48f60636280.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_scan_works_with_external_func_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_303d8bc784c0fe041df30586ae859245.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_scan_works_with_external_func_cython_parallel
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_116d33965d4a335e1a3a722a1aa511aa.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_segmented_scan_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_678b9adc1c242c435c53cf38d505d0ea.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_segmented_scan_cython_parallel
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_9824ea52a3de157665153f82c134736f.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_unique_scan_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_10cce52efb3dc468ae85f83fe5c9ea40.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtils::test_unique_scan_cython_parallel
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_c5d750f49c73249649bcbfb5023e25d4.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtilsJIT::test_reduction_works_with_external_func_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_5662c6866ef7f5d806bb92c47362ed37.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtilsJIT::test_reduction_works_with_map_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_5c107ae13a11a41f9f03e8f69b3ce2c5.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtilsJIT::test_scan_works_with_external_func_cython
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_9c2467add4a88d72ce0a1f3a67fb447c.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/cpy/tests/test_parallel.py::TestParallelUtilsJIT::test_scan_works_with_external_func_cython_parallel
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.cpy/source/py3.7-linux-i386/m_3ce863393ce97030d92dd85a8bb3e38d.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_call_initialize_pair
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_1ca2f6f3c87a0aced1147a42323c9e0b.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_call_pre_post_functions_in_group
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_380d33049e60eeba93e7e250e099d136.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_call_py_initialize
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_cc7e6f6747bcab4616b25771b58a0003.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_handle_repeated_helper_functions
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_605c1c7cab1f607d6ddbe4596f54170c.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_iterate_iterated_group
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_e6c3637f476bcecde9ab79fac8a493bc.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_iterate_nested_groups
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_2c18826a975a61c409256394bf7c334c.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_not_iterate_normal_group
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_b8d4d327d2a1ea45d3cd9acaaeb31eae.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_run_reduce
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_866e76262ba81f9c5715e8ded2fe3a59.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_support_constants
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_a39281cc879a9b6273edcdbed96961e2.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_support_loop_all_and_loop
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_6602b7587dbde3a4a1b3e89c9c525b5d.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_support_loop_all_and_loop
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_e689702faaab40fa4ea1f65ea53a7bef.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_work_with_non_double_arrays
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_eb65dab7322c93aca79da89ad850753b.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_integrator.py::TestLeapFrogIntegrator::test_integrator_calls_only_py_when_no_stage
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_0f8ccecffa92800f1e1c60caf7d61af8.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_integrator.py::TestLeapFrogIntegrator::test_integrator_calls_py_stage1
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_128d15c2d9e0ca930a88ca79cf433a70.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_integrator.py::TestLeapFrogIntegrator::test_integrator_calls_py_stage1_stage2
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_bb0974af10f5bc3963019f23b881db5f.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_integrator.py::TestLeapFrogIntegrator::test_leapfrog
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_9eceb868e252ed6a0ece6395929a8bcb.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_integrator.py::TestPEFRLIntegrator::test_pefrl
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_172dfa9e1631ea7d9482824bc06d14ec.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection2D::test_crksph
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_f24545a9be7b831ac0425b878177cb20.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection2D::test_crksph_symmetric
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_437e8cc88a6fd6e3c098b821810973a7.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection2D::test_gradient_correction
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_6606a5783a6859b7a65db9ef38136431.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection2D::test_mixed_gradient_correction
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_25d6e5546a9f5073e1f4f91086e35c77.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_band_matrix
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:63: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_band_matrix
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:64: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_band_matrix
.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_dense_matrix
.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_general_matrix
.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_matrix
.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_positivedefinite_Matrix
.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_tridiagonal_matrix
.pybuild/cpython3_3.7/build/pysph/tools/tests/test_geometry.py::TestGeometry::test_rotate
.pybuild/cpython3_3.7/build/pysph/tools/tests/test_geometry.py::TestGeometry::test_rotate
  /usr/lib/python3/dist-packages/numpy/matrixlib/defmatrix.py:71: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    return matrix(data, dtype=dtype, copy=False)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_dense_matrix
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:74: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_dense_matrix
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:75: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_general_matrix
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:52: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_general_matrix
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:53: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_matrix
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:97: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_matrix
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:98: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_positivedefinite_Matrix
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:109: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_positivedefinite_Matrix
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:110: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_tridiagonal_matrix
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:86: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_tridiagonal_matrix
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/sph/tests/test_linalg.py:87: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_multi_group_integrator.py::TestMultiGroupIntegrator::test_different_accels_per_integrator
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_b02855c53973b41cd416728f4f3d5a63.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/sph/tests/test_multi_group_integrator.py::TestMultiGroupIntegrator::test_different_accels_per_integrator
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:367: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/.pysph/source/py3.7-linux-i386/m_646dd531e33008b11fdce652ba19a184.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.7/build/pysph/tools/tests/test_geometry.py::TestGeometry::test_rotate
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/tools/geometry.py:176: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    rotation_matrix = matrix_exp(np.matrix(matrix))

.pybuild/cpython3_3.7/build/pysph/tools/tests/test_geometry.py::TestGeometry::test_rotate
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.7/build/pysph/tools/tests/test_geometry.py:76: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    rotation_matrix = G.matrix_exp(np.matrix(mat))

-- Docs: https://docs.pytest.org/en/latest/warnings.html

= 4 failed, 676 passed, 283 skipped, 71 deselected, 88 warnings in 626.70 seconds =

More detail are available in [1].

[1] https://buildd.debian.org/status/fetch.php?pkg=pysph&arch=i386&ver=1.0%7Ea6-3&stamp=1567841029&raw=0

Error when running pysph test

Hi, I am new to pysph and installed it on my Ubuntu 18.04.
When I run "pysph test", I get the following error:

`====================================== test session starts =======================================
platform linux -- Python 3.6.9, pytest-5.4.1, py-1.8.1, pluggy-0.13.1
rootdir: /home/ehsan/gpusph
collected 242 items / 26 errors / 216 selected

============================================= ERRORS =============================================
_______________________ ERROR collecting base/tests/test_device_helper.py ________________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/base/tests/test_device_helper.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/base/tests/test_device_helper.py:4: in
from pysph.base.utils import get_particle_array # noqa: E402
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_______________________ ERROR collecting base/tests/test_domain_manager.py _______________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/base/tests/test_domain_manager.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/base/tests/test_domain_manager.py:8: in
from pysph.base.nnps import DomainManager, BoxSortNNPS, LinkedListNNPS
../.local/lib/python3.6/site-packages/pysph/base/nnps.py:1: in
from pysph.base.nnps_base import get_number_of_threads, py_flatten,
pysph/base/particle_array.pxd:33: in init pysph.base.nnps_base
???
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_______________________ ERROR collecting base/tests/test_neighbor_cache.py _______________________
../.local/lib/python3.6/site-packages/pysph/base/tests/test_neighbor_cache.py:7: in
from pysph.base.nnps import NeighborCache, LinkedListNNPS
../.local/lib/python3.6/site-packages/pysph/base/nnps.py:1: in
from pysph.base.nnps_base import get_number_of_threads, py_flatten,
pysph/base/nnps_base.pyx:218: in init pysph.base.nnps_base
???
E AttributeError: type object 'pysph.base.nnps_base.NNPSParticleArrayWrapper' has no attribute 'reduce_cython'
____________________________ ERROR collecting base/tests/test_nnps.py ____________________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/base/tests/test_nnps.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/base/tests/test_nnps.py:12: in
from pysph.base.utils import get_particle_array
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
___________________________ ERROR collecting base/tests/test_octree.py ___________________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/base/tests/test_octree.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/base/tests/test_octree.py:10: in
from pysph.base.utils import get_particle_array
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_______________________ ERROR collecting base/tests/test_particle_array.py _______________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/base/tests/test_particle_array.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/base/tests/test_particle_array.py:10: in
from pysph.base import particle_array
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_______________________ ERROR collecting base/tests/test_periodic_nnps.py ________________________
../.local/lib/python3.6/site-packages/pysph/base/tests/test_periodic_nnps.py:7: in
from pysph.base.nnps import DomainManager, BoxSortNNPS, LinkedListNNPS,
../.local/lib/python3.6/site-packages/pysph/base/nnps.py:1: in
from pysph.base.nnps_base import get_number_of_threads, py_flatten,
pysph/base/nnps_base.pyx:218: in init pysph.base.nnps_base
???
E AttributeError: type object 'pysph.base.nnps_base.NNPSParticleArrayWrapper' has no attribute 'reduce_cython'
___________________________ ERROR collecting base/tests/test_utils.py ____________________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/base/tests/test_utils.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/base/tests/test_utils.py:3: in
from ..utils import is_overloaded_method
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_________________________ ERROR collecting parallel/tests/test_openmp.py _________________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_openmp.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/parallel/tests/test_openmp.py:10: in
from .example_test_case import ExampleTestCase, get_example_script
../.local/lib/python3.6/site-packages/pysph/parallel/tests/example_test_case.py:9: in
from pysph.solver.utils import load, get_files
../.local/lib/python3.6/site-packages/pysph/solver/utils.py:13: in
from pysph.solver.output import load, dump, output_formats # noqa: 401
../.local/lib/python3.6/site-packages/pysph/solver/output.py:9: in
from pysph.base.particle_array import ParticleArray
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
______________________ ERROR collecting parallel/tests/test_parallel_run.py ______________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel_run.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel_run.py:12: in
from pysph.parallel.tests.example_test_case import ExampleTestCase, get_example_script
../.local/lib/python3.6/site-packages/pysph/parallel/tests/example_test_case.py:9: in
from pysph.solver.utils import load, get_files
../.local/lib/python3.6/site-packages/pysph/solver/utils.py:13: in
from pysph.solver.output import load, dump, output_formats # noqa: 401
../.local/lib/python3.6/site-packages/pysph/solver/output.py:9: in
from pysph.base.particle_array import ParticleArray
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_______________________ ERROR collecting solver/tests/test_application.py ________________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/solver/tests/test_application.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/solver/tests/test_application.py:19: in
from pysph.solver.application import Application
../.local/lib/python3.6/site-packages/pysph/solver/application.py:18: in
from pysph.base import utils
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
__________________________ ERROR collecting solver/tests/test_solver.py __________________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/solver/tests/test_solver.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/solver/tests/test_solver.py:15: in
from pysph.solver.solver import Solver
../.local/lib/python3.6/site-packages/pysph/solver/solver.py:12: in
from pysph.solver.utils import ProgressBar, load, dump
../.local/lib/python3.6/site-packages/pysph/solver/utils.py:13: in
from pysph.solver.output import load, dump, output_formats # noqa: 401
../.local/lib/python3.6/site-packages/pysph/solver/output.py:9: in
from pysph.base.particle_array import ParticleArray
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_______________________ ERROR collecting solver/tests/test_solver_utils.py _______________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/solver/tests/test_solver_utils.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/solver/tests/test_solver_utils.py:14: in
from pysph.base.utils import get_particle_array, get_particle_array_wcsph
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
___________________ ERROR collecting sph/bc/tests/test_simple_inlet_outlet.py ____________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/sph/bc/tests/test_simple_inlet_outlet.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/sph/bc/tests/test_simple_inlet_outlet.py:13: in
from pysph.base.utils import get_particle_array
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
______________________ ERROR collecting sph/tests/test_acceleration_eval.py ______________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/sph/tests/test_acceleration_eval.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/sph/tests/test_acceleration_eval.py:13: in
from pysph.base.utils import get_particle_array
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_______________ ERROR collecting sph/tests/test_acceleration_eval_cython_helper.py _______________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/sph/tests/test_acceleration_eval_cython_helper.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/sph/tests/test_acceleration_eval_cython_helper.py:8: in
from pysph.base.particle_array import ParticleArray
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_________________________ ERROR collecting sph/tests/test_integrator.py __________________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/sph/tests/test_integrator.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/sph/tests/test_integrator.py:9: in
from pysph.base.utils import get_particle_array, get_particle_array_wcsph
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
__________________ ERROR collecting sph/tests/test_integrator_cython_helper.py ___________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/sph/tests/test_integrator_cython_helper.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/sph/tests/test_integrator_cython_helper.py:5: in
from pysph.base.utils import get_particle_array
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_______________________ ERROR collecting sph/tests/test_integrator_step.py _______________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/sph/tests/test_integrator_step.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/sph/tests/test_integrator_step.py:6: in
from pysph.base.utils import get_particle_array as gpa
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_____________________ ERROR collecting sph/tests/test_kernel_corrections.py ______________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/sph/tests/test_kernel_corrections.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/sph/tests/test_kernel_corrections.py:5: in
from pysph.base.utils import get_particle_array
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
___________________ ERROR collecting sph/tests/test_multi_group_integrator.py ____________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/sph/tests/test_multi_group_integrator.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/sph/tests/test_multi_group_integrator.py:10: in
from pysph.base.utils import get_particle_array
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
___________________________ ERROR collecting sph/tests/test_scheme.py ____________________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/sph/tests/test_scheme.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/sph/tests/test_scheme.py:4: in
from pysph.sph.wc.edac import EDACScheme
../.local/lib/python3.6/site-packages/pysph/sph/wc/edac.py:22: in
from pysph.base.utils import get_particle_array
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_________________________ ERROR collecting tools/tests/test_geometry.py __________________________
../.local/lib/python3.6/site-packages/pysph/tools/tests/test_geometry.py:5: in
import pysph.tools.geometry as G
../.local/lib/python3.6/site-packages/pysph/tools/geometry.py:4: in
from pysph.base.nnps import LinkedListNNPS
../.local/lib/python3.6/site-packages/pysph/base/nnps.py:1: in
from pysph.base.nnps_base import get_number_of_threads, py_flatten,
pysph/base/nnps_base.pyx:218: in init pysph.base.nnps_base
???
E AttributeError: type object 'pysph.base.nnps_base.NNPSParticleArrayWrapper' has no attribute 'reduce_cython'
_______________________ ERROR collecting tools/tests/test_geometry_stl.py ________________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/tools/tests/test_geometry_stl.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/tools/tests/test_geometry_stl.py:5: in
from pysph.base.particle_array import ParticleArray
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_______________________ ERROR collecting tools/tests/test_interpolator.py ________________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/tools/tests/test_interpolator.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/tools/tests/test_interpolator.py:16: in
from pysph.tools.interpolator import get_nx_ny_nz, Interpolator
../.local/lib/python3.6/site-packages/pysph/tools/interpolator.py:8: in
from pysph.base.utils import get_particle_array
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
_______________________ ERROR collecting tools/tests/test_sph_evaluator.py _______________________
ImportError while importing test module '/home/ehsan/.local/lib/python3.6/site-packages/pysph/tools/tests/test_sph_evaluator.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../.local/lib/python3.6/site-packages/pysph/tools/tests/test_sph_evaluator.py:5: in
from pysph.base.utils import get_particle_array
../.local/lib/python3.6/site-packages/pysph/base/utils.py:7: in
from .particle_array import ParticleArray,
pysph/base/particle_array.pyx:25: in init pysph.base.particle_array
???
../.local/lib/python3.6/site-packages/pysph/base/device_helper.py:12: in
from compyle.template import Template
E ModuleNotFoundError: No module named 'compyle.template'
======================================== warnings summary ========================================
/home/ehsan/.local/lib/python3.6/site-packages/compyle/ext_module.py:7
/home/ehsan/.local/lib/python3.6/site-packages/compyle/ext_module.py:7: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
import imp

/home/ehsan/.local/lib/python3.6/site-packages/pysph/examples/tests/test_examples.py:84
/home/ehsan/.local/lib/python3.6/site-packages/pysph/examples/tests/test_examples.py:84: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
@mark.slow

/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:51
/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:51: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
@mark.parallel

/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:55
/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:55: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
@mark.parallel

/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:68
/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:68: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
@mark.slow

/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:69
/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:69: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
@mark.parallel

/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:88
/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:88: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
@mark.parallel

/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:94
/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:94: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
@mark.parallel

/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:108
/home/ehsan/.local/lib/python3.6/site-packages/pysph/parallel/tests/test_parallel.py:108: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
@mark.parallel

-- Docs: https://docs.pytest.org/en/latest/warnings.html
==================================== short test summary info =====================================
ERROR base/tests/test_device_helper.py
ERROR base/tests/test_domain_manager.py
ERROR base/tests/test_neighbor_cache.py - AttributeError: type object 'pysph.base.nnps_base.NNP...
ERROR base/tests/test_nnps.py
ERROR base/tests/test_octree.py
ERROR base/tests/test_particle_array.py
ERROR base/tests/test_periodic_nnps.py - AttributeError: type object 'pysph.base.nnps_base.NNPS...
ERROR base/tests/test_utils.py
ERROR parallel/tests/test_openmp.py
ERROR parallel/tests/test_parallel_run.py
ERROR solver/tests/test_application.py
ERROR solver/tests/test_solver.py
ERROR solver/tests/test_solver_utils.py
ERROR sph/bc/tests/test_simple_inlet_outlet.py
ERROR sph/tests/test_acceleration_eval.py
ERROR sph/tests/test_acceleration_eval_cython_helper.py
ERROR sph/tests/test_integrator.py
ERROR sph/tests/test_integrator_cython_helper.py
ERROR sph/tests/test_integrator_step.py
ERROR sph/tests/test_kernel_corrections.py
ERROR sph/tests/test_multi_group_integrator.py
ERROR sph/tests/test_scheme.py
ERROR tools/tests/test_geometry.py - AttributeError: type object 'pysph.base.nnps_base.NNPSPart...
ERROR tools/tests/test_geometry_stl.py
ERROR tools/tests/test_interpolator.py
ERROR tools/tests/test_sph_evaluator.py
!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 26 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!
================================= 9 warnings, 26 errors in 1.45s =================================`

Error when running pysph test -v

Hi, I am new to pysph and installed it on my Windows 10.
My environment is configured as follows:
Python version :3.8.5
Cython version :0.29.21
mayavi :4.7.1
mako :1.1.0
I did not occupy these python files,but when I run "pysph test -v", I get the following error:
======================================================= ERRORS ========================================================
_______________________ ERROR at teardown of TestOutputNumpy.test_dump_and_load_with_constants ________________________

self = <pysph.solver.tests.test_solver_utils.TestOutputNumpy testMethod=test_dump_and_load_with_constants>

def tearDown(self):
  shutil.rmtree(self.root)

d:\miniconda3\lib\site-packages\pysph\solver\tests\test_solver_utils.py:56:


d:\miniconda3\lib\shutil.py:516: in rmtree
return _rmtree_unsafe(path, onerror)
d:\miniconda3\lib\shutil.py:400: in _rmtree_unsafe
onerror(os.unlink, fullname, sys.exc_info())


path = 'C:\Users\wsh\AppData\Local\Temp\tmpuxvqkv4s'
onerror = <function rmtree..onerror at 0x000002578062E318>

def _rmtree_unsafe(path, onerror):
    try:
        with os.scandir(path) as scandir_it:
            entries = list(scandir_it)
    except OSError:
        onerror(os.scandir, path, sys.exc_info())
        entries = []
    for entry in entries:
        fullname = entry.path
        try:
            is_dir = entry.is_dir(follow_symlinks=False)
        except OSError:
            is_dir = False
        if is_dir:
            try:
                if entry.is_symlink():
                    # This can only happen if someone replaces
                    # a directory with a symlink after the call to
                    # os.scandir or entry.is_dir above.
                    raise OSError("Cannot call rmtree on a symbolic link")
            except OSError:
                onerror(os.path.islink, fullname, sys.exc_info())
                continue
            _rmtree_unsafe(fullname, onerror)
        else:
            try:
              os.unlink(fullname)

E PermissionError: [WinError 32] 另一个程序正在使用此文件,进程无法访问。: 'C:\Users\wsh\AppData\Local\Temp\tmpuxvqkv4s\simple.npz'

d:\miniconda3\lib\shutil.py:398: PermissionError
___________________ ERROR at teardown of TestOutputNumpy.test_dump_and_load_with_partial_data_dump ____________________

self = <pysph.solver.tests.test_solver_utils.TestOutputNumpy testMethod=test_dump_and_load_with_partial_data_dump>

def tearDown(self):
  shutil.rmtree(self.root)

d:\miniconda3\lib\site-packages\pysph\solver\tests\test_solver_utils.py:56:


d:\miniconda3\lib\shutil.py:516: in rmtree
return _rmtree_unsafe(path, onerror)
d:\miniconda3\lib\shutil.py:400: in _rmtree_unsafe
onerror(os.unlink, fullname, sys.exc_info())


path = 'C:\Users\wsh\AppData\Local\Temp\tmppaaog6vf'
onerror = <function rmtree..onerror at 0x000002578062E288>

def _rmtree_unsafe(path, onerror):
    try:
        with os.scandir(path) as scandir_it:
            entries = list(scandir_it)
    except OSError:
        onerror(os.scandir, path, sys.exc_info())
        entries = []
    for entry in entries:
        fullname = entry.path
        try:
            is_dir = entry.is_dir(follow_symlinks=False)
        except OSError:
            is_dir = False
        if is_dir:
            try:
                if entry.is_symlink():
                    # This can only happen if someone replaces
                    # a directory with a symlink after the call to
                    # os.scandir or entry.is_dir above.
                    raise OSError("Cannot call rmtree on a symbolic link")
            except OSError:
                onerror(os.path.islink, fullname, sys.exc_info())
                continue
            _rmtree_unsafe(fullname, onerror)
        else:
            try:
              os.unlink(fullname)

E PermissionError: [WinError 32] 另一个程序正在使用此文件,进程无法访问。: 'C:\Users\wsh\AppData\Local\Temp\tmppaaog6vf\simple.npz'

d:\miniconda3\lib\shutil.py:398: PermissionError
______________________ ERROR at teardown of TestOutputNumpy.test_dump_and_load_works_by_default _______________________

self = <pysph.solver.tests.test_solver_utils.TestOutputNumpy testMethod=test_dump_and_load_works_by_default>

def tearDown(self):
  shutil.rmtree(self.root)

d:\miniconda3\lib\site-packages\pysph\solver\tests\test_solver_utils.py:56:


d:\miniconda3\lib\shutil.py:516: in rmtree
return _rmtree_unsafe(path, onerror)
d:\miniconda3\lib\shutil.py:400: in _rmtree_unsafe
onerror(os.unlink, fullname, sys.exc_info())


path = 'C:\Users\wsh\AppData\Local\Temp\tmp7zfuh7qv'
onerror = <function rmtree..onerror at 0x000002578062E1F8>

def _rmtree_unsafe(path, onerror):
    try:
        with os.scandir(path) as scandir_it:
            entries = list(scandir_it)
    except OSError:
        onerror(os.scandir, path, sys.exc_info())
        entries = []
    for entry in entries:
        fullname = entry.path
        try:
            is_dir = entry.is_dir(follow_symlinks=False)
        except OSError:
            is_dir = False
        if is_dir:
            try:
                if entry.is_symlink():
                    # This can only happen if someone replaces
                    # a directory with a symlink after the call to
                    # os.scandir or entry.is_dir above.
                    raise OSError("Cannot call rmtree on a symbolic link")
            except OSError:
                onerror(os.path.islink, fullname, sys.exc_info())
                continue
            _rmtree_unsafe(fullname, onerror)
        else:
            try:
              os.unlink(fullname)

E PermissionError: [WinError 32] 另一个程序正在使用此文件,进程无法访问。: 'C:\Users\wsh\AppData\Local\Temp\tmp7zfuh7qv\simple.npz'

d:\miniconda3\lib\shutil.py:398: PermissionError
_____________________ ERROR at teardown of TestOutputNumpy.test_dump_and_load_works_with_compress _____________________

self = <pysph.solver.tests.test_solver_utils.TestOutputNumpy testMethod=test_dump_and_load_works_with_compress>

def tearDown(self):
  shutil.rmtree(self.root)

d:\miniconda3\lib\site-packages\pysph\solver\tests\test_solver_utils.py:56:


d:\miniconda3\lib\shutil.py:516: in rmtree
return _rmtree_unsafe(path, onerror)
d:\miniconda3\lib\shutil.py:400: in _rmtree_unsafe
onerror(os.unlink, fullname, sys.exc_info())


path = 'C:\Users\wsh\AppData\Local\Temp\tmpbd4xv9li'
onerror = <function rmtree..onerror at 0x000002578062E9D8>

def _rmtree_unsafe(path, onerror):
    try:
        with os.scandir(path) as scandir_it:
            entries = list(scandir_it)
    except OSError:
        onerror(os.scandir, path, sys.exc_info())
        entries = []
    for entry in entries:
        fullname = entry.path
        try:
            is_dir = entry.is_dir(follow_symlinks=False)
        except OSError:
            is_dir = False
        if is_dir:
            try:
                if entry.is_symlink():
                    # This can only happen if someone replaces
                    # a directory with a symlink after the call to
                    # os.scandir or entry.is_dir above.
                    raise OSError("Cannot call rmtree on a symbolic link")
            except OSError:
                onerror(os.path.islink, fullname, sys.exc_info())
                continue
            _rmtree_unsafe(fullname, onerror)
        else:
            try:
              os.unlink(fullname)

E PermissionError: [WinError 32] 另一个程序正在使用此文件,进程无法访问。: 'C:\Users\wsh\AppData\Local\Temp\tmpbd4xv9li\simplez.npz'

d:\miniconda3\lib\shutil.py:398: PermissionError
__________________ ERROR at teardown of TestOutputNumpy.test_that_output_array_information_is_saved ___________________

self = <pysph.solver.tests.test_solver_utils.TestOutputNumpy testMethod=test_that_output_array_information_is_saved>

def tearDown(self):
  shutil.rmtree(self.root)

d:\miniconda3\lib\site-packages\pysph\solver\tests\test_solver_utils.py:56:


d:\miniconda3\lib\shutil.py:516: in rmtree
return _rmtree_unsafe(path, onerror)
d:\miniconda3\lib\shutil.py:400: in _rmtree_unsafe
onerror(os.unlink, fullname, sys.exc_info())


path = 'C:\Users\wsh\AppData\Local\Temp\tmplwpanhdu'
onerror = <function rmtree..onerror at 0x000002578062E4C8>

def _rmtree_unsafe(path, onerror):
    try:
        with os.scandir(path) as scandir_it:
            entries = list(scandir_it)
    except OSError:
        onerror(os.scandir, path, sys.exc_info())
        entries = []
    for entry in entries:
        fullname = entry.path
        try:
            is_dir = entry.is_dir(follow_symlinks=False)
        except OSError:
            is_dir = False
        if is_dir:
            try:
                if entry.is_symlink():
                    # This can only happen if someone replaces
                    # a directory with a symlink after the call to
                    # os.scandir or entry.is_dir above.
                    raise OSError("Cannot call rmtree on a symbolic link")
            except OSError:
                onerror(os.path.islink, fullname, sys.exc_info())
                continue
            _rmtree_unsafe(fullname, onerror)
        else:
            try:
              os.unlink(fullname)

E PermissionError: [WinError 32] 另一个程序正在使用此文件,进程无法访问。: 'C:\Users\wsh\AppData\Local\Temp\tmplwpanhdu\simple.npz'

d:\miniconda3\lib\shutil.py:398: PermissionError
______________________ ERROR at teardown of TestOutputNumpyV1.test_load_works_with_dump_version1 ______________________

self = <pysph.solver.tests.test_solver_utils.TestOutputNumpyV1 testMethod=test_load_works_with_dump_version1>

def tearDown(self):
  shutil.rmtree(self.root)

d:\miniconda3\lib\site-packages\pysph\solver\tests\test_solver_utils.py:170:


d:\miniconda3\lib\shutil.py:516: in rmtree
return _rmtree_unsafe(path, onerror)
d:\miniconda3\lib\shutil.py:400: in _rmtree_unsafe
onerror(os.unlink, fullname, sys.exc_info())


path = 'C:\Users\wsh\AppData\Local\Temp\tmpeoneh1mo'
onerror = <function rmtree..onerror at 0x000002578062E318>

def _rmtree_unsafe(path, onerror):
    try:
        with os.scandir(path) as scandir_it:
            entries = list(scandir_it)
    except OSError:
        onerror(os.scandir, path, sys.exc_info())
        entries = []
    for entry in entries:
        fullname = entry.path
        try:
            is_dir = entry.is_dir(follow_symlinks=False)
        except OSError:
            is_dir = False
        if is_dir:
            try:
                if entry.is_symlink():
                    # This can only happen if someone replaces
                    # a directory with a symlink after the call to
                    # os.scandir or entry.is_dir above.
                    raise OSError("Cannot call rmtree on a symbolic link")
            except OSError:
                onerror(os.path.islink, fullname, sys.exc_info())
                continue
            _rmtree_unsafe(fullname, onerror)
        else:
            try:
              os.unlink(fullname)

E PermissionError: [WinError 32] 另一个程序正在使用此文件,进程无法访问。: 'C:\Users\wsh\AppData\Local\Temp\tmpeoneh1mo\simple.npz'

d:\miniconda3\lib\shutil.py:398: PermissionError
====================================================== FAILURES =======================================================
_____________________________________ TestOctreeFor1DDataset.test_levels_in_tree ______________________________________

self = <pysph.base.tests.test_octree.TestOctreeFor1DDataset testMethod=test_levels_in_tree>

def setUp(self):
    N = 1e5
  self.x = np.linspace(0, 1, num=N)

d:\miniconda3\lib\site-packages\pysph\base\tests\test_octree.py:192:


<array_function internals>:6: in linspace
???


start = 0, stop = 1, num = 100000.0, endpoint = True, retstep = False, dtype = None, axis = 0

@array_function_dispatch(_linspace_dispatcher)
def linspace(start, stop, num=50, endpoint=True, retstep=False, dtype=None,
             axis=0):
    """
    Return evenly spaced numbers over a specified interval.

    Returns `num` evenly spaced samples, calculated over the
    interval [`start`, `stop`].

    The endpoint of the interval can optionally be excluded.

    .. versionchanged:: 1.16.0
        Non-scalar `start` and `stop` are now supported.

    Parameters
    ----------
    start : array_like
        The starting value of the sequence.
    stop : array_like
        The end value of the sequence, unless `endpoint` is set to False.
        In that case, the sequence consists of all but the last of ``num + 1``
        evenly spaced samples, so that `stop` is excluded.  Note that the step
        size changes when `endpoint` is False.
    num : int, optional
        Number of samples to generate. Default is 50. Must be non-negative.
    endpoint : bool, optional
        If True, `stop` is the last sample. Otherwise, it is not included.
        Default is True.
    retstep : bool, optional
        If True, return (`samples`, `step`), where `step` is the spacing
        between samples.
    dtype : dtype, optional
        The type of the output array.  If `dtype` is not given, infer the data
        type from the other input arguments.

        .. versionadded:: 1.9.0

    axis : int, optional
        The axis in the result to store the samples.  Relevant only if start
        or stop are array-like.  By default (0), the samples will be along a
        new axis inserted at the beginning. Use -1 to get an axis at the end.

        .. versionadded:: 1.16.0

    Returns
    -------
    samples : ndarray
        There are `num` equally spaced samples in the closed interval
        ``[start, stop]`` or the half-open interval ``[start, stop)``
        (depending on whether `endpoint` is True or False).
    step : float, optional
        Only returned if `retstep` is True

        Size of spacing between samples.


    See Also
    --------
    arange : Similar to `linspace`, but uses a step size (instead of the
             number of samples).
    geomspace : Similar to `linspace`, but with numbers spaced evenly on a log
                scale (a geometric progression).
    logspace : Similar to `geomspace`, but with the end points specified as
               logarithms.

    Examples
    --------
    >>> np.linspace(2.0, 3.0, num=5)
    array([2.  , 2.25, 2.5 , 2.75, 3.  ])
    >>> np.linspace(2.0, 3.0, num=5, endpoint=False)
    array([2. ,  2.2,  2.4,  2.6,  2.8])
    >>> np.linspace(2.0, 3.0, num=5, retstep=True)
    (array([2.  ,  2.25,  2.5 ,  2.75,  3.  ]), 0.25)

    Graphical illustration:

    >>> import matplotlib.pyplot as plt
    >>> N = 8
    >>> y = np.zeros(N)
    >>> x1 = np.linspace(0, 10, N, endpoint=True)
    >>> x2 = np.linspace(0, 10, N, endpoint=False)
    >>> plt.plot(x1, y, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.plot(x2, y + 0.5, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.ylim([-0.5, 1])
    (-0.5, 1)
    >>> plt.show()

    """
  num = operator.index(num)

E TypeError: 'float' object cannot be interpreted as an integer

d:\miniconda3\lib\site-packages\numpy\core\function_base.py:113: TypeError
_____________________________________ TestOctreeFor1DDataset.test_parent_for_node _____________________________________

self = <pysph.base.tests.test_octree.TestOctreeFor1DDataset testMethod=test_parent_for_node>

def setUp(self):
    N = 1e5
  self.x = np.linspace(0, 1, num=N)

d:\miniconda3\lib\site-packages\pysph\base\tests\test_octree.py:192:


<array_function internals>:6: in linspace
???


start = 0, stop = 1, num = 100000.0, endpoint = True, retstep = False, dtype = None, axis = 0

@array_function_dispatch(_linspace_dispatcher)
def linspace(start, stop, num=50, endpoint=True, retstep=False, dtype=None,
             axis=0):
    """
    Return evenly spaced numbers over a specified interval.

    Returns `num` evenly spaced samples, calculated over the
    interval [`start`, `stop`].

    The endpoint of the interval can optionally be excluded.

    .. versionchanged:: 1.16.0
        Non-scalar `start` and `stop` are now supported.

    Parameters
    ----------
    start : array_like
        The starting value of the sequence.
    stop : array_like
        The end value of the sequence, unless `endpoint` is set to False.
        In that case, the sequence consists of all but the last of ``num + 1``
        evenly spaced samples, so that `stop` is excluded.  Note that the step
        size changes when `endpoint` is False.
    num : int, optional
        Number of samples to generate. Default is 50. Must be non-negative.
    endpoint : bool, optional
        If True, `stop` is the last sample. Otherwise, it is not included.
        Default is True.
    retstep : bool, optional
        If True, return (`samples`, `step`), where `step` is the spacing
        between samples.
    dtype : dtype, optional
        The type of the output array.  If `dtype` is not given, infer the data
        type from the other input arguments.

        .. versionadded:: 1.9.0

    axis : int, optional
        The axis in the result to store the samples.  Relevant only if start
        or stop are array-like.  By default (0), the samples will be along a
        new axis inserted at the beginning. Use -1 to get an axis at the end.

        .. versionadded:: 1.16.0

    Returns
    -------
    samples : ndarray
        There are `num` equally spaced samples in the closed interval
        ``[start, stop]`` or the half-open interval ``[start, stop)``
        (depending on whether `endpoint` is True or False).
    step : float, optional
        Only returned if `retstep` is True

        Size of spacing between samples.


    See Also
    --------
    arange : Similar to `linspace`, but uses a step size (instead of the
             number of samples).
    geomspace : Similar to `linspace`, but with numbers spaced evenly on a log
                scale (a geometric progression).
    logspace : Similar to `geomspace`, but with the end points specified as
               logarithms.

    Examples
    --------
    >>> np.linspace(2.0, 3.0, num=5)
    array([2.  , 2.25, 2.5 , 2.75, 3.  ])
    >>> np.linspace(2.0, 3.0, num=5, endpoint=False)
    array([2. ,  2.2,  2.4,  2.6,  2.8])
    >>> np.linspace(2.0, 3.0, num=5, retstep=True)
    (array([2.  ,  2.25,  2.5 ,  2.75,  3.  ]), 0.25)

    Graphical illustration:

    >>> import matplotlib.pyplot as plt
    >>> N = 8
    >>> y = np.zeros(N)
    >>> x1 = np.linspace(0, 10, N, endpoint=True)
    >>> x2 = np.linspace(0, 10, N, endpoint=False)
    >>> plt.plot(x1, y, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.plot(x2, y + 0.5, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.ylim([-0.5, 1])
    (-0.5, 1)
    >>> plt.show()

    """
  num = operator.index(num)

E TypeError: 'float' object cannot be interpreted as an integer

d:\miniconda3\lib\site-packages\numpy\core\function_base.py:113: TypeError
________________________________________ TestOctreeFor1DDataset.test_plot_root ________________________________________

self = <pysph.base.tests.test_octree.TestOctreeFor1DDataset testMethod=test_plot_root>

def setUp(self):
    N = 1e5
  self.x = np.linspace(0, 1, num=N)

d:\miniconda3\lib\site-packages\pysph\base\tests\test_octree.py:192:


<array_function internals>:6: in linspace
???


start = 0, stop = 1, num = 100000.0, endpoint = True, retstep = False, dtype = None, axis = 0

@array_function_dispatch(_linspace_dispatcher)
def linspace(start, stop, num=50, endpoint=True, retstep=False, dtype=None,
             axis=0):
    """
    Return evenly spaced numbers over a specified interval.

    Returns `num` evenly spaced samples, calculated over the
    interval [`start`, `stop`].

    The endpoint of the interval can optionally be excluded.

    .. versionchanged:: 1.16.0
        Non-scalar `start` and `stop` are now supported.

    Parameters
    ----------
    start : array_like
        The starting value of the sequence.
    stop : array_like
        The end value of the sequence, unless `endpoint` is set to False.
        In that case, the sequence consists of all but the last of ``num + 1``
        evenly spaced samples, so that `stop` is excluded.  Note that the step
        size changes when `endpoint` is False.
    num : int, optional
        Number of samples to generate. Default is 50. Must be non-negative.
    endpoint : bool, optional
        If True, `stop` is the last sample. Otherwise, it is not included.
        Default is True.
    retstep : bool, optional
        If True, return (`samples`, `step`), where `step` is the spacing
        between samples.
    dtype : dtype, optional
        The type of the output array.  If `dtype` is not given, infer the data
        type from the other input arguments.

        .. versionadded:: 1.9.0

    axis : int, optional
        The axis in the result to store the samples.  Relevant only if start
        or stop are array-like.  By default (0), the samples will be along a
        new axis inserted at the beginning. Use -1 to get an axis at the end.

        .. versionadded:: 1.16.0

    Returns
    -------
    samples : ndarray
        There are `num` equally spaced samples in the closed interval
        ``[start, stop]`` or the half-open interval ``[start, stop)``
        (depending on whether `endpoint` is True or False).
    step : float, optional
        Only returned if `retstep` is True

        Size of spacing between samples.


    See Also
    --------
    arange : Similar to `linspace`, but uses a step size (instead of the
             number of samples).
    geomspace : Similar to `linspace`, but with numbers spaced evenly on a log
                scale (a geometric progression).
    logspace : Similar to `geomspace`, but with the end points specified as
               logarithms.

    Examples
    --------
    >>> np.linspace(2.0, 3.0, num=5)
    array([2.  , 2.25, 2.5 , 2.75, 3.  ])
    >>> np.linspace(2.0, 3.0, num=5, endpoint=False)
    array([2. ,  2.2,  2.4,  2.6,  2.8])
    >>> np.linspace(2.0, 3.0, num=5, retstep=True)
    (array([2.  ,  2.25,  2.5 ,  2.75,  3.  ]), 0.25)

    Graphical illustration:

    >>> import matplotlib.pyplot as plt
    >>> N = 8
    >>> y = np.zeros(N)
    >>> x1 = np.linspace(0, 10, N, endpoint=True)
    >>> x2 = np.linspace(0, 10, N, endpoint=False)
    >>> plt.plot(x1, y, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.plot(x2, y + 0.5, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.ylim([-0.5, 1])
    (-0.5, 1)
    >>> plt.show()

    """
  num = operator.index(num)

E TypeError: 'float' object cannot be interpreted as an integer

d:\miniconda3\lib\site-packages\numpy\core\function_base.py:113: TypeError
_________________ TestOctreeFor1DDataset.test_sum_of_indices_lengths_equals_total_number_of_particles _________________

self = <pysph.base.tests.test_octree.TestOctreeFor1DDataset testMethod=test_sum_of_indices_lengths_equals_total_number_of_particles>

def setUp(self):
    N = 1e5
  self.x = np.linspace(0, 1, num=N)

d:\miniconda3\lib\site-packages\pysph\base\tests\test_octree.py:192:


<array_function internals>:6: in linspace
???


start = 0, stop = 1, num = 100000.0, endpoint = True, retstep = False, dtype = None, axis = 0

@array_function_dispatch(_linspace_dispatcher)
def linspace(start, stop, num=50, endpoint=True, retstep=False, dtype=None,
             axis=0):
    """
    Return evenly spaced numbers over a specified interval.

    Returns `num` evenly spaced samples, calculated over the
    interval [`start`, `stop`].

    The endpoint of the interval can optionally be excluded.

    .. versionchanged:: 1.16.0
        Non-scalar `start` and `stop` are now supported.

    Parameters
    ----------
    start : array_like
        The starting value of the sequence.
    stop : array_like
        The end value of the sequence, unless `endpoint` is set to False.
        In that case, the sequence consists of all but the last of ``num + 1``
        evenly spaced samples, so that `stop` is excluded.  Note that the step
        size changes when `endpoint` is False.
    num : int, optional
        Number of samples to generate. Default is 50. Must be non-negative.
    endpoint : bool, optional
        If True, `stop` is the last sample. Otherwise, it is not included.
        Default is True.
    retstep : bool, optional
        If True, return (`samples`, `step`), where `step` is the spacing
        between samples.
    dtype : dtype, optional
        The type of the output array.  If `dtype` is not given, infer the data
        type from the other input arguments.

        .. versionadded:: 1.9.0

    axis : int, optional
        The axis in the result to store the samples.  Relevant only if start
        or stop are array-like.  By default (0), the samples will be along a
        new axis inserted at the beginning. Use -1 to get an axis at the end.

        .. versionadded:: 1.16.0

    Returns
    -------
    samples : ndarray
        There are `num` equally spaced samples in the closed interval
        ``[start, stop]`` or the half-open interval ``[start, stop)``
        (depending on whether `endpoint` is True or False).
    step : float, optional
        Only returned if `retstep` is True

        Size of spacing between samples.


    See Also
    --------
    arange : Similar to `linspace`, but uses a step size (instead of the
             number of samples).
    geomspace : Similar to `linspace`, but with numbers spaced evenly on a log
                scale (a geometric progression).
    logspace : Similar to `geomspace`, but with the end points specified as
               logarithms.

    Examples
    --------
    >>> np.linspace(2.0, 3.0, num=5)
    array([2.  , 2.25, 2.5 , 2.75, 3.  ])
    >>> np.linspace(2.0, 3.0, num=5, endpoint=False)
    array([2. ,  2.2,  2.4,  2.6,  2.8])
    >>> np.linspace(2.0, 3.0, num=5, retstep=True)
    (array([2.  ,  2.25,  2.5 ,  2.75,  3.  ]), 0.25)

    Graphical illustration:

    >>> import matplotlib.pyplot as plt
    >>> N = 8
    >>> y = np.zeros(N)
    >>> x1 = np.linspace(0, 10, N, endpoint=True)
    >>> x2 = np.linspace(0, 10, N, endpoint=False)
    >>> plt.plot(x1, y, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.plot(x2, y + 0.5, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.ylim([-0.5, 1])
    (-0.5, 1)
    >>> plt.show()

    """
  num = operator.index(num)

E TypeError: 'float' object cannot be interpreted as an integer

d:\miniconda3\lib\site-packages\numpy\core\function_base.py:113: TypeError
________________________________ TestCompressedOctreeFor1DDataset.test_levels_in_tree _________________________________

self = <pysph.base.tests.test_octree.TestCompressedOctreeFor1DDataset testMethod=test_levels_in_tree>

def setUp(self):
    N = 1e5
  self.x = np.linspace(0, 1, num=N)

d:\miniconda3\lib\site-packages\pysph\base\tests\test_octree.py:238:


<array_function internals>:6: in linspace
???


start = 0, stop = 1, num = 100000.0, endpoint = True, retstep = False, dtype = None, axis = 0

@array_function_dispatch(_linspace_dispatcher)
def linspace(start, stop, num=50, endpoint=True, retstep=False, dtype=None,
             axis=0):
    """
    Return evenly spaced numbers over a specified interval.

    Returns `num` evenly spaced samples, calculated over the
    interval [`start`, `stop`].

    The endpoint of the interval can optionally be excluded.

    .. versionchanged:: 1.16.0
        Non-scalar `start` and `stop` are now supported.

    Parameters
    ----------
    start : array_like
        The starting value of the sequence.
    stop : array_like
        The end value of the sequence, unless `endpoint` is set to False.
        In that case, the sequence consists of all but the last of ``num + 1``
        evenly spaced samples, so that `stop` is excluded.  Note that the step
        size changes when `endpoint` is False.
    num : int, optional
        Number of samples to generate. Default is 50. Must be non-negative.
    endpoint : bool, optional
        If True, `stop` is the last sample. Otherwise, it is not included.
        Default is True.
    retstep : bool, optional
        If True, return (`samples`, `step`), where `step` is the spacing
        between samples.
    dtype : dtype, optional
        The type of the output array.  If `dtype` is not given, infer the data
        type from the other input arguments.

        .. versionadded:: 1.9.0

    axis : int, optional
        The axis in the result to store the samples.  Relevant only if start
        or stop are array-like.  By default (0), the samples will be along a
        new axis inserted at the beginning. Use -1 to get an axis at the end.

        .. versionadded:: 1.16.0

    Returns
    -------
    samples : ndarray
        There are `num` equally spaced samples in the closed interval
        ``[start, stop]`` or the half-open interval ``[start, stop)``
        (depending on whether `endpoint` is True or False).
    step : float, optional
        Only returned if `retstep` is True

        Size of spacing between samples.


    See Also
    --------
    arange : Similar to `linspace`, but uses a step size (instead of the
             number of samples).
    geomspace : Similar to `linspace`, but with numbers spaced evenly on a log
                scale (a geometric progression).
    logspace : Similar to `geomspace`, but with the end points specified as
               logarithms.

    Examples
    --------
    >>> np.linspace(2.0, 3.0, num=5)
    array([2.  , 2.25, 2.5 , 2.75, 3.  ])
    >>> np.linspace(2.0, 3.0, num=5, endpoint=False)
    array([2. ,  2.2,  2.4,  2.6,  2.8])
    >>> np.linspace(2.0, 3.0, num=5, retstep=True)
    (array([2.  ,  2.25,  2.5 ,  2.75,  3.  ]), 0.25)

    Graphical illustration:

    >>> import matplotlib.pyplot as plt
    >>> N = 8
    >>> y = np.zeros(N)
    >>> x1 = np.linspace(0, 10, N, endpoint=True)
    >>> x2 = np.linspace(0, 10, N, endpoint=False)
    >>> plt.plot(x1, y, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.plot(x2, y + 0.5, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.ylim([-0.5, 1])
    (-0.5, 1)
    >>> plt.show()

    """
  num = operator.index(num)

E TypeError: 'float' object cannot be interpreted as an integer

d:\miniconda3\lib\site-packages\numpy\core\function_base.py:113: TypeError
________________________________ TestCompressedOctreeFor1DDataset.test_parent_for_node ________________________________

self = <pysph.base.tests.test_octree.TestCompressedOctreeFor1DDataset testMethod=test_parent_for_node>

def setUp(self):
    N = 1e5
  self.x = np.linspace(0, 1, num=N)

d:\miniconda3\lib\site-packages\pysph\base\tests\test_octree.py:238:


<array_function internals>:6: in linspace
???


start = 0, stop = 1, num = 100000.0, endpoint = True, retstep = False, dtype = None, axis = 0

@array_function_dispatch(_linspace_dispatcher)
def linspace(start, stop, num=50, endpoint=True, retstep=False, dtype=None,
             axis=0):
    """
    Return evenly spaced numbers over a specified interval.

    Returns `num` evenly spaced samples, calculated over the
    interval [`start`, `stop`].

    The endpoint of the interval can optionally be excluded.

    .. versionchanged:: 1.16.0
        Non-scalar `start` and `stop` are now supported.

    Parameters
    ----------
    start : array_like
        The starting value of the sequence.
    stop : array_like
        The end value of the sequence, unless `endpoint` is set to False.
        In that case, the sequence consists of all but the last of ``num + 1``
        evenly spaced samples, so that `stop` is excluded.  Note that the step
        size changes when `endpoint` is False.
    num : int, optional
        Number of samples to generate. Default is 50. Must be non-negative.
    endpoint : bool, optional
        If True, `stop` is the last sample. Otherwise, it is not included.
        Default is True.
    retstep : bool, optional
        If True, return (`samples`, `step`), where `step` is the spacing
        between samples.
    dtype : dtype, optional
        The type of the output array.  If `dtype` is not given, infer the data
        type from the other input arguments.

        .. versionadded:: 1.9.0

    axis : int, optional
        The axis in the result to store the samples.  Relevant only if start
        or stop are array-like.  By default (0), the samples will be along a
        new axis inserted at the beginning. Use -1 to get an axis at the end.

        .. versionadded:: 1.16.0

    Returns
    -------
    samples : ndarray
        There are `num` equally spaced samples in the closed interval
        ``[start, stop]`` or the half-open interval ``[start, stop)``
        (depending on whether `endpoint` is True or False).
    step : float, optional
        Only returned if `retstep` is True

        Size of spacing between samples.


    See Also
    --------
    arange : Similar to `linspace`, but uses a step size (instead of the
             number of samples).
    geomspace : Similar to `linspace`, but with numbers spaced evenly on a log
                scale (a geometric progression).
    logspace : Similar to `geomspace`, but with the end points specified as
               logarithms.

    Examples
    --------
    >>> np.linspace(2.0, 3.0, num=5)
    array([2.  , 2.25, 2.5 , 2.75, 3.  ])
    >>> np.linspace(2.0, 3.0, num=5, endpoint=False)
    array([2. ,  2.2,  2.4,  2.6,  2.8])
    >>> np.linspace(2.0, 3.0, num=5, retstep=True)
    (array([2.  ,  2.25,  2.5 ,  2.75,  3.  ]), 0.25)

    Graphical illustration:

    >>> import matplotlib.pyplot as plt
    >>> N = 8
    >>> y = np.zeros(N)
    >>> x1 = np.linspace(0, 10, N, endpoint=True)
    >>> x2 = np.linspace(0, 10, N, endpoint=False)
    >>> plt.plot(x1, y, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.plot(x2, y + 0.5, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.ylim([-0.5, 1])
    (-0.5, 1)
    >>> plt.show()

    """
  num = operator.index(num)

E TypeError: 'float' object cannot be interpreted as an integer

d:\miniconda3\lib\site-packages\numpy\core\function_base.py:113: TypeError
___________________________________ TestCompressedOctreeFor1DDataset.test_plot_root ___________________________________

self = <pysph.base.tests.test_octree.TestCompressedOctreeFor1DDataset testMethod=test_plot_root>

def setUp(self):
    N = 1e5
  self.x = np.linspace(0, 1, num=N)

d:\miniconda3\lib\site-packages\pysph\base\tests\test_octree.py:238:


<array_function internals>:6: in linspace
???


start = 0, stop = 1, num = 100000.0, endpoint = True, retstep = False, dtype = None, axis = 0

@array_function_dispatch(_linspace_dispatcher)
def linspace(start, stop, num=50, endpoint=True, retstep=False, dtype=None,
             axis=0):
    """
    Return evenly spaced numbers over a specified interval.

    Returns `num` evenly spaced samples, calculated over the
    interval [`start`, `stop`].

    The endpoint of the interval can optionally be excluded.

    .. versionchanged:: 1.16.0
        Non-scalar `start` and `stop` are now supported.

    Parameters
    ----------
    start : array_like
        The starting value of the sequence.
    stop : array_like
        The end value of the sequence, unless `endpoint` is set to False.
        In that case, the sequence consists of all but the last of ``num + 1``
        evenly spaced samples, so that `stop` is excluded.  Note that the step
        size changes when `endpoint` is False.
    num : int, optional
        Number of samples to generate. Default is 50. Must be non-negative.
    endpoint : bool, optional
        If True, `stop` is the last sample. Otherwise, it is not included.
        Default is True.
    retstep : bool, optional
        If True, return (`samples`, `step`), where `step` is the spacing
        between samples.
    dtype : dtype, optional
        The type of the output array.  If `dtype` is not given, infer the data
        type from the other input arguments.

        .. versionadded:: 1.9.0

    axis : int, optional
        The axis in the result to store the samples.  Relevant only if start
        or stop are array-like.  By default (0), the samples will be along a
        new axis inserted at the beginning. Use -1 to get an axis at the end.

        .. versionadded:: 1.16.0

    Returns
    -------
    samples : ndarray
        There are `num` equally spaced samples in the closed interval
        ``[start, stop]`` or the half-open interval ``[start, stop)``
        (depending on whether `endpoint` is True or False).
    step : float, optional
        Only returned if `retstep` is True

        Size of spacing between samples.


    See Also
    --------
    arange : Similar to `linspace`, but uses a step size (instead of the
             number of samples).
    geomspace : Similar to `linspace`, but with numbers spaced evenly on a log
                scale (a geometric progression).
    logspace : Similar to `geomspace`, but with the end points specified as
               logarithms.

    Examples
    --------
    >>> np.linspace(2.0, 3.0, num=5)
    array([2.  , 2.25, 2.5 , 2.75, 3.  ])
    >>> np.linspace(2.0, 3.0, num=5, endpoint=False)
    array([2. ,  2.2,  2.4,  2.6,  2.8])
    >>> np.linspace(2.0, 3.0, num=5, retstep=True)
    (array([2.  ,  2.25,  2.5 ,  2.75,  3.  ]), 0.25)

    Graphical illustration:

    >>> import matplotlib.pyplot as plt
    >>> N = 8
    >>> y = np.zeros(N)
    >>> x1 = np.linspace(0, 10, N, endpoint=True)
    >>> x2 = np.linspace(0, 10, N, endpoint=False)
    >>> plt.plot(x1, y, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.plot(x2, y + 0.5, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.ylim([-0.5, 1])
    (-0.5, 1)
    >>> plt.show()

    """
  num = operator.index(num)

E TypeError: 'float' object cannot be interpreted as an integer

d:\miniconda3\lib\site-packages\numpy\core\function_base.py:113: TypeError
____________ TestCompressedOctreeFor1DDataset.test_sum_of_indices_lengths_equals_total_number_of_particles ____________

self = <pysph.base.tests.test_octree.TestCompressedOctreeFor1DDataset testMethod=test_sum_of_indices_lengths_equals_total_number_of_particles>

def setUp(self):
    N = 1e5
  self.x = np.linspace(0, 1, num=N)

d:\miniconda3\lib\site-packages\pysph\base\tests\test_octree.py:238:


<array_function internals>:6: in linspace
???


start = 0, stop = 1, num = 100000.0, endpoint = True, retstep = False, dtype = None, axis = 0

@array_function_dispatch(_linspace_dispatcher)
def linspace(start, stop, num=50, endpoint=True, retstep=False, dtype=None,
             axis=0):
    """
    Return evenly spaced numbers over a specified interval.

    Returns `num` evenly spaced samples, calculated over the
    interval [`start`, `stop`].

    The endpoint of the interval can optionally be excluded.

    .. versionchanged:: 1.16.0
        Non-scalar `start` and `stop` are now supported.

    Parameters
    ----------
    start : array_like
        The starting value of the sequence.
    stop : array_like
        The end value of the sequence, unless `endpoint` is set to False.
        In that case, the sequence consists of all but the last of ``num + 1``
        evenly spaced samples, so that `stop` is excluded.  Note that the step
        size changes when `endpoint` is False.
    num : int, optional
        Number of samples to generate. Default is 50. Must be non-negative.
    endpoint : bool, optional
        If True, `stop` is the last sample. Otherwise, it is not included.
        Default is True.
    retstep : bool, optional
        If True, return (`samples`, `step`), where `step` is the spacing
        between samples.
    dtype : dtype, optional
        The type of the output array.  If `dtype` is not given, infer the data
        type from the other input arguments.

        .. versionadded:: 1.9.0

    axis : int, optional
        The axis in the result to store the samples.  Relevant only if start
        or stop are array-like.  By default (0), the samples will be along a
        new axis inserted at the beginning. Use -1 to get an axis at the end.

        .. versionadded:: 1.16.0

    Returns
    -------
    samples : ndarray
        There are `num` equally spaced samples in the closed interval
        ``[start, stop]`` or the half-open interval ``[start, stop)``
        (depending on whether `endpoint` is True or False).
    step : float, optional
        Only returned if `retstep` is True

        Size of spacing between samples.


    See Also
    --------
    arange : Similar to `linspace`, but uses a step size (instead of the
             number of samples).
    geomspace : Similar to `linspace`, but with numbers spaced evenly on a log
                scale (a geometric progression).
    logspace : Similar to `geomspace`, but with the end points specified as
               logarithms.

    Examples
    --------
    >>> np.linspace(2.0, 3.0, num=5)
    array([2.  , 2.25, 2.5 , 2.75, 3.  ])
    >>> np.linspace(2.0, 3.0, num=5, endpoint=False)
    array([2. ,  2.2,  2.4,  2.6,  2.8])
    >>> np.linspace(2.0, 3.0, num=5, retstep=True)
    (array([2.  ,  2.25,  2.5 ,  2.75,  3.  ]), 0.25)

    Graphical illustration:

    >>> import matplotlib.pyplot as plt
    >>> N = 8
    >>> y = np.zeros(N)
    >>> x1 = np.linspace(0, 10, N, endpoint=True)
    >>> x2 = np.linspace(0, 10, N, endpoint=False)
    >>> plt.plot(x1, y, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.plot(x2, y + 0.5, 'o')
    [<matplotlib.lines.Line2D object at 0x...>]
    >>> plt.ylim([-0.5, 1])
    (-0.5, 1)
    >>> plt.show()

    """
  num = operator.index(num)

E TypeError: 'float' object cannot be interpreted as an integer

d:\miniconda3\lib\site-packages\numpy\core\function_base.py:113: TypeError
______________________________________ TestOpenMPExamples.test_ldcavity_example _______________________________________

self = <pysph.parallel.tests.test_openmp.TestOpenMPExamples testMethod=test_ldcavity_example>

def test_ldcavity_example(self):
    dt=1e-4; tf=200*dt
    serial_kwargs = dict(timestep=dt, tf=tf, pfreq=500)
    extra_parallel_kwargs = dict(openmp=None)
    # Note that we set nprocs=1 here since we do not want
    # to run this with mpirun.
    self.run_example(
        'cavity.py', nprocs=1, atol=1e-14,
        serial_kwargs=serial_kwargs,
      extra_parallel_kwargs=extra_parallel_kwargs
    )

d:\miniconda3\lib\site-packages\pysph\parallel\tests\test_openmp.py:55:


d:\miniconda3\lib\site-packages\pysph\parallel\tests\example_test_case.py:128: in run_example
serial = load(file)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:298: in load
return output.load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:79: in load
return self._load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:135: in _load
solver_data = _get_dict_from_arrays(data["solver_data"])
d:\miniconda3\lib\site-packages\numpy\lib\npyio.py:255: in getitem
pickle_kwargs=self.pickle_kwargs)


fp = <zipfile.ZipExtFile name='solver_data.npy' mode='r'>, allow_pickle = False
pickle_kwargs = {'encoding': 'bytes', 'fix_imports': True}

def read_array(fp, allow_pickle=False, pickle_kwargs=None):
    """
    Read an array from an NPY file.

    Parameters
    ----------
    fp : file_like object
        If this is not a real file object, then this may take extra memory
        and time.
    allow_pickle : bool, optional
        Whether to allow writing pickled data. Default: False

        .. versionchanged:: 1.16.3
            Made default False in response to CVE-2019-6446.

    pickle_kwargs : dict
        Additional keyword arguments to pass to pickle.load. These are only
        useful when loading object arrays saved on Python 2 when using
        Python 3.

    Returns
    -------
    array : ndarray
        The array from the data on disk.

    Raises
    ------
    ValueError
        If the data is invalid, or allow_pickle=False and the file contains
        an object array.

    """
    version = read_magic(fp)
    _check_version(version)
    shape, fortran_order, dtype = _read_array_header(fp, version)
    if len(shape) == 0:
        count = 1
    else:
        count = numpy.multiply.reduce(shape, dtype=numpy.int64)

    # Now read the actual data.
    if dtype.hasobject:
        # The array contained Python objects. We need to unpickle the data.
        if not allow_pickle:
          raise ValueError("Object arrays cannot be loaded when "
                             "allow_pickle=False")

E ValueError: Object arrays cannot be loaded when allow_pickle=False

d:\miniconda3\lib\site-packages\numpy\lib\format.py:727: ValueError
------------------------------------------------ Captured stdout call -------------------------------------------------
running test: ['d:\miniconda3\python.exe', 'd:\miniconda3\lib\site-packages\pysph\parallel\tests\cavity.py', '--timestep=0.0001', '--tf=0.02', '--pfreq=500', '--fname=cavity', '--directory=C:\Users\wsh\AppData\Local\Temp\tmpfyiugtco']
running test: ['d:\miniconda3\python.exe', 'd:\miniconda3\lib\site-packages\pysph\parallel\tests\cavity.py', '--timestep=0.0001', '--tf=0.02', '--pfreq=500', '--openmp', '--fname=cavity', '--directory=C:\Users\wsh\AppData\Local\Temp\tmp9zml4pvp']
__________________________________ TestOutputNumpy.test_dump_and_load_with_constants __________________________________

self = <pysph.solver.tests.test_solver_utils.TestOutputNumpy testMethod=test_dump_and_load_with_constants>

def test_dump_and_load_with_constants(self):
    x = np.linspace(0, 1.0, 10)
    y = x*2.0
    pa = get_particle_array_wcsph(name='fluid', x=x, y=y,
                                  constants={'c1': 1.0, 'c2': [2.0, 3.0]})
    pa.add_property('A', data=2.0, stride=2)
    pa.set_output_arrays(['x', 'y', 'A'])
    fname = self._get_filename('simple')
    dump(fname, [pa], solver_data={})
  data = load(fname)

d:\miniconda3\lib\site-packages\pysph\solver\tests\test_solver_utils.py:124:


d:\miniconda3\lib\site-packages\pysph\solver\output.py:298: in load
return output.load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:79: in load
return self._load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:135: in _load
solver_data = _get_dict_from_arrays(data["solver_data"])
d:\miniconda3\lib\site-packages\numpy\lib\npyio.py:255: in getitem
pickle_kwargs=self.pickle_kwargs)


fp = <zipfile.ZipExtFile name='solver_data.npy' mode='r'>, allow_pickle = False
pickle_kwargs = {'encoding': 'bytes', 'fix_imports': True}

def read_array(fp, allow_pickle=False, pickle_kwargs=None):
    """
    Read an array from an NPY file.

    Parameters
    ----------
    fp : file_like object
        If this is not a real file object, then this may take extra memory
        and time.
    allow_pickle : bool, optional
        Whether to allow writing pickled data. Default: False

        .. versionchanged:: 1.16.3
            Made default False in response to CVE-2019-6446.

    pickle_kwargs : dict
        Additional keyword arguments to pass to pickle.load. These are only
        useful when loading object arrays saved on Python 2 when using
        Python 3.

    Returns
    -------
    array : ndarray
        The array from the data on disk.

    Raises
    ------
    ValueError
        If the data is invalid, or allow_pickle=False and the file contains
        an object array.

    """
    version = read_magic(fp)
    _check_version(version)
    shape, fortran_order, dtype = _read_array_header(fp, version)
    if len(shape) == 0:
        count = 1
    else:
        count = numpy.multiply.reduce(shape, dtype=numpy.int64)

    # Now read the actual data.
    if dtype.hasobject:
        # The array contained Python objects. We need to unpickle the data.
        if not allow_pickle:
          raise ValueError("Object arrays cannot be loaded when "
                             "allow_pickle=False")

E ValueError: Object arrays cannot be loaded when allow_pickle=False

d:\miniconda3\lib\site-packages\numpy\lib\format.py:727: ValueError
______________________________ TestOutputNumpy.test_dump_and_load_with_partial_data_dump ______________________________

self = <pysph.solver.tests.test_solver_utils.TestOutputNumpy testMethod=test_dump_and_load_with_partial_data_dump>

def test_dump_and_load_with_partial_data_dump(self):
    x = np.linspace(0, 1.0, 10)
    y = x*2.0
    pa = get_particle_array_wcsph(name='fluid', x=x, y=y)
    pa.set_output_arrays(['x', 'y'])
    fname = self._get_filename('simple')
    dump(fname, [pa], solver_data={})
  data = load(fname)

d:\miniconda3\lib\site-packages\pysph\solver\tests\test_solver_utils.py:107:


d:\miniconda3\lib\site-packages\pysph\solver\output.py:298: in load
return output.load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:79: in load
return self._load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:135: in _load
solver_data = _get_dict_from_arrays(data["solver_data"])
d:\miniconda3\lib\site-packages\numpy\lib\npyio.py:255: in getitem
pickle_kwargs=self.pickle_kwargs)


fp = <zipfile.ZipExtFile name='solver_data.npy' mode='r'>, allow_pickle = False
pickle_kwargs = {'encoding': 'bytes', 'fix_imports': True}

def read_array(fp, allow_pickle=False, pickle_kwargs=None):
    """
    Read an array from an NPY file.

    Parameters
    ----------
    fp : file_like object
        If this is not a real file object, then this may take extra memory
        and time.
    allow_pickle : bool, optional
        Whether to allow writing pickled data. Default: False

        .. versionchanged:: 1.16.3
            Made default False in response to CVE-2019-6446.

    pickle_kwargs : dict
        Additional keyword arguments to pass to pickle.load. These are only
        useful when loading object arrays saved on Python 2 when using
        Python 3.

    Returns
    -------
    array : ndarray
        The array from the data on disk.

    Raises
    ------
    ValueError
        If the data is invalid, or allow_pickle=False and the file contains
        an object array.

    """
    version = read_magic(fp)
    _check_version(version)
    shape, fortran_order, dtype = _read_array_header(fp, version)
    if len(shape) == 0:
        count = 1
    else:
        count = numpy.multiply.reduce(shape, dtype=numpy.int64)

    # Now read the actual data.
    if dtype.hasobject:
        # The array contained Python objects. We need to unpickle the data.
        if not allow_pickle:
          raise ValueError("Object arrays cannot be loaded when "
                             "allow_pickle=False")

E ValueError: Object arrays cannot be loaded when allow_pickle=False

d:\miniconda3\lib\site-packages\numpy\lib\format.py:727: ValueError
_________________________________ TestOutputNumpy.test_dump_and_load_works_by_default _________________________________

self = <pysph.solver.tests.test_solver_utils.TestOutputNumpy testMethod=test_dump_and_load_works_by_default>

def test_dump_and_load_works_by_default(self):
    x = np.linspace(0, 1.0, 10)
    y = x*2.0
    dt = 1.0
    pa = get_particle_array(name='fluid', x=x, y=y)
    fname = self._get_filename('simple')
    dump(fname, [pa], solver_data={'dt': dt})
  data = load(fname)

d:\miniconda3\lib\site-packages\pysph\solver\tests\test_solver_utils.py:68:


d:\miniconda3\lib\site-packages\pysph\solver\output.py:298: in load
return output.load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:79: in load
return self._load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:135: in _load
solver_data = _get_dict_from_arrays(data["solver_data"])
d:\miniconda3\lib\site-packages\numpy\lib\npyio.py:255: in getitem
pickle_kwargs=self.pickle_kwargs)


fp = <zipfile.ZipExtFile name='solver_data.npy' mode='r'>, allow_pickle = False
pickle_kwargs = {'encoding': 'bytes', 'fix_imports': True}

def read_array(fp, allow_pickle=False, pickle_kwargs=None):
    """
    Read an array from an NPY file.

    Parameters
    ----------
    fp : file_like object
        If this is not a real file object, then this may take extra memory
        and time.
    allow_pickle : bool, optional
        Whether to allow writing pickled data. Default: False

        .. versionchanged:: 1.16.3
            Made default False in response to CVE-2019-6446.

    pickle_kwargs : dict
        Additional keyword arguments to pass to pickle.load. These are only
        useful when loading object arrays saved on Python 2 when using
        Python 3.

    Returns
    -------
    array : ndarray
        The array from the data on disk.

    Raises
    ------
    ValueError
        If the data is invalid, or allow_pickle=False and the file contains
        an object array.

    """
    version = read_magic(fp)
    _check_version(version)
    shape, fortran_order, dtype = _read_array_header(fp, version)
    if len(shape) == 0:
        count = 1
    else:
        count = numpy.multiply.reduce(shape, dtype=numpy.int64)

    # Now read the actual data.
    if dtype.hasobject:
        # The array contained Python objects. We need to unpickle the data.
        if not allow_pickle:
          raise ValueError("Object arrays cannot be loaded when "
                             "allow_pickle=False")

E ValueError: Object arrays cannot be loaded when allow_pickle=False

d:\miniconda3\lib\site-packages\numpy\lib\format.py:727: ValueError
_______________________________ TestOutputNumpy.test_dump_and_load_works_with_compress ________________________________

self = <pysph.solver.tests.test_solver_utils.TestOutputNumpy testMethod=test_dump_and_load_works_with_compress>

def test_dump_and_load_works_with_compress(self):
    x = np.linspace(0, 1.0, 10)
    y = x*2.0
    dt = 1.0
    pa = get_particle_array(name='fluid', x=x, y=y)
    fname = self._get_filename('simple')
    dump(fname, [pa], solver_data={'dt': dt})
    fnamez = self._get_filename('simplez')
    dump(fnamez, [pa], solver_data={'dt': dt}, compress=True)
    # Check that the file size is indeed smaller
    self.assertTrue(os.stat(fnamez).st_size < os.stat(fname).st_size)
  data = load(fnamez)

d:\miniconda3\lib\site-packages\pysph\solver\tests\test_solver_utils.py:90:


d:\miniconda3\lib\site-packages\pysph\solver\output.py:298: in load
return output.load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:79: in load
return self._load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:135: in _load
solver_data = _get_dict_from_arrays(data["solver_data"])
d:\miniconda3\lib\site-packages\numpy\lib\npyio.py:255: in getitem
pickle_kwargs=self.pickle_kwargs)


fp = <zipfile.ZipExtFile name='solver_data.npy' mode='r' compress_type=deflate>, allow_pickle = False
pickle_kwargs = {'encoding': 'bytes', 'fix_imports': True}

def read_array(fp, allow_pickle=False, pickle_kwargs=None):
    """
    Read an array from an NPY file.

    Parameters
    ----------
    fp : file_like object
        If this is not a real file object, then this may take extra memory
        and time.
    allow_pickle : bool, optional
        Whether to allow writing pickled data. Default: False

        .. versionchanged:: 1.16.3
            Made default False in response to CVE-2019-6446.

    pickle_kwargs : dict
        Additional keyword arguments to pass to pickle.load. These are only
        useful when loading object arrays saved on Python 2 when using
        Python 3.

    Returns
    -------
    array : ndarray
        The array from the data on disk.

    Raises
    ------
    ValueError
        If the data is invalid, or allow_pickle=False and the file contains
        an object array.

    """
    version = read_magic(fp)
    _check_version(version)
    shape, fortran_order, dtype = _read_array_header(fp, version)
    if len(shape) == 0:
        count = 1
    else:
        count = numpy.multiply.reduce(shape, dtype=numpy.int64)

    # Now read the actual data.
    if dtype.hasobject:
        # The array contained Python objects. We need to unpickle the data.
        if not allow_pickle:
          raise ValueError("Object arrays cannot be loaded when "
                             "allow_pickle=False")

E ValueError: Object arrays cannot be loaded when allow_pickle=False

d:\miniconda3\lib\site-packages\numpy\lib\format.py:727: ValueError
_____________________________ TestOutputNumpy.test_that_output_array_information_is_saved _____________________________

self = <pysph.solver.tests.test_solver_utils.TestOutputNumpy testMethod=test_that_output_array_information_is_saved>

def test_that_output_array_information_is_saved(self):
    # Given
    x = np.linspace(0, 1.0, 10)
    y = x*2.0
    pa = get_particle_array(name='fluid', x=x, y=y, u=3*x)

    # When
    output_arrays = ['x', 'y', 'u']
    pa.set_output_arrays(output_arrays)
    fname = self._get_filename('simple')
    dump(fname, [pa], solver_data={})
  data = load(fname)

d:\miniconda3\lib\site-packages\pysph\solver\tests\test_solver_utils.py:148:


d:\miniconda3\lib\site-packages\pysph\solver\output.py:298: in load
return output.load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:79: in load
return self._load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:135: in _load
solver_data = _get_dict_from_arrays(data["solver_data"])
d:\miniconda3\lib\site-packages\numpy\lib\npyio.py:255: in getitem
pickle_kwargs=self.pickle_kwargs)


fp = <zipfile.ZipExtFile name='solver_data.npy' mode='r'>, allow_pickle = False
pickle_kwargs = {'encoding': 'bytes', 'fix_imports': True}

def read_array(fp, allow_pickle=False, pickle_kwargs=None):
    """
    Read an array from an NPY file.

    Parameters
    ----------
    fp : file_like object
        If this is not a real file object, then this may take extra memory
        and time.
    allow_pickle : bool, optional
        Whether to allow writing pickled data. Default: False

        .. versionchanged:: 1.16.3
            Made default False in response to CVE-2019-6446.

    pickle_kwargs : dict
        Additional keyword arguments to pass to pickle.load. These are only
        useful when loading object arrays saved on Python 2 when using
        Python 3.

    Returns
    -------
    array : ndarray
        The array from the data on disk.

    Raises
    ------
    ValueError
        If the data is invalid, or allow_pickle=False and the file contains
        an object array.

    """
    version = read_magic(fp)
    _check_version(version)
    shape, fortran_order, dtype = _read_array_header(fp, version)
    if len(shape) == 0:
        count = 1
    else:
        count = numpy.multiply.reduce(shape, dtype=numpy.int64)

    # Now read the actual data.
    if dtype.hasobject:
        # The array contained Python objects. We need to unpickle the data.
        if not allow_pickle:
          raise ValueError("Object arrays cannot be loaded when "
                             "allow_pickle=False")

E ValueError: Object arrays cannot be loaded when allow_pickle=False

d:\miniconda3\lib\site-packages\numpy\lib\format.py:727: ValueError
________________________________ TestOutputNumpyV1.test_load_works_with_dump_version1 _________________________________

self = <pysph.solver.tests.test_solver_utils.TestOutputNumpyV1 testMethod=test_load_works_with_dump_version1>

def test_load_works_with_dump_version1(self):
    x = np.linspace(0, 1.0, 10)
    y = x*2.0
    pa = get_particle_array(name='fluid', x=x, y=y)
    fname = self._get_filename('simple')
    dump_v1(fname, [pa], solver_data={})
  data = load(fname)

d:\miniconda3\lib\site-packages\pysph\solver\tests\test_solver_utils.py:181:


d:\miniconda3\lib\site-packages\pysph\solver\output.py:298: in load
return output.load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:79: in load
return self._load(fname)
d:\miniconda3\lib\site-packages\pysph\solver\output.py:135: in _load
solver_data = _get_dict_from_arrays(data["solver_data"])
d:\miniconda3\lib\site-packages\numpy\lib\npyio.py:255: in getitem
pickle_kwargs=self.pickle_kwargs)


fp = <zipfile.ZipExtFile name='solver_data.npy' mode='r'>, allow_pickle = False
pickle_kwargs = {'encoding': 'bytes', 'fix_imports': True}

def read_array(fp, allow_pickle=False, pickle_kwargs=None):
    """
    Read an array from an NPY file.

    Parameters
    ----------
    fp : file_like object
        If this is not a real file object, then this may take extra memory
        and time.
    allow_pickle : bool, optional
        Whether to allow writing pickled data. Default: False

        .. versionchanged:: 1.16.3
            Made default False in response to CVE-2019-6446.

    pickle_kwargs : dict
        Additional keyword arguments to pass to pickle.load. These are only
        useful when loading object arrays saved on Python 2 when using
        Python 3.

    Returns
    -------
    array : ndarray
        The array from the data on disk.

    Raises
    ------
    ValueError
        If the data is invalid, or allow_pickle=False and the file contains
        an object array.

    """
    version = read_magic(fp)
    _check_version(version)
    shape, fortran_order, dtype = _read_array_header(fp, version)
    if len(shape) == 0:
        count = 1
    else:
        count = numpy.multiply.reduce(shape, dtype=numpy.int64)

    # Now read the actual data.
    if dtype.hasobject:
        # The array contained Python objects. We need to unpickle the data.
        if not allow_pickle:
          raise ValueError("Object arrays cannot be loaded when "
                             "allow_pickle=False")

E ValueError: Object arrays cannot be loaded when allow_pickle=False

d:\miniconda3\lib\site-packages\numpy\lib\format.py:727: ValueError

Probably a bug in sisph

def initialize(self, d_idx, d_au, d_av, d_aw):
d_au[d_idx] = 0.0
d_av[d_idx] = 0.0
d_aw[d_idx] = 0.0

The MomentumEquationArtificialViscosity Equation is first evaluated and the calculated accelerations are thrown away in MomentumEquationBodyForce, or am I missing something.

Process on GPU killed after long run and/or restart

Dear developers,

I was using PySPH on my old GPU (GeForce GTX 680) for a while and I recently started using it on a newer GPU (NVIDIA TITAN V) as well. Unfortunately, there are some strange issues showing up, so hopefully someone can help me out here:

  1. When I run a simulation for a longer while, e.g. 8 hours or so, it suddenly gets killed. The resulting error log can be find in the attachment.
  2. When I try to restart the simulation from any of the previous restart files it gets killed again right away while showing the same error messages.
  3. The issue also shows up when want to restart a simulation that has run for a short time, say 30 minutes, and which finished correctly.

This problem doesn't appear on my GTX 680 but on that machine I am using an older version of PyOpenCL. I tried using this same older version of PyOpenCL on the TITAN V but it didn't solve the issue.

Best,

Stephan

gpu_err.log

pysph view does not view cube output simulation

I have

pysph 1.0a6
traits 6.1.1
traitsui 7.0.1
vtk 8.1.2
mayavi 4.7.1

I installed pysph through pip, compiling was giving too many errors.

When I tried to run pysph it complained about missing packages (which pip on pysph package did not install). The extra pkg installations had problems I had to downgrade to vtk 8.1.2 otherwise mayavi would not install.

I could run and generate simulation output fine. But when I tried to view the output, even after pysph command line stopped complaining on pkgs, I get the error

$ pysph view cube_output/

Exception occurred in traits notification handler for object: <pysph.tools.mayavi_viewer.MayaviViewer object at 0x7f52bc32cd58>, trait: files, old value: [], new value: ['cube_output/cube_0.npz', 'cube_output/cube_5.npz']
Traceback (most recent call last):
  File "/home/user/Documents/env3/lib/python3.6/site-packages/traits/trait_notifiers.py", line 340, in __call__
    self.handler(*args)
  File "/home/user/Documents/env3/lib/python3.6/site-packages/pysph/tools/mayavi_viewer.py", line 893, in _files_changed
    self._file_count_changed(fc)
  File "/home/user/Documents/env3/lib/python3.6/site-packages/pysph/tools/mayavi_viewer.py", line 911, in _file_count_changed
    data = load(fname)
  File "/home/user/Documents/env3/lib/python3.6/site-packages/pysph/solver/output.py", line 298, in load
    return output.load(fname)
  File "/home/user/Documents/env3/lib/python3.6/site-packages/pysph/solver/output.py", line 79, in load
    return self._load(fname)
  File "/home/user/Documents/env3/lib/python3.6/site-packages/pysph/solver/output.py", line 135, in _load
    solver_data = _get_dict_from_arrays(data["solver_data"])
  File "/home/user/Documents/env3/lib/python3.6/site-packages/numpy/lib/npyio.py", line 255, in __getitem__
    pickle_kwargs=self.pickle_kwargs)
  File "/home/user/Documents/env3/lib/python3.6/site-packages/numpy/lib/format.py", line 727, in read_array
    raise ValueError("Object arrays cannot be loaded when "
ValueError: Object arrays cannot be loaded when allow_pickle=False

Any ideas?

Intermittent error in pysph.parallel.tests.test_parallel.MPIReduceArrayTestCase.test_parallel_reduce

The test pysph.parallel.tests.test_parallel.MPIReduceArrayTestCase.test_parallel_reduce fails form time to time in the debian-ci infrastructure.
The error does not happen at every run so it is quite hard to track down.

Please find below the test output:

============================= test session starts ==============================
platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.0
rootdir: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd, configfile: tox.ini
collected 889 items / 90 deselected / 799 selected

pysph/base/tests/test_device_helper.py .ss.ss.ss.ss.ss.ss.ss.ss.ss.ss.ss [  4%]
.ss.ss.ss.ss.ss.ss.sss                                                   [  6%]
pysph/base/tests/test_domain_manager.py sss.........                     [  8%]
pysph/base/tests/test_kernel.py ........................................ [ 13%]
.................................................................        [ 21%]
pysph/base/tests/test_linalg3.py .......                                 [ 22%]
pysph/base/tests/test_neighbor_cache.py ....                             [ 22%]
pysph/base/tests/test_nnps.py .......ssssssssss......................... [ 28%]
....................................................ssssssssssssssssssss [ 37%]
ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss........ [ 46%]
.........................................                                [ 51%]
pysph/base/tests/test_octree.py .....s...s...s...s...s...s...s...s.      [ 55%]
pysph/base/tests/test_particle_array.py ................................ [ 59%]
...ssssssssssssssssssssssssssssssssssssssssssssssssssssssssssssss        [ 67%]
pysph/base/tests/test_periodic_nnps.py ............                      [ 69%]
pysph/base/tests/test_reduce_array.py ......                             [ 70%]
pysph/base/tests/test_utils.py ..                                        [ 70%]
pysph/examples/tests/test_riemann_solver.py s........                    [ 71%]
pysph/parallel/tests/test_openmp.py .                                    [ 71%]
pysph/parallel/tests/test_parallel.py .ss.F.                             [ 72%]
pysph/parallel/tests/test_parallel_run.py .                              [ 72%]
pysph/solver/tests/test_application.py ...                               [ 72%]
pysph/solver/tests/test_solver.py ..                                     [ 73%]
pysph/solver/tests/test_solver_utils.py ............                     [ 74%]
pysph/sph/bc/tests/test_simple_inlet_outlet.py .......                   [ 75%]
pysph/sph/tests/test_acceleration_eval.py ....................ssssssssss [ 79%]
ssssssssssssssssssssssssssssssssssssssssssssss                           [ 84%]
pysph/sph/tests/test_acceleration_eval_cython_helper.py ...              [ 85%]
pysph/sph/tests/test_equations.py ...............                        [ 87%]
pysph/sph/tests/test_integrator.py ...............ssssssss               [ 90%]
pysph/sph/tests/test_integrator_cython_helper.py .                       [ 90%]
pysph/sph/tests/test_integrator_step.py ..                               [ 90%]
pysph/sph/tests/test_kernel_corrections.py ................              [ 92%]
pysph/sph/tests/test_linalg.py .............                             [ 94%]
pysph/sph/tests/test_multi_group_integrator.py .s                        [ 94%]
pysph/sph/tests/test_riemann_solver.py .............                     [ 95%]
pysph/sph/tests/test_scheme.py .                                         [ 96%]
pysph/tools/tests/test_geometry.py .............s...                     [ 98%]
pysph/tools/tests/test_geometry_stl.py ...........                       [ 99%]
pysph/tools/tests/test_interpolator.py ...                               [100%]

=================================== FAILURES ===================================
_________________ MPIReduceArrayTestCase.test_parallel_reduce __________________

self = <pysph.parallel.tests.test_parallel.MPIReduceArrayTestCase testMethod=test_parallel_reduce>

    @mark.parallel
    def test_parallel_reduce(self):
        args = ['--directory=%s' % self.root]
>       run_parallel_script.run(
            filename='simple_reduction.py', args=args, nprocs=nprocs, path=path
        )

pysph/parallel/tests/test_parallel.py:101: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

filename = 'simple_reduction.py', args = ['--directory=/tmp/tmpag842xi0']
nprocs = 2, timeout = 30.0
path = '/build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests'

    def run(filename, args=None, nprocs=2, timeout=30.0, path=None):
        """Run a python script with MPI or in serial (if nprocs=1).  Kill process
        if it takes longer than the specified timeout.
    
        Parameters:
        -----------
        filename - filename of python script to run under mpi.
        args - List of arguments to pass to script.
        nprocs - number of processes to run (1 => serial non-mpi run).
        timeout - time in seconds to wait for the script to finish running,
            else raise a RuntimeError exception.
        path - the path under which the script is located
            Defaults to the location of this file (__file__), not curdir.
    
        """
        if args is None:
            args = []
        file_path = abspath(join(path, filename))
        cmd = [sys.executable, file_path] + args
        if nprocs > 1:
            cmd = ['mpiexec', '-n', str(nprocs)] + cmd
    
        print('running test:', cmd)
    
        process = Popen(cmd, stdout=PIPE, stderr=PIPE)
        timer = Timer(timeout, kill_process, [process])
        timer.start()
        out, err = process.communicate()
        timer.cancel()
        retcode = process.returncode
        if retcode:
            msg = 'test ' + filename + ' failed with returncode ' + str(retcode)
            print(out.decode('utf-8'))
            print(err.decode('utf-8'))
            print('#'*80)
            print(msg)
            print('#'*80)
>           raise RuntimeError(msg)
E           RuntimeError: test simple_reduction.py failed with returncode -9

pysph/tools/run_parallel_script.py:54: RuntimeError
----------------------------- Captured stdout call -----------------------------
running test: ['mpiexec', '-n', '2', '/usr/bin/python3.9', '/build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/simple_reduction.py', '--directory=/tmp/tmpag842xi0']
KILLING PROCESS ON TIMEOUT
Rank 0: Generating output in /tmp/tmpag842xi0
Rank 1: Generating output in /tmp/tmpag842xi0

ValueError: Dimension 1 invalid for PyZoltan!
Exception ignored in: 'pyzoltan.core.zoltan.get_geometry_list'
Traceback (most recent call last):
  File "/build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/solver/application.py", line 1353, in _setup_parallel_manager_and_initial_load_balance
    pm.update()
ValueError: Dimension 1 invalid for PyZoltan!

################################################################################
test simple_reduction.py failed with returncode -9
################################################################################
=============================== warnings summary ===============================
../../../../../../../usr/lib/python3/dist-packages/compyle/types.py:164
  /usr/lib/python3/dist-packages/compyle/types.py:164: DeprecationWarning: `np.bool` is a deprecated alias for the builtin `bool`. To silence this warning, use `bool` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.bool_` here.
  Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
    np.dtype(np.bool): 'char',

../../../../../../../usr/lib/python3/dist-packages/compyle/types.py:173
  /usr/lib/python3/dist-packages/compyle/types.py:173: DeprecationWarning: `np.bool` is a deprecated alias for the builtin `bool`. To silence this warning, use `bool` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.bool_` here.
  Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
    'bool': np.bool,

../../../../../../../usr/lib/python3/dist-packages/pyximport/pyximport.py:51
  /usr/lib/python3/dist-packages/pyximport/pyximport.py:51: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
    import imp

../../../../../../../usr/lib/python3/dist-packages/_pytest/mark/structures.py:337
  /usr/lib/python3/dist-packages/_pytest/mark/structures.py:337: PytestCollectionWarning: cannot collect 'test_all_backends' because it is not a function.
    def __call__(self, *args: object, **kwargs: object):

pysph/base/nnps.py:1
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/base/nnps.py:1: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here.
  Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
    from pysph.base.nnps_base import get_number_of_threads, py_flatten, \

pysph/examples/tests/test_examples.py:84
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/examples/tests/test_examples.py:84: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.slow

pysph/parallel/tests/test_openmp.py:17
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_openmp.py:17: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.slow

pysph/parallel/tests/test_openmp.py:33
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_openmp.py:33: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.slow

pysph/parallel/tests/test_parallel.py:55
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_parallel.py:55: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

pysph/parallel/tests/test_parallel.py:59
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_parallel.py:59: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

pysph/parallel/tests/test_parallel.py:72
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_parallel.py:72: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.slow

pysph/parallel/tests/test_parallel.py:73
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_parallel.py:73: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

pysph/parallel/tests/test_parallel.py:92
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_parallel.py:92: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

pysph/parallel/tests/test_parallel.py:98
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_parallel.py:98: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

pysph/parallel/tests/test_parallel.py:112
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_parallel.py:112: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

pysph/parallel/tests/test_parallel_run.py:25
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_parallel_run.py:25: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.slow

pysph/parallel/tests/test_parallel_run.py:26
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_parallel_run.py:26: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

pysph/parallel/tests/test_parallel_run.py:39
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_parallel_run.py:39: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.slow

pysph/parallel/tests/test_parallel_run.py:40
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_parallel_run.py:40: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

pysph/parallel/tests/test_parallel_run.py:50
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/parallel/tests/test_parallel_run.py:50: PytestUnknownMarkWarning: Unknown pytest.mark.parallel - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @mark.parallel

<frozen importlib._bootstrap>:228
  <frozen importlib._bootstrap>:228: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.

pysph/tools/tests/test_geometry_stl.py:7
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/tools/tests/test_geometry_stl.py:7: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here.
  Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
    import pysph.tools.geometry_stl as G

.pybuild/cpython3_3.9/build/pysph/sph/bc/tests/test_simple_inlet_outlet.py::TestSimpleInlet1D::test_inlet_calls_callback
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_d4d2f4620a7b2524086275ccd7e1980a.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/bc/tests/test_simple_inlet_outlet.py::TestSimpleOutlet1D::test_outlet_absorbs_particles_from_source
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_e33701e75932aa50c325a9da5905e046.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_call_initialize_pair
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_23a3818adde45352a04c1b7364f8beb8.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_call_pre_post_functions_in_group
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_6bd24ef6276cfb0a627c4a76f48d9905.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_call_py_initialize
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_2050fc7c6320c48dbe6a33538cf05088.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_handle_repeated_helper_functions
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_766c702abd060b362383b083d9c03275.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_iterate_iterated_group
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_33b81f4dba430aaf42210931044b04c0.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_iterate_nested_groups
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_7717c1442a4d826d0e187f434ef96cbc.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_not_iterate_normal_group
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_cfe2eb093a39e8e702efa0cf780bf4e3.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_run_reduce
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_678795442fd61741ccc5ce5dd5c5c753.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_support_constants
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_7bb1bab96b93bbdaafc45e03d4993835.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_support_loop_all_and_loop
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_e7e9ab313fa06d6ac295a713d6d20ed0.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_support_loop_all_and_loop
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_536b4600578e21ae33f7deb8aa1f71fe.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_acceleration_eval.py::TestAccelerationEval1D::test_should_work_with_non_double_arrays
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_f9827192da77f6672a3575a967f33ba7.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_integrator.py::TestIntegratorAdaptiveTimestep::test_compute_timestep_with_dt_adapt
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_76734bae3a461753975fbf791fdbb19b.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_integrator.py::TestIntegratorAdaptiveTimestep::test_compute_timestep_with_dt_adapt_trumps_dt_cfl
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_3eef2402041a6d8884be144e1ea8a11f.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_integrator.py::TestIntegratorAdaptiveTimestep::test_compute_timestep_with_dt_cfl
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_14fdd08882c92b4eca96b8e5525a5e26.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_integrator.py::TestIntegratorAdaptiveTimestep::test_compute_timestep_without_adaptive
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_a0271db338b0a591d39b669652df021c.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_integrator.py::TestLeapFrogIntegrator::test_helper_can_be_used_with_stepper
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_afce861c4d3d616622e5e739877d460b.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_integrator.py::TestLeapFrogIntegrator::test_integrator_calls_only_py_when_no_stage
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_a8dbf3b89f1f24e1514c5a05249db577.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_integrator.py::TestLeapFrogIntegrator::test_integrator_calls_py_stage1
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_5d94160891ab9d7b65d483394d53f090.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_integrator.py::TestLeapFrogIntegrator::test_integrator_calls_py_stage1_stage2
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_91ff9b09094d9dbbd640f4b6119df413.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_integrator.py::TestLeapFrogIntegrator::test_leapfrog
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_17141a5fd5c62030e087718d7d2cd89f.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_integrator.py::TestPEFRLIntegrator::test_pefrl
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_28bc252d0d0bd9e4d053fe4ec8d0ebc8.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection2D::test_crksph
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_03fdf2f0a8238e4859e602e5100a8ae8.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection2D::test_crksph_symmetric
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_3879b817675749e849fa14f03c2f577e.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection2D::test_gradient_correction
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_c2abdc8087fe603ffd109cd9e5859cec.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection2D::test_mixed_gradient_correction
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_d286287fc29a88343393d0c2866d769c.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection3D::test_gradient_correction
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_aa0363728d382da65870dd91f24707af.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_kernel_corrections.py::TestKernelCorrection3D::test_mixed_gradient_correction
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_102aa4de9cd8d5ff37ffec30e2e1c5a5.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_band_matrix
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py:92: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_band_matrix
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py:93: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_band_matrix
.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_dense_matrix
.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_general_matrix
.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_matrix
.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_positivedefinite_Matrix
.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_tridiagonal_matrix
.pybuild/cpython3_3.9/build/pysph/tools/tests/test_geometry.py::TestGeometry::test_rotate
.pybuild/cpython3_3.9/build/pysph/tools/tests/test_geometry.py::TestGeometry::test_rotate
  /usr/lib/python3/dist-packages/numpy/matrixlib/defmatrix.py:69: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    return matrix(data, dtype=dtype, copy=False)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_dense_matrix
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py:103: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_dense_matrix
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py:104: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_general_matrix
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py:81: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_general_matrix
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py:82: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_matrix
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py:126: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_matrix
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py:127: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_positivedefinite_Matrix
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py:138: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_symmetric_positivedefinite_Matrix
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py:139: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_tridiagonal_matrix
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py:115: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    mat = np.matrix(mat)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py::TestLinalg::test_tridiagonal_matrix
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/sph/tests/test_linalg.py:116: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    new_b = mat * np.transpose(np.matrix(result))

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_multi_group_integrator.py::TestMultiGroupIntegrator::test_different_accels_per_integrator
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_03d2f3c8840eaa38e2d13edb3a0927a1.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/sph/tests/test_multi_group_integrator.py::TestMultiGroupIntegrator::test_different_accels_per_integrator
  /usr/lib/python3/dist-packages/Cython/Compiler/Main.py:369: FutureWarning: Cython directive 'language_level' not set, using 2 for now (Py2). This will change in a later release! File: /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/.pysph/source/py3.9-linux-x86_64/m_f16a981c222b02ffbc9a137fd99ac24b.pyx
    tree = Parsing.p_module(s, pxd, full_module_name)

.pybuild/cpython3_3.9/build/pysph/tools/tests/test_geometry.py::TestGeometry::test_rotate
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/tools/geometry.py:176: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    rotation_matrix = matrix_exp(np.matrix(matrix))

.pybuild/cpython3_3.9/build/pysph/tools/tests/test_geometry.py::TestGeometry::test_rotate
  /build/2/pysph-1.0~b0~20191115.gite3d5e10/2nd/.pybuild/cpython3_3.9/build/pysph/tools/tests/test_geometry.py:76: PendingDeprecationWarning: the matrix subclass is not the recommended way to represent matrices or deal with linear algebra (see https://docs.scipy.org/doc/numpy/user/numpy-for-matlab-users.html). Please adjust your code to use regular ndarray.
    rotation_matrix = G.matrix_exp(np.matrix(mat))

.pybuild/cpython3_3.9/build/pysph/tools/tests/test_geometry_stl.py::TestGeometry::test_get_points_from_mgrid
  /usr/lib/python3/dist-packages/numpy/core/fromnumeric.py:1970: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray.
    result = asarray(a).shape

-- Docs: https://docs.pytest.org/en/stable/warnings.html
=========================== short test summary info ============================
FAILED pysph/parallel/tests/test_parallel.py::MPIReduceArrayTestCase::test_parallel_reduce
= 1 failed, 525 passed, 273 skipped, 90 deselected, 77 warnings in 2228.24s (0:37:08) =

AttributeError: 'pysph.base.nnps_base.NeighborCache' object has no attribute 'get_neighbors_gpu'

Hi Guys,

I am simulating a sloshing tank in pysph and it is working on CPU and OpenMP.
but when I tried to run using OpenCL
python .\sloshing_tank.py --opencl

Then I encounter the following error,

Using h = 0.002600
SISPH dt = 0.000100
Choose platform:
[0] <pyopencl.Platform 'NVIDIA CUDA' at 0x17cfbcd2d10>
[1] <pyopencl.Platform 'Intel(R) OpenCL' at 0x17cfbcd10e0>
Choice [0]:0
Set the environment variable PYOPENCL_CTX='0' to avoid being asked again.
OpenCL code written to C:\Users\kccho\.pysph\source\py3.7-win-amd64\generated.cl
Traceback (most recent call last):
  File ".\sloshing_tank.py", line 434, in <module>
    app.run()
  File "C:\Users\kccho\anaconda3\envs\pysph\lib\site-packages\pysph\solver\application.py", line 1535, in run        
    self._create_particles(self.create_particles)
  File "C:\Users\kccho\anaconda3\envs\pysph\lib\site-packages\pysph\solver\application.py", line 903, in _create_particles
    self.particles = particle_factory(*args, **kw)
  File ".\sloshing_tank.py", line 374, in create_particles
    self.scheme.setup_properties([fluid, boundary])
  File "C:\Users\kccho\anaconda3\envs\pysph\lib\site-packages\pysph\sph\scheme.py", line 202, in setup_properties    
    self.scheme.setup_properties(particles, clean)
  File "D:\Research\SPH\PySPH\HPRC_PySPH\para_tolerance_13_w_0_2\tolerance_0_010\sisph.py", line 1459, in setup_properties
    self._get_normals(pa)
  File "D:\Research\SPH\PySPH\HPRC_PySPH\para_tolerance_13_w_0_2\tolerance_0_010\sisph.py", line 1113, in _get_normals
    seval.evaluate()
  File "C:\Users\kccho\anaconda3\envs\pysph\lib\site-packages\pysph\tools\sph_evaluator.py", line 53, in evaluate    
    self.func_eval.compute(t, dt)
  File "C:\Users\kccho\anaconda3\envs\pysph\lib\site-packages\pysph\sph\acceleration_eval.py", line 232, in compute  
    self.c_acceleration_eval.compute(t, dt)
  File "C:\Users\kccho\anaconda3\envs\pysph\lib\site-packages\pysph\sph\acceleration_eval_gpu_helper.py", line 286, in compute
    self._call_kernel(info, extra_args)
  File "C:\Users\kccho\anaconda3\envs\pysph\lib\site-packages\pysph\sph\acceleration_eval_gpu_helper.py", line 234, in _call_kernel
    cache.get_neighbors_gpu()
AttributeError: 'pysph.base.nnps_base.NeighborCache' object has no attribute 'get_neighbors_gpu'

As mentioned earlier to install pyopencl first and then install pysph, I did the same

pip uninstall pysph
pip install pyopencl
pip install -r requirements.txt
python setup.py develop

But I am still enclosing the issue. Can anyone tell me how to resolve this issue?

Trying to use the slider of the interactive plot in 2D removes the widget

Hello,

I am trying to visualize the output of the elliptical_drop_output run within a jupyter notebook with the following code (taken from the docs)

%matplotlib ipympl
from pysph.tools.ipy_viewer import Viewer2D
viewer = Viewer2D('elliptical_drop_output')
viewer.interactive_plot()

This works fine as you can see in the first image
image

But when I try to use the widget to f.e. select a timeframe the widget is gone and all that is left is the plot. See the picture
image

Any ideas?
Regards, Manos

I get "OSError: [WinError 6] The handle is invalid" when trying to run pysph examples

How to solve this?

@prabhuramachandran

Running example pysph.examples.cube.

Information for example: pysph.examples.cube
A very simple example to help benchmark PySPH. (2 minutes)

The example creates a cube shaped block of water falling in free-space under
the influence of gravity while solving the incompressible, inviscid flow
equations. Only 5 time steps are solved but with a million particles. It is
easy to change the number of particles by simply passing the command line
argument --np to a desired number::

$ pysph run cube --np 2e6

To check the performance of PySPH using OpenMP one could try the following::

$ pysph run cube --disable-output

$ pysph run cube --disable-output --openmp

Number of particles: 103823
Generating output in C:\Users---\cube_output
Compiling code at: C:\Users---.pysph\source\py3.7-win-amd64\m_30bb3da0068142637bd63ff68cf26310.pyx
Traceback (most recent call last):
File "c:\users---\anaconda3\lib\runpy.py", line 193, in _run_module_as_main
"main", mod_spec)
File "c:\users---\anaconda3\lib\runpy.py", line 85, in run_code
exec(code, run_globals)
File "C:\Users---\Anaconda3\Scripts\pysph.exe_main
.py", line 9, in
File "c:\users---\anaconda3\lib\site-packages\pysph\tools\cli.py", line 69, in main
args.func(extra)
File "c:\users---\anaconda3\lib\site-packages\pysph\tools\cli.py", line 16, in run_examples
main(args)
File "c:\users---\anaconda3\lib\site-packages\pysph\examples\run.py", line 203, in main
run_command(module, args.split())
File "c:\users---\anaconda3\lib\site-packages\pysph\examples\run.py", line 147, in run_command
_exec_file(filename)
File "c:\users---\anaconda3\lib\site-packages\pysph\examples\run.py", line 21, in _exec_file
exec(co, ns)
File "c:\users---\anaconda3\lib\site-packages\pysph\examples\cube.py", line 88, in
app.run()
File "c:\users---\anaconda3\lib\site-packages\pysph\solver\application.py", line 1402, in run
self._configure()
File "c:\users---\anaconda3\lib\site-packages\pysph\solver\application.py", line 1126, in _configure
fixed_h=fixed_h)
File "c:\users---\anaconda3\lib\site-packages\pysph\solver\solver.py", line 203, in setup
sph_compiler.compile()
File "c:\users---\anaconda3\lib\site-packages\pysph\sph\sph_compiler.py", line 38, in compile
mod = helper0.compile(code0)
File "c:\users---\anaconda3\lib\site-packages\pysph\sph\acceleration_eval_cython_helper.py", line 172, in compile
self._module = self._ext_mod.load()
File "c:\users---\anaconda3\lib\site-packages\pysph\cpy\ext_module.py", line 266, in load
self.build()
File "c:\users---\anaconda3\lib\site-packages\pysph\cpy\ext_module.py", line 244, in build
setup_args={'script_args': script_args}
File "c:\users---\anaconda3\lib\site-packages\pyximport\pyxbuild.py", line 102, in pyx_to_dll
dist.run_commands()
File "c:\users---\anaconda3\lib\distutils\dist.py", line 966, in run_commands
self.run_command(cmd)
File "c:\users---\anaconda3\lib\distutils\dist.py", line 982, in run_command
log.info("running %s", command)
File "c:\users---\anaconda3\lib\distutils\log.py", line 46, in info
self._log(INFO, msg, args)
File "c:\users---\anaconda3\lib\distutils\log.py", line 31, in _log
stream.write('%s\n' % msg)
OSError: [WinError 6] The handle is invalid

ValueError when running elliptical_drop example

Hi, Bit of a beginner but I have just done a fresh install following the docs on a Ubuntu 18 machine and when runninh pysph run elliptical_drop It seems to run fine but when it reaches 100% it errors and I have no idea what to do to fix it.

Cheers

thomas@ubuntu ~/Documents/DP4_2 $ pysph run elliptical_drop
Running example pysph.examples.elliptical_drop.

Information for example: pysph.examples.elliptical_drop
Evolution of a circular patch of incompressible fluid. (60 seconds)

See J. J. Monaghan "Simulating Free Surface Flows with SPH", JCP, 1994, 100, pp
399 - 406

An initially circular patch of fluid is subjected to a velocity profile that
causes it to deform into an ellipse. Incompressibility causes the initially
circular patch to deform into an ellipse such that the area is conserved. An
analytical solution for the locus of the patch is available (exact_solution)

This is a standard test for the formulations for the incompressible SPH
equations.
Elliptical drop :: 5025 particles
Effective viscosity: rho*alpha*h*c/8 = 0.5687500000000001
Generating output in /home/thomas/Documents/DP4_2/elliptical_drop_output
Precompiled code from: /home/thomas/.pysph/source/py3.8-linux-x86_64/m_b41d0ba62d49506702fe93e0c4092ae9.pyx
No of particles:
----------------------------------------------------------------------
  fluid: 5025
----------------------------------------------------------------------
Setup took: 0.12276 secs
100%
Run took: 31.01062 secs
Traceback (most recent call last):
  File "/home/thomas/.local/bin/pysph", line 8, in <module>
    sys.exit(main())
  File "/home/thomas/.local/lib/python3.8/site-packages/pysph/tools/cli.py", line 69, in main
    args.func(extra)
  File "/home/thomas/.local/lib/python3.8/site-packages/pysph/tools/cli.py", line 16, in run_examples
    main(args)
  File "/home/thomas/.local/lib/python3.8/site-packages/pysph/examples/run.py", line 187, in main
    run_command(module, argv[1:])
  File "/home/thomas/.local/lib/python3.8/site-packages/pysph/examples/run.py", line 147, in run_command
    _exec_file(filename)
  File "/home/thomas/.local/lib/python3.8/site-packages/pysph/examples/run.py", line 21, in _exec_file
    exec(co, ns)
  File "/home/thomas/.local/lib/python3.8/site-packages/pysph/examples/elliptical_drop.py", line 226, in <module>
    app.post_process(app.info_filename)
  File "/home/thomas/.local/lib/python3.8/site-packages/pysph/examples/elliptical_drop.py", line 219, in post_process
    self._compute_results()
  File "/home/thomas/.local/lib/python3.8/site-packages/pysph/examples/elliptical_drop.py", line 196, in _compute_results
    for sd, array in iter_output(self.output_files, 'fluid'):
  File "/home/thomas/.local/lib/python3.8/site-packages/pysph/solver/utils.py", line 317, in iter_output
    data = load(file)
  File "/home/thomas/.local/lib/python3.8/site-packages/pysph/solver/output.py", line 298, in load
    return output.load(fname)
  File "/home/thomas/.local/lib/python3.8/site-packages/pysph/solver/output.py", line 79, in load
    return self._load(fname)
  File "/home/thomas/.local/lib/python3.8/site-packages/pysph/solver/output.py", line 135, in _load
    solver_data = _get_dict_from_arrays(data["solver_data"])
  File "/home/thomas/.local/lib/python3.8/site-packages/numpy/lib/npyio.py", line 243, in __getitem__
    return format.read_array(bytes,
  File "/home/thomas/.local/lib/python3.8/site-packages/numpy/lib/format.py", line 742, in read_array
    raise ValueError("Object arrays cannot be loaded when "
ValueError: Object arrays cannot be loaded when allow_pickle=False
thomas@ubuntu ~/Documents/DP4_2 $ 

add_particle adds particle with inconsistent array length when property have more than one stride.

The following code explains what is happening.

from pysph.base.utils import get_particle_array_wcsph as gpaw
import numpy as np

x = np.arange(0,1,0.1)
y = np.copy(x)
fluid = gpaw(x=x, y=y)
fluid.add_property('newprop', stride=3)
pa_add = {}
for prop, array in fluid.get_property_arrays().items():
    pa_add[prop] = np.array(array[0:5])
fluid.add_particles(**pa_add)
fluid.get_property_arrays()

print(len(fluid.x), len(fluid.newprop), fluid.stride)

here the add_particles should show error that the property array lengths are inconsistent instead of adding them.

Guide on readthedocs provide errors with Python 3

In the section http://pysph.readthedocs.io/en/latest/tutorial/circular_patch_simple.html#interpolating-properties, the following code will fail in Python 3:

interp = Interpolator(parrays.values(), num_points=10000)
p = interp.interpolate('p')

Simply because in Python 3, dict.values return only a view of dict. If pysph is installed with Python 3.x, a solution to fix is:

interp = Interpolator(list(parrays.values()), num_points=10000)
p = interp.interpolate('p')

This makes me wonder whether it would be better to stick with one Python version only (at the time of speaking Python 3 is the future so debateably it would be better to stick with them).

Question: Inlet / Outlet with multiple fluids

Hi,

First of all, thank you for your amazing work here.
I want to simulate multi-fluid systems in closed channels. An incoming fluid should displace an already present fluid. The incoming fluid should enter the simulation with a constant velocity.
I tried to use inlets and outlets for that, however, I ran into some problems. How can the inlet produce fluid_1 while the outlet "erases" fluid_2 and eventually fluid_1.

Thank you for your help,

Florian

OSErrors on examples

I encountered this same problem running the taylor_green and the sphere_in_vessel_akinci.py example on windows 10 amd64 with python 3.9.1.

D:\>pysph run taylor_green
Running example pysph.examples.taylor_green.

Information for example: pysph.examples.taylor_green
Taylor Green vortex flow (5 minutes).
D:\Python39\lib\site-packages\setuptools\distutils_patch.py:25: UserWarning: Distutils was imported before Setuptools. This usage is discouraged and may exhibit undesirable behaviors or errors. Please use Setuptools' objects directly or at least import Setuptools first.
  warnings.warn(
Taylor green vortex problem :: nfluid = 2500, dt = 0.000454545
Generating output in D:\taylor_green_output
Compiling code at: C:\Users\ylan\.pysph\source\py3.9-win-amd64\m_0f0514a0447e13b6694879ff005ee980.pyx
running build_ext
Traceback (most recent call last):
  File "D:\Python39\Scripts\pysph-script.py", line 33, in <module>
    sys.exit(load_entry_point('PySPH==1.0a6', 'console_scripts', 'pysph')())
  File "D:\Python39\lib\site-packages\pysph\tools\cli.py", line 69, in main
    args.func(extra)
  File "D:\Python39\lib\site-packages\pysph\tools\cli.py", line 16, in run_examples
    main(args)
  File "D:\Python39\lib\site-packages\pysph\examples\run.py", line 187, in main
    run_command(module, argv[1:])
  File "D:\Python39\lib\site-packages\pysph\examples\run.py", line 147, in run_command
    _exec_file(filename)
  File "D:\Python39\lib\site-packages\pysph\examples\run.py", line 21, in _exec_file
    exec(co, ns)
  File "D:\Python39\lib\site-packages\pysph\examples\taylor_green.py", line 449, in <module>
    app.run()
  File "D:\Python39\lib\site-packages\pysph\solver\application.py", line 1402, in run
    self._configure()
  File "D:\Python39\lib\site-packages\pysph\solver\application.py", line 1121, in _configure
    solver.setup(
  File "D:\Python39\lib\site-packages\pysph\solver\solver.py", line 203, in setup
    sph_compiler.compile()
  File "D:\Python39\lib\site-packages\pysph\sph\sph_compiler.py", line 38, in compile
    mod = helper0.compile(code0)
  File "D:\Python39\lib\site-packages\pysph\sph\acceleration_eval_cython_helper.py", line 172, in compile
    self._module = self._ext_mod.load()
  File "D:\Python39\lib\site-packages\pysph\cpy\ext_module.py", line 266, in load
    self.build()
  File "D:\Python39\lib\site-packages\pysph\cpy\ext_module.py", line 240, in build
    mod = pyxbuild.pyx_to_dll(
  File "D:\Python39\lib\site-packages\pyximport\pyxbuild.py", line 102, in pyx_to_dll
    dist.run_commands()
  File "D:\Python39\lib\distutils\dist.py", line 966, in run_commands
    self.run_command(cmd)
  File "D:\Python39\lib\distutils\dist.py", line 982, in run_command
    log.info("running %s", command)
  File "D:\Python39\lib\distutils\log.py", line 46, in info
    self._log(INFO, msg, args)
  File "D:\Python39\lib\distutils\log.py", line 31, in _log
    stream.write('%s\n' % msg)
OSError: [WinError 6] invalid handle

Problem while using loop_all in openmp

While using loop_all in openmp for ShepardFilter equation the values of d_rho are changed in the loop_all itself within one equation while previously some temporary values are stored and then they are used for new values of rho in a different equation. So previously we are not changing the value of rho in the loop but now we are changing the values of rho in the loop_all by making d_rho[d_idx] = 0 and then adding something to it in the loop. Because of the parallelisation the density of one particle is used for other particles even before the loop is finished, this resulted in false rho values.

The problem is only in openmp and it works perfectly without openmp.

Build fails: error: no matching function for call to '__sort'

Fails on FreeBSD 11.2 amd64, clang-60, python-3.6, version 1.0a6:

In file included from /usr/include/c++/v1/__string:56:
/usr/include/c++/v1/algorithm:4201:5: error: no matching function for call to '__sort'
    __sort<_Comp_ref>(__first, __last, __comp);
    ^~~~~~~~~~~~~~~~~
pysph/base/nnps_base.cpp:22642:10: note: in instantiation of function template specialization 'std::__1::sort<std::__1::__wrap_iter<std::__1::pair<unsigned int, unsigned
      int> *>, int (std::__1::pair<unsigned int, unsigned int>, std::__1::pair<unsigned int, unsigned int>)>' requested here
    std::sort<std::vector<__pyx_t_5pysph_4base_9nnps_base_id_gid_pair_t> ::iterator,int (__pyx_t_5pysph_4base_9nnps_base_id_gid_pair_t, __pyx_t_5pysph_4base_9nnps_ba...
         ^
/usr/include/c++/v1/algorithm:3998:1: note: candidate function not viable: no known conversion from 'int (*)(std::__1::pair<unsigned int, unsigned int>,
      std::__1::pair<unsigned int, unsigned int>)' to 'int (&)(std::__1::pair<unsigned int, unsigned int>, std::__1::pair<unsigned int, unsigned int>)' for 3rd argument;
      dereference the argument with *
__sort(_RandomAccessIterator __first, _RandomAccessIterator __last, _Compare __comp)
^
1 warning and 1 error generated.
error: command 'cc' failed with exit status 1

Issue with using pysph view

I'm a beginner to pysph. I managed to install the package along with all the dependencies. Using the command "pysph run dam_break_2d" works for me and even prints the output data in the appropriate location. However, when I use pysph view dam_break_2d_output/, it throws up the following error:

It looks like orte_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

setting topology failed
--> Returned value Error (-1) instead of ORTE_SUCCESS


It looks like orte_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during orte_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):

orte_ess_init failed
--> Returned value Error (-1) instead of ORTE_SUCCESS


It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems. This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

ompi_mpi_init: ompi_rte_init failed
--> Returned "Error" (-1) instead of "Success" (0)

*** An error occurred in MPI_Init_thread
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[phani-B85M-DS3H:7967] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!

FYI, I'm using mpirun (Open MPI) 2.1.1.
Any direction to debug this issue will be greatly appreciated :)

Issue with compyle

Hello,

While installing pySPH in ubuntu 18.04 inside anaconda 3 using the command " pip install PySPH", I am getting an error

ValueError: 'compyle/thrust/sort.pyx' doesn't match any files

May I know, how to fix this one.

It further shows the following command line next:

Command "/home/rahul/anaconda3/bin/python /home/rahul/anaconda3/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py get_requires_for_build_wheel /tmp/tmpwsp6k4pi" failed with error code 1 in /tmp/pip-install-6ou1mlem/compyle

Rahul

Scan arguments in CPy

Currently CPy only allows input arguments to be passed to the scan kernel while calling. This should be changed so that output and segment arguments can also be passed. IMO these should be passed as keyword arguments. For example if input_args are int* ary and output_args are int* out the the kernel would be called as,

scan_knl(ary=ary, out=out)

too many sockets open when running ``app.run`` inside a loop

When calling app.run inside an optimization loop, python3 ends with an error:

  OSError: Unable to create file (unable to open file: name = '*.hdf5',
  errno = 24, error message = 'Too many open files', flags = 13, o_flags = 242)

When no output files are written, using the flag --disable-output, then the
same error occurs, but now when intenting to write another file.

The modified example script bouncing_cube-modified.py reproduces the error.
Here the app.run is called from within a simple loop. At my system it runs
until the system soft limit of 512 files is hit. The scipt adds information as to which files are open for running the script.

I cannot find the root cause of this error, help needed here.

The error can be by-passed using 2 python scripts: one with the loop, calling the
other one using a subprocess.

, see loop.py running together with bouncing-cube2.py

The files can be found here, rename the .txt files to .py

bouncing_cube2.txt
bouncing_cube-modified.txt
loop.txt

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.