Giter Club home page Giter Club logo

libpysal's Introduction

Python Spatial Analysis Library

Continuous Integration PyPI version Anaconda-Server Badge Discord Ruff DOI

PySAL, the Python spatial analysis library, is an open source cross-platform library for geospatial data science with an emphasis on geospatial vector data written in Python. It supports the development of high level applications for spatial analysis, such as

  • detection of spatial clusters, hot-spots, and outliers
  • construction of graphs from spatial data
  • spatial regression and statistical modeling on geographically embedded networks
  • spatial econometrics
  • exploratory spatio-temporal data analysis

PySAL Components

PySAL is a family of packages for spatial data science and is divided into four major components:

Lib

solve a wide variety of computational geometry problems including graph construction from polygonal lattices, lines, and points, construction and interactive editing of spatial weights matrices & graphs - computation of alpha shapes, spatial indices, and spatial-topological relationships, and reading and writing of sparse graph data, as well as pure python readers of spatial vector data. Unike other PySAL modules, these functions are exposed together as a single package.

  • libpysal : libpysal provides foundational algorithms and data structures that support the rest of the library. This currently includes the following modules: input/output (io), which provides readers and writers for common geospatial file formats; weights (weights), which provides the main class to store spatial weights matrices, as well as several utilities to manipulate and operate on them; computational geometry (cg), with several algorithms, such as Voronoi tessellations or alpha shapes that efficiently process geometric shapes; and an additional module with example data sets (examples).

Explore

The explore layer includes modules to conduct exploratory analysis of spatial and spatio-temporal data. At a high level, packages in explore are focused on enabling the user to better understand patterns in the data and suggest new interesting questions rather than answer existing ones. They include methods to characterize the structure of spatial distributions (either on networks, in continuous space, or on polygonal lattices). In addition, this domain offers methods to examine the dynamics of these distributions, such as how their composition or spatial extent changes over time.

  • esda : esda implements methods for the analysis of both global (map-wide) and local (focal) spatial autocorrelation, for both continuous and binary data. In addition, the package increasingly offers cutting-edge statistics about boundary strength and measures of aggregation error in statistical analyses

  • giddy : giddy is an extension of esda to spatio-temporal data. The package hosts state-of-the-art methods that explicitly consider the role of space in the dynamics of distributions over time

  • inequality : inequality provides indices for measuring inequality over space and time. These comprise classic measures such as the Theil T information index and the Gini index in mean deviation form; but also spatially-explicit measures that incorporate the location and spatial configuration of observations in the calculation of inequality measures.

  • momepy : momepy is a library for quantitative analysis of urban form - urban morphometrics. It aims to provide a wide range of tools for a systematic and exhaustive analysis of urban form. It can work with a wide range of elements, while focused on building footprints and street networks. momepy stands for Morphological Measuring in Python.

  • pointpats : pointpats supports the statistical analysis of point data, including methods to characterize the spatial structure of an observed point pattern: a collection of locations where some phenomena of interest have been recorded. This includes measures of centrography which provide overall geometric summaries of the point pattern, including central tendency, dispersion, intensity, and extent.

  • segregation : segregation package calculates over 40 different segregation indices and provides a suite of additional features for measurement, visualization, and hypothesis testing that together represent the state-of-the-art in quantitative segregation analysis.

  • spaghetti : spaghetti supports the the spatial analysis of graphs, networks, topology, and inference. It includes functionality for the statistical testing of clusters on networks, a robust all-to-all Dijkstra shortest path algorithm with multiprocessing functionality, and high-performance geometric and spatial computations using geopandas that are necessary for high-resolution interpolation along networks, and the ability to connect near-network observations onto the network

Model

In contrast to explore, the model layer focuses on confirmatory analysis. In particular, its packages focus on the estimation of spatial relationships in data with a variety of linear, generalized-linear, generalized-additive, nonlinear, multi-level, and local regression models.

  • mgwr : mgwr provides scalable algorithms for estimation, inference, and prediction using single- and multi-scale geographically-weighted regression models in a variety of generalized linear model frameworks, as well model diagnostics tools

  • spglm : spglm implements a set of generalized linear regression techniques, including Gaussian, Poisson, and Logistic regression, that allow for sparse matrix operations in their computation and estimation to lower memory overhead and decreased computation time.

  • spint : spint provides a collection of tools to study spatial interaction processes and analyze spatial interaction data. It includes functionality to facilitate the calibration and interpretation of a family of gravity-type spatial interaction models, including those with production constraints, attraction constraints, or a combination of the two.

  • spreg : spreg supports the estimation of classic and spatial econometric models. Currently it contains methods for estimating standard Ordinary Least Squares (OLS), Two Stage Least Squares (2SLS) and Seemingly Unrelated Regressions (SUR), in addition to various tests of homokestadicity, normality, spatial randomness, and different types of spatial autocorrelation. It also includes a suite of tests for spatial dependence in models with binary dependent variables.

  • spvcm : spvcm provides a general framework for estimating spatially-correlated variance components models. This class of models allows for spatial dependence in the variance components, so that nearby groups may affect one another. It also also provides a general-purpose framework for estimating models using Gibbs sampling in Python, accelerated by the numba package.

    ⚠️ Warning: spvcm has been archived and is planned for deprecation and removal in pysal 25.01.

  • tobler : tobler provides functionality for for areal interpolation and dasymetric mapping. Its name is an homage to the legendary geographer Waldo Tobler a pioneer of dozens of spatial analytical methods. tobler includes functionality for interpolating data using area-weighted approaches, regression model-based approaches that leverage remotely-sensed raster data as auxiliary information, and hybrid approaches.

  • access : access aims to make it easy for analysis to calculate measures of spatial accessibility. This work has traditionally had two challenges: [1] to calculate accurate travel time matrices at scale and [2] to derive measures of access using the travel times and supply and demand locations. access implements classic spatial access models, allowing easy comparison of methodologies and assumptions.

  • spopt: spopt is an open-source Python library for solving optimization problems with spatial data. Originating from the original region module in PySAL, it is under active development for the inclusion of newly proposed models and methods for regionalization, facility location, and transportation-oriented solutions.

Viz

The viz layer provides functionality to support the creation of geovisualisations and visual representations of outputs from a variety of spatial analyses. Visualization plays a central role in modern spatial/geographic data science. Current packages provide classification methods for choropleth mapping and a common API for linking PySAL outputs to visualization tool-kits in the Python ecosystem.

  • legendgram : legendgram is a small package that provides "legendgrams" legends that visualize the distribution of observations by color in a given map. These distributional visualizations for map classification schemes assist in analytical cartography and spatial data visualization

  • mapclassify : mapclassify provides functionality for Choropleth map classification. Currently, fifteen different classification schemes are available, including a highly-optimized implementation of Fisher-Jenks optimal classification. Each scheme inherits a common structure that ensures computations are scalable and supports applications in streaming contexts.

  • splot : splot provides statistical visualizations for spatial analysis. It methods for visualizing global and local spatial autocorrelation (through Moran scatterplots and cluster maps), temporal analysis of cluster dynamics (through heatmaps and rose diagrams), and multivariate choropleth mapping (through value-by-alpha maps. A high level API supports the creation of publication-ready visualizations

Installation

PySAL is available through Anaconda (in the defaults or conda-forge channel) We recommend installing PySAL from conda-forge:

conda config --add channels conda-forge
conda install pysal

PySAL can also be installed using pip:

pip install pysal

As of version 2.0.0 PySAL has shifted to Python 3 only.

Users who need an older stable version of PySAL that is Python 2 compatible can install version 1.14.3 through pip or conda:

conda install pysal==1.14.3

Documentation

For help on using PySAL, check out the following resources:

Development

As of version 2.0.0, PySAL is now a collection of affiliated geographic data science packages. Changes to the code for any of the subpackages should be directed at the respective upstream repositories, and not made here. Infrastructural changes for the meta-package, like those for tooling, building the package, and code standards, will be considered.

Development is hosted on github.

Discussions of development as well as help for users occurs on the developer list as well as in PySAL's Discord channel.

Getting Involved

If you are interested in contributing to PySAL please see our development guidelines.

Bug reports

To search for or report bugs, please see PySAL's issues.

Build Instructions

To build the meta-package pysal see tools/README.md.

License information

See the file "LICENSE.txt" for information on the history of this software, terms & conditions for usage, and a DISCLAIMER OF ALL WARRANTIES.

libpysal's People

Contributors

andrewwinslow avatar conceptron avatar darribas avatar dependabot[bot] avatar dfolch avatar fbdtemme avatar fgregg avatar jgaboardi avatar jlaura avatar jo-tham avatar knaaptime avatar kryndlea avatar lanselin avatar ljwolf avatar makosak avatar martinfleis avatar mgeeeek avatar mhwang4 avatar nmalizia avatar pastephens avatar pedrovma avatar pre-commit-ci[bot] avatar qulogic avatar schmidtc avatar sebastic avatar shaohu avatar sjsrey avatar slumnitz avatar tayloroshan avatar weikang9009 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

libpysal's Issues

To networkx argument name changed

networkx changed the name of ebunch to ebunch_to_add in their Graph.add_weighted_edges_from method.

Now, this means that the addition of weighted edges to the graph representing the W object fails, since it specifies the name as ebunch

Request a new release of libpysal

libpysal relies on shapely as a dependency but does not demand the installation procedure to automatically install shapely.

libpysal api.py contains a collection of functions/classes in libpysal including those dependent on shapely and those not. For users who only want to use non-shapely-dependent functions and do not install shapely on their machine, import libpysal.api would be problematic in the current released version of libpysal. This is also a vital issue for giddy which uses non-shapely-dependent functions from import libpysal.api .

Fortunately this commit (only import the shapely extension if shapely is available) resolves this issue. Thus it would be very important to release a new version of libpysal with this commit integrated.

core.util.WKTParser.fromWKT does not correctly determine holes

AFAICT, the WKTParser in core.util.wkt never constructs polygons with holes, since it never passes a holes keyword argument.

This came up because I've gotten a MultiPolygon parser working (I think) on the examples at the bottom of that file. In testing it, I was correctly parsing the Multipolygon, but the Polygon parser was never correctly determining what was the exterior ring and what was the interior ring.

Page 2-8 of the OGC Simple Features specification says Polygons should be:

1 exterior ring and 0 or more interior boundaries.

And, as far as I can tell, the exterior ring is always be first, with an arbitrary number of holes listed afterwards. I can't find that in the spec.

ENH: shared perimeter contiguity weighting.

When we're coming from geodataframes, we can probably implement a shared perimeter weighting scheme pretty easily. This means that, instead of contiguity being a binary relation, we'd assign the weight to the adjacency graph according to how much of the polygon's perimeter is shared along their shared edge.

For Queen weights, this would necessarily reduce them down to Rook weights, so we would implement this only for Rook.

  1. do a first pass for binary contiguity
  2. for neighbors:
    1. poly_focal.intersection(poly_neighbor).length / poly_focal.boundary.length would give the asymmetric perimeter weight
    2. poly_focal.intersection(poly_neighbor).length / (poly_focal.boundary.length + poly_neighbor.length) would give a symmetrized perimeter weight

This would be an interesting addition to the library. It could be implemented as a function in util, and then applied at the end of initialisation for Rook.from_shapefile or from_dataframe if perimeter=True.

weights.plot does not handle named observations

right now, if you have weights constructed using an idVariable or ids list, the plot method fails.

This is because it is using the iloc method to do lookups based on iteration indices.

It should be possible to rework this to use names and only names.

import not working after local install

After installing with

python setup.py install

imports are failing

Python 2.7.13 |Continuum Analytics, Inc.| (default, Dec 20 2016, 23:09:15)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-1)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Anaconda is brought to you by Continuum Analytics.
Please check out: http://continuum.io/thanks and https://anaconda.org
>>> import libpysal
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: No module named libpysal

Nightli.es build permissions

For some reason I have permission to set Nightli.es builds for both spint and spglm, but not for the actual package I maintain, spaghetti. Is there an quick solution for this? I haven't found one. The screen shot below was taken from pysal's travis profile.

nightli

redirect pysal/#934 to libpysal

pysal's #934 involves a fix for unsupported filetypes that removes a class of errors we didn't know we were making in core.

We need to port that commit to here & also unittest with TestCase.assertRaises to ensure that this gets raised for unsupported filetypes.

Edit island neighbors values in spatial Wobject Pysal

I am working on clustering a map with some islands in it.
These islands have no neighbors in the w object and so I can't run the maxp clustering function. I get the command ''No initial solution found''.
What I tried to do is to calculate the neighbors using the distance spatial weights, then substituting the island value in the congenital spatial weight.
The new W object still causes the same problem I used to get before, I am not sure where the problem comes from.
Here is the piece of code

''

shape=ps.open('file.dbf')
shape=shape.by_col_array('Solar')
shape=shape.astype(float)
hape[np.isnan(shape)]=0
scaler=MinMaxScaler()
shape1=scaler.fit_transform(shape)
w=ps.weights.Queen.from_shapefile('file.shp')
knn5=ps.knnW_from_shapefile('file.shp')
aa=w.islands
mtrx,idx=w.full()
for indx,val in enumerate(aa):

    w.neighbors[aa[indx]]=[knn5.neighbors[aa[indx]][0]]
    w.neighbors[knn5.neighbors[aa[indx]][0]]=w.neighbors[knn5.neighbors[aa[indx]][0]]+[aa[indx]]

w1=ps.weights.weights.W(w.neighbors,id_order=w.id_order,ids=w.id_order)
thr=0.1*sum(shape1[:,0])
np.random.seed(1234)
r=ps.region.maxp.Maxp(w1,shape1,floor=thr,floor_variable=shape1[:,0],initial=4000)

Then I get the command
''No initial solution found''

''
Any help please?

Rook & KNN take different id lists from the dataframe

KNN correctly takes the index of the dataframe as the index of the weights, but rook does not.

from libpysal import weights
from libpysal import examples
import geopandas

columbus = geopandas.read_file(examples.get_path('columbus.shp'))
columbus_sub = columbus.sample(frac=1)

print(columbus_sub.index[0:5]) # should not be (0,1,2,3,4,) but instead random

Wr = weights.Rook.from_dataframe(columbus_sub)
Wknn1 = weights.KNN.from_dataframe(columbus_sub)

print(Wr.id_order) # is (0,1,2,3...)
print(Wknn1.id_order) # matches columbus_sub.index

These should behave the same everywhere, and I think they should behave like KNN, taking the indices from the dataframe directly.

Kernel docstring does not mention unique Gaussian kernel behavior

I keep getting bit by this.

Our Gaussian kernel only computes for observations within the bandwidth distance.

But, in theory, this isn't necessary, since observations are still connected in the Gaussian kernel past this bandwidth.

Thus, in quite a few cases, this distance can result in truncations at pretty high w_{ij}; I get truncation at around .25 in an adaptive bandwidth on berlin neighborhoods data from geopython...

Since this isn't going to be fixed (I recall @TaylorOshan running into this when trying to build GWR on top of existing PySAL stuff), we need to disclaim that we force all kernels to be truncated at the bandwidth.

libpysal/libpysal/cg/__init__.py not importing `rtree`

  • Platform information:
    posix darwin
  • Python version:
    3.6.6
  • SciPy version:
    1.1.0
  • NumPy version:
    1.15.0

rtree is currently not being imported during __init__ (though it is being deleted), causing failures in spaghetti.utils.py.

from .shapes import *
from .standalone import *
from .locators import *
from .kdtree import *
from .sphere import *
from .voronoi import *
from .alpha_shapes import alpha_shape, alpha_shape_auto

del rtree
del kdtree
del locators
del voronoi
del standalone
del alpha_shapes
del shapes

attach_islands assumes transform='b' & assumes k=1

It'd be nice to be able to use more than 1 nearest neighbor for the islands, use different kinds of supplemental weights, or use non-binary transforms.

Functionally, I think this conceptually involves:

  1. Ensuring the "supplement" weights to the transform of the target weights. (probably usually binary, since this seems to be used mostly for rook/queen with islands)
  2. Merging all of the neighbors,weights of islands in the target with their corresponding neighbors,weights in the supplement:

I believe this mostly looks the same as what's implemented:

def attach_islands(target, suplement):
    assert supplement.id_order == target.id_order
    supplement.transform = target.transform
    neighbors, weights = copy.deepcopy(target.neighbors), copy.deepcopy(target.weights)
    for island_ix in target.islands:
        neighbors[island_ix] = supplement.neighbors[island_ix]
        weights[island_ix] = supplement.weights[island_ix]
    return W(neighbors, weights, id_order=target.id_order)

weights.Voronoi is a function, not a class.

This thread discusses this rough edge.

Whether or not we should document for the 2.0 release, then change (mandating another subsequent change to the documentation) or change (attempting to avoid spalling out into the rest of the library), then document the 2.0 release is up for grabs.

inconsistency in api?

from . import util

I wonder why not

from .util import *

The old api.py has it like this:

from .weights.util import lat2W, block_weights, comb, order, higher_order, shimbel, remap_ids, full2W, full, WSP2W, insert_diagonal, get_ids, get_points_array_from_shapefile, min_threshold_distance, lat2SW, w_local_cluster, higher_order_sp, hexLat2W, regime_weights, attach_islands, nonplanar_neighbors, fuzzy_contiguity

Weights for circle, spheres and other connected on borders

Hi,
I wanted to know if there is a way of generating weights for circles, spheres and such in a way similar to

w=pysal.lat2W(3,1)?

This generates

w.full()
Out[62]: (array([[ 0.,  1.,  0.],
[ 1.,  0.,  1.],
[ 0.,  1.,  0.]]), [0, 1, 2])

But on a circle it should be

(array([[ 0.,  1.,  1.],
[ 1.,  0.,  1.],
[ 1.,  1.,  0.]]), [0, 1, 2])

Thanks!

deprecate or test shapely_ext

the shapely extension was and is a very handy piece of code, wrapping correct interfaces between PySAL and shapely.

It's also used by geotable, my shapely-less mimick of geopandas, and the testing suite for the io.wkb module.

Currently, tests are not run for this code. Further, usually teach geopandas directly for this functionality. There is also some loss of information moving from shapely to pysal, since we do not maintain the order in which holes are nested inside exterior rings (pysal/#820, pysal/#852).

So, we should either:

  1. deprecate the shapely extension. This would not affect geotable, since that dependency is soft. It would affect tests for wkb, which should be simple to switch using shapely.geometry.shape.
  2. write tests for (and commit to maintain) this namespace.

on-the-fly W

This is originally a question, but if not implemented, it could be a nice additional user method to have. Is it possible to build a W from a subset of polygons from a shapefile?

Test-case example: calculate W for counties in only one state using a shapefile of all counties in the US. What I have in mind is something like ps.queen_from_shapefile but, instead of taking a path to a shapefile, it takes an iterable of polygons.

This would be useful for my example case, but also in many more contexts. Take the hypothetical case in which a user extracts a subset of polygons from a large database (e.g. spatialite) into memory and needs to build a W matrix from there (assuming the user has a way of converting the polygons from spatialite to PySAL geometries).

nonplanar_neighbors fails when sindex is not constructed.

nonplanar_neighbors uses the sindex attribute to avoid unnecessarily fuzzing some observations. We assume sindex exists and, if it does not, the computation fails with an AttributeError.

Right now, if geopandas is installed using pip, it does not bring with it libspatialindex, which is a C library. If you install geopandas with conda, you do get libspatialindex by default. On travis, we use pip, so that's failing.

So, we need to

  1. bail on nonplanar_neighbors when geodataframe.sindex is None.
  2. install geopandas on travis from github.

I'm doing 2. on #58

alphashapes & n<4

in theory, the "delaunay" triangulation for cases where the number of points is less than 4 is still "known." N=3 is the triangle, n=2 is the line, and n=1 is the point.

as it stands, Qhull will error out when passed a collection with less than four points. It'd be nice if we checked this & returned something sane in these cases, rather than bailing out from qhull.

Pulling example datasets from Carto

Opening this ticket to explore an idea that @ljwolf and myself had chatted briefly about. For the example datasets that are used in pysal, could these be maintained externally and just pulled by the library when required and cached locally? It's really easy to pull a Carto table directly in to a pandas dataframe using our SQL API so it might be a natural fit to store some of those data sources in Carto?

This would be similar to the approach scikit takes with grabbing example datasets.

only warn on disconnected components if there are no islands

When a graph has an island, we warn the user that both the graph is disconnected and that there is an island.

We should only warn about disconnected components when there is no island, since islands always result in disconnected components.

two modules “Wsets.py” and "util.py" depend on each other

Function WSP2W is defined in util.py which imports the module Wsets.py
However, in Wsets.py, WSP2W is called in the function w_clip without prior declaration or import. We would encounter the following error if outSP is set False:

image

Since util.py is dependent on Wsets.py, we cannot import WSP2W from util.py. Any idea how to resolve this?

BUG: test_weights_IO.py is using pysal and hard-coded paths

This test file has two issues that will raise failures if the tests are run:

  1. It is using pysal not libpysal
  2. Directory paths are hard coded to `C:1st

I’m not sure why Travis didn’t flag these when the merge was made?

When I run the tests locally I get:

ERROR: Failure: ModuleNotFoundError (No module named 'pysal')
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/serge/anaconda3/lib/python3.6/site-packages/nose/failure.py", line 39, in runTest
    raise self.exc_val.with_traceback(self.tb)
  File "/home/serge/anaconda3/lib/python3.6/site-packages/nose/loader.py", line 417, in loadTestsFromName
    addr.filename, addr.module)
  File "/home/serge/anaconda3/lib/python3.6/site-packages/nose/importer.py", line 47, in importFromPath
    return self.importFromDir(dir_path, fqname)
  File "/home/serge/anaconda3/lib/python3.6/site-packages/nose/importer.py", line 94, in importFromDir
    mod = load_module(part_fqname, fh, filename, desc)
  File "/home/serge/anaconda3/lib/python3.6/imp.py", line 234, in load_module
    return load_source(name, filename, file)
  File "/home/serge/anaconda3/lib/python3.6/imp.py", line 172, in load_source
    module = _load(spec)
  File "<frozen importlib._bootstrap>", line 675, in _load
  File "<frozen importlib._bootstrap>", line 655, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
  File "/home/serge/Documents/p/pysal/src/libpysal/libpysal/weights/tests/test_weights_IO.py", line 2, in <module>
    import pysal
ModuleNotFoundError: No module named 'pysal'

----------------------------------------------------------------------
Ran 165 tests in 11.996s

FAILED (SKIP=9, errors=1)

`contains_point` cannot handle nested exterior rings.

I hit this bug working on the WKT serialization/deserialization for the labelled array gsoc.

Take the following example with two polygons. The first multipolygon is a double-torus defined by a square ring between 1 and .75, and another square ring between .5 and .25. The other multipolygon is a single torus with an island, defined by a square between 1 and .75, and an inner square at .5.

Our contains_point implementation bails when any hole contains the point, which is accurate for the double-torus, but incorrect for the single-torus-with-island:

>>> import pysal as ps
>>> double_torus = ps.cg.Polygon(parts= [[(-1, -1), (-1,1),(1,1),(1,-1)],  #exterior square at 1
                                        [(-.5,-.5), (-,5, .5),(.5,.5),(.5,-.5)]],  #exterior square at .5
                                 holes= [[(-.75, -.75),(-.75,.75),(.75,.75),(.75,-.75)], #hole square at .75
                                        [(-.25,-.25),(-.25,.25),(.,25, .25),(.25,-.25)]]) #hole square at .25
>>> single_island_in_torus =  ps.cg.Polygon(parts= [[(-1, -1), (-1,1),(1,1),(1,-1)],  #exterior square at 1
                                        [(-.5,-.5), (-,5, .5),(.5,.5),(.5,-.5)]],  #exterior square at .5
                                 holes= [[(-.75, -.75),(-.75,.75),(.75,.75),(.75,-.75)], #hole square at .75
                                         ]) #inner square is solid
>>> double_torus.contains_point((0,0)) #is hollow, so origin is not contained
False
>>> single_island_in_torus.contains_point((0,0)) #should contain origin, since the inner square is solid
False

Again, if any hole contains the point, the algorithm bails, meaning any concentric multi-polygon will have incorrect results for point-in-polygon searches.

In fact, I don't think we can do a correct point in polygon search on multipolygons without establishing a ring-hole nesting.

Like, take a point P related to a multipolygon M composed of 2 exterior rings, A,B and one hole, H. Let P be contained in B. Stating the rings in OGC style, if M := ((AH), B), then P in H is not sufficient to exclude P from M, since P in (B intersection H) implies P in M.

Also, no clear even-odd rule exists for rings with no topological sorting, since checking ring/hole membership of P in either M:= ((AH),B) or M:= (A, (BH)) is indeterminate; "P is contained in two exterior rings and one hole" is ambiguous about P and M, since we don't know whether H contains B.

Solution

Sort the rings of a polygon topologically and record them in OGC style. Then, conduct a level-set membership test, where a naive point-in-polygon ring test can be applied walking down the topological sorting. The "top" ring governs membership; if the "top" ring is a hole, the point is not inside. Otherwise, the point is inside.

MGWR_Georgia_example.ipynb fails due to different sample data shapes

  • Platform information:
    nt win32

  • Python version:
    3.6.6 |Anaconda, Inc.| (default, Jun 28 2018, 11:27:44) [MSC v.1900 64 bit (AMD64)]

  • SciPy version:
    1.1.0

  • NumPy version:
    1.14.2

This step fails using Georgia sample data:
#Add GWR parameters to GeoDataframe
georgia_shp['gwr_intercept'] = gwr_results.params[:,0]

ValueError: Length of values does not match length of index

Initial DataFrame information:
GData_utm.csv shape is (172, 18)
G_utm.shp shape is (159, 13)

Support for missing data and pandas dataframes

Original author: [email protected] (January 31, 2013 20:49:46)

It would be very useful for pysal to recognize and handle NaN values in NumPy arrays and/or pandas dataframes. Sometimes, it is not desirable to simply drop all observations with missing data, as these observations can be important when calculating spatial lags.

Related, it would also be helpful to use pandas indexing to align the spatial weights matrix or matrices with the variables. Again, this is primarily an issue because of missing data.

Thanks!

Original issue: http://code.google.com/p/pysal/issues/detail?id=239

quadtree files

it looks like there is some additional stuff that's not integrated according to the rest of the repo structure.

There's unique/distinct data in libpysal/cg/tests/data and some images in libpysal/cg/tests/img. The data should live in examples, and we shouldn't ship the images at all. Also, those are used in an ipynb file inside of the test directory which should also not be shipped in libpysal.

Import libpysal failed on python 3.5 and 3.6

In both python 3.5 and 3.6 environments (on a mac), an attempt to import libpysal after pip install libpysal will incur the following error:

In [1]: import libpysal
---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-1-3a2ef2a3cd6f> in <module>()
----> 1 import libpysal

/Users/weikang/anaconda/envs/py3/lib/python3.6/site-packages/libpysal/__init__.py in <module>()
     21     Tools for creating and manipulating weights
     22 """
---> 23 import cg
     24 import io
     25 import weights

ModuleNotFoundError: No module named 'cg'

Import is fine in a python 2.7 environment.

error importing v3.0.7

In [1]: import libpysal
---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-1-b416346c1a00> in <module>()
----> 1 import libpysal

~/libpysal/libpysal/__init__.py in <module>()
     26 """
     27 from . import cg
---> 28 from . import io
     29 from . import weights
     30 from . import examples

~/libpysal/libpysal/io/__init__.py in <module>()
      1 from . import fileio
      2 from .tables import *
----> 3 from .iohandlers import *
      4 from .util import *
      5 open = fileio.FileIO

ModuleNotFoundError: No module named 'libpysal.io.iohandlers'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.