Giter Club home page Giter Club logo

oq-hazardlib's Introduction

oq-hazardlib's People

Contributors

acerisara avatar angri avatar cbworden avatar danciul avatar daniviga avatar francescoingv avatar g-weatherill avatar griffij avatar gvallarelli avatar julgp avatar klunk386 avatar larsbutler avatar luisera avatar matley avatar micheles avatar mmpagani avatar monellid avatar nackerley avatar nastasi-oq avatar nickhorspool avatar ptormene avatar raoanirudh avatar rcgee avatar silviacanessa avatar vpoggi avatar vup1120 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

oq-hazardlib's Issues

Zhao updates

The Zhao et al. (2006) GMPEs have been updated, and I'm putting in this issue to request new GSIM classes for the Zhao et al. (2016) GMPEs for subduction zones (interface, slab, crustal, and upper mantle).

Atkinson (2008) GMPE

I'm creating this issue to request the Atkinson (2008) GMPE. As with some of the other Atkinson and Boore GMPEs currently implemented in openquake, this was updated in this 2011 paper. The reason I'm requesting this GMPE is because it was included in the 2014 US National Seismic Hazard Maps for the CEUS.

GMPE subclasses without required depth parameters

At least one GMPE that uses a depth parameter does not require the depth parameter in the SitesContext (e.g., BooreEtAl2014). Other GMPEs require the depth parameter (e.g., AbrahamsonEtAl2014, CampbellBozorgnia2014, ChiouYoungs2014). So I would like to request subclasses of these latter GMPEs which do not require the depth parameters. I think this would be a fairly trivial change; in AbrahamsonEtAl2014, I think it should be as simple as setting z1pt0 = z1pt0ref. Currently, we are providing a SitesContext that achieves the same result, but it requires a bit of needless bookkeeping when working with multiple GMPEs that use different equations for z1pt0. I would think that this change may be useful generally also since the depth parameters are often unavailable.

Building .deb on Debian Wheezy fails due to missing python2.6-dev dependency

Building the oq-hazardlib (commit dc15640) deb fails for Debian Wheezy with the following error:

gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/lib/pymodules/python2.6/numpy/core/include -I/usr/include/python2.6 -c speedups/geodeticmodule.c -o build/temp.linux-x86_64-2.6/speedups/geodeticmodule.o -Wall -O2
speedups/geodeticmodule.c:19:20: fatal error: Python.h: No such file or directory
compilation terminated.
Traceback (most recent call last):
  File "setup.py", line 98, in <module>
    zip_safe=False,
  File "/usr/lib/python2.6/distutils/core.py", line 152, in setup
    dist.run_commands()
  File "/usr/lib/python2.6/distutils/dist.py", line 975, in run_commands
    self.run_command(cmd)
  File "/usr/lib/python2.6/distutils/dist.py", line 995, in run_command
    cmd_obj.run()
  File "/usr/lib/python2.6/distutils/command/build.py", line 135, in run
    self.run_command(cmd_name)
  File "/usr/lib/python2.6/distutils/cmd.py", line 333, in run_command
    self.distribution.run_command(command)
  File "/usr/lib/python2.6/distutils/dist.py", line 995, in run_command
    cmd_obj.run()
  File "/usr/lib/python2.6/dist-packages/setuptools/command/build_ext.py", line 46, in run
    _build_ext.run(self)
  File "/usr/lib/python2.6/distutils/command/build_ext.py", line 340, in run
    self.build_extensions()
  File "/usr/lib/python2.6/distutils/command/build_ext.py", line 449, in build_extensions
    self.build_extension(ext)
  File "/usr/lib/python2.6/dist-packages/setuptools/command/build_ext.py", line 182, in build_extension
    _build_ext.build_extension(self,ext)
  File "/usr/lib/python2.6/distutils/command/build_ext.py", line 499, in build_extension
    depends=ext.depends)
  File "/usr/lib/python2.6/distutils/ccompiler.py", line 621, in compile
    self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts)
  File "/usr/lib/python2.6/distutils/unixccompiler.py", line 180, in _compile
    raise CompileError, msg
distutils.errors.CompileError: command 'gcc' failed with exit status 1
dh_auto_build: python2.6 setup.py build --force returned exit code 1
make: *** [build] Error 1

This is because the Build-Require line from debian/control is missing "python2.6-dev", but python-2.6 is pulled in by python-nose.

Adding the dependency allows the build to succeed, but I don't know if building against python 2.6 is correct behaviour.

Rx distance calculator provides wrong results for certain fault geometries

The Rx distance calculator (https://github.com/gem/oq-hazardlib/blob/master/openquake/hazardlib/geo/surface/base.py#L228) provides wrong results for certain fault geometries.

Considering a simple fault surface defined in the following way:

fault_trace = Line([Point(0., 0.), Point(0.3, 0.8), Point(1., 1.)])
surf = SimpleFaultSurface.from_fault_data(
fault_trace=fault_trace,
upper_seismogenic_depth=0,
lower_seismogenic_depth=10,
dip=50.,
mesh_spacing=2.
)

the Rx distance pattern is ok:
rxdistancecontinuouscolorbarcase1

however if I consider the following fault trace:
fault_trace = Line([Point(0., 0.), Point(0.8, 0.3), Point(1., 1.)])

then the Rx distance pattern is wrong:

rxdistancecontinuouscolorbarcase2

quantile_curve(curves, quantile, weights) is broken when weights is None

By looking at https://github.com/gem/oq-hazardlib/blob/engine-2.2/openquake/hazardlib/stats.py#L35 one can see that there is a special case when the weights are None; here is the code which has been used for 5+ years:

    if weights is None:
        # this implementation is an alternative to
        # numpy.array(mstats.mquantiles(curves, prob=quantile, axis=0))[0]
        # more or less copied from the scipy mquantiles function, just special
        # cased for what we need (and a lot faster)
        arr = numpy.array(curves).reshape(len(curves), -1)
        p = numpy.array(quantile)
        m = 0.4 + p * 0.2
        n = len(arr)
        aleph = n * p + m
        k = numpy.floor(aleph.clip(1, n - 1)).astype(int)
        gamma = (aleph - k).clip(0, 1)
        data = numpy.sort(arr, axis=0).transpose()
        qcurve = (1.0 - gamma) * data[:, k - 1] + gamma * data[:, k]
        return qcurve

This code is used in the case of sampling. I submit that it is broken, since for weights=None one would expect to get the same result than assigning identical weights to all realizations, consistently with how it works in mean_curve, which is not the case.
The problem is that quantile_curve(curves, quantile, weights) uses a different algorithm if the weights are not None, so the numbers are different. For instance:

In[1]: import numpy

In [2]: from openquake.hazardlib.stats import quantile_curve

In [3]: quantile = 0.75

In [4]: curves = numpy.array([
            [.98161, .97837, .95579],
            [.97309, .96857, .93853],
        ])

In [5]: quantile_curve(curves, quantile, None)
Out[5]: array([ 0.98161,  0.97837,  0.95579])

In [6]: quantile_curve(curves, quantile, [.5, .5])
Out[6]: array([ 0.97735,  0.97347,  0.94716])

Given that we do not have a performance problem (the postprocessing is fast compared to the real computation) I would use the same algorithm in all cases, i.e. I would use the algorithm used for the full enumeration case, with nontrivial weights, which is based on interpolation.

PS: the code for weights=None was built to be compatible with scipy.mstats.mquantiles(curves, prob=quantile), however the tests has this comment:

        # TODO(LB): Check with our hazard experts to see if this is reasonable
        # tolerance. Better yet, get a fresh set of test data. (This test data
        # was just copied verbatim from from some old tests in
        # `tests/hazard_test.py`.

MultiMFD with homogeneous parameters

It will be very common for a MultiMFD to have homogeneous parameters (i.e. equal for all sites). We should make a special case for that in the XML, to save space.

Inefficiencies in the dissaggregation calculator

In the current disaggregation calculator we repeat the same (expensive) calculation twice. First we compute the Rjb distance between each site and the rupture (i.e. a mesh of points) (see https://github.com/gem/oq-hazardlib/blob/master/openquake/hazardlib/calc/disagg.py#L171) then we compute Rjb to find out the lon and lat of the points on the mesh at closer distance to each site (see https://github.com/gem/oq-hazardlib/blob/master/openquake/hazardlib/calc/disagg.py#L173). During the first call we can easily obtain also the lons and lats and return them if needed. In this way we will reduce of a 50% the time needed for rupture-site distance calculation.

Added a reader and a writer for GriddedSurface

This is needed for the Japan model. A GriddedSurface is defined by a list of 3D points. The NRML representation could be

<surface>
  <griddedSurface>
       <gml:posList>
         -124.704 40.363 5.49326 -124.977 41.214 4.98856 -125.14 42.096 4.89734
       </gml:posList>
  </griddedSurface>
</surface>

Do we still need downsample_trace?

Now both ComplexFaultSurfaces and SimpleFaultSurfaces store the original edges, so we could use those instead of downsampling. If the original edges are not available, should be considered an error?

PR #492 breaks tests on Windows

Pull #492 breaks "mesh" tests on Windows (independently from numpy/scipy/shapely versions). Errors are like this:

test_get_closest_points_mesh1D (openquake.hazardlib.tests.geo.surface.multi_test.DistancesTestCase) ... ERROR

======================================================================
ERROR: test_get_closest_points_mesh1D (openquake.hazardlib.tests.geo.surface.multi_test.DistancesTestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "C:\OQM\lib\openquake\hazardlib\tests\geo\surface\multi_test.py", line 151, in test_get_closest_points_mesh1D
    surf = MultiSurface(self.surfaces_mesh1D)
  File "C:\OQM\lib\openquake\hazardlib\geo\surface\multi.py", line 105, in __init__
    self.edge_set = self._get_edge_set(tol)
  File "C:\OQM\lib\openquake\hazardlib\geo\surface\multi.py", line 137, in _get_edge_set
    raise ValueError("Surface %s not recognised" % str(surface))
ValueError: Surface <openquake.hazardlib.tests.geo.surface.multi_test.FakeSurface object at 0x0FC2FF70> not recognised

----------------------------------------------------------------------

How to reproduce:

cd lib\openquake\hazardlib
python -m nose -v -a "!slow"

At least we should check how/if this affects demos. If not, the blocker label can be removed.

Align version with Engine

Current version of hazardlib is 0.24; in my opinion

  • It's stable enough to have a major >= 1
  • Its version should be aligned with the Engine; technically it's already like this (2.4 vs 0.24) and almost any change to hazardlib must be reflected to the engine (and vv) so it would be good to have only a version.

This will reduce a lot the confusion.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.