Giter Club home page Giter Club logo

mlmcpy's Introduction

MLMCPy - Multi-Level Monte Carlo with Python

Build Status Coverage Status

General

MLMCPy is an open source Python implementation of the Multi-Level Monte Carlo (MLMC) method for uncertainty propagation. Once a user defines their computational model and specifies the uncertainty in the model input parameters, MLMCPy can be used to estimate the expected value of a quantity of interest to within a specified precision. Support is available to perform the required model evaluations in parallel (if mpi4py is installed) and extensions of the MLMC method are provided to calculate more advanced statistics (e.g., covariance, CDFs).

Dependencies

MLMCPy is intended for use with Python 2.7 and relies on the following packages:

  • numpy
  • scipy
  • mpi4py (optional for running in parallel)
  • pytest (optional for running unit tests)

A requirements.txt file is included for easy installation of dependecies with pip:

pip install -r requirements.txt

Example Usage

'''
Simple example of propagating uncertainty through a spring-mass model using MLMC. 
Estimates the expected value of the maximum displacement of the system when the spring 
stiffness is a random variable. See the /examples/spring_mass/from_model/ for more details.
'''

import numpy as np
import sys

from MLMCPy.input import RandomInput
from MLMCPy.mlmc import MLMCSimulator

# Add path for example SpringMassModel to sys path.
sys.path.append('./examples/spring_mass/from_model/spring_mass')
import SpringMassModel

# Step 1 - Define random variable for spring stiffness:
# Need to provide a sampleable function to create RandomInput instance in MLMCPy
def beta_distribution(shift, scale, alpha, beta, size):
    return shift + scale*np.random.beta(alpha, beta, size)

stiffness_distribution = RandomInput(distribution_function=beta_distribution,
                                     shift=1.0, scale=2.5, alpha=3., beta=2.)

# Step 2 - Initialize spring-mass models. Here using three levels with MLMC.
# defined by different time steps
model_level1 = SpringMassModel(mass=1.5, time_step=1.0)
model_level2 = SpringMassModel(mass=1.5, time_step=0.1)
model_level3 = SpringMassModel(mass=1.5, time_step=0.01)
models = [model_level1, model_level2, model_level3]

# Step 3 - Initialize MLMC & predict max displacement to specified error (0.1).
mlmc_simulator = MLMCSimulator(stiffness_distribution, models)
[estimates, sample_sizes, variances] = mlmc_simulator.simulate(epsilon=1e-1)

Getting Started

MLMCPy can be installed via pip from PyPI:

pip install mlmcpy

MLMCPy can also be installed using the git clone command:

git clone https://github.com/nasa/MLMCPy.git

The best way to get started with MLMCPy is to take a look at the scripts in the examples/ directory. A simple example of propagating uncertainty through a spring mass system can be found in the examples/spring_mass/from_model directory. There is a second example that demonstrates the case where a user has access to input-output data from multiple levels of models (rather than a model they can directly evaluate) in the examples/spring_mass/from_data/ directory. For more information, see the source code documentation in docs/MLMCPy_documentation.pdf (a work in progress).

Tests

The tests can be performed by running "py.test" from the tests/ directory to ensure a proper installation.

Developers

UQ Center of Excellence
NASA Langley Research Center
Hampton, Virginia

This software was funded by and developed under the High Performance Computing Incubator (HPCI) at NASA Langley Research Center.

Contributors: James Warner ([email protected]), Luke Morrill, Juan Barrientos

License

Copyright 2018 United States Government as represented by the Administrator of the National Aeronautics and Space Administration. No copyright is claimed in the United States under Title 17, U.S. Code. All Other Rights Reserved.

Disclaimers No Warranty: THE SUBJECT SOFTWARE IS PROVIDED "AS IS" WITHOUT ANY WARRANTY OF ANY KIND, EITHER EXPRESSED, IMPLIED, OR STATUTORY, INCLUDING, BUT NOT LIMITED TO, ANY WARRANTY THAT THE SUBJECT SOFTWARE WILL CONFORM TO SPECIFICATIONS, ANY IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, OR FREEDOM FROM INFRINGEMENT, ANY WARRANTY THAT THE SUBJECT SOFTWARE WILL BE ERROR FREE, OR ANY WARRANTY THAT DOCUMENTATION, IF PROVIDED, WILL CONFORM TO THE SUBJECT SOFTWARE. THIS AGREEMENT DOES NOT, IN ANY MANNER, CONSTITUTE AN ENDORSEMENT BY GOVERNMENT AGENCY OR ANY PRIOR RECIPIENT OF ANY RESULTS, RESULTING DESIGNS, HARDWARE, SOFTWARE PRODUCTS OR ANY OTHER APPLICATIONS RESULTING FROM USE OF THE SUBJECT SOFTWARE. FURTHER, GOVERNMENT AGENCY DISCLAIMS ALL WARRANTIES AND LIABILITIES REGARDING THIRD-PARTY SOFTWARE, IF PRESENT IN THE ORIGINAL SOFTWARE, AND DISTRIBUTES IT "AS IS."โ€จ

Waiver and Indemnity: RECIPIENT AGREES TO WAIVE ANY AND ALL CLAIMS AGAINST THE UNITED STATES GOVERNMENT, ITS CONTRACTORS AND SUBCONTRACTORS, AS WELL AS ANY PRIOR RECIPIENT. IF RECIPIENT'S USE OF THE SUBJECT SOFTWARE RESULTS IN ANY LIABILITIES, DEMANDS, DAMAGES, EXPENSES OR LOSSES ARISING FROM SUCH USE, INCLUDING ANY DAMAGES FROM PRODUCTS BASED ON, OR RESULTING FROM, RECIPIENT'S USE OF THE SUBJECT SOFTWARE, RECIPIENT SHALL INDEMNIFY AND HOLD HARMLESS THE UNITED STATES GOVERNMENT, ITS CONTRACTORS AND SUBCONTRACTORS, AS WELL AS ANY PRIOR RECIPIENT, TO THE EXTENT PERMITTED BY LAW. RECIPIENT'S SOLE REMEDY FOR ANY SUCH MATTER SHALL BE THE IMMEDIATE, UNILATERAL TERMINATION OF THIS AGREEMENT.

mlmcpy's People

Contributors

gbomarito avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

mlmcpy's Issues

Fixed failing test with new pytest version

Error:

tests/input/test_RandomInput.py:93: in
@pytest.mark.parametrize('random_input', [uniform_distribution_input(),
E RemovedInPytest4Warning: Fixture "uniform_distribution_input" called directly. Fixtures are not meant to be called directly, are created automatically when test functions request them as parameters. See https://docs.pytest.org/en/latest/fixture.html for more information.

Defaults to 1 processor if mpi4py doesn't exist AND if it exists but not configured correctly

Not exactly sure how we want to handle this yet.

Current Behavior
Code checks if mpi4py module exists in the path using the MLMCSimulator.__detect_parallelization() method. If the module does not exist, an ImportError is raised, which is then caught in the try/except. The code then defaults to 1 processor and proceeds with no warning. This is fine in the case where the module doesn't exist and the user is trying to use a single processor. However, this is not ok if (1) the module doesn't exist and the user is trying to use multiple mpi processes and (2) the module exists and the user is trying to use multiple mpi processes but mpi4py is not configured properly.

Recommendations

  • Case 1: A print statement saying that mpi4py isn't installed; the current approach doesn't really leave many options, as we can't check for requested number of processes without mpi4py.
  • Case 2: Create a separate check for ImportError raised on from mpi4py import MPI. We still can't raise an error or else it won't work when called without mpiexec/mpirun (i.e., if user wants to use a single processor), so we would need to raise a more specific warning that mpi4py import failed.

Cached input/outputs from initialization are used for fastest model

We need to restructure the cache implementation so that the stored initialization input/outputs are used for the most expensive high fidelity model on the highest level. Currently the lowest level model is evaluated first in the simulation loop so it generally uses all the cached input/outputs before MLMC gets to the higher levels.

Error when initial_sample_size < # processors

MLMCPy/MLMCPy/mlmc/MLMCSimulator.py:237: RuntimeWarning: divide by zero encountered in divide

The costs being calculated there is for a particular processor and then the average cost across all processors is communicated/calculated for figuring out the optimal sample sizes later. The proper fix will be to rearrange how the average cost is being calculated by communicating/summing the total time for all processors then dividing by the total number of processors.

Generalize initial_sample_size to be an array or a scalar

Let users optionally specify an initial sample size for each level in the event that you want to run less evaluations with the expensive models.

Can specify an array with size equal to the number of levels. Or a scalar where the same sample sizes will be used on each level (current implementation)

Should cache be generated with the same inputs for each level?

Current, different inputs are used for each level of the cache. Which means more model evaluations are done up front (for example, level 1 with levels0-2 is evaluated for two sets of different sets of inputs for calculating differences with levels 2/0.), but more outputs are reused come simulation time (?). Is this the right strategy?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.