Giter Club home page Giter Club logo

marabou's Introduction

Marabou

Marabou codecov.io

Deep neural networks are proliferating in many applications. Consequently, there is a pressing need for tools and techniques for network analysis and certification. To help address that need, we present Marabou, a framework for verifying deep neural networks.

Marabou is an SMT-based tool that can answer queries over networks. A typical DNN verification query consists of two parts: (i) a neural network, and (ii) a property to be checked; and its result is either a formal guarantee that the network satisfies the property, or a concrete input for which the property is violated (a counter-example).

Marabou uses ONNX as its main network format. It supports feed-forward NNs with a wide range of activations. It also supports the .nnet and the Tensorflow NN formats.

Properties can be specified using the Python interface. In addition, Marabou also supports the VNNLIB property format.

For more details about the design and features of version 2.0 of Marabou, check out the latest tool paper. The initial version of Marabou is described in a previous tool paper.

Research

Marabou is a research product. More information about publications involving Marabou can be found here.

Installation

Installing via Pip

The recommended way to install Marabou is via pip using the command

pip install maraboupy

which will install both the Marabou executable on your path and the Python bindings. The Python interface currently supports Python 3.8, 3.9, 3.10 and 3.11.

Building from source

Build dependencies

The build process uses CMake version 3.16 (or later). You can get CMake here.

Marabou depends on the Boost and the OpenBLAS libraries, which are automatically downloaded and built when you run make. Library CXXTEST comes included in the repository.

The current version of Marabou can be built for Linux or MacOS machines using CMake:

git clone https://github.com/NeuralNetworkVerification/Marabou/
cd Marabou/
mkdir build 
cd build
cmake ../
cmake --build ./

To enable multiprocess build change the last command to:

cmake --build . -j PROC_NUM

After building Marabou, the compiled binary is located at build/Marabou, and the shared library for the Python API is located in maraboupy/. Building the Python interface requires pybind11 (which is automatically downloaded).

Export maraboupy folder to Python and Jupyter paths:

PYTHONPATH=PYTHONPATH:/path/to/marabou/folder
JUPYTER_PATH=JUPYTER_PATH:/path/to/marabou/folder

and the built maraboupy is ready to be used from a Python or a Jupyter script.

By default we install a 64bit Marabou and consequently a 64bit Python interface, the maraboupy/build_python_x86.sh file builds a 32bit version.

Compile Marabou with the Gurobi optimizer (optional)

Marabou can be configured to use the Gurobi optimizer, which can replace the in-house LP solver and enable a few additional solving modes.

Gurobi requires a license (a free academic license is available), after getting one the software can be downloaded here and here are installation steps, there is a compatibility issue that should be addressed. A quick installation reference:

export INSTALL_DIR=/opt
sudo tar xvfz gurobi9.5.1_linux64.tar.gz -C $INSTALL_DIR
cd $INSTALL_DIR/gurobi951/linux64/src/build
sudo make
sudo cp libgurobi_c++.a ../../lib/

Next, set the following environment variables (e.g., by adding the following to the .bashrc and invoke source .bashrc):

export GUROBI_HOME="/opt/gurobi951/linux64"
export PATH="${PATH}:${GUROBI_HOME}/bin"
export LD_LIBRARY_PATH="${LD_LIBRARY_PATH}:${GUROBI_HOME}/lib"
export GRB_LICENSE_FILE=/path/to/license.lic

After Gurobi is set up, follow the same build instruction of Marabou in the beginning, except that you need to run the following command instead of cmake ../:

cmake ../ -DENABLE_GUROBI=ON

Other CMake options

The cmake ../ command can take other options, for example:

  • Compile without the Python binding:
cmake ../ -DBUILD_PYTHON=OFF
  • Compile in debug mode (default is release):
cmake ../ -DCMAKE_BUILD_TYPE=Debug

Testing

To run tests we use ctest. The tests have labels according to level (unit/system/regress0/regress1...), and the code they are testing (engine/common etc...).
For example to run all unit tests execute in the build directory:

ctest -L unit

On every build we run the unit tests, and on every pull request we run unit, system, regress0 and regress1.

Another option to build and run all of the tests is:

cd path/to/marabou/repo/folder
mkdir build 
cd build
cmake ..
make check -j PROC_NUM

Build Instructions for Windows using Visual Studio

We no longer provide Windows support. Instructions to build an old version of Marabou for Windows can be found here.

Running Marabou

Run from the Command line

The Marabou executable can be called directly from the command line. It takes as arguments the network to verified and the property. Marabou --help would return a list of available options.

The repository contains sample networks and properties in the resources folder. To run Marabou on such an example, you can execute the following command from the repo directory:

Marabou resources/nnet/acasxu/ACASXU_experimental_v2a_2_7.nnet resources/properties/acas_property_3.txt

(if built from source, then you will need to replace Marabou with the path to the built executable i.e. build/Marabou).

Run using Python

The maraboupy/examples folder contains a list of examples. Please also see our documentation for the python interface, which contains examples, API documentation, and a developer's guide.

Advanced configuration

Choice of solver configurations

Currently the default configuration of Marabou is a single-threaded one that uses DeepPoly analysis for bound tightening, and the DeepSoI procedure during the complete search. For optimal runtime performance, you need to build Marabou with Gurobi enabled (See sub-section below for Gurobi installation), so that LPs are solved by Gurobi instead of the open-source native simplex engine.

You could also leverage parallelism by setting the num-workers option to N. This will spawn N threads, each solving the original verification query using the single-threaded configuration with a different random seed. This is the preferred parallelization strategy for low level of parallelism (e.g. N < 30). For example to solve a query using this mode with 4 threads spawned:

./resources/runMarabou.py resources/nnet/mnist/mnist10x10.nnet resources/properties/mnist/image3_target6_epsilon0.05.txt --num-workers=4

If you have access to a large number of threads, you could also consider the Split-and-Conquer mode (see below).

Using the Split and Conquer (SNC) mode

In the SNC mode, activated by --snc Marabou decomposes the problem into 2^n0 sub-problems, specified by --initial-divides=n0. Each sub-problem will be solved with initial timeout of t0, supplied by --initial-timeout=t0. Every sub-problem that times out will be divided into 2^n additional sub-problems, --num-online-divides=n, and the timeout is multiplied by a factor of f, --timeout-factor=f. Number of parallel threads t is specified by --num-workers=t.

So to solve a problem in SNC mode with 4 initial splits and initial timeout of 5s, 4 splits on each timeout and a timeout factor of 1.5:

build/Marabou resources/nnet/acasxu/ACASXU_experimental_v2a_2_7.nnet resources/properties/acas_property_3.txt --snc --initial-divides=4 --initial-timeout=5 --num-online-divides=4 --timeout-factor=1.5 --num-workers=4

A guide to Split and Conquer is available as a Jupyter Notebook in resources/SplitAndConquerGuide.ipynb.

Developing Marabou

Setting up your development environment

  1. Install pre-commit which manages the pre-commit hooks and use it to install the hooks, e.g.
pip install pre-commit
pre-commit install

This guarantees automatic formatting of your C++ code whenever you commit.

Tests

We have three types of tests:

  1. Unit tests - test specific small components, the tests are located alongside the code in a tests folder (for example: src/engine/tests), to add a new set of tests, add a file named Test_FILENAME (where FILENAME is what you want to test), and add it to the CMakeLists.txt file (for example src/engine/CMakeLists.txt)

  2. System tests - test an end to end use case but still have access to internal functionality. Those tests are located in src/system_tests. To add new set of tests create a file named Test_FILENAME, and add it also to src/system_tests/CMakeLists.txt.

  3. Regression tests - test end to end functionality thorugh the API, each test is defined by:

  • network_file - description of the "neural network" supporting nnet and mps formats (using the extension to decdie on the format)
  • property_file - optional, constraint on the input and output variables
  • expected_result - sat/unsat
    The regression tests are divided into 5 levels to allow variability in running time. To add a new regression tests:
  • add the description of the network and property to the resources sub-folder
  • add the test to: regress/regressLEVEL/CMakeLists.txt (where LEVEL is within 0-5)
  1. Python tests - test the maraboupy package.

In each build we run unit and system tests, and on pull request we also run the Python tests and levels 0 & 1 of the regression tests. In the future we will run other levels of regression weekly / monthly.

Acknowledgments

The Marabou project acknowledges support from the Binational Science Foundation (BSF) (2017662, 2021769, 2020250), the Defense Advanced Research Projects Agency (DARPA) (FA8750-18-C-0099), the European Union (ERC, VeriDeL, 101112713), the Federal Aviation Administration (FAA), Ford Motor Company (Alliance award 199909), General Electric (GE) Global Research, Intel Corporation, International Business Machines (IBM), the Israel Science Foundation (ISF) (683/18, 3420/21), the National Science Foundation (NSF) (1814369, 2211505, DGE-1656518), the Semiconductor Research Corporation (SRC) (2019-AU-2898), Siemens Corporation, the Stanford Center for AI Safety, the Stanford CURIS program, and the Stanford Institute for Human-Centered Artificial Intelligence (HAI).

Marabou is used in a number of flagship projects at Stanford's AISafety center.

People

Authors and contributors to the Marabou project can be found in AUTHORS and THANKS files, respectively.

marabou's People

Contributors

ahmed-irfan avatar aleksandarzeljic avatar alexopenu avatar castrong avatar clazarus avatar dependabot[bot] avatar derekahuang avatar elazarcoh avatar guyam2 avatar guykatzhuji avatar guykatzz avatar ibeling avatar idan0610 avatar jayanthkannan avatar jjgold012 avatar kjulian3 avatar lkuper avatar matthewdaggitt avatar omriisachuji avatar omriisack avatar rachellim avatar shantanuthakoor avatar sparth95 avatar tagomaru avatar teshim avatar vbcrlf avatar wenkokke avatar wu-haoze avatar yizhake avatar yuvaljacoby avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

marabou's Issues

Build error "use of undeclared identifier 'CLOCK_MONOTONIC'" on mac

I am getting the following build error after running cmake --build . while following the "Build Instructions for Linux or MacOS" in the README.

build_error

I'm new to the codebase and haven't attempted any changes to the build system. I'd appreciate any suggestions on next steps or fixes!

Thank you!
Chris

libboost causing compile issues on mac

This is the error message I get when trying to make from the top level Marabou directory:

ld: library not found for -l:libboost_program_options.a
clang: error: linker command failed with exit code 1 (use -v to see invocation)
make[2]: *** [marabou.elf] Error 1
make[1]: *** [engine.all] Error 2
make: *** [src.all] Error 2

build problems on mac

tried the new cmake build and got this error even tho pybind11 was already installed in my conda env. also why are yoou trying to run wget? Not everyone's host is running linux:

downloading pybind
/Users/Chelsea/Dropbox/AAHAA/src/Marabou/tools/download_pybind11.sh: line 10: wget: command not found
unzipping pybind
tar: Error opening archive: Failed to open 'pybind11_2_3_0.tar.gz'
CMake Error at CMakeLists.txt:201 (add_subdirectory):
  add_subdirectory given source
  "/Users/Chelsea/Dropbox/AAHAA/src/Marabou/tools/pybind11-2.3.0" which is
  not an existing directory.


CMake Error at CMakeLists.txt:202 (pybind11_add_module):
  Unknown CMake command "pybind11_add_module".


-- Configuring incomplete, errors occurred!
See also "/Users/Chelsea/Dropbox/AAHAA/src/Marabou/CMakeFiles/CMakeOutput.log".

Seg Fault for Simple Network

I pulled the main branch with latest commit: 54ae6c2
Using maraboupy
The network I am testing is just input = [[x]], W = [[1, 1, 1, 1]], output = [[x, x, x, x]]
I input a .pb and the equation list that is generated is:

>>> [e.addendList for e in network.equList]
[[(1.0, 0), (-1, 1)], [(1.0, 0), (-1, 2)], [(1.0, 0), (-1, 3)], [(1.0, 0), (-1, 4)]]
>>> [e.scalar for e in network.equList]
[0.0, 0.0, 0.0, 0.0]

So as you can see, very simple.

I set the input to be between 0 and 1 and test if the output is less than 4.

network.setLowerBound(0, 0.) 
network.setUpperBound(0, 1.)
for i in range(4):
	network.setUpperBound(output_vars[0][i], 4.)

After which:

>>> network.lowerBounds
{0: 0.0}
>>> network.upperBounds
{0: 1.0, 1: 4.0, 2: 4.0, 3: 4.0, 4: 4.0}

Which should return SAT. However, instead I get a segfault. The solver output is attached.

marabou_issue_2_14.txt

DnCWorker Unit Tests fails on Windows if tests are not collapsed into 1 function

The unit test for DnCWorker tests the response of the DnCWorker on different outcomes of solving a sub-problem. The outcomes could be SAT/UNSAT/TIMEOUT/ERROR and so on. If we test each case in a separate unit test function like in here, the unit tests pass on MacOS and Linux, but fail on Windows.
The unit tests only pass on windows if we collapse all the unit tests into one function like this.

Get CI working again

Travis is not running any more. Let's get travis or some other CI working again.

The result printing of DnC is not consistent with the single-solver printing

when running DnC mode, the result is not printed in the same as how the single-solver mode prints

Single solver mode

...
	--- Basis Factorization statistics ---
	Number of basis refactorizations: 1340
	--- Projected Steepest Edge Statistics ---
	Number of iterations: 72560.
	Number of resets to reference space: 754. Avg. iterations per reset: 96
	--- SBT ---
	Number of tightened bounds: 29115
UNSAT

DnC solver mode

...
Worker 1: Query 6-15 UNSAT, 3 tasks remaining
Worker 0: Query 6-16 UNSAT, 2 tasks remaining
Worker 3: Query 6-8 UNSAT, 1 tasks remaining
DnCManager::solve UNSAT query
Total Time: 19

transA and transB support for GEMM

ONNX's GEMM operation supports the optional parameters transA and transB, which respectively transpose the input and weight tensors. Unfortunately, maraboupy assumes that both of these parameters are false. This causes an assertion error, since if a tensor is transposed, the dimensions will be reversed:

assert shape1[-1] == shape2[0]

Since PyTorch converts nn.Linear modules into GEMM operations with transB=1, many exported PyTorch models cannot be opened by Marabou.

Install PythonAPI on make install

Add make install that will install the python API
For testing / developers add the build directory to the beginning of the PYTHONPATH

Definition of Done - python shared object is not copied into the maraboupy directory and users + developers can use the python api without coping the shared object

Query cycles indefinitely

With the latest version of Marabou (including the small fix to the iterator in Engine.cpp, #54 ), Marabou enters a cycle after running for ~2 hours. The number of splits grows until 11461, and then stays fixed at 11461 splits. Turning on tableau logging shows that the the pivots cycle between five variables and the values are not changing. This cycle continues for at least 2.5 hours after entering the cycle, and it appears to continue indefinitely.

I've attached a minimal python example to reproduce the behavior (I can convert to C++ if needed). The network used is also included. I turned the PREPROCESSOR_PL_CONSTRAINTS_ADD_AUX_EQUATIONS flag in GlobalConfiguration.cpp off because that still produces Malformed Basis errors. I've also attached a picture of the terminal output with tableau logging on to show the cycle.

Marabou_Cycle.zip

Example output values for ACAS XU networks

I'm importing the neural networks for ACAS XU to try to verify the same properties as in Appendix 6 of the Reluplex paper, and want to make sure that I'm getting conventions (e.g. row vs. column) correct.

I've followed the code in the main Reluplex repository https://github.com/guykatzz/ReluplexCav2017/blob/master/nnet/nnet.cpp but just want to make sure that I'm not making any silly errors. In particular, I'd like to make sure that I'm getting the logic around normalization done correctly --- it seems like neither the output nor input is to be normalized, but I just want to make sure --- and having values to work with would be extremely helpful.

A single set of input / output values fed through any of these networks should be enough for a sanity check - is this available somewhere?

Test Fail -- Bug in setting upper bound of auxiliary variables

Test Description:

model.nnet: 
	The model has two inputs (x0, x1), two outputs(y0, y1) and 1 hidden layer(RELU) of size 4 (ws_1_0, ws_1_1, ws_1_2, ws_1_3).  
	One of the valid assignments to these variables is (x0 = 2, x1 = 1.5) & (y0 = 1.5, y1 = 3.0) & (ws_1_0 = 1.5, ws_1_1 = 0.5,  ws_1_2 = 3, ws_1_3 = -6).  
	This can be corroborated by calling the solver on prop_1.txt and printing result of a forward pass on the network.  
	But the solver returns UNSAT on prop_2.txt. The only additional constraint in this file is ws_1_1 >= 0.5, which is consistent with the previous satisfying assignment.  
		
Command used to run: ./{PATH_TO_MARABOU_MASTER} bug-report/model.nnet bug-report/prop_1.txt

Potential Bug:

	src/engine/ReluConstraint.cpp:727 (in function addAuxiliaryEquations). 
             inputQuery.setUpperBound( _aux, -_lowerBounds[_b] );

Explanation:

Let me refer to the weighted sum variables (ws_1_0, ws_1_1, ws_1_2, ws_1_3) as (2, 3, 4, 5) and the corresponding RELU outputs be (6, 7, 8, 9).  
Let lb_2 be the lower bound of 2. 
The Preprocessor::preprocess method calls addPlAuxiliaryEquations. The purpose being to add the piecewise linear constraints (6 >= 2, 7 >= 3 ...) to the solver. Since we want to specify these constraints using equalities, we introduce auxiliary variables (12, 13, 14, 15) and add constraints like: 6 - 2 - 12 = 0.  
This is accompanied by setting lower and upper bounds for 12. The lower bound for 12 is indeed 0 since 6 >= 2. But the upper bound is set to be -(lb_2), which I suspect is not correct. 
6 - 2 - 12 = 0 and 6>=2 and 12 >=0 does not imply 12 <= -lb_2. Counter-example : (6 = 2, 12= 0.0, lb_2 = 0.5). 

bug-report.zip

Confusing Comment

This comment in Equation.h:
"For now, all
equations are interpreted as equalities, i.e. the sum of all
addends equals the scalar "

is very confusing given the following code:

class Equation
{
public:
    enum EquationType {
        EQ = 0,
        GE = 1,
        LE = 2
    };
...

My understanding is that you can represent inequalities using the Equation type, just that in the marabou core, it will turn them into equalities and add slack variables. If my interpretation is correct, I would get rid of this misleading comment.

Symbolic Bound Tightening using Python API

Hello,

I am using Marabou to test some neural networks for targeted adversarial robustness
and I noticed that, when using the python API, apparently Marabou do not use
Symbolic Bound Tightening. I also noticed that the tool appears to behave differently
if it is used on the same problem from the command line or from the Python API:
in particular, it appears that when called from the API it does not print the statistics update
when it should and it always print "Number of tightened bounds: 0".

Even so, the final solutions appear to be coherent when applied to the same problem.

Could it be a bug in the printing of the statistics update? Or am I doing something wrong?

The code to reproduce the problem and the transcription of the results (in command_line_test.txt and python_api_test.txt) is in the following .zip.

sbt_test.zip

Thank you for your time!

High degradation error when using convolutions

I am trying to run an example with convolutions in Marabou, but most of the times I run into high degradation error. I have tried to recreate the error on a smaller model.

Model:
Input(10 * 10 * 1) -> Convolution(Filter size(3 * 3 * 1 * 2)) -> Output(8 * 8 * 2) -> Reshape(128)

While running marabou on this small model, I run into very high degradation values, even immediately after preprocessing. I looked into this and I think the initial assignment of values to basic variables seems a bit off. I have attached output which contains the initial assigned value to each variable and some of these values go in the order of 10^10. The degradation value with the small number of variables is also 64,000.

I verified the network using forward pass and the difference in values from Marabou and tensorflow is negligible, so I think the equations are set up correctly. I also checked them out on a small example. I have attached the testing script, the example protobuf and the output which contains the initial assignment to all the variables.

git_issue.zip

Several places for defining the same configuration variables

Hey,

There are several places where we can define configuration variables - GlobalConfiguration and Options - this motivates a non-uniform defining behavior and bugs prone - for example - the default value of PREPROCESSOR_PL_CONSTRAINTS_ADD_AUX_EQUATIONS is defined in GlobalConfiguration as true but in Options as false value

Returning the SAT point which violate the setup conditions

While creating a demo for today's presentation, I came across a point where Marabou terminates with a SAT point, but the SAT point does not satisfy the constraints of the problem.

Problem setup:
We are working on MNIST images, we are trying to find adversarial examples for an image within a given delta. The input here is classified as 8. We are trying to find a minimum delta, such that perturbations < delta leads to image being misclassified as 3.

MarabouUtils.addInequality(net, [net.outputVars[0][pred], net.outputVars[0][ae_target]], [1, -1], 0)
This inequality is the constraint that we are trying to solve. pred is the initial prediction(8 for us). ae_target is the adversarial label we are trying to get image misclassified as.

Our expectation is that if we get a SAT point then the output value for pred <= ae_target. In the given example, while they are close, it still is the case that pred > ae_target. The values we are getting is:
[[-6.25324207 -7.91382846 -1.01699024 0.30267924 -3.63097583 -1.75347436
-7.64562634 -2.79832181 0.30273349 -2.61049726]].

I have attached the python file and the example input. The example takes a while to run on the master branch, but runs in around 10 seconds on using out PR with smart-fix flag turned on.
files.zip

maraboupy "cannot import name 'MarabouCore'"

On MacOS, when trying to run Example Notebook.ipynb in the maraboupy examples:

from maraboupy import Marabou

ImportError Traceback (most recent call last)
in ()
----> 1 from maraboupy import Marabou

~/Documents/Stanford/Marabou/maraboupy/Marabou.py in ()
18
19 #Marabou File
---> 20 from .MarabouNetworkNNet import *
21 from .MarabouNetworkTF import *
22

~/Documents/Stanford/Marabou/maraboupy/MarabouNetworkNNet.py in ()
17 '''
18
---> 19 from .MarabouUtils import *
20 from maraboupy import MarabouNetwork
21 import numpy as np

~/Documents/Stanford/Marabou/maraboupy/MarabouUtils.py in ()
17 '''
18
---> 19 from maraboupy import MarabouCore
20
21 class Equation:

ImportError: cannot import name 'MarabouCore'

This is after running make so the .so file (MarabouCore.cpython-36m-darwin.so) exists. I'm not sure how to debug this further, so help is appreciated!

Marabou doesn't compile with gcc version 9

Gcc 9 introduced a new warning about copy-construct (-Wdeprecated-copy). And since we compile using -Werror flag, we get compilation errors due to the deprecated-copy warnings.

Unable to perform pivot when merging columns

@dan9lee @adampah @justinarose are performing some experiments on one of the acasxu tiny networks to find adversarial examples.

When trying to apply a split of the form x1=x2 , we want to merge the two columns. We try to pivot the variables to make them nonbasic first. In this case, they are getting ReluplexError( ReluplexError::ENGINE_APPLY_SPLIT_FAILED, "Could not find a variable to pivot with" ) when trying to pivot out one of the involved basic variables.

Perhaps we can fail more gracefully and proceed to the "general case" of splits if merging is not possible.

To reproduce:
place de_solve.py in the Marabou root dir, have maraboupy compiled and added to pythonpath, and run python3 de_solve.py.

de_solve.zip

Problem with extremely sparse networks

Hello,

I am using Marabou to test constraints on very sparse networks (i.e., in which many of the weights are 0 or near to 0, for example, 10^-8).

When I test some property which the network should verify (e.g., the corresponding adversarial example was found using a statistical method) Marabou instantly (in less than a second) gives that the problem is UNSAT.

May it be possible that Marabou is subject to some kind of problem in the building of the representation of the network?

In attachment, you can find a .zip in which I have set up some scripts
in order to reproduce the issue.

Thank you for your time!

marabou_test.zip

Create regression test framework

We want multiple levels of regressions that can be run. It should be easy to add new regressions to a level. CVC4 may be a good source of inspiration.

Python API cannot load MarabouCore

With the latest changes to Marabou, the Python API no longer runs correctly. After re-making MarabouCore shared object, the line "from maraboupy import Marabou" returns the error:

ImportError: dynamic module does not define module export function (PyInit_MarabouCore)

This error can be reproduced using any maraboupy example.

Problem with MarabouNetwork.loadQuery()

Hello,

I have tried to use MarabouNetwork.loadQuery() in order to load my
verification query but it gives me the following error:

~/Marabou-master/maraboupy/MarabouNetwork.py in loadQuery(self, filename, verbose, timeout)
220 """
221 #ipq = self.getMarabouQuery()
--> 222 ipq = MarabouCore.loadQuery(filename)
223 vals, stats = MarabouCore.solve(ipq, filename, timeout=0)
224 if verbose:

AttributeError: module 'maraboupy.MarabouCore' has no attribute 'loadQuery'

Do you have any suggestions?
Thank you!

Loading Pytorch Models

Hello,

I am trying to find a way to verify a set of models created and
learned using pytorch but I am finding some difficulties.

Converting the models to the ONNX format and then to tensorflow protobuffer
(using onnx-tf) seems to modify the model with unnecessary operations which
are currently not supported by Marabou (e.g., GatherV2).

My models are convolutional and fully connected networks which only use Conv2d, Linear and MaxPool2d layers and the torch reshape function in forward of the model.

Do you have any advice on how to solve this problem?

Thank you!

Let Marabou's preprocessor eliminate unused variables automatically

Hey,

currently, when we declare a set of variables using the inputQuery.setNumberOfVariables method, Marabou's preprocessor expects that each available variable would have a valid bound.
For input variables, the bound is set by the user, as it should be, and for the other nodes in the network, Marabou can and will propagate the bounds to them.

However, for my purposes, I need to sometimes declare variables that end up as unused variables (each of them is not connected to any other variable). For those cases, Marabou's preprocessor can not detect automatically that those variables are unused and demand that a valid bound be set on each of them.

It would help if Marabou's preprocessor could automatically detect and remove those variables.

Thanks

MalformedBasis crashing

Running the acas_example on the 1_1_tiny4 network (query just asks for output var. 0 to be >= 0.5) reaches 9530 splits and 2029868 pivots before dying with a MalformedBasisException.

Clarify the semantics for Property

Currently, the property is actually a negation of the invariant we want to prove.

We should clarify the semantics we are following.
Moreover, we should refactor the code such that it directly corresponds to the semantics used by formal verification tools (like model checkers).

High degradation crashing on larger conv nets

After sparse matrix updates, the larger convolutional networks don't run out of memory anymore, but always run into degradation issues that they can't recover from.

To reproduce the issue:

  1. Clone Arvind's fork at https://github.com/arvindvs/Marabou/ and checkout branch "dev_marabou_ops_density." Do make in the maraboupy directory.
  2. Clone my fork of the marabou_tests repository here https://github.com/ibeling/marabou_tests/ and checkout branch "num_repro." (I invited you @guykatzz)
  3. Run the script low_latency_numerical_repro.py making sure to set PYTHONPATH to the directory you cloned into in (1).
  4. After two and a half minutes, observe that Marabou crashes because precision can't be restored.

The branch "dev_marabou_ops_density" is based on the "density" branch with these differences:

  1. the ADD_AUX_EQUATIONS flag is turned off
  2. Arvind implemented the additional TensorFlow operations needed to run the net in Marabou
  3. I had to make some minor changes to get it to compile

maraboupy generates query with infinite bounds

Two problems:

(1) If one creates a simple TF network

input -> identity -> output,

then Marabou can't solve the query output by maraboupy on this network. The reason is that maraboupy always creates separate output variables, but for some operations doesn't have to create any equations, since they would just correspond to permutations of the variables used for other operations. But in this case, there are no other operations, so you get a system of two "decoupled" parts input vars + output vars where the output vars have infinite bounds, and Marabou complains that variables have infinite bounds.

(2) In some cases, even where there is a link between input and output, one gets infinite bounds. This seems to be a preprocessor issue. Turning on the "magic flag" fixes the issues here. Whether or not this happens depends on the actual bounds for the input variables.

I don't have a file for (1) but attached is a file for (2). Run PYTHONPATH= python test_example_marabou.py. The first query (doing inference only, with Marabou) fails due to infinite bounds while the robustness check can run.
marabou_tests-convolutional-tests.zip

There seem to be two ways to fix these:

  1. Support infinite bounds in Marabou. This would fix both (1) and (2).
  2. Always link outputs explicitly to inputs in maraboupy. This would fix only (1). Supplement with more complete preprocessing to fix (2).

MalformedBasisException crash

On 938a9f5, verifying tiny4 in the acas_example default (checking that output[0] >= 0.5) crashes at 147 splits with:

MalformedBasisException caught!
Error! Status[346] was BELOW_LB, but when recomputed got BETWEEN
Variable: x500, index: 346. Value: -0.218929949457020, range: [-179769313486231570814527423731704356798070567525844996598917476803157260780028538760589558632766878171540458953514382464234321326889464182768467546703537516986049910576551282076245490090389328944075868508455133942304583236903222948165808559332123348274797826204144723168738177180919299881250404026184124858368.000000000000000, 179769313486231570814527423731704356798070567525844996598917476803157260780028538760589558632766878171540458953514382464234321326889464182768467546703537516986049910576551282076245490090389328944075868508455133942304583236903222948165808559332123348274797826204144723168738177180919299881250404026184124858368.000000000000000]

Changing equation ordering strongly affects time required to solve

@clazarus and I have been investigating the difference in runtime of the tensorflow and nnet versions of the same network, and reached the conclusion that there is a difference in the ordering of the equations and variables.

To test this, we ran the same nnet query with the equations added to the input query in different orders, using random.shuffle to randomly permute the equations.

The resultant query runs in ~10 seconds in some cases, and in some cases does not terminate at all (tried for ~15 minutes). When the query fails to terminate, there seem to be some issues with "High degradation found".

Testing framework for Maraboupy

We don't have a formal testing framework yet for maraboupy and the input parsers. I have a set of test networks I've been using to validate the tensorflow and onnx parsers, but incorporating these networks into automatic tests would tell us what parts of the code are still untested and ensure that future code changes don't introduce new bugs.

I can work on setting up the framework, but I'm not sure the best way to do that. Does anyone have any recommendations for setting up the testing framework, or is someone else currently working on this?

Strange failures on macOS and gcc 7.3.0

  • I have a fix for this coming. *

On macOS and gcc 7.3.0 (default in new Ubuntu released today), some unit tests for the Tableau fail non-deterministically.

The failures that I see are test_store_and_restore with the failure ./Test_Tableau.h:1084: Error: Expected (tableau->getValue( 2u ) != 1.0), found (1.0000)..+1.00x4+1.00x2 +3.00x0 +2.00x1 +2.00x3 = 225.00. and test_get_basis_equations with ./Test_Tableau.h:1400: Error: Expected (tableau->getLeavingVariable() == 5u), found (6 != 5).

Also, if you comment out line 263 of Engine.cpp so that the simplex loop always recomputes its cost function, Marabou crashes frequently on macOS, but not on Linux. I think this modification should not cause any problems.

I am running off the last totally working commit, cf85db8.

This behavior might be indicative of some unnoticed bug causing big issues when running on macOS.

Returns a SAT point outside of input bounds

While running Marabou, I've found cases where the result is SAT, even though the input values are outside the specified bounds. In addition, I have found examples where expanding the input bounds changes the result from a valid SAT point to UNSAT.

I've attached the files necessary to reproduce the issue using the C++ AcasParser. Included are three different main files that show different behaviors when used in place of the acas_example/main.cpp. Running "main.cpp.outOfBounds" shows a SAT point that is outside of the input bounds. Running "main.cpp.foundSAT" shows that Marabou can find a valid SAT point within a certain radius of a test point, but when that radius is expanded in "main.cpp.UNSAT", Marabou returns UNSAT. The neural network used is also attached and needs to be placed in the acas_example folder.

Attached files:
MarabouIssue.zip

New flags do not work on MacOS

These 2 recently added lines to suppress warnings produce the following error when trying to compile on Mac:

error: unknown warning option '-Wno-misleading-indentation'; did you mean '-Wno-missing-declarations'? [-Werror,-Wunknown-warning-option]
error: unknown warning option '-Wno-terminate'; did you mean '-Wno-format'? [-Werror,-Wunknown-warning-option]

I was unable to find a solution for this. I am not sure if they should be left like this or removed to ensure cross-platform compatibility off-the-shelf. I removed them in my local copy of Rules.mk and everything works as before.

https://github.com/guykatzz/Marabou/blob/aa2b0becf93859605b5432642d376d41a2e6fe79/Rules.mk#L75-L76

Error reading from TensorFlow SavedModel .pb file

I tried exporting the tiniest possible SavedModel from TensorFlow:

import tensorflow as tf

sess = tf.Session()
node1 = tf.constant(1.0, tf.float32)
node2 = tf.constant(2.0, tf.float32)
node3 = tf.add(node1, node2)
print("sess.run(node3): ", sess.run(node3))

# Write out summary data to a file so it can be visualized with TensorBoard
logs_path = '/tmp/tensorflow_logs/tf-example-1/'
summary_write = tf.summary.FileWriter(logs_path,                                    
                                      graph=tf.get_default_graph())

export_dir = '/tmp/tensorflow_exports/tf-example-1/'
builder = tf.saved_model.builder.SavedModelBuilder(export_dir)
with tf.Session(graph=tf.Graph()) as sess:
  builder.add_meta_graph_and_variables(sess,
                                       ["EXAMPLE1"] # required, but can be anything
  ) 
builder.save()

and tried importing the resulting SavedModel with maraboupy:

import numpy as np
from maraboupy import Marabou

proto_file_name = "/tmp/tensorflow_exports/tf-example-1/saved_model.pb"
net = Marabou.read_tf(proto_file_name)

Here's the result:

---------------------------------------------------------------------------
DecodeError                               Traceback (most recent call last)
<ipython-input-26-7bb497047e09> in <module>()
----> 1 net = Marabou.read_tf(proto_file_name)

~/repos/Marabou/maraboupy/Marabou.py in read_tf(filename, inputName, outputName)
     26         marabouNetworkTF: (MarabouNetworkTF) representing network
     27     """
---> 28     return MarabouNetworkTF(filename)

~/repos/Marabou/maraboupy/MarabouNetworkTF.py in __init__(self, filename, inputName, outputName)
     20         """
     21         super().__init__()
---> 22         self.readFromPb(filename, inputName, outputName)
     23 
     24     def clear(self):

~/repos/Marabou/maraboupy/MarabouNetworkTF.py in readFromPb(self, filename, inputName, outputName)
     45         with tf.gfile.GFile(filename, "rb") as f:
     46             graph_def = tf.GraphDef()
---> 47             graph_def.ParseFromString(f.read())
     48         with tf.Graph().as_default() as graph:
     49             tf.import_graph_def(graph_def, name="")

DecodeError: Error parsing message

It's kind of silly to even call this a "saved model" since it does nothing but add two constants. A better word for it is "saved program". But we probably ought to be able to import it.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.