Giter Club home page Giter Club logo

eth2.0-dafny's Introduction

License made-for-VSCode lemmas Checks

Overview

This project was started by ConsenSys R&D and was also supported by the Ethereum Foundation under grant FY20-285, Q4-2020

Objectives

The objective of this project is to write a formal specification of the Eth2.0 specification in the verification-aware programming language Dafny.

More specifically, our goals in this project are many-fold:

  1. Write a formal (non-ambiguous and functional) specification of the Eth2.0 specification. This specification is written with pre/post-conditions using the Hoare logic style proof.
  2. Write an implementation for each function to demonstrate that the specification can be implemented, in other words, it is not inconsistent.
  3. Formally prove that our implementation satisfies the specification. The formal proof is provided in the form of mathematical proofs of lemmas written in Dafny.

To achieve this, we use the capabilities of the verification-aware programming language Dafny to write the specification, the implementation, and the proofs.

How to use the verified code?

Depending on your inclination you may use the verified code in different ways:

  • you can find function specifications (usually in .s.dfy files). They describe state changes as functions mapping a state (and other parameters like blocks) to a new state; These specifications can help to write your own client in your preferred language (including functional programming languages); we have provided an example of implementation for each function, adapted from the Eth2.0 reference spec.

  • you may contribute new code and proofs by either adding functions we have not implemented yet or even test that new ideas and optimisations do not break the proofs;

  • you may use Dafny to generate target code (e.g. Go, Java, C#, JavaScript) and see how the automated generated code can replace or complement your code base;

  • you may use the specifications to help in unit testing your own code. The specifications contain unambiguous pre and post conditions that clearly specify the effect of a function/method. They can be used to write property-drive tests.

Methodology

Dafny provides extensive support for automated reasoning leveraging the power of state-of-start automated reasoning engines (SMT-solvers). As a result, Dafny can assist in proving the lemmas that specify correctness. Moreover, as the lemmas are written as Dafny programs, they provide a non-ambiguous mathematical proof that the code is correct with respect to a specification. All the proofs can be mechanically verified using theorem provers.

Results

This repository contains two main branches:

  • master with a substantial part of the verified Eth2.0 specs. The verified Dafny code guarantees the absence of runtime errors like array-out-of-bounds, division-by-zero. It also provides some functional proofs captured by the invariants in the top-level ForkChoice.dfy file:
  • goal1 with some proofs related to justification and finalisation. This branch has diverged from master and may not be easily merged into it.

Statistics

Some statistics about the project can be found in

Some call graphs are also available (DOT-Graphviz format) alongside the corresponding Dafny files.

A birds-eye view of the (more than 200) functions we have implemented is given by the top-level call graph.

Notes

An introduction (WIP) to the different components of Phase 0 is available in the Wiki section of this repo:

Here is a recent youtube video with a presentation

EEG Meet-up Dafny

Why we should formally verify the Eth2.0 specs

Background & context

The Eth2.0 specifications are subtle and sometimes complex. As a consequence, bugs, glitches or inconsistencies can creep into the specification and the implementation code.

Testing and code peer reviews can help keeping the bugs count low. However, testing can find some bugs but in general cannot guarantee the absence of bugs (Edsger W. Dijkstra).

These bugs remain uncovered ... until they manifest, resulting in crashes. Worse, they can be exploited as security vulnerabilities. An example of critical vulnerability is the OutOfBounds exception where a non-existent index in an array is accessed. This is one of the most common zero day attacks, and can occur in heavily tested code bases e.g. in the web browser Chromium.

You can read more about the specific case of the Beacon Chain in our Wiki section.

We have also put together a video series to introduce this project. The videos include discussions on the state transition, fork choice and process operations components of the project.

Related work

Runtime Verification Inc. have reported some work on:

Our work aims to complement the previous work by providing a thorough formal verification of the Eth2.0 phase 0 specifications.

Useful resources

How to check the proofs?

We have checked the proofs with Dafny 3.0.0 and Dafny 3.2.0.

The bash scripts verifyAll.sh can be used to verify the files in a given directory (e.g. using the Docker container, see below).

For example checking the attestations package can be done by:

/home/user1/eth2.0-dafny $ time ./verifyAll.sh src/dafny/beacon/attestations   -------------------------------------------------------
Processing src/dafny/beacon/attestations/AttestationsHelpers.dfy with config /dafnyVerify:1 /compile:0  /noCheating:1
/home/user1/eth2.0-dafny/src/dafny/beacon/attestations/../../utils/SetHelpers.dfy(38,17): Warning: /!\ No terms found to trigger on.
/home/user1/eth2.0-dafny/src/dafny/beacon/attestations/../../utils/SetHelpers.dfy(60,22): Warning: /!\ No terms found to trigger on.
/home/user1/eth2.0-dafny/src/dafny/beacon/attestations/../gasper/GasperEBBs.dfy(91,16): Warning: /!\ No terms found to trigger on.
/home/user1/eth2.0-dafny/src/dafny/beacon/attestations/../gasper/GasperEBBs.dfy(159,16): Warning: /!\ No terms found to trigger on.

Dafny program verifier finished with 24 verified, 0 errors
No errors in src/dafny/beacon/attestations/AttestationsHelpers.dfy
-------------------------------------------------------
Processing src/dafny/beacon/attestations/AttestationsTypes.dfy with config /dafnyVerify:1 /compile:0  /noCheating:1
/home/user1/eth2.0-dafny/src/dafny/beacon/attestations/../../utils/SetHelpers.dfy(38,17): Warning: /!\ No terms found to trigger on.
/home/user1/eth2.0-dafny/src/dafny/beacon/attestations/../../utils/SetHelpers.dfy(60,22): Warning: /!\ No terms found to trigger on.

Dafny program verifier finished with 12 verified, 0 errors
No errors in src/dafny/beacon/attestations/AttestationsTypes.dfy
-------------------------------------------------------
[OK] src/dafny/beacon/attestations/AttestationsHelpers.dfy
[OK] src/dafny/beacon/attestations/AttestationsTypes.dfy
Summary: 2 files processed - No errors occured! Great job.
./verifyAll.sh src/dafny/beacon/attestations  29.27s user 0.54s system 102% cpu 29.138 total

Using a Docker container

Pre-requisites:

  1. a working installation of Docker,
  2. a fork or clone of this repository.

A simple way to check the proofs is to use a pre-configured installation of Dafny on a Docker container.

On Unix-based system, cd to the root directory of your working copy of this repository.

/home/user1/ $ git clone [email protected]:PegaSysEng/eth2.0-dafny.git
/home/user1/ $ cd eth2.0-dafny
/home/user1/eth2.0-dafny $ 

The next commands will start a Docker container with Dafny pre-installed, and mount your local working directory as a volume in the Docker machine (this way you can access it from the running Docker machine):

/home/user1/eth2.0-dafny $ docker run -v /home/user1/eth2.0-dafny:/eth2.0-dafny -it franck44/linux-dafny /bin/bash
root@749938cb155d:/# cd eth2.0-dafny/
root@749938cb155d:/eth2.0-dafny# dafny /dafnyVerify:1 /compile:0 src/dafny/utils/MathHelpers.dfy 
Dafny 2.3.0.10506

Dafny program verifier finished with 13 verified, 0 errors
root@749938cb155d:/eth2.0-dafny# 

Install Dafny on your computer

Pre-requisites:

Assuming you have a running Dafny compiler, you may use the following command line to check a *.dfy file:

> dafny /dafnyVerify:1 /compile:0  /timeLimit:60 src/dafny/utils/MathHelpers.dfy
Dafny 2.3.0.10506

Dafny program verifier finished with 13 verified, 0 errors

The test folder contains some tests. The purpose of this directory is to demonstrate that we can extract an implementation and run it (indeed, once we have proved the code, there is no need to test it). To run the tests, you can issue the following command from the root directory (it collects all the files in the test folder, compile them and run them):

> ./scripts/runAllTests.sh

For an even better experience you may install VSCode and the Dafny plugin see our Dafny wiki.

How to compile to C#, Go

To compile to Go:

dafny /compileTarget:go /spillTargetCode:1 src/dafny/ssz/BitListSeDes.dfy

C# can be targeted by changing compileTarget to cs.

eth2.0-dafny's People

Contributors

booleanfunction avatar ethers avatar franck44 avatar saltiniroberto avatar unixpi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

eth2.0-dafny's Issues

Move away from using wellTyped

We can access the type of an object x using x?.
This may enable us to simplify some definitions and functions using wellTyped.

Warning on pre-conditions equivalent to false

The function foo below has a pre-condition (Pre) logically equivalent to false.
As a consequence any code in the body (Body) and any post-condition (Post) will satisfy the pre-post constraints (the function can not be run for any input, so it is trivially correct).
Indeed, Dafny verifies foo as the verification conditions is: is Pre && Body && !Post SAT? the answer being no as Pre == false.
There is no logical flaw or bug in the SMT-solver or the Dafny verification engine, but it may be helpful to issue a warning when a pre-condition is equivalent to false.

function method foo(x: nat) : nat
    requires x < 2 && x > 2
    ensures foo(x) >= 10
{
    x + 1
}

Reported to Dafny repo issue 620

Simplify SSZ

In view of the discussion we had with Protolambda, it seems that the SSZ module could be simplified.

We may focus on:

  1. provide serialise/deserialise for each type (without an super type Serialisable of RawSerialisable)
  2. prove the indempotence for each type separately.

The assumption in using SSZ is that clients agree on a channel for each type and the should communicate data for this type.
This may help us to simplify the current type system that may be over-complicated.

Update README

Expected results and current state may be updated at that stage.

Investigate specification of `updateFinalisedCheckpoint`

The eth2.0 specs perform a sequence of if statements to update the finalised_checkpoint of a given state.
In the specification of updateFinalisedCheckpoint those statements are supposed to performed in a given order (with priority).
This differs from the eth2.0 specs where the conditions of the if are checked in sequence and a subsequent if side-effect can override a previous one.

At the moment the code pertaining to the last update all(bits[0..2]) && s.current_justified_checkpoint.epoch == current_epoch - 1 is ignored.

To do:

  1. investigate whether the ifs should be exclusive
  2. fix the spec (updateFinalisedCheckpoint) accordingly and the implementation (process_justification_and_finalization).

Bug in get_next_power_of_2

There is a bug in the get_next_powewr_of_2 in the eth2.0 spec.
For 0 it returns 0.
Same bug exists for get_previous_power_of_2.

Bug report filed issue 1695.

assertion might not hold error in deserialise function

Under the environment of extension version v2.3.0 in VS Code, there exists an error that says the assertion may not hold for the case Uint32

Here is the screenshot:
Screenshot from 2024-02-01 00-05-24
Screenshot from 2024-02-01 00-12-32

The original code snippet:

/** Deserialise. 
     *  
     *  @param  xs  A sequence of bytes.
     *  @param  s   A target type for the deserialised object.
     *  @returns    Either a Success if `xs` could be deserialised
     *              in an object of type s or a Failure oytherwise.
     *  
     *  @note       It would probabaly be good to return the suffix of `xs`
     *              that has not been used in the deserialisation as well.
     */
     function method deserialise(xs : seq<byte>, s : Tipe) : Try<Serialisable>
        requires !(s.Container_? || s.List_? || s.Vector_?)
        ensures match deserialise(xs, s) 
            case Success(r) => wellTyped(r)
            case Failure => true 
    {
        match s
            case Bool_ => if |xs| == 1 && 0 <= xs[0] <= 1 then
                                var r : Serialisable := Bool(byteToBool(xs));
                                Success(r)
                            else 
                                Failure
                            
            case Uint8_ => if |xs| == 1 then
                                var r : Serialisable := Uint8(uintDes(xs));
                                Success(r)
                             else 
                                Failure

            //  The following cases must check that the result is wellTyped.
            //  If wellTyped and RawSerialisable, the result is a Serialisable.
            case Uint16_ => if |xs| == 2 then
                                //  Verify wellTyped before casting to Serialisable
                                assert(wellTyped(Uint16(uintDes(xs))));
                                //  If wellTyped and RawSerialisable, result is a Serialisable
                                var r : Serialisable := Uint16(uintDes(xs));
                                Success(r)                               
                            else 
                                Failure
            
            case Uint32_ => if |xs| == 4 then
                                assert(wellTyped(Uint32(uintDes(xs))));
                                var r : Serialisable := Uint32(uintDes(xs));
                                Success(r)                               
                            else 
                                Failure

            case Uint64_ => if |xs| == 8 then
                                constAsPowersOfTwo();
                                assert(wellTyped(Uint64(uintDes(xs))));
                                var r : Serialisable := Uint64(uintDes(xs));
                                Success(r)  
                             else 
                                Failure

            case Uint128_ => if |xs| == 16 then
                                constAsPowersOfTwo();
                                assert(wellTyped(Uint128(uintDes(xs))));
                                var r : Serialisable := Uint128(uintDes(xs));
                                Success(r)  
                             else 
                                Failure

            case Uint256_ => if |xs| == 32 then
                                constAsPowersOfTwo();
                                assert(wellTyped(Uint256(uintDes(xs))));
                                var r : Serialisable := Uint256(uintDes(xs));
                                Success(r)                              
                            else 
                                Failure
                                
            case Bitlist_(limit) => if (|xs| >= 1 && xs[|xs| - 1] >= 1) then
                                        var desBl := fromBytesToBitList(xs);
                                        //  Check that the decoded bitlist can fit within limit.
                                        if |desBl| <= limit then
                                            var r : Serialisable := Bitlist(desBl,limit);
                                            Success(r)
                                        else
                                            Failure
                                    else
                                        Failure

            case Bitvector_(len) => if isValidBitVectorEncoding(xs, len) then            
                                        var r : Serialisable := Bitvector(fromBytesToBitVector(xs,len));
                                        Success(r)
                                    else
                                        Failure

            case Bytes_(len) => if 0 < |xs| == len then
                                  var r : Serialisable := Bytes(xs);
                                  Success(r)
                                else Failure
    }

The path to that line: /eth2.0-dafny/src/dafny/ssz/Serialise.dfy(201,39): Error: assertion might not hold

Is there any solution for this, or it doesn't matter? Many thanks in advance!

Avoid creating "computation" overflows when not necessary

This issue is reported as issue 2136 eth2.0-specs.

Avoid creating "computation" overflows when not necessary

Overview of the Issue

In some part of the code of the Eth2.0 specs, some computations are performed on bounded integers (e.g. uint64) and may result in underflows/overflows.
For instance if a is uint64, computing 3 * a may result in an overflow as there result may not fit in the 64 bits.

Overflows can contribute to reject blocks:

State transitions that cause a uint64 overflow or underflow are also considered invalid. [beacon chain transition function].

As a result a block can be rejected even if it is actually well-formed and not responsible for the overflows.
(A related question is whether the previous statement applies only to uint64 or any underflow/overflow.)

Avoiding spurious under/overflows

However, it is possible to rewrite some parts of the specs in a way that avoids creating spurious overflows.
Spurious overflows can appear because a computation of a property of a given value is implemented using unsafe operations,
but the property could be checked without these unsafe operations.

Example: assume a and b are uint64.
The following code may result in an overflow when a >= MAX_UNIT64 - 1:

if (b >= 2) 
  if (a + 2 <= b)  // possible overflow for a + 2 that may not be a `uint64`
  ....

This code snippet can be rewritten in a safer equivalent version:

if (b >= 2) 
  if (a <= b - 2)  // b - 2 >= 0 and does not underflow not overflow.
  ....

Occurrences of this issue in part of the Eth2.0 specs

The code for process_justification_and_finalization may be made safer avoiding spurious overflows.

The following piece of code may be rewritten in a safer non-overflow-prone version:

# Process finalizations
if get_current_epoch(state) <= GENESIS_EPOCH + 1:
       return

// after this location, get_current_epoch(state) > GENESIS_EPOCH + 1 >= 2 and we can use this fact to
// evaluate the following conditions without any overflows. 
bits = state.justification_bits
# The 2nd/3rd/4th most recent epochs are justified, the 2nd using the 4th as source
if all(bits[1:4]) and old_previous_justified_checkpoint.epoch + 3 == current_epoch:
     state.finalized_checkpoint = old_previous_justified_checkpoint
# The 2nd/3rd most recent epochs are justified, the 2nd using the 3rd as source
if all(bits[1:3]) and old_previous_justified_checkpoint.epoch + 2 == current_epoch:
       state.finalized_checkpoint = old_previous_justified_checkpoint
# The 1st/2nd/3rd most recent epochs are justified, the 1st using the 3rd as source
if all(bits[0:3]) and old_current_justified_checkpoint.epoch + 2 == current_epoch:
       state.finalized_checkpoint = old_current_justified_checkpoint
# The 1st/2nd most recent epochs are justified, the 1st using the 2nd as source
if all(bits[0:2]) and old_current_justified_checkpoint.epoch + 1 == current_epoch:
       state.finalized_checkpoint = old_current_justified_checkpoint

The last three ifs statement can be rewritten as old_previous_justified_checkpoint.epoch == current_epoch - k, k = 1,2 (using a subtraction rather an addition to avoid any overflow.)
The first one old_previous_justified_checkpoint.epoch + 3 == current_epoch falls into two case:

  • current_epoch == 2 and that case, this condition cannot be true as old_previous_justified_checkpoint.epoch is a uint64 and thus non-negative
  • current_epoch > 2 and in this case we can replace the test using a minus operator. Overall this can be rewritten as:

As a result it is equivalent to:

 if all(bits[1:4]) and current_epoch >= 3 && old_previous_justified_checkpoint.epoch == current_epoch - 3:
        state.finalized_checkpoint = old_previous_justified_checkpoint

Another occurrence of possible overflows is in the computation of supermajority in the same function:

if get_attesting_balance(state, matching_target_attestations) * 3 >= get_total_active_balance(state) * 2:
...

The values get_attesting_balance(state, matching_target_attestations) and get_total_active_balance(state) are uint64 and multiplying them may generate an overflow exception.
This issue can be mitigated by converting the value of uint128 and perform the comparison with this type.

Formal verification that previous proposed patches are correct

The formal proof that the previous changes mitigate the overflow issues described above can be found in the Dafny verified version of process_justification_and_finalization.

Move to using block instead of blockHeaders

state_transition and on_block are currently using blockheaders instead of blocks.

Before adding process_operations, it is necessary to move to using blocks as blockheaders do not have the body of a block (that contains the operations).

Security vulnerabilities (possible overflows) in helper functions of Beacon Chain

Some overflows may happen in several helper functions of the Beacon Chain.

Possible overflow in assert statement in get_block_root_at_slot

The following code snippet contains a assert statement that may result in an overflow:

def get_block_root_at_slot(state: BeaconState, slot: Slot) -> Root:
    """
    Return the block root at a recent ``slot``.
    """
    assert slot < state.slot <= slot + SLOTS_PER_HISTORICAL_ROOT  
    return state.block_roots[slot % SLOTS_PER_HISTORICAL_ROOT]

note: the above assert is not a pre-condition to guarantee the absence of array-out-of-bounds error when dereferencing state.block_roots (this never happens because block_roots has size SLOTS_PER_HISTORICAL_ROOT) but a functional property of the system (requesting the block root for a slot that is too far in the past should not be permitted).

The pre-condition of the previous function should be strengthened to prevent slot + SLOTS_PER_HISTORICAL_ROOT from overflowing:

function method get_block_root_at_slot(state: BeaconState, slot: Slot) : Root
     requires slot as int + SLOTS_PER_HISTORICAL_ROOT as int < 0x10000000000000000 //  Max uint64 + 1
     requires slot < state.slot <= slot + SLOTS_PER_HISTORICAL_ROOT
{
        state.block_roots[slot % SLOTS_PER_HISTORICAL_ROOT]
}

Possible overflow in compute_start_slot_at_epoch

Because Epoch ranges over uint64, the multiplication can overflow in the following code snippet:

def compute_start_slot_at_epoch(epoch: Epoch) -> Slot:
    """
    Return the start slot of ``epoch``.
    """
    return Slot(epoch * SLOTS_PER_EPOCH)

The pre-condition should be strengthened to:

function method compute_start_slot_at_epoch(epoch: Epoch) : Slot
    requires epoch as int * SLOTS_PER_EPOCH as int < 0x10000000000000000    // max unit64 + 1
{
    epoch * SLOTS_PER_EPOCH as Slot
}

Other functions impacted by the previous strengthening of pre-conditions

As a result functions calling compute_start_slot_at_epoch or get_block_root_at_slot should have their pre-conditions strengthened.
This includes get_block_root

def get_block_root(state: BeaconState, epoch: Epoch) -> Root:
    """
    Return the block root at the start of a recent ``epoch``.
    """
    return get_block_root_at_slot(state, compute_start_slot_at_epoch(epoch))

the pre-condition of which should be:

function method get_block_root(state: BeaconState, epoch: Epoch) : Root    
     requires state.slot as int + SLOTS_PER_HISTORICAL_ROOT as int < 0x10000000000000000 // Max uint64 + 1
     requires epoch as int * SLOTS_PER_EPOCH as int < 0x10000000000000000   
     requires compute_start_slot_at_epoch(epoch) < state.slot <= compute_start_slot_at_epoch(epoch) + SLOTS_PER_HISTORICAL_ROOT  
{ 
        var e := compute_start_slot_at_epoch(epoch);
        get_block_root_at_slot(state, e)
}

Define constant for Max number of validators

In the Gasper helpers and proofs, change the MAX_VALIDATORS_PER_COMMITTEE to another (new) constant which is the size of the set of validators and not necessarily the MAX number of validators per committee.

Moreover, this can change over time, so at some point, it should be replaced by the current size of the set of validators.

Investigate: new info Auto-accumulator tail-recursive using Dafny 3

Using Dafny 3.0.0, a (new?) info has showed up:
"Auto-accumulator tail-recursive".
This is not an error or warning but may deserve some attention: how do we suppress this info from showing up, like induction info?

And example is the tail recursive function timeSeq in utils/Helpers.dfy:

 function method {:tailrecursion true} timeSeq<T>(t : T, k : nat) : seq<T> 
        ensures |timeSeq(t,k)| == k
        ensures forall i :: 0 <= i < k ==> timeSeq(t,k)[i] == t
        decreases k
    {
        if k == 0 then []
        else [t] + timeSeq(t, k - 1)
    }

Even with {:tailrecursion true} attribute the info appears using Dafny 3.

Definition of const in imported module breaks some proofs

Surprisingly, the definition of constants as

const SEQ_EMPTY_32_BYTES := timeSeq<Byte>(0,32)

or

const EMPTY_BYTES32 := Bytes32(SEQ_EMPTY_32_BYTES)

in Eth2Types.dfy seem to have an impact on seemingly unrelated datatypes in other files.
For instance, when defined as above, the proofs of the lemmas bitlistDecodeEncodeIsIdentity and bitlistEncodeDecodeIsIdentity are broken (the base case in the inductive proofs).

If I comment out the const and use

timeSeq(0 as Byte, 32) // for const SEQ_EMPTY_32_BYTES := timeSeq<Byte>(0,32)

and

Bytes32(timeSeq(0 as Byte, 32)) // for EMPTY_BYTES32

everything seems to be fine and the proofs can be checked by Dafny.
The first proof can be fixed by asserting that timeSeq(false,8) is equal for FALSE_BYTE.
The second I have no idea.

Anyhow, that seems very surprising that definitions of constants that are not used in the module BitlistSeDes can break proofs.

For now I will revert to the use of timeSeq in Eth2Types.dfy but this may need further investigation.

Fix proof of bitvectorEncodeDecodeIsIdentity

Dafny does not seem to be able to verify the proof of lemma bitvectorEncodeDecodeIsIdentity.
The failing case is for |xb| == 1 and we may add a few hints in the calculations to fix that.

Enable per file verification configuration

Some files need custom verification options to be checked efficiently.
It would be nice if the batch processing (script verifyAll.sh) could run the verification of each file with a default/custom verification configuration.

SSZ spec: Merkleization - Typo in wording

In simple-serialize.md, within the Merkelization section it says:

We now define Merkleization hash_tree_root(value) of an object value recursively:

merkleize(pack(value)) if value is a basic object or a vector of basic objects.

This part should exclude bitvectors as they are dealt with separately.

A suggested wording to correct this typo could be:

merkleize(pack(value)) if value is a basic object or a vector of basic objects, excluding bitvectors.

Bug report filed, issue #1892

Conflicting definitions of BeaconBlock

See ethereum/consensus-specs#1788

Seems like at least two defs of BeaconBlock co-exist:

class BeaconBlock(Container):
    slot: Slot
    proposer_index: ValidatorIndex
    parent_root: Root
    state_root: Root
    body: BeaconBlockBody

and another one

class BeaconBlock(Container):
    slot: Slot
    parent_root: Root
    state_root: Root
    body: BeaconBlockBody

Add licensing information to repo.

Seems like APACHE is the preferred license in PegaSys for this kind of work.
After approval, we may add the license to the repo, and a header (copyright) to the files.

Re-organise folders/files.

Reduce size and complexity of files.

To do on the beacon folder first:

  • create files with types only, e.g. AttestationsTypes.dfy
  • create files with functions (if possible no lemmas), e.g Attestations.s.dfy
  • create files with proofs, e.g. Attestations.p.dfy
  • try to split functions and group them meaningfully (e.g. in Helpers, split epoch and other stuff)
  • create subfolders of beacon (e.g. epoch, generalProofs)

Add Serialisation for Containers and prove related (de)serialisation lemmas.

Commit a0a4a69 adds the Container constructor to Serialisable but it does not add the related serilisation and deserialisation functions.
Also, in order to avoid breaking the verification for the existing serialisation functions and lemmas, it adds a requires clause to

  • serialise
  • deserialise
  • wellTypedDoesNotFail
  • seDesInvolutive
  • serialiseIsInjective

constraining the type of input parameter to be in the set {Bool_,Uint8_,Bitlist_,Bytes32_}

Make the specification of int_to_bytes and bytes_to_int more general

The following are more general specifications for the functions int_to_bytes and bytes_to_int than those currently in the Dafny specification:

    function int_to_bytes(n: uint64, length: uint64) : seq<uint8>
    requires n as nat < power(256,length as nat)
    ensures |int_to_bytes(n,length)| == length as int
    ensures forall i | 8 <= i < |int_to_bytes(n,length)| :: int_to_bytes(n,length)[i] == 0
    function bytes_to_int(s: seq<uint8>):uint64
    requires forall i | 8 <= i < |s| :: s[i] == 0
    ensures bytes_to_int(s) as nat < power(256,8)

Collect stats

Design scripts to collect statistics about Dafny files.

  • number of lines of code (started in count_lines.py)
  • call graph (call_graph.py)

Variable names clash in Boogie file

While moving to using const for empty blocks, genesis state etc, in the ForkChoice module, a Dafny/Boogie error showed up.
After some trimming of the code where the error originated, I managed to obtain a simple version exhibiting the same problem.
There is a work around and the problem as been reported to the dafny-lang developers as issue 904.

Re-organise repo

Archive current snapshot as a branch and clean up master branch.

Encode Vectors only for specific cases

As per the K-spec, vectors seem to be used in a small numb er of cases.

This is taken from the K-spec, types.k file:

 block_roots: Vector[Hash, SLOTS_PER_HISTORICAL_ROOT]
    state_roots: Vector[Hash, SLOTS_PER_HISTORICAL_ROOT]
    proof: Vector[Hash, DEPOSIT_CONTRACT_TREE_DEPTH + 1]  # Merkle path to deposit data list root
  //Converts K-Map representation of a python List/Vector to a K-ValueList. Only implemented for used types.```
On top of that seems like `Bytes32`` which is a Vector of 32 uint8 is also used.

We may only consider the used typed as per the K-spec in our verification of Phase 0.

on_block(b) and (store) progress

It seems that on_block(b) can be called many times with the same block, or with any block that is already in the store.
This results in the store not being modified and is a potential source of DoS attack.

This issue has been reported as issue 1891 in the Eth2.0 specs.

Merge the two versions of fromBitlistToBytes

The two functions, in ssz
https://github.com/PegaSysEng/eth2.0-dafny/blob/7174b8d01b5c667af4a78cbf6a65c36e96ad2c32/src/dafny/ssz/BitListSeDes.dfy#L93
and in merkle
https://github.com/PegaSysEng/eth2.0-dafny/blob/7174b8d01b5c667af4a78cbf6a65c36e96ad2c32/src/dafny/merkle/Merkleise.dfy#L241

are similar.
The first one adds a trailing one to the argument list l before performing the same computation as the second one.
We may consider refactoring this code and use only one instance.

Missing pre-conditions

The Eth2.0-specs omit checks on keys's memberships in some cases and explicitly mention some (using asserts) at other times.
This seems to be a general inconsistency in the specs.
This inconsistency has been reported in issue 1930.

Introduce IndexedAttestation

In the current version and for the sake of simplicity, we will assume that a ValidatorIndex is within the bounds
of the MAX_VALIDATOR_PER_COMMITTEE.

This is not the case in practice. A Validator index is within the REGISTRY_LIMIT and there is a list of indices (chosen validator indices for a given slot) that encodes the set of selected validators.

As we do not deal with the random selection of validators, it is reasonable to assume the following simplification:
a validatorIndex is within the MAX_VALIDATOR_PWER_COMMITTEE range.

Later, we have to introduce indexedAttestation and map a given item in the list of aggregation_bits and map the list of bits the attesting_indices. See this link.

Fuel setting on a function used in a constrained type declaration erroneously affects the witness verification

Bug report on the Dafny GitHub repo: dafny-lang/dafny#703

Copied here for reference


The following piece of code verifies correctly.

module M {
    predicate method p1(x:nat)
    {
        if x < 6 then 
            true
        else
            p1(x-1)
    }

    predicate method p2(x:nat)
    {
        if x < 6 then 
            true
        else
            p1(x-1)
    }    

    type T1 = x:nat | p1(x) witness 5
    type T2 = x:T1 | p2(x) witness 5

    method m(x:nat) returns (x':nat)
    {
        x' := 0;
    }
}

However, the verification of the following piece of code,

module M {
    predicate method p1(x:nat)
    {
        if x < 6 then 
            true
        else
            p1(x-1)
    }

    predicate method p2(x:nat)
    {
        if x < 6 then 
            true
        else
            p1(x-1)
    }    

    type T1 = x:nat | p1(x) witness 5
    type T2 = x:T1 | p2(x) witness 5

    method {:fuel p1,1,2} m(x:nat) returns (x':nat)
    {
        x' := 0;
    }
}

where the annotation {:fuel p1,1,2} has been added to method m, fails with error

test.dfy(19,35): Error: result of operation might violate subset type constraint for 'T1'
Execution trace:
    (0,0): anon0
    (0,0): anon4_Else

Dafny program verifier finished with 4 verified, 1 error

The same happens if the annotation {:fuel 1,2} is added to the function p1 as follows.

module M {
    predicate method {:fuel p1,1,2} p1(x:nat)
    {
        if x < 6 then 
            true
        else
            p1(x-1)
    }

    predicate method p2(x:nat)
    {
        if x < 6 then 
            true
        else
            p1(x-1)
    }    

    type T1 = x:nat | p1(x) witness 5
    type T2 = x:T1 | p2(x) witness 5

    method m(x:nat) returns (x':nat)
    {
        x' := 0;
    }
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.