Giter Club home page Giter Club logo

blockscout-rs's Introduction

Blockscout

Blockchain Explorer for inspecting and analyzing EVM Chains.

Blockscout

Blockscout provides a comprehensive, easy-to-use interface for users to view, confirm, and inspect transactions on EVM (Ethereum Virtual Machine) blockchains. This includes Ethereum Mainnet, Ethereum Classic, Optimism, Gnosis Chain and many other Ethereum testnets, private networks, L2s and sidechains.

See our project documentation for detailed information and setup instructions.

For questions, comments and feature requests see the discussions section or via Discord.

About Blockscout

Blockscout allows users to search transactions, view accounts and balances, verify and interact with smart contracts and view and interact with applications on the Ethereum network including many forks, sidechains, L2s and testnets.

Blockscout is an open-source alternative to centralized, closed source block explorers such as Etherscan, Etherchain and others. As Ethereum sidechains and L2s continue to proliferate in both private and public settings, transparent, open-source tools are needed to analyze and validate all transactions.

Supported Projects

Blockscout currently supports several hundred chains and rollups throughout the greater blockchain ecosystem. Ethereum, Cosmos, Polkadot, Avalanche, Near and many others include Blockscout integrations. A comprehensive list is available here. If your project is not listed, please submit a PR or contact the team in Discord.

Getting Started

See the project documentation for instructions:

Acknowledgements

We would like to thank the EthPrize foundation for their funding support.

Contributing

See CONTRIBUTING.md for contribution and pull request protocol. We expect contributors to follow our code of conduct when submitting code or comments.

License

License: GPL v3.0

This project is licensed under the GNU General Public License v3.0. See the LICENSE file for details.

blockscout-rs's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

blockscout-rs's Issues

Update verification algorithm

Current verification approach assumes that there is only one metadata part present in bytecode to be verified which could be extracted as the last bytes of deployed bytecode. But that is not always true. E.g. issue 5636 presents the case where creation transaction input contains a creation code of another contract with its own metadata part. We cannot verify such contracts right now.

It would be nice to update the metadata extraction mechanisms to support verification of bytecodes with multiple metadata parts

Make solc compilation async

Currently we use Solc::compile function to compile solidity contracts. But that function compiles contracts in synchronous blocking way. Probably we should use Solc::async_compile

Fix standard json requests without optimizer

If we send the following standard json request to smart-contracts verifier it fails with error content is not valid standard json: missing field 'optimizer' at line 10 column 3:

{
  "language": "Solidity",
  "sources": {
    "A.sol": {
      "content": "pragma solidity ^0.6.8; contract A {}"
    }
  },
  "settings": {
    "remappings": []
  }
}

The problem is in settings.optimizer field which is not present in the input and is required in our current implementation. But according to solidity documentation (https://docs.soliditylang.org/en/v0.8.15/using-the-compiler.html#input-description), that field should be optional. Thus, in general, we should accept that input and consider it the same way as if we have

{
  "language": "Solidity",
  "sources": {
    "A.sol": {
      "content": "pragma solidity ^0.6.8; contract A {}"
    }
  },
  "settings": {
    "remappings": [],
    "optimizer": {}
  }
}

Integrate smart-contracts verifier into blockscout start-up

We would like to build and run a local instance of smart-contracts verifier on blockscout startup by default. So that users who run blockscout locally would have their own verifier service running.

We have to research how would be better to implement that. Currently we have two ideas: 1) to update docker-compose file that runs main blockscout instance; 2) to update blockscout instance to run the service on start-up from elixir code directly. We'll have to discuss what advantages and disadvantages every one of them has

Extract verified contracts from Blockscout instances

We need to extract all currently verified contracts from Blockscout instances and probably support their updates. Regardless of the way how we obtain those contracts, we have to re-verify them via eth-bytecode-db service, as it should parse parts of the bytecodes obtained from provided sources.

Token Contract Statistics Dashboard

Provide token contract statistics on the address details page:

  • Total token amount
  • Percentage of senders/receivers of token
  • Total transfer count
  • Unique token senders
  • Unique token receivers
  • Top token transfers by the amount
  • Top token receivers by the amount
  • Graph of token transfers by month
  • Graph of token receivers, senders, transfers by month

screen shot 2018-07-05 at 11 31 15 am

Add more descriptive errors and logs

In case of verification errors almost in all cases smart-contract-verifier returns "No contract could be verified with provided data". Probable we may make those errors more informative at least for some cases (e.g. compiler_version in metadata of contract to verify does not correspond to result of local compilation).

Also, it may be useful to analyze current logs we get from running service and see how we can improve them as well.

Verification via deployed bytecode with constructor arguments in Abi

Verification fails for deployed bytecode in case there is a constructor with arguments in the contract Abi.

The problem occurs because constructor arguments have always been extracted when verifying via creation transaction input. Deployed bytecode has no indication of constructor arguments, and when verification via deployed bytecode was added in #131, parsing of constructor arguments has not been considered.

That may prevent contracts with constructor arguments from being verified via deployed bytecode, which is important for networks that do not index creation transaction inputs. E.g., issue #6275 presents a contract that would not have been verified even if the Kava network had used a verification microservice.

Update smart-contract-verifier interface

Epic: #200

Update smart-contract-verifier interface

Need to update smart-contract verifier so that it would be possible to obtain verification specific data required for eth-bytecode-db service.

  • Add compiler settings to verification response
    #179
  • Add creation tx input and deployed bytecode parts (main/meta) of the source to verification response
    #205
  • Grpc server
    • Protobuf file and server initialization
      #222
    • Solidity service implementation
      #224
    • Vyper service implementation
      #225
    • Sourcify service implementation
      #226
  • Add verification via metadata

Searching for similar bytecodes

Implement searching of similar bytecodes. The algorithm is almost the same for creation transaction inputs and deployed bytecodes, with the only difference that deployed bytecodes should not check constructor arguments correspondance

Update code structure

Could we extract the business logic from layers that handle requests (e.g. HTTP server) into a separate library crate? E.g. for smart-contracts verifier we may make functions responsible for verification (e.g. multi_part::verify, standard_json::verify) to be independent from HTTP handlers (currently they accept web::Data, and Json<VerificationRequest>). By their extraction into a separate lib we may easily integrate any sources where that requests are coming from (e.g. HTTP, grpc, native calls from Elixir)

Extend contract verification errors

Currently, if provided source contracts were successfully compiled, but none of them corresponded to the actual creation_tx_input and deployed_bytecode we return a verification error with the following message: "No contract could be verified with provided data".

We need to analyze what more specific errors we can return for different cases. As one of the examples, if any of the resultant error are CompilerVersionMismatch, the compiler version provided by the user differs from the compiler version the contract on chain was compiled with. In that case we may return "Invalid compiler version (expected ...)" instead of "No contract could be..."

Add checks for downloaded compilers

We may run --version on downloaded solc compilers to ensure that they work and return correct compiler version. That adds some level of confidence that the downloaded compiler is correct and works

Add `evm_version`s bruteforcing

In case if contract could not be verified with provided settings, Elixir implementation had bruteforced the last three evm_versions in hope that the user just made a mistake in that field (https://github.com/blockscout/blockscout/blob/bd2aec623ac1b15d8b90015765a4ffb5a518962d/apps/explorer/lib/explorer/smart_contract/solidity/verifier.ex#L64)

Probably, it would be a good idea to implement the same mechanism in our implementation and analyze how it would affect the performance

Smart Contract Method Calls List

Provide a table of method calls and counts for every smart contract. This table should be located under a Statistics tab on the address details page for a smart contract.

screen shot 2018-07-05 at 10 51 23 am

Signatures aggregation service

Currently there are several solutions that implement Ethereum signature databases and provides an API to search for the functions and events by their selectors (https://sig.eth.samczsun.com/, https://www.4byte.directory/). Those databases may contain very similar but not identical information. We would like to implement a service that a) makes a call to those databases to search for corresponding methods, aggregates responses and return aggregated data to the caller; b) submits new ABIs to those databases.

  • Process functions
  • Process events
    #211
  • Update smart-contract verifier to send ABIs to sig-provider service
    #212

Contract/Tokens Transactions per month graph

Provide a graph displaying the number of transactions to a particular contract/token per month. This provides insight on whether a particular Dapp or Token is increasing in popularity.

This graph should be displayed under a statistics tab on the address details page for a smart contract.

Here is an example:
screen shot 2018-07-05 at 10 48 11 am

PANIC: thread 'main' panicked at 'couldn't initialize the app

dir: No such file or directory (os error 2)
thread 'main' panicked at 'couldn't initialize the app: fetching list json returned error: error sending request for url (https://raw.githubusercontent.com/blockscout/solc-bin/main/vyper.list.json): error trying to connect: unexpected EOF', /data/blockscout-rs/smart-contract-verifier-http/src/run.rs:21:14
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

Smart Contract Statistics Dashboard

Provide a tab on the contract details page about the statistics for that smart contract:

  • Smart contract calls amount
  • Unique address callers amount
  • Graph - Contract calls and unique callers counts by day (30 days)
  • Graph - Function calls distribution (call counts, smart contract callers)
  • Graph - Event counts (by events)
  • Table/Graph - Top Smart Contract Function Calls/ Unique Callers Count
  • Table - Smart Contract Events
  • Table - Top Contract Callers

screen shot 2018-07-05 at 11 47 50 am

screen shot 2018-07-05 at 11 47 59 am

Middleware trait for smart-contract-verifier

There are situations when we may want to make some additional work with verified contracts. Currently we have several examples when that could be useful:

  1. #133 describes how we can update smart-contract-verifier to send a request for a new service responsible for broadcasting method ABIs to 3-rd party Ethereum signature database solutions (option 2)
  2. It is highly probable that we decide to import verified contracts into the Sourcify service as part of our possible collaboration

Besides that, there could be other extensions users may wish to add to result processing (e.g. insert results into some custom database).

Vyper fetch compilers

Currently smart contract verification service written on Rust does not support Vyper verification. The issue represents one of sub-tasks required to implement that functionality.

Research where to get Vyper compilers from and implement a fetcher backed by download cache as it is currently done for Solidity compilers. As a starting point we may probably look at Vyper downloader in blockscout repository (https://github.com/blockscout/blockscout/blob/master/apps/explorer/lib/explorer/smart_contract/vyper_downloader.ex)

`DeployedBytecode` and `Bytecode` structures in `Verifier`

In #8 we discussed the possibility to make DeployedBytecode and Bytecode to include all fields actual deployed bytecode and bytecode have:

I suggest adding a struct, which will declaratively describe the bytecode format, i.e.

struct Bytecode {
// bytecode starts with this
compiler: Compiler,
// 4 zeroes specifying the next value
guard: Guard,
// something something size of 16 bytes
foo: Foo,
// last 2 bytes are the length
length: u16,
}
so you could just look at it and understand what format bytecode is, instead of looking at the parse function and trying to guess something from it (like reading here, that last two bytes are the length etc)
and then implement parse on it (may be DeployedBytecode already is this struct, but then I'd like to see all the values)

In this issue we may discuss those alternative, and whether we want to implement those structures that way or left the layout that allows only to effectively implement functions the structures expose.

sig-provider: look into variable length types in events

If an event argument is variable length/more than 32 bytes and indexed, then instead of value in the event topic there will be a keccak hash of this value. We need to look into this and determine the best way to decode/not decode such types

Add verification of contracts without `creation_tx_input`

Current Behaviour

Currently, only creation_tx_input value is compared with a result of local compilation during verification process. deployed_bytecode which is also received as a part of the request, is only checked to be valid non-empty hex and has no other impact on verification process.

The reason behind that decision is the following: sometimes bytecode deployed on the chain as a result of creation transaction differs from evm.deployedBytecode returned from compiler. As a simple example we may consider a simple library

pragma solidity 0.5.11;

library Foo {
    function foo() external pure returns (string memory) {
        return "foo";
    }
}

evm.deployedBytecode obtained from the compiler returns

0x730000000000000000000000000000000000000000301460806040526004361060335760003560e01c8063c2985578146038575b600080fd5b603e60b6565b6040518080602001828103825283818151815260200191508051906020019080838360005b83811015607c5780820151818401526020810190506063565b50505050905090810190601f16801560a85780820380516001836020036101000a031916815260200191505b509250505060405180910390f35b60606040518060400160405280600381526020017f666f6f000000000000000000000000000000000000000000000000000000000081525090509056fea265627a7a723158208553df9ac3dbfc1efd85b95c497d79906e9cdc5e99be26820389f2cf06bd0ee664736f6c634300050b0032

But then deployed on the chain zeros in the beginning are replaced on the address the library has been deployed at:

0x73cd13606ccf9c73cecab6eed0b46a2d4b8aee781b301460806040526004361060335760003560e01c8063c2985578146038575b600080fd5b603e60b6565b6040518080602001828103825283818151815260200191508051906020019080838360005b83811015607c5780820151818401526020810190506063565b50505050905090810190601f16801560a85780820380516001836020036101000a031916815260200191505b509250505060405180910390f35b60606040518060400160405280600381526020017f666f6f000000000000000000000000000000000000000000000000000000000081525090509056fea265627a7a723158208553df9ac3dbfc1efd85b95c497d79906e9cdc5e99be26820389f2cf06bd0ee664736f6c634300050b0032

Comparing creation_tx_input with evm.bytecode allows us to overcome such issues and verify contracts whose deployed bytecode is modified during initialization process.

Problem

Problems start when contract has no creation transaction. That may happen if, for example, the contract was deployed on chain during chain initialization via genesis file (#5675). In that case, there is nothing to compare the result of local compilation with and, thus, verification fails (actually it fails even earlier, as we would return 400 - BadInputError on empty creation transaction input).

Solution

We may partially solve that problem by allowing for clients not to specify creation_tx_input in verification request. In such a case, we may try to verify the contract just by comparing deployed bytecode with the local evm.deployedBytecode. If successful, the contract should be considered as verified. This approach does not allow us to verify contracts which deployed bytecode is modified as a result of creation transaction execution, but at least we'll be able to verify contracts those deployed bytecodes remains the same.

Generate grpc+http service stub

We want to research an easy way to generate grpc and http server along with openapi documentation.
Right now we're looking at this pipeline
.proto -> openapi (using protoc-gen-openapiv2)
.proto -> rust (using prost)
.proto -> grpc server (using tonic)
.proto -> http server (using self written crate)

Standard JSON handle

We need a handle, that will accept standard json as input and will verify the input with bytecode from blockchain

Verification of Vyper contracts

Currently we support only verification of Solidity smart-contracts, while Blockscout also supports verification of Vyper contracts.
That issue keeps track of current status of Vyper contracts verificaiton in smart-contract-verifier service.

  • Implement download of Vyper compilers
  • Compilation of Vyper contracts
  • Verification of Vyper contracts
  • Add HTTP endpoints to handle list_versions and verify requests

Add generic compiler fetcher

Add generic compiler fetcher, which will get one json file with mapping from version to the download link and download files if necessary.

Add verification response format

Currently we come up with the following verification response structure:

Success

{
  "message": "OK",
  "result": {
    "ABI": "[{\n\"type\":\"event\",\n\"inputs\": [{\"name\":\"a\",\"type\":\"uint256\",\"indexed\":true},{\"name\":\"b\",\"type\":\"bytes32\",\"indexed\":false}],\n\"name\":\"Event\"\n}, {\n\"type\":\"event\",\n\"inputs\": [{\"name\":\"a\",\"type\":\"uint256\",\"indexed\":true},{\"name\":\"b\",\"type\":\"bytes32\",\"indexed\":false}],\n\"name\":\"Event2\"\n}, {\n\"type\":\"function\",\n\"inputs\": [{\"name\":\"a\",\"type\":\"uint256\"}],\n\"name\":\"foo\",\n\"outputs\": []\n}]\n",
    "CompilerVersion": "v0.2.1-2016-01-30-91a6b35",
    "ContractName": "Test",
    "ImplementationAddress": "0x000000000000000000000000000000000000000e",
    "IsProxy": "true",
    "OptimizationUsed": "1",
    "SourceCode": "pragma solidity >0.4.24;\n\ncontract Test {\nconstructor() public { b = hex\"12345678901234567890123456789012\"; }\nevent Event(uint indexed a, bytes32 b);\nevent Event2(uint indexed a, bytes32 b);\nfunction foo(uint a) public { emit Event(a, b); }\nbytes32 b;\n}\n"
  },
  "status": "0"
}

In case of success, status of "0" is returned with all contract data as a result value. The data returned partially copies the data received with a request. That is done purposefully, as such redundancy simplifies the requester allowing it not to keep that data internally.

Verification Error

{
  "message": "Verification error: bytecode does not match compilation output",
  "result": nil,
  "status": "1"
}

If verification fails, non-zero status is returned with reason provided as a message value. result in that case if nil.

Notice, that both success and verification error responses have 200 HTTP status code, and status field is used as a discriminant. The field is a string in order to ease introduction of new verification error statuses.

Other errors

The server may also return 400 (Bad Request) in case if deployedBytecode or creationTxInput are invalid (e.g. are invalid hexes, or have different metadata hash values). As those data are extracted from the chain by the requester, they should be valid and it is the requester's responsibility. Thus, these errors are treated differently from the verification errors.

Version handles isn't updated

Version handles use Compilers when should use XXXClient, or we should provide compilers as app data.
Also no tests found this bug, so we should add some tests and check that current impl fails on them

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.