Giter Club home page Giter Club logo

firehose-antelope's Introduction

Firehose on Antelope

License

This is the Antelope chain-specific implementation part of firehose-core and enables both Firehose and Substreams on Antelope chains with native blocks.

For Developers

To get started with Firehose and Substreams, you need to sign up on https://app.pinax.network to get yourself an api key. You'll also find quickstarts there to get you started and all of our available endpoints (we currently provide both Firehose and Substreams endpoints for EOS, WAX and Telos, as well as different testnets).

For connecting to Firehose endpoints, you'll need the Protobufs which are published on buf.build. Some Golang example code on how to set up a Firehose client can be found here.

To consume Antelope Substreams, please have a look at the documentation. You can also find Substreams to deploy in our Substreams repository here and on substreams.dev.

To develop Antelope Substreams, have a look at the documentation here and at the Pinax SDK for Antelope Substreams which can be found here.

A collection of resources around Substreams can also be found on Awesome Substreams.

Subgraphs

Although The Graph does not officially support any Antelope chains yet, it's possible to write Subgraphs based on Substreams and set up your own Graph node. You can find an example here on how to achieve this.

For Operators

Please have a look at the documentation here on how to set up your own Firehose & Substreams stack. Note that indexing larger Antelope chains such as EOS or WAX requires parallel processing of the chain and a lot of resources to have the indexing done in a reasonable time frame.

EOS EVM

This implementation provides native Antelope blocks, including all Antelope specific block data. In case you are looking for operating Firehose & Substreams for EOS EVM, please have a look at the firehose-ethereum repository; it provides a generic evm poller to poll the EVM blocks from an RPC node.

Support

In case of any questions around the Pinax endpoints or technology, feel free to hit us on our Discord server. For more generic questions around Substreams, you might also find the Streamingfast Discord server useful.

License

Apache 2.0

firehose-antelope's People

Contributors

chamorin avatar deniscarriere avatar dependabot[bot] avatar emiliocramer avatar fschoell avatar jubeless avatar maoueh avatar sduchesneau avatar xjonathanlei avatar yaroshkvorets avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

ultraio

firehose-antelope's Issues

firehose-antelope really slow when replaying kylin and telos blocks

Just start from a snapshot from a few days ago. Watch the amount of CPU use by firehose and how slow the blockchain syncs forward (3 minutes for every 10 seconds in blocks). Without firehose there, the chain runs like 10x faster (30 minutes in 10 seconds). Any optimizations to be made?

Note: this is reader+merger running together.

Cleanup

  • old code which got commented out
  • any dfuse-io references in logging etc
  • any eosio references

Relayer channel at max capacity

Jul 05 15:24:29 eos-sfrl206 sf[555542]: 2023-07-05T15:24:29.023Z ERRO (dgrpc) finished streaming call with code Unknown {"grpc.start_time": "2023-07-05T15:22:42Z", "system": "grpc", "span.kind": "server", "grpc.service": "sf.bstream.v1.BlockStream", "grpc.method": "Blocks", "peer.address": "[2602:fd23:102:1:8888:8888:4:53]:54798", "error": "subscription channel at max capacity", "grpc.code": "Unknown", "grpc.time_ms": 106495.2421875}

firehose starts at block 0 instead of block 2

On startup, merger complains:

processing object list: expecting to start at block 0 but got block 2 (and we have no previous blockID to align with..). First streamable block is configured to be: 0

kylin merger stuck at block 0001246500

On Kylin testnet, LIB doesn't move between these 2 blocks and merger can't seem to merge blocks over that big a range:

0001255017-607c061c1f753301-86aed2e4e203c735-1246625-default.dbin.zst
...
0001362153-8f17feba9159dbfa-c8d01c276dd21554-1246625-default.dbin.zst

Last log entry from merger:

INFO (merger) merged and uploaded {"filename": "0001246500", "merge_time": "47.127411ms"}

Starting merger without `common-forked-blocks-store-url` generates error

Don't specify any storage locations. It should default to the datadir specified with -d (which is /var/lib/dfuse in our case. Get weird error:

Error: unable to launch: failed to init destination archive store: parse "file://{sf-data-dir}/storage/forked-blocks": invalid character "{" in host name
2022-11-26T17:03:44.884Z ERRO (derr) acme-blockchain {"error": "unable to launch: failed to init destination archive store: parse \"file://{sf-data-dir}/storage/forked-blocks\": invalid character \"{\" in host name"}

Example:

$ /usr/bin/fireantelope -d /var/lib/dfuse -c /etc/dfuse/config.yml start

config.yml:

start:
  args:
  - reader-node-stdin
  - merger
  flags:
    verbose: 0
    reader-node-grpc-listen-addr: :9000
    common-first-streamable-block: 1

"invalid string value: data is not UTF-8 encoded"

There is an issue with block 167764311 on EOS.

grpcurl -H 'Authorization: xxx' -d '{"start_block_num": 167764311, "stop_block_num": 167764312}' eos.firehose.pinax.network:443 sf.firehose.v2.Stream.Blocks
ERROR:
  Code: Unknown
  Message: unexpected HTTP status code received from server: 204 (No Content)

When running substreams getting a better error:

substreams run https://github.com/pinax-network/substreams/releases/download/common-v0.7.0/common-v0.7.0.spkg map_action_traces -e eos.substreams.pinax.network:443 -s 167764311 -t +1

Error: rpc error: code = InvalidArgument desc = step new irr: handler step new: execute modules: applying executor results "map_action_traces": execute: maps wasm call: block 167764311: module "map_action_traces": wasm execution failed deterministically: panic in the wasm: "called `Result::unwrap()` on an `Err` value: DecodeError { description: \"invalid string value: data is not UTF-8 encoded\", stack: [(\"Action\", \"json_data\"), (\"ActionTrace\", \"action\"), (\"TransactionTrace\", \"action_traces\"), (\"Block\", \"unfiltered_transaction_traces\")] }" at common/src/maps.rs:46:1

`bool` encoding

Decoding db_op.new_data_json JSON, ran into an issue that boolean values are encoded as 0/1 as opposed to false/true.

Example: atomicassets::setcoldata action updating collections table: https://eos.eosq.eosnation.io/tx/026601c512c54841184553eaa6d50daa70e927386ea4530dc18e51efee6e1035

Abi:

{
            "name": "collections_s",
            "base": "",
            "fields": [
                {
                    "name": "collection_name",
                    "type": "name"
                },
                {
                    "name": "author",
                    "type": "name"
                },
                {
                    "name": "allow_notify",
                    "type": "bool"
                },
                {
                    "name": "authorized_accounts",
                    "type": "name[]"
                },
                {
                    "name": "notify_accounts",
                    "type": "name[]"
                },
                {
                    "name": "market_fee",
                    "type": "float64"
                },
                {
                    "name": "serialized_data",
                    "type": "uint8[]"
                }
            ]
        },

Receiving from firehose (see allow_notify):

{
    "allow_notify": 1,
    "author": "steph",
    "authorized_accounts": ["steph", "atomicdropsx", "unpack.gems"],
    "collection_name": "eosmemewars1",
    "market_fee": "0.10000000000000001",
    "notify_accounts": [],
    "serialized_data": [4, 18, 77, 101, 109, <skip> , 34, 58, 34, 34, 125]
}

Any reason why it's not boolean?

waxtest block replay crash

Oct 23 03:21:46 waxtest-sfdm148 nodeos[518434]: info  2023-10-23T03:21:46.768 nodeos    producer_plugin.cpp:531       on_incoming_block    ] Received block c922f275582d56ea... #154706000 @ 2022-05-22T13:37:33.500 signed by liquidgaming [trxs: 0, lib: 154705668, confirmed: 0, net: 0, cpu: 100, elapsed: 77, time: 240, latency: 44804653268 ms]
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: panic: runtime error: index out of range [8] with length 7
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: goroutine 138 [running]:
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: github.com/eoscanada/eos-go.(*ABI).decode(0xc004b47950, 0xff?, {0xc000e5b680, 0xef})
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: #011/tmp/build.1697080930.936/build/pkg/mod/github.com/eoscanada/[email protected]/abidecoder.go:88 +0xaa6
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: github.com/eoscanada/eos-go.(*ABI).Decode(0xc00e181680?, 0x2a221fb?, {0xc000e5b680?, 0xc00dcfbf40?})
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: #011/tmp/build.1697080930.936/build/pkg/mod/github.com/eoscanada/[email protected]/abidecoder.go:71 +0x18
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: github.com/pinax-network/firehose-antelope/codec.(*ABIDecoder).decodeDbOp(0xc0007e6c30, 0xc00da99500, 0xffffffffffffffff, {0xc00e0780c0, 0x40}, 0x9389e76, 0x0?)
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: #011/tmp/build.1697080930.936/firehose-antelope/codec/abi_decoder.go:667 +0x969
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: github.com/pinax-network/firehose-antelope/codec.(*ABIDecoder).executeDecodingJob(0xc0007e6c30, {0x2773ae0?, 0xc00e06c660?})
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: #011/tmp/build.1697080930.936/firehose-antelope/codec/abi_decoder.go:461 +0x3a5
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: github.com/lytics/ordpool.(*OrderedPool).poolWorkerMain(0xc0007f0000, 0x7)
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: #011/tmp/build.1697080930.936/build/pkg/mod/github.com/lytics/[email protected]/ordered_pool.go:166 +0xda
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: created by github.com/lytics/ordpool.(*OrderedPool).Start in goroutine 41
Oct 23 03:21:46 waxtest-sfdm148 nodeos[518435]: #011/tmp/build.1697080930.936/build/pkg/mod/github.com/lytics/[email protected]/ordered_pool.go:57 +0x2d
Oct 23 03:21:47 waxtest-sfdm148 nodeos[518434]: DMLOG FPRINTF_FAILED failed written=0 remaining=6024 1 Broken pipe
Oct 23 03:21:47 waxtest-sfdm148 nodeos[518434]: DMLOG FPRINTF_FAILURE_TERMINATED

error message says acme

I saw this message:

Error: config file "/etc/dfuse/config.yml" not found. Did you 'fireacme init'?

The fact the config file is missing is my problem, but the error shouldn't say fireacme.

Can't stream using negative start block

Question

When using -start-block it only starts from the initial block, but not the "end of the current blocks"

How do you stream only the latest blocks using a negative value?

Also tried using -stop-block and it throws an error since it doesn't allow negatives

Bug ๐Ÿ› or feature ๐Ÿ”จ ?

CLI

-s, --start-block int Start block to stream from. Defaults to -1, which means the initialBlock of the first module you are streaming (default -1)
-t, --stop-block string Stop block to end stream at, inclusively. (default "0")

โŒ Can't stream using negative start block beyond initial block

modules:
  - name: map_transfers
    kind: map
    inputs:
      - source: sf.antelope.type.v2.Block
    output:
      type: proto:antelope.eosio.token.v1.TransferEvents

https://github.com/EOS-Nation/substreams-antelope/blob/690be1d05eb8c4438102369497cc1f014255846c/substreams/eosio.token/substreams.yaml#L22-L28

$ substreams run -e eos.firehose.eosnation.io:9001 https://eos.mypinata.cloud/ipfs/QmbttxBK9FbV8E8g8g8jp8rpYDvK8QzEwSx4bQmafngXpJ map_transfers -s -1000
Connected - Progress messages received: 0 (0/sec)
Backprocessing history up to requested target block 18446744073709550616:
(hit 'm' to switch mode)

Error: rpc error: code = InvalidArgument desc = validate request: negative start block -1000 is not accepted

โŒ Stop block must be positive integer

$ substreams run -e eos.firehose.eosnation.io:9001 https://eos.mypinata.cloud/ipfs/QmbttxBK9FbV8E8g8g8jp8rpYDvK8QzEwSx4bQmafngXpJ map_transfers -t -1000       
Error: stop block: end block is invalid: strconv.ParseUint: parsing "-1000": invalid syntax

reader crash on EOS block 144532479

023-10-08T01:52:05.533Z ERRO (reader-node-stdin) reading from console logs {"error": "unable to marshal to binary form: string field contains invalid UTF-8"}

Rename acme-blockchain to antelope throughout the code

See error message:

2022-11-26T16:36:02.826Z ERRO (derr) acme-blockchain {"error": "unable to launch: unable to create app \"merger\": failed to create directory \"{sf-data-dir}/storage/forked-blocks\": mkdir {sf-data-dir}: permission denied"}

Testing after update to v1.1.1

Running latest substreams cli v1.1.1:

substreams run -e waxtest.firehose.eosnation.io:9001 https://github.com/pinax-network/substreams/releases/download/eosio.token-v0.9.0/eosio-token-v0.9.0.spkg map_transfers -s -1
Connected (trace ID 4dd1c5a09c1b5ede3a81a51d281422d5)
Progress messages received: 0 (0/sec)
Backprocessing history up to requested target block 18446744073709551615:
(hit 'm' to switch mode)
`
Error: rpc error: code = Internal desc = step new irr: handler step new: execute modules: running executor "map_transfers": execute module: execute: maps wasm call: input data for "sf.antelope.type.v1.Block": inputs module value not found

Different error if I run another module:

substreams run -e waxtest.firehose.eosnation.io:9001 https://github.com/pinax-network/substreams/releases/download/eosio.token-v0.9.0/eosio-token-v0.9.0.spkg kv_out -s -1
Connected (trace ID f0c7458ec0494164b746bfe74c550d1b)
Progress messages received: 0 (0/sec)
Backprocessing history up to requested target block 18446744073709551615:
(hit 'm' to switch mode)

Error: rpc error: code = InvalidArgument desc = validate request: input source "type:\"sf.antelope.type.v1.Block\"" not supported, only "sf.ethereum.type.v2.Block" and 'sf.substreams.v1.Clock' are valid

update `sf.antelope.type.v1` => `v2`

Using sf.antelope.type.v1.Block native source does not work with Substreams

However, the protobuf file is using sf.antelope.type.v1

  • change any protobuf to v2
  • update folder to v2
  • replace streamingfast references to pinax-network

https://github.com/pinax-network/firehose-antelope/blob/feature/decode_db_ops/proto/sf/antelope/type/v1/type.proto#L3-L5

Substreams

modules:
  - name: map_active_schedule
    kind: map
    inputs:
      - source: sf.antelope.type.v1.Block
    output:
      type: proto:sf.antelope.type.v2.ProducerAuthoritySchedule

Error โŒ

Error: rpc error: code = Internal desc = step new irr: handler step new: execute modules: running executor "map_active_schedule": execute module: execute: execute: maps wasm call: input data for "sf.antelope.type.v1.Block": inputs module value not found

Improve error message when streaming blocks that are too old

Problem

When streaming blocks that are before the common-first-streamable-block: 265580000, no explicit error message indicates the issue. Instead, the gRPC channel freezes for a while and eventually shuts down with Stream removed errors.


Do you think it's possible to detect from the request if the block range or block number is too far back and return a more explicit error message ?

init is missing

It seems init is missing from firehose-antelope, to interactively create the firehose.yaml file (and default data structure)

merger doesn't start without setting start block

Nov 26 17:11:49 eos-sfdm64a nodeos[24163]: Fatal error in app merger:
Nov 26 17:11:49 eos-sfdm64a nodeos[24163]: expecting to start at block 0 but got block 2 (and we have no previous blockID to align with..). First streamable block is configured to be: 1

genesis json should be probably be stored in block 1? Discuss with SF team.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.