Giter Club home page Giter Club logo

polkascan-pre-harvester's Introduction

Polkascan Open-Source

Polkascan Open-Source Application

Quick deployment (Use hosted Polkascan API endpoints)

Step 1: Clone repository:

git clone https://github.com/polkascan/polkascan-os.git

Step 2: Change directory:

cd polkascan-os

Step 3: Check available releases:

git tag

Step 4: Checkout latest releases:

git checkout v0.x.x

Step 5: Make sure to also clone submodules within the cloned directory:

git submodule update --init --recursive

Step 6: Then build the other docker containers

docker-compose -p kusama -f docker-compose.kusama-quick.yml up --build

Use public Substrate RPC endpoints

Step 1: Clone repository:

git clone https://github.com/polkascan/polkascan-os.git

Step 2: Change directory:

cd polkascan-os

Step 3: Check available releases:

git tag

Step 4: Checkout latest releases:

git checkout v0.x.x

Step 5: Make sure to also clone submodules within the cloned directory:

git submodule update --init --recursive

Step 6: During the first run let MySQL initialize (wait for about a minute)

docker-compose -p kusama -f docker-compose.kusama-public.yml up -d mysql

Step 7: Then build the other docker containers

docker-compose -p kusama -f docker-compose.kusama-public.yml up --build

Full deployment

The following steps will run a full Polkascan-stack that harvests blocks from a new local network.

Step 1: Clone repository:

git clone https://github.com/polkascan/polkascan-os.git

Step 2: Change directory:

cd polkascan-os

Step 3: Check available releases:

git tag

Step 4: Checkout latest releases:

git checkout v0.x.x

Step 5: Make sure to also clone submodules within the cloned directory:

git submodule update --init --recursive

Step 6: During the first run let MySQL initialize (wait for about a minute)

docker-compose -p kusama -f docker-compose.kusama-full.yml up -d mysql

Step 7: Then build the other docker containers

docker-compose -p kusama -f docker-compose.kusama-full.yml up --build

Links to applications

Other networks

Add custom types for Substrate Node Template

Cleanup Docker

Use the following commands with caution to cleanup your Docker environment.

Prune images

docker system prune

Prune images (force)

docker system prune -a

Prune volumes

docker volume prune

API specification

The Polkascan API implements the https://jsonapi.org/ specification. An overview of available endpoints can be found here: https://github.com/polkascan/polkascan-pre-explorer-api/blob/master/app/main.py#L60

Troubleshooting

When certain block are not being processed or no blocks at all then most likely there is a missing or invalid type definition in the type registry.

Some steps to check:

You can also dive into Python to pinpoint which types are failing to decode:

import json
from scalecodec.type_registry import load_type_registry_file
from substrateinterface import SubstrateInterface

substrate = SubstrateInterface(
    url='ws://127.0.0.1:9944',
    type_registry_preset='substrate-node-template',
    type_registry=load_type_registry_file('harvester/app/type_registry/custom_types.json'),
)

block_hash = substrate.get_block_hash(block_id=3899710)

extrinsics = substrate.get_block_extrinsics(block_hash=block_hash)

print('Extrinsincs:', json.dumps([e.value for e in extrinsics], indent=4))

events = substrate.get_events(block_hash)

print("Events:", json.dumps([e.value for e in events], indent=4))

polkascan-pre-harvester's People

Contributors

arjanz avatar emielsebastiaan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

polkascan-pre-harvester's Issues

NameError: name 'DemocracyVoteAudit' is not defined

Hi,

I'm running this harvester to process data in Kusama.

[2020-03-24 03:39:34,261: ERROR/ForkPoolWorker-8] Task app.tasks.accumulate_block_recursive[856497f7-efda-4434-a967-5601a25ac7a6] raised unexpected: HarvesterCouldNotAddBlock('0x652741b2ad907604ad5f47e3a6032d25e6baabe674ddd91547ba04cfdc5c709e',)
Traceback (most recent call last):
  File "/usr/src/app/app/tasks.py", line 112, in accumulate_block_recursive
    block = harvester.add_block(block_hash)
  File "/usr/src/app/app/processors/converters.py", line 564, in add_block
    extrinsic_processor.accumulation_hook(self.db_session)
  File "/usr/src/app/app/processors/extrinsic.py", line 78, in accumulation_hook
    vote_audit = DemocracyVoteAudit(
NameError: name 'DemocracyVoteAudit' is not defined

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/celery/app/trace.py", line 375, in trace_task
    R = retval = fun(*args, **kwargs)
  File "/usr/src/app/app/tasks.py", line 72, in __call__
    return super().__call__(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/celery/app/trace.py", line 632, in __protected_call__
    return self.run(*args, **kwargs)
  File "/usr/src/app/app/tasks.py", line 139, in accumulate_block_recursive
    raise HarvesterCouldNotAddBlock(block_hash) from exc

It seems that DemocracyVoteAudit definition is missing.

Here are my environment variables

PYTHONPATH=/usr/src/app
SUBSTRATE_RPC_URL=wss://kusama-rpc.polkadot.io/
SUBSTRATE_ADDRESS_TYPE=2    # Kusama
FINALIZATION_ONLY=1
SUBSTRATE_METADATA_VERSION=10
TYPE_REGISTRY=kusama

How can I solve the problem? Thanks

Block sync failure on a custom blockchain

We are trying to deploy a polkascan open source version. However we encountered some problems after connected to our node.

Traceback (most recent call last):
  File "/usr/src/app/app/tasks.py", line 115, in accumulate_block_recursive
    block = harvester.add_block(block_hash)
  File "/usr/src/app/app/processors/converters.py", line 469, in add_block
    self.process_metadata(parent_spec_version, parent_hash)
  File "/usr/src/app/app/processors/converters.py", line 210, in process_metadata
    runtime = Runtime.query(self.db_session).get(spec_version)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 959, in get
    return self._get_impl(ident, loading.load_on_pk_identity)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 1069, in _get_impl
    return db_load_fn(self, primary_key_identity)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/loading.py", line 282, in load_on_pk_identity
    return q.one()
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3292, in one
    ret = self.one_or_none()
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3261, in one_or_none
    ret = list(self)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3334, in __iter__
    return self._execute_and_instances(context)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3356, in _execute_and_instances
    querycontext, self._connection_from_session, close_with_result=True
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3371, in _get_bind_args
    mapper=self._bind_mapper(), clause=querycontext.statement, **kw
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3349, in _connection_from_session
    conn = self.session.connection(**kw)
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 1124, in connection
    execution_options=execution_options,
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 1130, in _connection_for_bind
    engine, execution_options
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 408, in _connection_for_bind
    self._assert_active()
  File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 299, in _assert_active
    "This session is in 'inactive' state, due to the "
sqlalchemy.exc.InvalidRequestError: This session is in 'inactive' state, due to the SQL transaction being rolled back; no further SQL can be emitted within this transaction.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/celery/app/trace.py", line 375, in trace_task
    R = retval = fun(*args, **kwargs)
  File "/usr/src/app/app/tasks.py", line 70, in __call__
    return super().__call__(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/celery/app/trace.py", line 632, in __protected_call__
    return self.run(*args, **kwargs)
  File "/usr/src/app/app/tasks.py", line 142, in accumulate_block_recursive
    raise HarvesterCouldNotAddBlock(block_hash) from exc
app.processors.converters.HarvesterCouldNotAddBlock: 0xbf62f20192a6d35c59a846d10831ee06275fdcccb179b279832405a804fa5500

If we check the flower console, not every app.tasks.accumulate_block_recursive task fail.

It looks like related to type_registry. We found types for Kusama, Polkadot, and a substrate-node-template. However we are based on substrate/bin/node, not that similar to any of the above types. We didn't find a docs to generate the type definition.

So far we have tried:

  • Sync to kusama: works
  • Sync to our blockchain with kusama config: shows above error
  • Sync to our blockchain with node-template config: shows above error

New Account is not getting added & issue in updating existing account balance

I have made some changes to the blockchain such that instead of storing AccountStore in System module, I am storing it in custom module & implemented all required Trait in that pallet & everything working fine. But to use polkascan with this customised blockchain, I have updated the polkascan code in a way that I have replaced System module for balances to Custom module. And data_account_info_snapshot table is getting updated correctly but the new account entry is not getting created data_account. Any suggestion or help will highly be appreciated.

To summarise, I have customer balance pallet which binds balance to Did[u8; 32] instead of AccountId & AccountStore for this custom balance pallet is another custom pallet instead of System module. And Accounts Balances or not getting updated for genesis Accounts & not getting created new accounts (Endowed).

Running start.sh migrates don't work

Running the start shell script an error is given saying app is not a module. I have run the pip install however the migrations still fail due to naming within the folder.

Kusama: websockets.exceptions.PayloadTooBig

I got this error when syncing Kusama data

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/websockets/protocol.py", line 827, in transfer_data
    message = await self.read_message()
  File "/usr/local/lib/python3.6/site-packages/websockets/protocol.py", line 950, in read_message
    frame = await self.read_data_frame(max_size=max_size)
  File "/usr/local/lib/python3.6/site-packages/websockets/protocol.py", line 971, in read_data_frame
    frame = await self.read_frame(max_size)
  File "/usr/local/lib/python3.6/site-packages/websockets/protocol.py", line 1051, in read_frame
    extensions=self.extensions,
  File "/usr/local/lib/python3.6/site-packages/websockets/framing.py", line 127, in read
    f"payload length exceeds size limit ({length} > {max_size} bytes)"
websockets.exceptions.PayloadTooBig: payload length exceeds size limit (65535 > 16 bytes)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/src/app/app/tasks.py", line 114, in accumulate_block_recursive
    block = harvester.add_block(block_hash)
  File "/usr/src/app/app/processors/converters.py", line 366, in add_block
    json_block = self.substrate.get_chain_block(block_hash)
  File "/usr/local/lib/python3.6/site-packages/substrateinterface/__init__.py", line 211, in get_chain_block
    response = self.rpc_request("chain_getBlock", [block_hash]).get('result')
  File "/usr/local/lib/python3.6/site-packages/substrateinterface/__init__.py", line 134, in rpc_request
    asyncio.get_event_loop().run_until_complete(self.ws_request(payload))
  File "/usr/local/lib/python3.6/asyncio/base_events.py", line 467, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.6/site-packages/substrateinterface/__init__.py", line 108, in ws_request
    self._ws_result = json.loads(await websocket.recv())
  File "/usr/local/lib/python3.6/site-packages/websockets/protocol.py", line 509, in recv
    await self.ensure_open()
  File "/usr/local/lib/python3.6/site-packages/websockets/protocol.py", line 812, in ensure_open
    raise self.connection_closed_exc()
websockets.exceptions.ConnectionClosedError: code = 1006 (connection closed abnormally [internal]), no reason

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/celery/app/trace.py", line 375, in trace_task
    R = retval = fun(*args, **kwargs)
  File "/usr/src/app/app/tasks.py", line 74, in __call__
    return super().__call__(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/celery/app/trace.py", line 632, in __protected_call__
    return self.run(*args, **kwargs)
  File "/usr/src/app/app/tasks.py", line 141, in accumulate_block_recursive
    raise HarvesterCouldNotAddBlock(block_hash) from exc
app.processors.converters.HarvesterCouldNotAddBlock: 0x2e1c614587197fded1d5d79c7bc712189f7a74f62d6782c07127604301536fb5

I can simply reproduce it by

>>> from substrateinterface import SubstrateInterface
>>> substrate = SubstrateInterface('wss://kusama-rpc.polkadot.io', type_registry_preset='kusama')
>>> block = substrate.get_chain_block('0x2e1c614587197fded1d5d79c7bc712189f7a74f62d6782c07127604301536fb5')

The block 0x2e1c614587197fded1d5d79c7bc712189f7a74f62d6782c07127604301536fb5 has a really big extrinsic (4th).

I think we should enlarge max_size when using websocket client.
https://github.com/polkascan/py-substrate-interface/blob/1d15aafb581c606b717517bdab14bfd8648d1888/substrateinterface/__init__.py#L104
https://websockets.readthedocs.io/en/stable/api.html#module-websockets.client

Just wonder if anyone got this problem as well?

Question: Does this expose the data to calculate total unlocked tokens at any given block?

I'm looking for a data feed that can give me the total unlocked tokens at any given block. Essentially this would be the total issuance which can be queried from the chain state minus tokens locked for governance, tokens locked for vesting, tokens staked, and eventually tokens locked in parachain slots. The last four items would need to be queried on each account individually, which could be quite a long loading time. Does Polkascan track or expose any of this data?

question: Steps to Sync our Own DB?

Hello @arjanz,

I am wondering if the repo is in a state where one could clone the repo, alter some endpoints in the docker-compose, and begin syncing our own MySql DB on Kusama?

If not, what other work would be needed, just off the top of your head?

Thanks!

SQL error with block #2688181 of Kusama

Got errors when processing Kusama block #2688181

sqlalchemy.exc.DataError: (mysql.connector.errors.DataError) 1406 (22001): Data too long for column 'signature' at row 1
[SQL: INSERT INTO data_extrinsic (block_id, extrinsic_idx, extrinsic_hash, extrinsic_length, extrinsic_version, signed, `unsigned`, signedby_address, signedby_index, address_length, address, account_index, account_idx, signature, nonce, era, `call`, module_id, call_id, params, success, error, spec_version_id, codec_error) VALUES (%(block_id)s, %(extrinsic_idx)s, %(extrinsic_hash)s, %(extrinsic_length)s, %(extrinsic_version)s, %(signed)s, %(unsigned)s, %(signedby_address)s, %(signedby_index)s, %(address_length)s, %(address)s, %(account_index)s, %(account_idx)s, %(signature)s, %(nonce)s, %(era)s, %(call)s, %(module_id)s, %(call_id)s, %(params)s, %(success)s, %(error)s, %(spec_version_id)s, %(codec_error)s)]
[parameters: {'block_id': 2688181, 'extrinsic_idx': 3, 'extrinsic_hash': '080c430e02dae3a765c7b737ce8602dcdcdc991ca38121ec3c599406afd96ac5', 'extrinsic_length': 143, 'extrinsic_version': '84', 'signed': True, 'unsigned': False, 'signedby_address': True, 'signedby_index': False, 'address_length': 'ff', 'address': '7bd437292fcdcfeba45c83aec26656f399954d9e38048ebd07a1922e31fc2270', 'account_index': None, 'account_idx': None, 'signature': 'c42b82d02bce3202f6a05d4b06d1ad46963d3be36fd0528bbe90e7f7a4e5fcd38d14234b1c9fcee920d76cfcf43b4ed5dd718e357c2bc1aae3a642975207e67f01', 'nonce': 0, 'era': '1503', 'call': '0400', 'module_id': 'balances', 'call_id': 'transfer', 'params': '[{"name": "dest", "type": "Address", "value": "0xf6b21d624832094b03aa672e016462a020e217cc67b1434785b99114a2b4fa5a", "valueRaw": ""}, {"name": "value", "type": "Compact<Balance>", "value": 10000000000, "valueRaw": "0700e40b5402"}]', 'success': 1, 'error': 0, 'spec_version_id': 2005, 'codec_error': 0}]
(Background on this error at: http://sqlalche.me/e/9h9h)

It seems that the length of signature is 130 and the field is varchar(128) (https://github.com/polkascan/polkascan-pre-harvester/blob/master/app/db/versions/38a7f29de7c7_initial_db_layout.py#L284)

a sqlalchemy exception

Polkascan run well when i start the chain with --chain dev ,but when i run chain to connect with my
testnet which has 4 peers and have more than 25,000 blocks, it will throw a exception and looks like concurrent reason
polkascan:master latest
substrate:master Dec 9, 2019

`[2019-12-12 17:54:56,710: INFO/MainProcess] Received task: app.tasks.accumulate_block_recursive[641a3faf-d996-4690-b6f4-57878b6c9480]
[2019-12-12 17:54:56,711: DEBUG/MainProcess] TaskPool: Apply <function _fast_trace_task at 0x1107ebf28> (args:('app.tasks.accumulate_block_recursive', '641a3faf-d996-4690-b6f4-57878b6c9480', {'lang': 'py', 'task': 'app.tasks.accumulate_block_recursive', 'id': '641a3faf-d996-4690-b6f4-57878b6c9480', 'eta': None, 'expires': None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': 'ba499d6d-f592-469c-98f8-8fe0b454520e', 'parent_id': '531ab9ec-b32a-4a9d-bc12-e6b488092dae', 'argsrepr': "('0xd4243cf2b940b36120243e787ac2bdef72f6ef772ec2a737a09d66befc220416', None)", 'kwargsrepr': '{}', 'origin': 'gen26565@haoMBP', 'reply_to': '8dd2b96d-9d6b-3c30-8b16-d55f1310abd8', 'correlation_id': '641a3faf-d996-4690-b6f4-57878b6c9480', 'delivery_info': {'exchange': '', 'routing_key': 'celery', 'priority': 0, 'redelivered': True}}, '[["0xd4243cf2b940b36120243e787ac2bdef72f6ef772ec2a737a09d66befc220416", null], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]', 'application/json', 'utf-8') kwargs:{})
[2019-12-12 17:54:56,711: INFO/MainProcess] Received task: app.tasks.accumulate_block_recursive[2c274382-d4e0-4ae6-83e1-b857c531c16b]
...skipping...
[2019-12-12 17:54:57,013: DEBUG/ForkPoolWorker-2] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:57,013: DEBUG/ForkPoolWorker-1] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:57,014: DEBUG/ForkPoolWorker-3] http://localhost:9933 "POST / HTTP/1.1" 200 597
[2019-12-12 17:54:57,018: DEBUG/ForkPoolWorker-3] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:57,021: DEBUG/ForkPoolWorker-4] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:57,021: DEBUG/ForkPoolWorker-2] http://localhost:9933 "POST / HTTP/1.1" 200 448
[2019-12-12 17:54:57,022: WARNING/ForkPoolWorker-2] Metadata: CACHE MISS
[2019-12-12 17:54:57,022: WARNING/ForkPoolWorker-2] 198
[2019-12-12 17:54:57,024: DEBUG/ForkPoolWorker-1] http://localhost:9933 "POST / HTTP/1.1" 200 448
[2019-12-12 17:54:57,024: WARNING/ForkPoolWorker-1] Metadata: CACHE MISS
[2019-12-12 17:54:57,025: WARNING/ForkPoolWorker-1] 198
[2019-12-12 17:54:57,027: DEBUG/ForkPoolWorker-2] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:57,028: DEBUG/ForkPoolWorker-3] http://localhost:9933 "POST / HTTP/1.1" 200 448
[2019-12-12 17:54:57,028: DEBUG/ForkPoolWorker-4] http://localhost:9933 "POST / HTTP/1.1" 200 992
[2019-12-12 17:54:57,029: WARNING/ForkPoolWorker-3] Metadata: CACHE MISS
[2019-12-12 17:54:57,029: WARNING/ForkPoolWorker-3] 198
[2019-12-12 17:54:57,031: DEBUG/ForkPoolWorker-1] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:57,032: DEBUG/ForkPoolWorker-4] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:57,035: DEBUG/ForkPoolWorker-3] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:57,043: DEBUG/ForkPoolWorker-2] http://localhost:9933 "POST / HTTP/1.1" 200 117345
[2019-12-12 17:54:57,059: DEBUG/ForkPoolWorker-1] http://localhost:9933 "POST / HTTP/1.1" 200 117345
[2019-12-12 17:54:57,064: DEBUG/ForkPoolWorker-4] http://localhost:9933 "POST / HTTP/1.1" 200 448
[2019-12-12 17:54:57,065: WARNING/ForkPoolWorker-4] Metadata: CACHE MISS
[2019-12-12 17:54:57,065: WARNING/ForkPoolWorker-4] 198
[2019-12-12 17:54:57,071: DEBUG/ForkPoolWorker-4] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:57,084: DEBUG/ForkPoolWorker-3] http://localhost:9933 "POST / HTTP/1.1" 200 117345
[2019-12-12 17:54:57,110: DEBUG/ForkPoolWorker-4] http://localhost:9933 "POST / HTTP/1.1" 200 117345
[2019-12-12 17:54:57,237: WARNING/ForkPoolWorker-2] store version to db
[2019-12-12 17:54:57,237: WARNING/ForkPoolWorker-2] MetadataV9Decoder
[2019-12-12 17:54:58,371: DEBUG/ForkPoolWorker-2] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:58,378: DEBUG/ForkPoolWorker-2] http://localhost:9933 "POST / HTTP/1.1" 200 448
[2019-12-12 17:54:58,383: DEBUG/ForkPoolWorker-2] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:58,385: DEBUG/ForkPoolWorker-2] http://localhost:9933 "POST / HTTP/1.1" 200 69
[2019-12-12 17:54:58,386: WARNING/ForkPoolWorker-2] 1
[2019-12-12 17:54:58,386: WARNING/ForkPoolWorker-2] {'phase': 0, 'extrinsic_idx': 0, 'type': '0000', 'module_id': 'System', 'event_id': 'ExtrinsicSuccess', 'params': [{'type': 'DispatchInfo', 'value': {'weight': 10000, 'class': 'Operational', 'paysFee': True}, 'valueRaw': ''}], 'topics': []}
[2019-12-12 17:54:58,396: WARNING/ForkPoolWorker-2] + Added 0xd4243cf2b940b36120243e787ac2bdef72f6ef772ec2a737a09d66befc220416
[2019-12-12 17:54:58,400: DEBUG/ForkPoolWorker-1] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:58,400: DEBUG/ForkPoolWorker-3] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:58,400: DEBUG/ForkPoolWorker-4] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:58,405: DEBUG/ForkPoolWorker-4] http://localhost:9933 "POST / HTTP/1.1" 200 448
[2019-12-12 17:54:58,405: WARNING/ForkPoolWorker-4] Metadata: CACHE MISS
[2019-12-12 17:54:58,406: WARNING/ForkPoolWorker-4] 198
[2019-12-12 17:54:58,407: DEBUG/ForkPoolWorker-2] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:58,407: WARNING/ForkPoolWorker-4] ! ERROR adding 0xf6b75ad9cd5d9aa3a643bb40ff9ec681401f347269ee5a00dcf9eae524f8d91e
[2019-12-12 17:54:58,408: DEBUG/ForkPoolWorker-1] http://localhost:9933 "POST / HTTP/1.1" 200 448
[2019-12-12 17:54:58,409: WARNING/ForkPoolWorker-1] Metadata: CACHE MISS
[2019-12-12 17:54:58,410: WARNING/ForkPoolWorker-1] 198
[2019-12-12 17:54:58,411: WARNING/ForkPoolWorker-1] ! ERROR adding 0x5a024887e2cd8a8d650fb47b3e7b117c94c5c10a60440e668f9b90db36e649ab
[2019-12-12 17:54:58,413: DEBUG/ForkPoolWorker-3] http://localhost:9933 "POST / HTTP/1.1" 200 448
[2019-12-12 17:54:58,413: DEBUG/ForkPoolWorker-2] http://localhost:9933 "POST / HTTP/1.1" 200 595
[2019-12-12 17:54:58,414: WARNING/ForkPoolWorker-3] Metadata: CACHE MISS
[2019-12-12 17:54:58,414: WARNING/ForkPoolWorker-3] 198
[2019-12-12 17:54:58,415: WARNING/ForkPoolWorker-3] ! ERROR adding 0x9f7985dd6019395cbabee3b42deeefb07d2ab3ba30801af117f7a4ff801c03d4
[2019-12-12 17:54:58,416: DEBUG/ForkPoolWorker-2] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:58,417: ERROR/ForkPoolWorker-1] Task app.tasks.accumulate_block_recursive[4248d115-48a5-4bdc-98e5-3b787ca91166] raised unexpected: HarvesterCouldNotAddBlock('0x5a024887e2cd8a8d650fb47b3e7b117c94c5c10a60440e668f9b90db36e649ab')
Traceback (most recent call last):
File "/Users/haoming/workspace/proChain/polkascan-pre-harvester/app/tasks.py", line 106, in accumulate_block_recursive
block = harvester.add_block(block_hash)
File "/Users/haoming/workspace/proChain/polkascan-pre-harvester/app/processors/converters.py", line 539, in add_block
self.process_metadata(json_parent_runtime_version, parent_hash)
File "/Users/haoming/workspace/proChain/polkascan-pre-harvester/app/processors/converters.py", line 201, in process_metadata
runtime = Runtime.query(self.db_session).get(spec_version)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 959, in get
return self._get_impl(ident, loading.load_on_pk_identity)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 1069, in _get_impl
return db_load_fn(self, primary_key_identity)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/loading.py", line 282, in load_on_pk_identity
return q.one()
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3292, in one
ret = self.one_or_none()
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3261, in one_or_none
ret = list(self)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3334, in iter
return self._execute_and_instances(context)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3356, in _execute_and_instances
querycontext, self._connection_from_session, close_with_result=True
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3371, in _get_bind_args
mapper=self._bind_mapper(), clause=querycontext.statement, **kw
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3349, in _connection_from_session
conn = self.session.connection(**kw)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1124, in connection
execution_options=execution_options,
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1130, in _connection_for_bind
engine, execution_options
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 408, in _connection_for_bind
self._assert_active()
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 299, in _assert_active
"This session is in 'inactive' state, due to the "
sqlalchemy.exc.InvalidRequestError: This session is in 'inactive' state, due to the SQL transaction being rolled back; no further SQL can be emitted within this transaction.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/celery/app/trace.py", line 375, in trace_task
R = retval = fun(*args, **kwargs)
File "/Users/haoming/workspace/proChain/polkascan-pre-harvester/app/tasks.py", line 66, in call
return super().call(*args, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/celery/app/trace.py", line 632, in protected_call
return self.run(*args, **kwargs)
File "/Users/haoming/workspace/proChain/polkascan-pre-harvester/app/tasks.py", line 133, in accumulate_block_recursive
raise HarvesterCouldNotAddBlock(block_hash) from exc
app.processors.converters.HarvesterCouldNotAddBlock: 0x5a024887e2cd8a8d650fb47b3e7b117c94c5c10a60440e668f9b90db36e649ab
[2019-12-12 17:54:58,417: ERROR/ForkPoolWorker-4] Task app.tasks.accumulate_block_recursive[0fe935ed-7475-46f3-8c4f-5954af00f64d] raised unexpected: HarvesterCouldNotAddBlock('0xf6b75ad9cd5d9aa3a643bb40ff9ec681401f347269ee5a00dcf9eae524f8d91e')
Traceback (most recent call last):
File "/Users/haoming/workspace/proChain/polkascan-pre-harvester/app/tasks.py", line 106, in accumulate_block_recursive
block = harvester.add_block(block_hash)
File "/Users/haoming/workspace/proChain/polkascan-pre-harvester/app/processors/converters.py", line 539, in add_block
self.process_metadata(json_parent_runtime_version, parent_hash)
File "/Users/haoming/workspace/proChain/polkascan-pre-harvester/app/processors/converters.py", line 201, in process_metadata
runtime = Runtime.query(self.db_session).get(spec_version)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 959, in get
return self._get_impl(ident, loading.load_on_pk_identity)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 1069, in _get_impl
...skipping...
[2019-12-12 17:54:58,458: WARNING/ForkPoolWorker-1] Metadata: CACHE MISS
[2019-12-12 17:54:58,458: WARNING/ForkPoolWorker-1] 198
[2019-12-12 17:54:58,462: DEBUG/ForkPoolWorker-4] http://localhost:9933 "POST / HTTP/1.1" 200 448
[2019-12-12 17:54:58,463: WARNING/ForkPoolWorker-4] Metadata: CACHE MISS
[2019-12-12 17:54:58,463: WARNING/ForkPoolWorker-4] 198
[2019-12-12 17:54:58,464: DEBUG/ForkPoolWorker-2] Starting new HTTP connection (1): localhost:9933
[2019-12-12 17:54:58,468: DEBUG/ForkPoolWorker-3] http://localhost:9933 "POST / HTTP/1.1" 200 448
[2019-12-12 17:54:58,468: DEBUG/ForkPoolWorker-2] http://localhost:9933 "POST / HTTP/1.1" 200 595`

My harvest can't process all blocks

My havrest monitor:
http://149.28.147.147:5555/
http://149.28.147.147:8080/oak-testnet/harvester/admin

It has 1479 failed.
data_account_audit is filled, but data_account is empty

Error is about MetadataV7ModuleStorageEntry

http://149.28.147.147:5555/task/ba1603a3-a349-47f7-8661-f63b635f0bc9

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/celery/app/trace.py", line 385, in trace_task
    R = retval = fun(*args, **kwargs)
  File "/usr/src/app/app/tasks.py", line 70, in __call__
    return super().__call__(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/celery/app/trace.py", line 648, in __protected_call__
    return self.run(*args, **kwargs)
  File "/usr/src/app/app/tasks.py", line 392, in update_balances_in_block
    harvester.create_full_balance_snaphot(block_id)
  File "/usr/src/app/app/processors/converters.py", line 1030, in create_full_balance_snaphot
    if storage_method.get("type_hasher_key1") == "Blake2_128Concat":
AttributeError: 'MetadataV7ModuleStorageEntry' object has no attribute 'get'

My docker-compose yml file:
https://github.com/OAK-Foundation/polkascan-os/blob/oak-testnet-host/docker-compose.oak-testnet.yml

My custom types:
https://github.com/OAK-Foundation/polkascan-pre-harvester/blob/9a5bf632eed712a09c33cea32831e24bc42083fa/app/type_registry/oak_testnet.json

Please help me to solve it.

Thanks!

How can retry accumulate_block_recursive

The block can be skip if execute the service accumulate_block_recursive throw some error, how can I retry it ?
harvester last block #314625+,and finalize block is still #287295, because missing #287296,I want to retry block #287296

Block processing stuck on Polkadot CC1

After completing several items in the block processing queue the process seems to be stuck at #288892:

$ curl localhost:8000/queue
{"status": "success", "data": {"harvester_head": 502094, "block_process_queue": [{"from": 241917.0, "to": 288892}]}}

New blocks are properly inserted. We are using these settings:

- TYPE_REGISTRY=polkadot
- SUBSTRATE_ADDRESS_TYPE=0
- SUBSTRATE_METADATA_VERSION=9
- NEW_SESSION_EVENT_HANDLER=True
- SUBSTRATE_STORAGE_BALANCE=Account
- SUBSTRATE_STORAGE_INDICES=Accounts
- FINALIZATION_ONLY=1

We have tried repeatedly to send a start request to the API endpoint:

$ curl -X POST localhost:8000/start
{"status": "success", "data": {"task_id": "f424b868-c9bb-482b-9e72-f75ee5e42ed1"}}

This worked previously when it got stuck on other blocks but now it doesn't seem to work anymore. Also looks like there aren't any errors in the worker logs:

[2020-06-30 14:11:21,916: INFO/MainProcess] Received task: app.tasks.accumulate_block_recursive[3a6a7471-7bc8-4e08-aec9-84fabd4cfe0c]  
[2020-06-30 14:11:22,557: WARNING/ForkPoolWorker-2] + Added 0xb2fc2acab3d2120c044119ed55cb65407420a1c23bdc0da074604c4f237f65ac
[2020-06-30 14:11:22,848: WARNING/ForkPoolWorker-2] + Added 0x501a80347741100185549650430625bd0b8ebcf4d7b290ee46fa7dbb9833f83c
[2020-06-30 14:11:22,948: WARNING/ForkPoolWorker-2] . Skipped 0x8aaea2cf4f738f2e0b80d6c694c251a916c9110ffdb939b89726a06f17e5f29b
[2020-06-30 14:11:22,952: INFO/ForkPoolWorker-2] Task app.tasks.accumulate_block_recursive[3a6a7471-7bc8-4e08-aec9-84fabd4cfe0c] succeeded in 1.033748185960576s: {'result': '2 blocks added', 'lastAddedBlockHash': '0x8aaea2cf4f738f2e0b80d6c694c251a916c9110ffdb939b89726a06f17e5f29b', 'sequencerStartedFrom': False}
[2020-06-30 14:11:31,884: INFO/MainProcess] Received task: app.tasks.start_harvester[30d69e38-dbc5-4959-8afd-2b354892b413]  
[2020-06-30 14:11:31,916: INFO/ForkPoolWorker-2] Task app.tasks.start_harvester[30d69e38-dbc5-4959-8afd-2b354892b413] succeeded in 0.030910639907233417s: {'result': 'Harvester job started', 'block_sets': [{'start_block_hash': '0xa98f834cc962f64a27f18047f321f95b07ef7c281742bad925ae71f9f1f053ab', 'end_block_hash': None}], 'sequencer_task_id': '04163218-d3ab-472b-9be0-d82345acfa8d'}
[2020-06-30 14:11:51,901: INFO/MainProcess] Received task: app.tasks.start_sequencer[96ba0455-62ca-49a3-9ec2-5936812b8684]  
[2020-06-30 14:11:55,535: INFO/ForkPoolWorker-2] Task app.tasks.start_sequencer[96ba0455-62ca-49a3-9ec2-5936812b8684] succeeded in 3.6320296940393746s: {'result': 'Block #241917 is missing.. stopping check '}
[2020-06-30 14:12:02,203: INFO/MainProcess] Received task: app.tasks.accumulate_block_recursive[6c2777bf-0dbc-4a22-8143-7d532b5c95aa]  
[2020-06-30 14:12:02,840: WARNING/ForkPoolWorker-2] + Added 0xd6f575ab9f30604ee800fe075ef8ab564264444796a18eac69ac4de10876e4f1
[2020-06-30 14:12:03,145: WARNING/ForkPoolWorker-2] + Added 0x3c6a7ef5aa1477311ed5ca2b05fbf1ffef024ffc21a738e001cce8234718a186
[2020-06-30 14:12:03,168: WARNING/ForkPoolWorker-2] . Skipped 0xe3fd6d576d4bbbc9af3610cad145b756db2ae1a83231439242418edbe81bd4a9
[2020-06-30 14:12:03,171: INFO/ForkPoolWorker-2] Task app.tasks.accumulate_block_recursive[6c2777bf-0dbc-4a22-8143-7d532b5c95aa] succeeded in 0.9668770999414846s: {'result': '2 blocks added', 'lastAddedBlockHash': '0xe3fd6d576d4bbbc9af3610cad145b756db2ae1a83231439242418edbe81bd4a9', 'sequencerStartedFrom': False}
[2020-06-30 14:12:11,885: INFO/MainProcess] Received task: app.tasks.start_harvester[f5ede8a6-8735-4278-b89e-b8a5817977fb]  
[2020-06-30 14:12:11,912: INFO/ForkPoolWorker-2] Task app.tasks.start_harvester[f5ede8a6-8735-4278-b89e-b8a5817977fb] succeeded in 0.025522317038848996s: {'result': 'Harvester job started', 'block_sets': [{'start_block_hash': '0x09eba9988a5ce0ff7bdce6f8cda3b911ed6fb2bc952a08100b1af611ef633630', 'end_block_hash': None}], 'sequencer_task_id': 'ea5ae35e-4851-4a2e-b25f-8eab754085a1'}
[2020-06-30 14:12:31,885: INFO/MainProcess] Received task: app.tasks.start_harvester[3317c00d-4532-4bb2-bb18-ed20f7b30b46]  
[2020-06-30 14:12:31,913: INFO/ForkPoolWorker-2] Task app.tasks.start_harvester[3317c00d-4532-4bb2-bb18-ed20f7b30b46] succeeded in 0.02677196601871401s: {'result': 'Harvester job started', 'block_sets': [{'start_block_hash': '0x5eb837ed5767b37ccf79a68773c02d522aa0474dcaebb63baa259ecd04556694', 'end_block_hash': None}], 'sequencer_task_id': 'ab2c44f0-003f-490b-9fe8-232e7f87e04e'}
[2020-06-30 14:12:51,901: INFO/MainProcess] Received task: app.tasks.start_sequencer[cda8ce54-5606-451c-a762-a2fc170cc35c]  
[2020-06-30 14:12:55,284: INFO/ForkPoolWorker-2] Task app.tasks.start_sequencer[cda8ce54-5606-451c-a762-a2fc170cc35c] succeeded in 3.382033488014713s: {'result': 'Block #241917 is missing.. stopping check '}
[2020-06-30 14:13:11,905: INFO/MainProcess] Received task: app.tasks.start_sequencer[11045c99-fd07-46ee-9744-c36c806cdd3b]  
[2020-06-30 14:13:15,523: INFO/ForkPoolWorker-2] Task app.tasks.start_sequencer[11045c99-fd07-46ee-9744-c36c806cdd3b] succeeded in 3.6170766670256853s: {'result': 'Block #241917 is missing.. stopping check '}
[2020-06-30 14:13:51,903: INFO/MainProcess] Received task: app.tasks.start_sequencer[e3f7138a-612e-43ff-bd31-c88085723a0c]  
[2020-06-30 14:13:55,439: INFO/ForkPoolWorker-2] Task app.tasks.start_sequencer[e3f7138a-612e-43ff-bd31-c88085723a0c] succeeded in 3.5346222911030054s: {'result': 'Block #241917 is missing.. stopping check '}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.