casper-ecosystem / developer-rewards Goto Github PK
View Code? Open in Web Editor NEWA place where developers can get rewarded for their contribution to the Casper Ecosystem and Docs
License: Apache License 2.0
A place where developers can get rewarded for their contribution to the Casper Ecosystem and Docs
License: Apache License 2.0
500 USD
SDK
We're inviting you to participate to explore the possibilities of using the Casper Client Python SDK in real world scenarios like Data Analytics etc.
Recommended approach:
700 USD
SDK
Create an NPM package for connecting to different wallets in the Casper ecosystem. A standardized wallet adapter for Casper is going to enable dApp developers to start building quickly without the need to reinvent the wheel. It is going to be developed using React 18. The project structure will be kept as simple as possible to allow new contributors to freely understand the project. Having a lightweight and performant package is the goal. Lastly, a good README.md file is going to show how it is used.
Currently I've found three different wallets (except Casper Signer as it going to be deprecated):
1000 USD
SDK
Here is the state of the SDK as of now , it's 1.4.15 compliant :
https://github.com/abahmanem/casper-scala-sdk
Here are all the methods to update or to add for compliance with latest 1.5.2
We will be updating the following methods to V 1.5.2
"account_put_deploy",
"info_get_deploy",
"state_get_account_info",
"state_get_dictionary_item",
"query_global_state"
Tasks :
Estimated workload: 8 hours
Hourly rate: 125 US
1- Methods :
"account_put_deploy",
"info_get_deploy",
"state_get_account_info",
"state_get_dictionary_item",
"query_global_state"
updated to V 1.5.2 of Casper node
2- Unit tests updated and running successfully for the above methods
3- Github documentation updated
1000 USD
DApps
It seems that a CL dev built something similar already and published the GitHub Repo:
https://github.com/gRoussac/casper-deployer-hackathon-oct-2022
It is also available online:
https://casper.onrender.com/
Maybe this can be extended or used as a reference/starting point on how to interact with the Casper network via a website.
— Additional Features? —
Create a website which offers a UI to create deploys, test smart contracts and interact with the Casper blockchain.
Using casper-client can be very complicated since it is a command line utility which is very powerful and flexible but that is also it’s weak point:
A lot of things can go wrong and it is easy to make mistakes causing frustration when developing or testing.
Create a UI version of casper-client similar to the one already mentioned at the beginning, as a website, which can also send the deploys to the network directly
It is much more complex but offers an even better user experience if it can store user data, sessions, custom templates, doesn’t share potentially sensitive data etc.
A few possible features:
Make the network selectable (MainNet, TestNet, custom)
Payment amount/Gas fees
Create collections of session-args incl. data types
Creating, editing, exporting offline deploys
RPC node (default to CA maintained ones, localhost?) incl. connection test
Templates for CEP-78.
Custom templates for own contracts which can be saved, edited, loaded
Advanced features:
Query existing contracts and provide a UI to interact with it
When delivering the solution describe the main functions in Code and what was done with all relevant information to [email protected]
Prepare a general Concept for an Integration of an AI support system with Casper Developer Portal
350 USD
Documentation
The main objective of this DevReward is to develop a comprehensive concept for integrating GPT-4 or a comparable AI model as a support tool for the Casper Developer Portal (Casper.network) by utilizing all available internal data.
The expected concept includes the following steps, which should be considered in terms of budgeting and planning:
Gather internal data: Collect all relevant internal data from various sources such as FAQs, knowledge base articles, and historical customer interactions available on the Casper Developer Portal.
Preprocess and format the data: Clean and format the internal data to ensure its suitability for training GPT-4. This may involve removing irrelevant information, standardizing the format, and organizing it in a structured manner.
Train AI Model: Utilize the preprocessed internal data to train an AI model using an appropriate training pipeline. This step involves feeding the data into the model and allowing it to learn the patterns, context, and nuances specific to the Casper Developer Portal.
Develop a user interface: Create a user interface that enables seamless interaction with the AI model. This can take the form of a chatbot or a search function embedded within the website.
Implement real-time data updates: Set up a system to regularly update the AI model with the latest internal data. This ensures that the model remains up to date and can provide accurate responses based on the most recent information available.
Test and refine: Conduct thorough testing by simulating user interactions and evaluating the responses provided by the AI model. Identify areas for improvement or refinement and iterate on the system accordingly.
Deploy and monitor: Once the integration has been thoroughly tested and refined, deploy the AI model support tool on the Casper Developer Portal. Continuously monitor its performance, gather user feedback, and make necessary adjustments to enhance its effectiveness and usability.
The acceptance criteria for the AI integration in the Casper Developer Portal are as follows:
Accuracy of Responses: The chatbot should provide accurate and relevant responses to user queries, which will be evaluated by internal Developer Advocates.
Understanding User Intent: The chatbot should accurately interpret user intent and respond accordingly. It should comprehend the nuances and context of user queries to provide appropriate and helpful responses.
Response Time: The chatbot should respond to user queries within an acceptable timeframe, as determined by Developer Advocates.
Language Comprehension: The chatbot should have a wide vocabulary and understanding of various language nuances to facilitate effective communication.
Handling of User Errors: The chatbot should be capable of effectively handling user errors and misunderstandings. It should provide clear prompts or suggestions when a user query is ambiguous or invalid, guiding them towards the correct information.
Personalization and Context Persistence: The chatbot should be able to maintain context throughout the conversation and provide personalized responses based on user preferences or previous interactions.
Integration with Existing Systems: The integration should successfully connect with all relevant data sources, such as the Developer portal and CasperEcosystem.io, ensuring seamless data exchange.
Scalability and Performance: The chatbot should be able to handle a high volume of concurrent user interactions without a significant decrease in performance. Performance benchmarks may be defined as part of the acceptance criteria to ensure effective scalability.
Continuous Improvement: The acceptance criteria should include provisions for ongoing monitoring and improvement of the chatbot's performance based on user feedback and analytics data.
To develop a comprehensive concept for the desired outcome, the following tasks should be included:
500 USD
SDK
We're inviting you to participate by reviewing and testing our cutting-edge Casper Client Golang Software Development Kit (SDK).
Recommended approach:
1000 USD
SDK
We have developed Swift SDK for Casper node version 1.4 for the following methods:
Because the maximum value of the proposal is $ 1000, I will split up 3 proposals each proposal 1000 $ corresponding to 4 methods. This is part 1:
chain_get_state_root_hash
info_get_peers
info_get_deployment
info_get_status
Following methods of the API Fully updated to 1.5 CSPR node:
Unit tests
1000 USD
SDK
We have developed Swift SDK for Casper node version 1.4 for the following methods https://github.com/hienbui9999/CasperSDKInSwift
To update to node 1.5 version, it takes much time to review the whole thing, groping for testing, editing and then testing the whole thing again.
Very close estimate it took about 50-70 hours for this whole thing (12 methods).
Cost: $2900.
Because the maximum value of the proposal is $ 1000, I will split up 3 proposals each proposal 1000 $ corresponding to 4 methods.
300 USD
Documentation
Vietnam is one of the biggest blockchain hubs in South East Asia, in terms of community and builders. That's the reason why we created this proposal to create the documentation and video guide about the DevReward program that can attract more Vietnamese Builders.
500 USD
DApps
Currently, it is quite difficult to obtain a report for a specific period detailing all the CSPR delegation rewards of a particular address for a year, along with the corresponding price at the time when the rewards were generated.
The solution to this issue should be a tool that is free to use and open-source. This tool will enable users to generate a list of all the rewards for a specific public key (e.g., wallet address) within a designated time period (e.g., a year). Furthermore, it should provide an option to download this information directly as a document (e.g., a .csv file), without requiring any additional user data (such as an email address).
This tool should be accessible and user-friendly for everyone, featuring a simple GUI or should be written in cross-plattform code like Python (SDKs in Python are already available in the DevPortal Library).
Proposal DevReward Tax Tool.pdf
For a clearer presentation, please refer to the attached proposal.
Implementing a Faucet for ERC20/CEP-18 Tokens
200 USD
DeFi
Prepare a simple web UI (preferably React) with a button that people can press to get X amount of Token, redeemable only once every 24h. A similar solution we have for the $CSPR native token on the testnet. We want an easier way of token distribution.
300 USD
DApps
Description: The Casper Mainnet Blockchain is open source, and various blockchain explorers have made some of its data accessible. This DevReward project aims to create a data tool that provides a comprehensive overview of the blockchain's daily activity concerning addresses on explorers. The primary objective is to gain a better understanding of active wallets and wallets in general.
• The required data must be sourced from the Mainnet of the Casper Blockchain.
• The newly generated daily datasets should be added to the main datasets in a timely manner and should be available as .csv files within 48 hours.
• The available categories should include the following datasets:
o Daily active addresses (addresses that have performed at least 1 deploy)
o Daily total of newly generated addresses
o Daily total of all available addresses on the blockchain
o Daily total of all empty addresses that do not contain any tokens
o Daily total of all available addresses that contain at least 1-9 CSPR Tokens
o Daily total of all available addresses that contain at least 10 to 99 CSPR
o Daily total of all available addresses that contain at least 1000 to 9,999 CSPR
o Daily total of all available addresses that contain at least 10,000 to 99,999 CSPR
o Daily total of all available addresses that contain at least 100,000 to 999,999 CSPR
o Daily total of all available addresses that contain at least 1,000,000 to 4,999,999 CSPR
o Daily total of all available addresses that contain more than 5,000,000 CSPR
• The dataset should provide daily data points.
• Data must be available for download in .csv format and easily accessible. It should be available for at least the year 2023
and, preferably, for the entire blockchain history up to the Genesis block if applicable.
• The downloadable .csv datasets need to have at least the following filters to choose from:
o Last month
o Last year
o All data (until Genesis or at least 2023)
o Select time frame (with a selectable range that goes in the format "from dd.mm.yyyy to dd.mm.yyyy")
• Consider refining the categorization and suggest more granular datasets if applicable.
• Provide a detailed description of how the different sections were categorized, the calculations behind it, the data processing
methods employed, and make the source code accessible.
• Optionally, include a graphical representation of the data over time on the explorer's webpage.
150 USD
DApps
Description: The Casper Testnet Blockchain is open source, but currently, not many blockchain explorers have made its data accessible. This DevReward project aims to create a data tool that provides a comprehensive overview of the blockchain's daily activity concerning addresses on explorers. The primary objective is to gain a better understanding of active wallets and wallets in general.
• The required data must be sourced from the testnet of the Casper Blockchain.
• The newly generated daily datasets should be added to the main datasets in a timely manner and should be available as
.csv files within 48 hours.
• The available categories should include the following datasets:
o Daily active addresses (addresses that have performed at least 1 deploy)
o Daily total of newly generated addresses
o Daily total of all available addresses on the blockchain
• The dataset should provide daily data points.
• Data must be available for download in .csv format and easily accessible. It should be available for at least the year 2023 and, preferably, for the entire blockchain history up to the Genesis block if applicable.
• The downloadable .csv datasets need to have at least the following filters to choose from:
o Last month
o Last year
o All data (until Genesis or at least 2023)
o Select time frame (with a selectable range that goes in the format "from dd.mm.yyyy to dd.mm.yyyy")
• Consider refining the categorization and suggest more granular datasets if applicable.
• Provide a detailed description of how the different sections were categorized, the calculations behind it, the data processing
methods employed, and make the source code accessible.
• Optionally, include a graphical representation of the data over time on the explorer's webpage.
1000 USD
SDK
We have developed Swift SDK for Casper node version 1.4 for the following methods:
Because the maximum value of the proposal is $ 1000, I will split up 3 proposals each proposal 1000 $ corresponding to 4 methods. This is part 3:
state_get_dictionary_item
state_get_balance
state_get_auction_info
account_put_deploy
Following methods of the API Fully updated to 1.5 CSPR node:
Unit tests
600 USD
Other
GM readers and devs!
I am in Crypto since 2017 and missing one feature for interacting with a smart contract. The masses dont know how to read a code.
For the transparency sake, I always wished that there will be a meassage when you interact with the smart contract, what this smart contract is able to do if you accept. This makes it a lot harder for scammers beacause before you interact with the contract it says you, that it can drain your wallet or whatever. On the other hand it can say that the contract can only read your wallet and signing this contract is safe.
For enterprise adoption it would be a hughe win if they dont have to care about those issues to get scammed by what ever.
I heard about the fire extension, which pops up when you sign a contract which tells you what the contract is able to do if you sign.
My public adress is 0197565a21eda48501efcc16bc20b5013ef6ce80f3b027e4f95ddef57474a6c557
You can also reach out to me via email: [email protected]
1000 USD
SDK
Here is the state of the SDK as of now , it's 1.4.15 compliant :
https://github.com/abahmanem/casper-scala-sdk
Here are all the methods to update or to add for compliance with latest 1.5.2
We will be updating the following methods to V 1.5.2
"query_balance",
"info_get_peers",
"info_get_status",
"chain_get_block",
"chain_get_era_info_by_switch_block",
Tasks :
Estimated workload: 8 hours
Hourly rate: 125 US
1- Methods :
"query_balance",
"info_get_peers",
"info_get_status",
"chain_get_block",
"chain_get_era_info_by_switch_block",
updated to V 1.5.2 of Casper node
2- Unit tests updated and running successfully for the above methods
3- Github documentation updated
500 USD
DApps
The DeFi landscape is currently evolving, and TVL (Total Value Locked) is a valuable indicator of the level of activity occurring on the blockchain within the context of the DeFi ecosystem. The Casper Mainnet Blockchain is open source, and various blockchain explorers have made some of its data accessible. This DevReward project aims to create a data tool that provides a comprehensive overview of the blockchain's daily activity related to TVL and DEX (Decentralized Exchange) activity. The primary objective is to gain a better understanding of TVL on the blockchain, active DEXs, and the corresponding liquidity pools.
• The required data must be sourced from the Mainnet of the Casper Blockchain.
• This Developer Reward is focused on Teams with Casper Blockchain Explorer, which, in a second step, would be able to
include the concept directly into their explorer if all acceptance criteria are met.
• Prepare a robust concept and mock-up for indicating TVL, active DEXs and Bridges, and the included liquidity pools. This
mock-up should have the following required MVPs that can be assessed:
o A comprehensive overview of all DEXs and bridges using the Casper Blockchain.
o A comprehensive overview of all available liquidity pools that are included.
o A total overview of the entire existing TVL on the Casper blockchain and the daily changes in these volumes.
o The data should be available as downloadable .csv files.
o Make suggestions for 5-10 KPIs that should be available and explain the added value they would bring.
o Data must be available for download in .csv format and easily accessible. It should be available for at least the year 2023
and, preferably, for the entire blockchain history up to the Genesis block if applicable.
• Provide a detailed description of how the different sections were categorized, the calculations behind them, the data
processing methods employed, and make the source code accessible.
• As soon as this data and MVP are ready, schedule a meeting with the corresponding deployer of the concept and we will
request a demo of the MVP.
• If all acceptance criteria are met, please provide a time schedule for implementing the solution and an implementation
schedule.
• Based on these points, the concept will then be transferred to a DevReward that will cover the implementation of this
concept or further development until the solution is live on the corresponding Explorer.
300 USD
Other
This DevReward is focused on vulnerabilities in our webpage, casper.network. A bug search yielded the result that several vulnerabilities exist, and they should be assessed by the Association. The involved developer has provided a status report and self-reported it to the technical team of the association.
300 USD
DApps
The Casper Mainnet Blockchain is open source, and various blockchain explorers have made some of its data accessible. This DevReward project aims to create a data tool that provides a comprehensive overview of the blockchain's daily activity concerning deploys on Explorers. The primary objective is to categorize and subdivide daily deploys into the following categories:
• Total daily deploys
• Daily native CSPR Transactions (regular CSPR token transfers from A to B)
• Daily staking deploys
• Unstaking deploys
• Daily NFT mintings
• Daily NFT transfers
• Daily NFT burns
• Daily smart contract deploys
• Daily other/unknown transactions
• The required data must be sourced from the mainnet of the Casper Blockchain.
• The new daily generated datasets should be added to the main datasets in a timely manner and need to be available as .csv
files (available within 48 hours).
• The available categories should include the following datasets:
o Daily Total deploys
o Daily native CSPR Transactions
o Daily staking activities
o Daily unstaking deploys
o Daily NFT mintings
o Daily NFT transfers
o Daily NFT burns
o Daily smart contract deploys
o Daily other/unknown transactions
• The dataset should provide daily data points.
• Data must be available for download in .csv format and easily accessible. It should be available for at least the year 2023
and, preferably, for the entire blockchain history up to the Genesis block.
• The downloadable .csv datasets need to have at least the following filters to choose from:
o Last month
o Last year
o All data (until Genesis or at least 2023)
o Select time frame (with a selectable range that goes in the format "from dd.mm.yyyy to dd.mm.yyyy")
• Consider refining the categorization and suggest more granular datasets. Explore the possibility of further subdividing the
"other/unknown" category, similar to the smart contract deploys, and clarify the types of transactions it encompasses.
• Provide a detailed description of how the different sections were categorized, the data processing methods employed, and
make the source code accessible.
• Optionally, include a graphical representation of the data over time on the explorer's webpage.
1000 USD
SDK
Here is the state of the SDK as of now , it's 1.4.15 compliant.
https://github.com/abahmanem/casper-scala-sdk
Here are all the methods to update or to add for compliance with latest 1.5.2
We will be updating the following method to V 1.5.2
"chain_get_state_root_hash",
and adding the following two methods :
"info_get_validator_changes",
"info_get_chainspec",
Tasks :
Estimated workload: 8 hours
Hourly rate: 125 US
1- Method : "chain_get_state_root_hash"
updated to V 1.5.2 of Casper node
2- Methods :
"info_get_validator_changes",
"info_get_chainspec"
implemented accordining to V1.5.2 of Casper node
2- Unit tests added and updated to V1.5.2 and running successfully for the above methods
3- Github documentation updated
4- New version of the SDK available on maven and sbt Repos
300 USD
DApps
The Casper Testnet Blockchain is open source, but currently, not many blockchain explorers have made its data accessible. This DevReward project aims to create a data tool that provides a comprehensive overview of the testnet's daily activity concerning deploys on explorers. The primary objective is to categorize and subdivide daily deploys into the following categories:
• Total daily deploys
• Daily native CSPR Transactions (regular CSPR token transfers from A to B)
• Daily staking activities
• Unstaking deploys
• Daily NFT mintings
• Daily NFT transfers
• Daily NFT burns
• Daily smart contract deploys
• Daily other/unknown transactions
• The required data must be sourced from the Testnet of the Casper Blockchain.
• The newly generated daily datasets should be added to the main datasets in a timely manner and need to be available as .csv files (within 48 hours).
• The available categories should include the following datasets:
o Daily Total deploys
o Daily native CSPR Transactions
o Daily staking activities
o Daily unstaking deploys
o Daily NFT mintings
o Daily NFT transfers
o Daily NFT burns
o Daily smart contract deploys
o Daily other/unknown transactions
• The dataset should provide daily data points.
• Data must be available for download in .csv format and easily accessible. It should be available for at least the year 2023
and, preferably, for the entire blockchain history up to the Genesis block if applicable.
• The downloadable .csv datasets need to have at least the following filters to choose from:
o Last month
o Last year
o All data (until Genesis or at least 2023)
o Select time frame (with a selectable range that goes in the format "from dd.mm.yyyy to dd.mm.yyyy")
• Consider refining the categorization and suggest more granular datasets. Explore the possibility of further subdividing the
"other/unknown" category, similar to the smart contract deploys, and clarify the types of transactions it encompasses.
• Provide a detailed description of how the different sections were categorized, the data processing methods employed, and
make the source code accessible.
• Optionally, include a graphical representation of the data over time on the explorer's webpage.
500 USD
Documentation
Today, there are no tools available to assist validators in visualizing and analyzing the statistics of their nodes over time. The purpose of this DevReward is to develop a comprehensive dataset that enables validators to access various time series data, such as the daily cumulative staked amount, by utilizing raw data including the number of stakers, total CSPR staked, rewards, delegation, and undelegation.
Granularity: The data should be captured at the end of each block or era, providing a detailed and accurate representation of the node's performance.
1000 USD
SDK
We have developed Swift SDK for Casper node version 1.4 for the following methods:
Because the maximum value of the proposal is $ 1000, I will split up 3 proposals each proposal 1000 $ corresponding to 4 methods. This is part 2:
chain_get_block_transfers
chain_get_block
chain_get_era_info_by_switch_block
status_get_item
Following methods of the API Fully updated to 1.5 CSPR node:
Unit tests
500 USD
Documentation
Today, there is a lack of tools available to assist validators in visualizing and analyzing the statistics of their nodes over time. The objective of this DevReward is to create new graphs on websites that display relevant Casper Blockchain data. These graphs will provide a convenient way for node owners and the community to track validator statistics over time. The data displayed will include information such as the number of stakers, total CSPR staked, rewards, delegation, and undelegation. By providing this visual representation of the data, validators and the community will be able to easily monitor and analyze the performance and trends of the validators.
In addition to the existing features of the Casper-Blockchain statistics website, a new section will be introduced on the individual pages of all validators. This section will include graphs that provide valuable insights into various aspects of the validator's performance over time. The following graphs will be included:
Rewards Earned Over Time: This graph will visualize the rewards earned by the validator over a specific period. It will allow validators and the community to track the growth of rewards and identify any patterns or trends.
Total CSPR Staked Over Time: This graph will display the total amount of CSPR staked with the validator over time. It will provide an overview of the validator's staking activity and showcase the growth of the stake.
Delegate Action Over Time: This graph will illustrate the delegation activity associated with the validator over a given timeframe. It will demonstrate the inflow of delegation and highlight periods of increased or decreased delegation.
Undelegate Action Over Time: This graph will showcase the undelegation activity related to the validator. It will help validators and the community understand the periods when undelegations occur and the impact on the validator's stake.
Delegators Over Time: This graph will present the number of delegators associated with the validator over time. It will indicate the growth or decline in the validator's delegator base and provide insights into its popularity within the community.
As a mandatory requirement, the code developed for the Casper-Blockchain statistics website and these additional features will be open source.
1000 USD
SDK
Here is the state of the SDK as of now , it's 1.4.15 compliant :
https://github.com/abahmanem/casper-scala-sdk
Here are all the methods to update or to add for compliance with latest 1.5.2
We will be updating the following methods to V 1.5.2
"state_get_auction_info",
"chain_get_era_summary",
"rpc.discover"
"state_get_balance",
"chain_get_block_transfers",
Tasks :
Estimated workload: 8 hours
Hourly rate: 125 US
1- Methods :
"state_get_auction_info",
"chain_get_era_summary",
"rpc.discover"
"state_get_balance",
"chain_get_block_transfers",
updated to V 1.5.2 of Casper node
2- Unit tests updated and running successfully for the above methods
3- Github documentation updated
700 USD
NFT
Currently there is no Casper Discord Bot that support Discord Administrator / Mod to manage their community. We would like to create this proposal to implement the Casper Discord Bot with these features:
400 USD
Other
We all are observing a strange behavior of the nodes uptime estimation tool. About 100 nodes that had excellent performance and uptime - i.e. were hitting 700 every or most of weeks in a year (except for the well-known 3-7 days when the whole network was lagging) suddenly started to lag out of the blue in August (since week#6), so it was no longer possible to make "700" for them, and therefore they were no longer paid at all, or paid very sparingly.
Some nodes looks just surprizingly: went through fire and water and didn't even sneeze in common laggy days but suddenly stumbled every day in August.
The network had common and unavoidable lag days due to events such as::
Jan 28 to Feb 2 - attack event (mention in the group: https://t.me/CasperTestNet/22552);
Feb 21 - 1.4.13 upgrade;
Feb 28 - March 1 - many people lost their LP during attack on port 8888 (source IPs: 185.234.210.155; 82.1.51.142 and others.. )
April 12 - /doesn't seem to depend on the host/
May 4 - 1.4.15 Upgrade
June 21 - mass Germany located Hetzner servers lag. (yet 1 in Finland though)
New events that took place during the period examined:
July 6 - firewall update, whitelisting 3.91.157.200 for scoring tool
July 17 (Q3,week#3) - upgrade 1.5.2
Note: week#5 has more payed nodes as it includes grace period (from Jul31 to Aug2)
August 1 - firewall update, whitelisting 3.80.27.246 for scoring tool
August 7 - a date with an unknown event, after which many nodes became heavily laggy
On the day of the 1.5.2 update - July 17 - many nodes lost longevity points (LP) and some even went offline, some required up to 2 consecutive days for the update, so July 17-18 are not considered as lags. This is more of an operator negligence. And marked as 'missed upgr'.
The usual causes of lags are weak server configuration, server/node misconduction or oversite, network problems at different parts of the testnetwork. And it is obvious that in the month of August any of these conditions could not suddenly arise to such a severe degree that hundred of nodes would change their indicators dramatically.
So in this study, I'm going to assume that the "node uptime" metrics that are the result of the node scoring tool survey do not reflect actual node uptime.
* In any case, in the attachment there is a list of all problematic nodes in the network with comprehensive data for evaluating network performance for any further research. And if you need logs of those nodes, we can post the list of needed PubKeys directly in the testnet telegram group and explicitly their operators to do upload.
The only mean to register those lags is "Casper Testnet Participant Scores" spreadsheet. But this spreadsheet is not published right at the end of each week, and so I (as well as everyone else in the network) did not have the opportunity to spot abnormalities in time, so this study may not already contain fully relevant data.
In addition to that, be noted, I was evaluating Validator/KeepUp status as of September 8 and Sept. 12. And after making my suggestion in the testnet telegram group, many people may have activated the bid on the validators auccion so this data may already become irrelevant (and indeed there is such a movement on the netю You can see the signs of racing right here https://testnet.cspr.live/validators)
Given the long experience of the test network, 1 lag on some date common to all participants can be considered normal behavior.
So even the July 17 lag will not be taken into account when evaluating the performance level of the node (as well as the lag of February 21, May 4, June 21, etc.)
'Performance level' means 'Good' or 'Bad'.
Sources used: Casper Testnet Participant Scores - 2023 Q1, Q2 and Q3 (as of Sept.6) spreadsheets. + CNM (https://cnm.casperlabs.io/network/casper-test/detail) as of September 8-12.
How many of all kinds of nodes are in the Network at the moment of the study (~Sept. 10)?
264 - total
214 - are 'Good's, 'Average's, 'bad-good's and 'good-short's (123 - Validators; 83 - KeepUp or ReadOnly; 8 - Inactive bid)
192 - Good (have 700 points in most weeks)
2 - Average (still profitable, will be considered as 'Good')
10 - bad-good (seems have improved hardware)
10 - good-short (good, but worked less than 3 months - started in June - August)
50 - unuseful nodes (with lags every week or too offten stops, have no reliable info)
194 - good + average, are useful and reliable nodes for evaluating network issues.
How many of reliable 'Good' nodes - i.e. excepting 'short' and 'bad-good' showed lagging in different periods?
What number of lag-events?
What are their states?
Note: 'missed upgr' cases will not be counted.
lags in July 6 -16
Validators: 8 nodes ; 8 lag events if deduct '1 lag' events: 1 node; 1 event
KeepUp: 2 nodes ; 3 lag events 1 node; 2 events
Inactive bid: 1 nodes ; 2 lag events 1 node; 2 events
lags in July 17-31
Validators: 36 nodes ; 115 lag events if deduct '1 lag' events: 26 nodes; 105 event
KeepUp: 51 nodes; 167 lag events 43 nodes; 159 events
Inactive bid: 2 nodes ; 8 lag events 1 node; 7 events
lags in August 1-6
Validators: 7 nodes; 8 lag events if deduct '1 lag' events: 1 node; 2 events
KeepUp 10 nodes; 10 lag events) 0 nodes; 0 events
Inactive bid 0 nodes; 0 lag events) 0 nodes; 0 events
lags after August 7..
Validators: 35 nodes; 343 lag events if deduct '1 lag' events: 28 nodes; 335 events
KeepUp: 52 nodes; 505 lag events 49 nodes; 499 events
Inactive bid: 0 nodes; 0 lag events 0 nodes; 0 events
39 Validator, not considered bad-performance, nodes suffered in July-August with 443 lag events ( '1 lag' cases were not taken into account)
57 ReadOnly,not considered bad-performance, nodes suffered in July-August with 660 lag events ( '1 lag' cases were not taken into account)
and 2 NotActive bid nodes with 9 lags events
As we can see, even either with '1 lag' cases or without them, KeepUp nodes has more lags.
How many nodes had no any changes in July nor August? How many of them are Validators? Is there a correlation with Hoster or region?
Note: 'missed upgrade' cases will not be taken into account, again, as they are not due to the network faults/events/issues etc..
65 Validator nodes had absolutely no lags in whole July-August period - green colored
(and even 2 more Validator nodes are among 'bad-good' class)
16 Read-only nodes had absolutely no lags in whole July-August period - green colored
How many nodes which lost LP at July 17 lagged / had no lags in further period? - they are marked as "missed upgr." in my spreadsheet.
- 17 Validator nodes have lost LP at 1.5.2 upgrade and had no more lags
- 9 ReadOnly nodes have lost LP at 1.5.2 upgrade and had no more lags
How many nodes which saved LP at July 17 lagged / had no lags in further period?
- 46 validator (+ 2 validator from 'bad-good' performance level class')
- 10 read-only nodes
Does Hoster influence the performance?
problematic nodes are hosted at: Hetz Germany, Hetz Finland, Alabanza Finland and Innowacyjne Rozwiazania Informatyczne Poland
Hetzner Finland nodes shows worse results than Germany in summary
AWS - mostly better performance, but samples are too small
Even either with '1 lag' cases or without them, KeepUp nodes has more lags in July-August period. Which can be related to scoring tool IP change or 1.5.2 itself or something other. But we can monitor further.
* Of course, it is adequate and logical to say that Node Validator operators are already more responsible and more interested in the testnet, and most likely they have better servers (there is more load on the validator), and therefore they have fewer lags. But still such correlations are the majority.
Are nodes polled while they are finalizing a block?
If so, during this time the node may experience additional load and return a bad result, even though it has excellent servers.
And one of the burning questions (and one that has no normal explanation in the period under study ) is a number of nodes, which despite impeccable performance lost longevity on flat ground at July 17. (but.. seems you've already fixed it)
And finally, as we are currently experiencing a sort of attack - certain IPs (and they seem to be the same for the whole network) are spamming some nodes. But not all of them. But you're aware of the case, but there's no solution yet.
Thank you for evaluating my job
If it turns out that the problem is not due to errors in the uptime tool, then it is necessary to analyze the traffic that spams nodes and develop a more accurate defense against such an attack using a firewall.
Create documentation on the principle and details of the node uptime estimation tool, because people wonder what is going on in the black box.
This attack or bug, whatever it is, is causing me severe financial problems, so I would like some transparency.
With this nodes analysis you have more free time and we can request logs of all problematic nodes.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.