The last bandwidth issues inspired me to start my own debugging. I'm an OPS guy and when I'm facing issue I need to understand it and find the root cause.
Steem Blockchain provides a bunch of raw data, we can easily get details for any block we want, but we are people, we can't read and process millions lines of text on the fly... but... we're better at reading charts ;-)
I've built a system which gather data from Steem blockchain and generate dynamic charts for transactions and bandwidth details. The system is build on Open Source software such as Grafana, Graphite and few Python scripts I wrote for parsing and collecting data.
The Steem blockchain's bandwidth depends on the 3 parameters:
and it's calculated based on formula:
max_bandwidth = number_of_blocks_per_week * maximum_block_size * current_reserve_ratio
- number_of_block_per_week is constant, Steem generates 20 blocks per minute * 60 minutes * 24 hours * 7 days = 201600 blocks
- maximum_block_size is set by TOP 20 witnesses and currently is 65536
- current_reserve_ratio it's a kind of anti spam parameter, whose value decrease when the average size of the new blocks is greater than 25% of the maximum_block_size (65536 * 0.25 = 16384).
Every transaction generates data
- more transactions = more data,
- more data = bigger blocks,
- bigger blocks = lower current_reserve_ratio,
- lower current_reserve_ratio = lower bandwidth,
- lower bandwidth = MOAR! issues...
Let's take a look on the charts
This is a snapshot of data I made today, the charts show how many transactions are processed, what is the current_reserve_ratio, average block size, and how the Steem blockchain automatically reduce the bandwidth when average_block_size hit the limit.
We can see the most popular transactions are,
- custom_json (blue) - when we follow/unfollow somebody
- votes (red) - when we upvote post or comment
- comment (orange) - when we add comment
The average_block_size hit the limit (red horizontal line set on 25% of max_block_size = 16kB) after which the current_reserve_ratio value is decreased.
And finally the bandwidth is reduced as well. The lower value of current_reserve_ration, the lower value of available blockchain badwidth.
From time to time we see spikes caused by operations like votes, custom_json, delegate.
More than 100 custom_json operations in a one block 18946277
Ok... it seems like the click.view
user bot is looking for new friends...
117 votes in a one block 18945967
Dozen of minions upvoting the same post at the same time... probably voting bots.
In the next few weeks/months something needs to be improved to make enough capacity for all new Steemians, it could be the algorithm itself or the max_block_size parameter set by Witnesses. We're hitting the limit which is set to just 25% of the max block size... :)
If you want to track the current Steem blockchain transactions in real time, I've made a public page with the chart for you. Enjoy! ;-)
Please vote for me as witness.
If you think I'll be a good witness, please
- go to https://steemit.com/~witnesses
- scroll to the bottom of the page
- enter my name jamzed next to the @ symbol and click the VOTE button