Why Bitcoin is not sustainable and a better alternative to the proof-of-work algorithm. Part I
In a decentralised system like BitCoin, the proof-of-work algorithm is the core of the solution.
The whole point of "Satoshi Nakamoto's" paper was to explain a way to prevent the double spending problem.
In order for a miner to approve a new block of transactions and claim the prize, the proof-of-work has to be solved. The miner must provide proof that he has found the answer to a complicated question.
To illustrate what a proof of work means, let me show an example :
Find the <proof-of-work> variable so that result of the hashing the first 10 digits of the result are 0.
Hash Function (<Block's Hash> <proof-of-work>) = 0000000000𝑌𝑌𝑌𝑌𝑌𝑌 …
The only way to find the proof-of-work, is to try every combination there is, until the miner finds the right one.
Today, this takes a lot of power. And with time the proof-of-work gets tougher and tougher to find.
A LOT of power is used to run the Bitcoin network : by 2020 BitCoin could use as much energy as Denmark
We are talking about energy that is being used to calculate cryptographic hashes, energy that is in some way misused. And because of the heat the processors produce, fire can be an issue for many miners...
Unfortunately, the more popular crypto-currencies out there also use the same proof-of-work algorithm (yes, even DogeCoin).
So, let's forget for a second the other issues that BitCoin has : blockchain increasing size that induces, delays in transaction approval, mining pool reaching more than 50% of the mining power, etc.
What if we could imagine a crypto-currency that is not based in the proof-of-work algorithm ?
There are many alternatives that are not yet fully exploited from my point of view : proof-of-stake, proof-of-activity, proof-of-bum…
I will not explore these notions in the article. Instead, I would like to propose a new solution to the double spending problem by asking 2 questions :
What if we could know exactly when each transaction occurred?
What if we could accurately and securely TimeStamp every transaction in the Network ?
Then the double spending problem would be solved. No need for the proof-of-work, no need for mining, no need to waste tons of energy.
2 transaction events cannot take place at exactly the same time : if a node tries to double spend a BitCoin, then the Network would only validate the first transaction and the second would be refused.
Let's call it UTCoin (Universal Time Coin)
Ideed, in order for this to work "time" has to be somewhat absolute.
UTCoin could be the answer to all environmental problems linked with the proof-of-work algorithm.
Of course, UTCoin has many limitations and issues that need to be addressed before even considering creating this type of coin.
For instance, the biggest problem is : how do you establish absolute and accurate TimeStamps for every transaction? How do you prevent TimeStamp fraud?
I have been working on this for several months, and I have several ideas in mind to solve the main issues.
I have been loving Steem since I discovered it and I decided it will be the first place I will post my humble research.
Since Steem is all about the power of the community, let's try to figure this out together.
How can we TimeStamp every transaction on the Network without a central entity?
In a week, I will write a part II explaining my ideas and include some ideas from the comment section.
Thanks again for making part of this community, let's try to figure this out together!
Upvoted you
Nice article! What if some sort of trust value could be applied to the nodes in order to make the TimeStamps for trustable?
This is possible, this concept has already been explored but it may work. However, I think in order for this to work you need to establish personal accounts... If not, a hacker could create multiple accounts and exploit the trust algorithm.
The conversion of fuel into electricity into PoW hashes acts like a 'battery charger' for putting intrinsic value into each unit of a cryptocurrency.
I contend that complaining about PoW algorithms 'wasting' energy is roughly the equivalent of complaining about the first combustion engines wasting fuel. It's a subjective argument that ignores the value added to the broader economic and financial ecosystems.
Good topic to debate!
I like the anonymous aspect of bitcoin i would never want everything to be traceable and timed dated etc.. Definitely a post to debate about. Happy Steeming!
I agree with you, but the fact that all transactions are timed does not mean that everything will be better traced than in BitCoin's algorithm.
We are just talking about adding a TimeStamp to every transaction.
Right, in some way, you need energy to run things. I like the 'battery charger' analogy.
There is also a lot to improve in the traditional economic system, but the direction seems to be blocked....
The crypto-currency trend is just getting started, and we still have the chance to do better than bitcoin before it generalizes.
In theory, every miner could validate a block within millisecond's, the PoW is only there to prevent fraud and randomize the node that will validate the blocks.
For example, look at this experiment what Google did to CAPTCHAS :
"....characters from Street View images are appearing in CAPTCHAs to improve Google Maps with useful information like business addresses and locations. Based on the data and results of these reCaptcha tests, we'll determine if using imagery might also be an effective way to further refine our tools for fighting machine and bot-related abuse online."
CAPTCHAS are not intrinsically very useful, but there is always room to improve.
As for any blockchain endeavor, there is much to cover at an early stage. In this case, identification based on timestamps is the single most crucial factor so I will only comment on that, and I will provide a rather theoretic view.
Identifiability
Mathematically speaking there is a foundation for this. Say our stamps take values in (a convex set in) the reals, than it is true that two events cannot take place at the same time if we accept that there exists no subset of the reals with cardinality strictly greater or smaller than aleph-naught, e.g., the cardinality of the reals in its entirety.
Uniqueness
That property leads also to a problem. The reals can be expressed by decimal representations that have an infinite sequence of digits to the right of the decimal point, such that for every two consecutive points in the reals there exists an integer that marks a threshold after which the two reals are said to be arbitrarily close to each other. See Cauchy sequence. This means that as for our computational systems concerned, which have clear limits to the amount of digits that can be expressed, at some point, two consecutive reals are identical.
Further acknowledgement
Our timestamps cannot actually take values in the reals. In basically every well-know theorem, spacetime (as a generalization of time) is treated as continuous and linear such that for any two events in time with coordinates in both dimensions, coordinates may exist between them as well. The question of how far can we go with splitting and stretching the resolution, e.g., the question on how many digits these coordinates can have, leads in deep water. Many will agree that eventually time intervals are restricted by Planck time, the smallest unit of time available (10^-44). At that resolution, laws of physics probably don't add up, so it is, even for this purely theoretic exercise, not at all entirely clear whether two events can in fact be uniquely identified based on time. Luckily, we will probably never ever achieve an observation level at which we will have to concern ourselves with that. So as for any realistic bottlenecks, we will have to find some resolution of time units for which we find a (strictly positive) probability level of two events coinciding at an identical timestamp still acceptable, while the storage of this large precision data remains manageable. If that defines a theoretical bound within which to operate, I believe there is surely legroom. However, I would say that a set of predefined rules on how to proceed when such an event happens, has to be in place. Possibly together with features that split any transaction into a sequence of small transactions such that two coinciding transactions will very small economic significance.
According to Prof. Einstein and co, there is no universal moment of 'now' and no two events separated by even a tiny amount of space can be considered to have occurred simultaneously.
Seriously, this idea would be way too dependant on very accurately synchronised clocks and unworkable because, even if we expressed the exact time down to every of the nine billion odd oscillations of caesium 133 atoms that occur every second, there is still a risk that two transactions will happen at the same apparent moment.
Congratulations @joelpa! You have received a personal award!
Happy Birthday - 1 Year
Click on the badge to view your own Board of Honor on SteemitBoard.
For more information about this award, click here
Congratulations @joelpa! You have received a personal award!
2 Years on Steemit
Click on the badge to view your Board of Honor.
Do not miss the last post from @steemitboard:
SteemitBoard World Cup Contest - Round of 16 - Day 4
Participate in the SteemitBoard World Cup Contest!
Collect World Cup badges and win free SBD
Support the Gold Sponsors of the contest: @good-karma and @lukestokes
Congratulations @joelpa! You received a personal award!
You can view your badges on your Steem Board and compare to others on the Steem Ranking
Vote for @Steemitboard as a witness to get one more award and increased upvotes!