Smart contract security
In light of all the attacks on smart contracts this year you can't help but wonder how many of them could have been avoided. Despite the heavy weight cryptography at the centre of this technological revolution the 'fatal' flaws are always human made. Given the general state of information security in the world it shouldn't really be a surprise that there are aloso problems within the cryptocurrency community too.
Better processes and better tooling can help, but it is a mistake to think that all security issues could have been avoided with better planning. Everything is easy in hindsight which is why we should use the lessons learned from each problem and attempt to make improvements for the future.
The security problems so far
Vitalik wrote a blog in June last year which covered the larger and better known smart contract problems at the time. Since then we've seen a few more: most notably the multi-signature Parity wallet attack and also an attack on one of Parity's libraries which results in funds being frozen (including the funds form the Polkadot fundraiser). Vitalik did cover a few of the solutions in his post but it is clear that there is still a lot of improvement needed.
This article mainly covers smart contracts but the methodologies and ideas pertain to software security in general which ultimately includes all blockchain projects.
What can the community do to improve?
Two obvious places to improve are:
- Better processes
- Better tools
Improving processes within the crypto community would involve adhering to an already established software development methodology which bakes security in from the beginning.
Improving the tool set involves creating good tools to assist with code writing practises. Once these tools are created then there is some effort required to promote them.
Security Development Lifecycle
Part of the problem of security is developing good habits such that you create anything with security in mind from the beginning and then continuing to keep security in mind until end-of-life. Having knowledge of "best practises" helps along with a mature development methodology. A lot can be said on this topic but I shall leave the following links for follow reading for those who may find them useful. For most people I suspect it will make for dull reading!
- Smart Contract Best Practises (ConsenSys) - This is a set of good practises for writing smart contracts
- Best Coding Practises (Wikipedia) - good general advice on coding but not so much emphasis placed on security.
- Microsoft's Security Development Lifecycel (MS SDL) - When Microsoft decided to get serious about security they came up with the SDL.
A useful tool in a security engineer's toolbox is a code scanner ( See: Automated code review ). Such tools perform static code analysis and try to find known bad patterns of code.
Securify is a free online tool that anyone can use in their web browser. It perform analysis on Solidity code as well as EVM bytecode.
The website is here: http://securify.ch/
Solidity Function Profiler
In addition, Eric Rafaloff at Gotham Digital Science wrote an interesting article on the recent Parity hack and believes that function profiling would have spotted the vulnerability: [Reviewing Ethereum Smart Contracts]https://blog.gdssecurity.com/labs/2017/9/27/reviewing-ethereum-smart-contracts.html). While I did say that security is always easy in hindsight Eric does make a number of good points in his conclusion:
We have seen that applying a simple code review technique of profiling an application would have likely caught this vulnerability early on. Knowledge of the Solidity language and the EVM is required, but these can be picked up by consulting documentation, known pitfalls, and open source code bases. The underlying code review methodology stays largely the same.We have seen that applying a simple code review technique of profiling an application would have likely caught this vulnerability early on. Knowledge of the Solidity language and the EVM is required, but these can be picked up by consulting documentation, known pitfalls, and open source code bases. The underlying code review methodology stays largely the same.
GDS are a highly respected name in the information security field so I would treat this as solid professional advice. Eric also provided his tool for free on GitHub: Solidity Function Profiler
Mythril - an Ethereum disassembler / blockchain exploration & analysis tool
Bernhard Mueller, a software security professional, has also created a tool for analysing solidity code. He introduced the tool in his first article on Hacker Noon but also wrote an accompanying blog piece after the most recent Parity hack: [What caused the latest $100 million Ethereum smart contract bug] (https://hackernoon.com/what-caused-the-latest-100-million-ethereum-bug-and-a-detection-tool-for-similar-bugs-7b80f8ab7279)
This is a reflection on the recent multi-sig flaw in the Parity contract and has some comments about how fuzzing could have perhaps helped to find the flaw. From the article:
As it happens, just a couple of weeks ago I wrote about using symbolic analysis to detect unprotected SUICIDE instructions.
Symbolic analysis is just one of the things mentioned by Vitalik in the blog I linked above. Progress is being made but greater visibility of these tools is required by the blockchain development teams.
The tool Bernhard created is available for free on his GitHub repo
I actually thought Mythril did fuzzing but upon second reading I think I may have been mistaken. The process of fuzzing is to throw unexpected inputs at a piece of software to see if it breaks in unexpected ways. Whenever there is an error there is often a security problem. It is a blunt approach but one which can yield interesting results. For now it appears that such a tool doesn't exist unless I've misread the capabilities of the tools above.
Nick Johnson a software engineer for Ethereum commented on Reddit that it would be great to see such a tool: comment on /r/ethereum.
From what I can see the QuantStamp team are working on a number of security auditing tools and one of which will be a fuzzer. They have tools on their GitHub repo but I didn't see a fuzzer online yet.
Security is easy in hindsight but it is also possible to learn from past mistakes and try to build better processes to avoid such errors in the future. Part of building a better and more secure future requires team to enhance their development processes, upskill team members, and also to make the best use of tools available to automate code auditing. I'm hopeful that we will see such improvements in the future.