As a researcher who grew up amidst punch-cards and battled for computer time like a gourmet fights for a table at French Laundry, I have witnessed the incredible evolution of blockchain technology with awe and admiration. The transformation from the scarcity of computing resources to the abundance we see today is nothing short of miraculous.


The rapid advancement in computer speed, coupled with the improvement in our mathematical abilities, is gradually altering networks such as Ethereum. This development will not only enhance scalability but also foster truly decentralized blockchain systems and stronger smart contracts, making them even more potent.

Currently, blockchains require a significant amount of computational power, yet they are more centralized and vulnerable than many expect. Sophisticated protocols rely heavily on large servers, most of which are housed within a handful of dominant cloud platforms. We’re still in the initial phases of creating truly advanced smart contracts.

Currently, most Ethereum smart contracts range from about 24-25 kilobytes, and numerous DeFi ecosystems rely on networks of multiple contracts. It’s not unreasonable to envision a future where smart contracts could grow in size, potentially reaching megabytes. These larger contracts might incorporate advanced features such as embedded machine learning models or intricate decision trees.

In the future, the concept of setting a 25kb limit for smart contracts might appear just as outdated as the 640kb main memory limit on early personal computers does today.

Examining the future impact of these technological advancements on blockchain systems, it’s crucial to reflect on their origins: early blockchains have been criticized for using extensive computational resources, which might seem extravagant to those who remember the scarcity of computing power in the past. For instance, saving space was so important during the early days of computing that people omitted the year’s first two digits (like ’85 instead of ‘1985). A proof-of-work system with numerous parallel processes would have been deemed unnecessarily wasteful under those circumstances.

In earlier times, it was impossible for everyone to verify each other’s results due to the scarcity of computational power. My home, growing up, was filled with computer punch-cards used by my parents in machines that were as precious as a table at French Laundry. Nowadays, those days are ancient history, and although I can no longer create programs using punch cards, I am skilled at designing efficient paper airplanes with them instead.

Moore’s Law, which states that the number of transistors on a microchip doubles approximately every two years, has significantly advanced technology from the use of punch cards. Over time, this rapid development has led to an astonishing increase in performance that is often difficult to fully grasp. In 1970, a chip could accommodate around 1,500 circuits, while by 2020, it was nearly 50 billion.

In the realm of blockchain technology, we can essentially exchange a resource that’s now quite affordable – computational power – for something highly valuable: reliable data and outcomes. The growth of Ethereum has significantly expanded this smart strategy into a bustling environment brimming with useful applications, and this evolution is far from complete, as Moore’s law, although showing signs of slowing, remains resilient.

It was thought that Moore’s Law, a principle stating that the number of transistors on a chip doubles every two years with minimal cost increase, would reach its maximum by the current decade due to the limitations imposed by quantum mechanics when circuits become too small and unpredictable. However, this hasn’t occurred yet. Today, the smallest chips utilize circuits as narrow as 4 nanometers, and the semiconductor industry has plans for even smaller circuits down to 0.7nm, extending our progress well into the next decade. Interestingly, a silicon atom is only 0.2nm wide, which could potentially be approaching our practical limit.

Apart from manufacturing faster chips with increased logic, we’ve also significantly improved our mathematical abilities, particularly in the realm of intricate mathematical proofs essential for blockchains – zero-knowledge proofs (ZKPs). Zero-knowledge proofs are mathematical instruments that enable one to validate information without disclosing the original data. This enables the condensation of numerous transactions without attaching all pertinent data or maintaining confidentiality about those transactions.

As a researcher delving into the realm of blockchain technology, I’ve come to appreciate the indispensable role of Zero-Knowledge Proofs (ZKPs) in facilitating higher transaction volumes and preserving user privacy. However, it is important to note that ZKPs pose a challenge due to their complexity and high computational requirements.

In the space of just a few years, ZKPs have gone from proof-of-concept demonstrations to core technologies in the world of blockchain. Part of the credit goes to faster, better, cheaper computers, but it turns out our math skills in this space are improving enormously as well. While nobody has defined a kind of Moore’s Law for ZKPs, our own experience at EY has been very good: the performance of Nightfall, the privacy technology we developed, has improved by a factor of over 10,000 since we unveiled the prototype in 2018.

By merging enhanced chip capabilities with sophisticated mathematics, we anticipate significant transformations in the functioning of blockchains. Already, you can see glimpses of this evolution: zero-knowledge roll-ups and zero-Knowledge-based virtual machines leverage advanced math and substantial computing power to condense and execute Ethereum’s blockchain transactions. Where once we required expensive server time for Nightfall testing, the latest version can now run on high-end laptops.

In the current pace of advancements, it’s likely that almost any device, including smartphones, might soon function as a blockchain node and handle transactions directly, rather than just sending them to the cloud. Currently, you can perform basic Zero-Knowledge Proof (ZKP) transactions within your web browser for networks like Z-Cash. As these functionalities become widespread, we could witness a more authentic decentralized blockchain environment with less concentration of compute-intensive services.

A potential improvement could involve expanding the maximum size limit for large smart contracts from the current 24kb on Ethereum. At present, many prominent DeFi (Decentralized Finance) services are compelled to combine several smaller contracts due to this limit. By increasing the size allowance for smart contracts, we can streamline services, cut costs, and potentially reduce vulnerabilities that hackers might exploit.

As a crypto investor, I’ve been part of countless conversations about the potential for re-decentralizing the internet. For years, we’ve looked to blockchains as the guiding light, but the journey hasn’t always matched the initial excitement. Despite the promises of Web3, many aspects of this world are still heavily centralized.

Please take note that these perspectives belong solely to the author and should not be interpreted as reflecting the opinions of EY, CoinDesk, Inc., nor its proprietors and associates.

Read More

2024-08-20 22:28