Matthew Tyson
Contributing writer

Blockchain breakthroughs: A tech revolution told in whitepapers

Analysis
Jul 14, 20229 mins
BlockchainEmerging Technology

Whitepapers are the standard mode of communicating innovation in the blockchain space. It’s something of a feast of technological creativity. Read on for a sampling of groundbreaking web3 whitepapers.

Abstract blockchain concept  >  Blocks / nodes / connections / transactions
Credit: Thinkstock

Blockchain is one of the great blindsides in the history of technology.  Major trends around cloud technology, virtualization, and mobile you could see coming, but a novel distributed computing model based on public key cryptography?  That came almost completely out of the blue. 

When the Nakamoto whitepaper dropped in 2008 it unceremoniously set off a ferment of innovation that continues growing to this day.  It wasn’t entirely unexpected if you were watching the digital currency scene—at the time an even nerdier backwater of the already highly nerdy cryptography frontier.  The paper itself gives a tip of the hat to several prior artists like Adam Back’s HashCash whitepaper. 

In that view, Bitcoin looks like a sensible progression.  However, even as a natural outcome of compounding creativity the Bitcoin paper is extraordinary. 

The proof of work (PoW) solution to the double-spend problem is non-obvious even knowing the prior art.  And that idea led to an unpredictable result: the possibility of decentralized, permissionless virtual machines. 

The first shoots in this spreading revolution were issued by Vitalin Biturik in the Ethereum whitepaper.  The basic idea of leveraging blockchain to build Turing machines was introduced there.  Once you have established the viability of a permissionless, compute-enabled network, you get what you might expect: a horde of smart computer scientists and engineers leaping into the space to find new ways of taking advantage of and improving upon the possibilities.

In short, we have seen an outpouring of genius.  Obviously there have been some blind alleys and questionable characters at work.  None of that discredited the real groundbreaking work that has been and is being done in the space.  After the Ethereum virtual machine’s introduction, several promising avenues of progress have been proposed and implemented.  Here’s a look at some of the most prominent.

Ethereum and the virtual machine

If Bitcoin is the root from which the web3 tree has grown, Ethereum is the trunk from which the branches have spread.  Ethereum took the conceptual step of saying with a system in hand for verifying transactions are valid, maybe we can build a virtual machine.  There are devils in the detail here—implementing such a system is a substantial challenge—but not only is it possible, it also opens up applications with unique characteristics.

In general, these applications are known as dApps, or distributed applications.  dApps are comprised of smart contracts that run on-chain and the traditional web apps that interface with them.

The concept of a smart contract is perhaps the best concept to use as a lens in understanding Ethereum’s basic innovation.  Smart contracts are so called because they represent a contract on the network—they specify what are valid and binding contracts for the participants.  (Participants are cryptographically bound to these contracts via their private keys).  In a sense, the blockchain structure enables code to describe a variety of verifiable agreements between people and systems.  

Smart contracts are “smart” because they are autonomous.  This is the characteristic that really distinguishes them from conventional applications: no intervention by outside entities is necessary for their action.

With a generally available network that can enforce contracts, in which participants can join in a permissionless way, many traditional business models are potentially vulnerable to disruption.  As this article describes in detail, finance is potentially just the tip of the iceberg.  Also inherent in Ethereum basic design is the possibility of decentralized governance, or DAO (distributed autonomous organizations). 

Many practical realities must be overcome in realizing the promise of blockchain, and many of the innovations in subsequent projects are targeted at doing just that.

Peercoin and proof of stake

The consensus mechanism is the means by which nodes in the network come to agreement on what are valid transactions.  Bitcoin originated PoW consensus, which uses the cryptographically demonstrable work done in mining certain values.  This works but suffers from being energy intensive and acts as a performance bottleneck. The Peercoin whitepaper introduced an alternative mechanism known as proof of stake (PoS). 

The wealth of projects that have since been built using the PoS model is a wonderful testament to its efficacy, but perhaps the greatest testament is Ethereum itself moving to a PoS model with Ethereum 2.  Proof of stake opens up possibilities by streamlining the overall operations of blockchain networks.  

Proof of stake works by ensuring validator nodes are vested in the network.  In general, that means establishing that validators hold a certain amount of the crypto token used by the platform to denote value. At the very highest level, you can say proof of stake works by creating an incentive for nodes to behave correctly.  Compromising the network by means of a Byzantine network attack, like a Sybil attack or a 51% attack, will devalue the very coins held by the malicious nodes.  This increases the cost of attacks and is a disincentive.  It’s simpler and more lucrative to simply participate in good faith.

PoS eliminates the high energy cost on validator nodes.  It also reduces the minimum time required by nodes to process transactions.  That is because the nature of PoW is doing difficult computations, something that depends upon time and electricity.

PoS is not without drawbacks.  Nevertheless, it represents a real innovation and opened up not just new implementations, but also caused people to begin thinking creatively about proof of consensus and other fundamentals in “layer 1” technology.

Solana and proof of history

Another breakthrough in blockchain thought is Solana’s proof of history (PoH) mechanism.  It’s whitepaper describes a system wherein a verifiable delay function (VDF) is applied to the network, enabling validator nodes to largely ignore the question of transaction ordering.

A verifiable delay function is one that, similar to a mining function, establishes that it has run by executing a cryptographic function.  This function outputs a hash that then proves it has run and a certain amount of time has elapsed.  This is like a cryptographic clock.

By devising an architecture that allows validators to share a single VDF server, the entire Solana network boasts radically faster block verification times.  It’s important to note that PoH by itself is a performance optimization, not a validation mechanism.  It must be combined with a consensus mechanism.  In Solana’s case, its token (SOL) adopts PoS. 

Avalanche and validation neighborhoods

The Avalanche whitepaper introduces an ingenious approach to consensus.  It proposes that nodes can agree upon valid transactions by sampling a small set of the network.  You can think of this as validation proceeding against a ‘flash’ subnet.  As the paper says, “Each node polls a […] randomly chosen set of neighbors, and switches its proposal if a supermajority supports a different value.”

This simple-seeming idea is well suited to the conditions of a distributed network.  It obtains lower overhead for nodes because they don’t have to refer to the entire network to be assured they have a valid copy of the blockchain state.  At the same time, the interconnected operation of all the different validator neighborhoods means the overall network state always moves towards valid consensus. 

Avalanche’s whitepaper is also notable for explicit and clear statements of two other principles that have gained traction.  The first is the idea of creating a “network of networks” wherein the underlying network enables a variety of networks that can operate independently or when desired interdependently via the chain’s token (AVAX).  The example given in the paper is of one subnetwork that handles gold contracts and another that handles real estate.  The two operate independently unless someone wants to buy real estate using their gold, at which point AVAX becomes the medium of exchange.

The second idea Avalanche puts forth well is self-governance.  In short, it has built in its protocol the ability of nodes to alter the parameters of its operation.  In particular, the member nodes have the ability to control staking timeframes, fee amounts, and minting rates. 

Internet computer and partial synchrony

The internet computer project is founded on a whitepaper that introduces a new mechanism for obtaining beneficial characteristics from both conventional and blockchain networks, thereby “obtaining the efficiency of a permissioned protocol while offering many of the benefits of a decentralized PoS protocol.”

This is done by considering the overall network as a collection of somewhat independent subnets.  Each subnet operates in terms of liveness as a permissioned network dependent upon a centralized PKI (public key infrastructure).   However, the context of these networks is run by a DAO.  That is, the protocol, network topology and PKI are all in the control of the decentralized network.

This enables efficiency of transaction processing without sacrificing safeness.  This is called partial synchrony in the paper, the idea being that each subnet operates for a defined period as a synchronous network.  This allows the subnets to rapidly proceed.  The subnets then participate in asynchronous collaboration to confirm the validity of the progress.  This operates on the explicit assumption that less than ⅓ of the network may be participating in a Byzantine style attack—a common threshold in distributed computing.  The overall asynchronous network is then tuned to preserve safety and resilience in harmony with the subnets being tuned to maximize throughput.

Ongoing innovation

While we’ve covered a lot of ground here, there are other intriguing whitepapers, and more being proposed.  It’s a fascinating space to watch, with some daring and mind-expanding thinking going on.