Skip to content

Latest commit

History

History
75 lines (40 loc) 路 16.6 KB

03-introduction.md

File metadata and controls

75 lines (40 loc) 路 16.6 KB

I. Introduction {introduction}

"I imagined the web as an open platform that would allow everyone, everywhere to share information, access opportunities and collaborate across geographic and cultural boundaries." Tim Berners-Lee [59]

Background

The web started [18] as a way to share research and communicate between independent entities. Over time, as value was created and unlocked, the implementation of the web took a direction that favored centralization [56] in favor of efficiency over resilience, profit and control of markets over digital liberty. Walls have been built everywhere to keep users in, digital boundaries are hard to cross, access to opportunities is limited, a few control [48] what information we can share and how, and how we find information in the first place; and the online individual is easily tracked and compromised [68]. The current web is in trouble.

"Catalini and Gans defined "the cost of networking" as a problem associated with the increasing market power of the FANG group. Reduction of that cost will lead to disentanglement of the network effect benefits from market power." The Blockchain Effect [55], Catalini & Gans [53]

Market power on the web is centralized to a handful of corporations and their platforms. This has been achieved through creating a platform and a data moat, in which the users and their data are aggregated in a single, massive database. The data is accessible to the users and developers only through a proprietary application UI, but the data is owned and controlled by the platform. Capturing the individual user, attracting their contacts and limiting competitive external developer access, so that the users can't transact outside the system, creates substantial network effects - a powerful incentive for centralization.

This consolidated market power has lead to a situation where it's hard for new businesses to compete [66] with the incumbent platforms. New businesses can't access users or their data outside the major platforms and very few can contend with the amount of resources they have. As well, new businesses can only get access to users and data by going through the incumbents, which further establishes the platforms' control over a) their users b) who can access those users and c) what the rules for access are.

This is a massive risk to entrepreneurial businesses as the rules can change at any point. This in turn reduces the number of new entrants, innovation [52] and competition. The major platforms don't have pricing or service quality pressure and they charge higher prices [31] for lower quality services. Any new business submitting to a platform can be shut down without a notice. For the end user, this ultimately means that there are less options to choose from and they're locked in. Centralization on the web has lead to data and market monopolization, with all of the ills that it creates for the market itself.

At the same time, the vast amount of data has become a liability - data gets stolen or otherwise compromised every so often. For good reason, the regulatory bodies are increasing their demands on the major platforms to take care of users' privacy. However, at other times, the regulatory bodies want and get access to the platform and their users' data. This makes the centralized platforms a lucrative environment for performing mass-surveillance and an easy target for censorship.

The situation is unsustainable and ultimately prevents innovation from happening, limiting the potential of the web. In order to protect users, enable a fair market, and encourage innovation and growth, a paradigm shift from centralized to decentralized models is needed.

The paradigm shift is achieved by reversing the authority.

Instead of platforms working as a gatekeeper between the user and their data and between users and other services, in reverse the users own their data and control who or which application can access it. While in the current model data is centralized around the platforms, in the decentralized model the data is "centralized" around the user.

"In a network of autonomous systems, an agent is only concerned with assertions about its own policy; no external agent can tell it what to do, without its consent. This is the crucial difference between autonomy and centralized management." Burgess [6]

Instead of users asking the application if they can access and use it, the applications ask [47] the user for permission to access their data. Instead of one big database for all users, there are an infinite number of small databases, many for every user and application. Instead of having a separate account in thousands of services, self-certified identities and user profiles work across all applications. Blog posts, activity feeds, friend lists are owned by the users, who decide which service and user interface they use to access and operate on them. The user can allow their data to be used by multiple applications simultaneously, making the data re-usable and interoperable between applications and networks. Keeping local copies [33] of small databases and operating on them locally [36] is efficient and makes the user experience feel instantaneous. Applications work in disconnected environments by default.

Instead of a business requiring permission from the platforms to get access to users, they can directly communicate with the user and request access to their data. They can talk to other services through a unified common language and create new services by composing other services and data, creating emergent value and new business opportunities. An environment where businesses don't need to build and operate expensive infrastructure to acquire and keep users at high cost allows them to focus on their core offering instead of building a platform and data moat.

Because the data, users, and applications are decoupled, and because everything is cryptographically verified, the developers can build and compose applications and services swiftly and fearlessly and compete on user experience, algorithms, data insights, service quality, price, ethical values, such as respect for privacy, and more. Because the rules and access to users are not dictated by the platforms, businesses can move faster to and in the market and even small players can enter, compete, and seize opportunities.

This leveled playing field creates a new wave of innovation and decentralized applications, services, and business models. Ultimately, it is the end users who benefit from more diverse service and application offerings, better user experience, improved privacy, and lowered prices. The decentralized web will be open [37], free (as in freedom), creative, and fun.

A Glimpse of Hope

The rise of crypto-networks has created a new wave of interest in technologies that enable decentralization. Bitcoin [12] and Ethereum [20] have led the way and as a whole we've built a lot in the past few years. We've developed technologies that can be used as building blocks for decentralized systems, such as IPFS [28], libp2p [32], and most of these new technologies, protocols, and systems are open source. A variety of new, cryptocurrency-based business models and digital governance models have been created. In a short time, systems that weren't possible to build before have been conceived, implemented, and deployed.

At the core of many crypto-networks is a global ledger. The global ledger works as a "single global truth", which means that all transactions between network participants are recorded on the same global ledger. This creates a bottleneck: requiring everyone to synchronize with everyone else reduces the maximum throughput of the network as a whole. Many projects have tackled improving the overall throughput of a network, but a single stream can only flow so fast.

Most ledger-based networks have a programming interface to program the network with "smart contracts". Smart contracts are programs that run in a network by the network participants. We can create decentralized applications (dApps), make payments, run business logic, implement new protocols, and more. However, most smart contract languages and execution environments (that is the ledger they run on) are not compatible with each other [69]. This means that the developers need to write their programs in a platform-specific language, effectively needing to decide up-front in which network they wish to run their programs or services. This creates fragmentation between the plethora of networks and creates an obstacle for interoperability between them. In this regard, most of the current blockchain systems create a technological silo for the developers: you have to choose which platform to bet on before even building the application, because switching platforms means rewriting the application.

Furthermore, because the smart contract programs are tied to the underlying crypto-network and ledger, they preclude the possibility of freely developing on open, non-cryptocurrency based building blocks. That is, if a program is built on a specific blockchain platform, coins or tokens or payments are required, so the developers and users have to sign up with the network and acquire tokens for that network.

This reminds us of the problem we have with centralized platforms, and on a more broader level doesn't seem to align with the original motivations that created the whole field.

At the same time, we've become painfully aware of the realities of the centralized platforms and have started countermeasures through regulation, such as the GDPR [21] in the European Union, or even calls to break down the platform owners.

These opposing forces to centralization, especially the people and societies waking up to question the level of privacy and security the platforms offer, are a glimpse of hope. Fundamentally though, the problems still exist.

Motivations

To build and deploy infrastructure that can realize the full potential of decentralized networks, reverse the authority and decouple data from network effects [55], we think the following aspects need to be addressed:

  • Asynchronous message passing
  • Eventual consistency [23]
  • The ability to structure large networks as smaller networks
  • A decoupling of computation from the platform
  • Usability in disconnected or disrupted environments

The biggest limitation to the current blockchain networks is the requirement for a single, global source of truth, the ledger, and forcing every participant to constantly synchronize and keep the global state to transact on it. While the motivations to do so are understandable (strong consistency guarantees, crypto-economic incentives, etc.), it creates an unbearable inefficiency for the network to operate.

To let infinitely diverse, offline-first applications be built on the decentralized web, the underlying protocols can not be built on a single token or network, or be dependent on them. Rather, the core protocols need to be flexible enough to build such networks on top and without requiring a payment to be able to use them. We need building blocks, not chains. The networks and protocols that require a single, global ledger are still needed and complementary to the core protocols, but the baseline needs to be as flexible, available, and efficient as possible, so that data can be decoupled from the underlying platform.

The real world is asynchronous: events occur and "things happen" disconnected from each other, at "some point in time", but rarely do we observe them exactly at the same time. All forms of consistency and consensus are in fact eventually consistent; messages are passed between participants, according to a predefined set of rules (the protocol), and consensus is agreed upon only when the required messages have been exchanged and the steps of the protocol were executed. While the end result, the consensus, is a form of strong consistency and requires synchronization, the underlying mechanisms are asynchronous. Messages are sent and "at some point in time" they are received and acknowledged by the receivers. From this perspective, we need to consider eventual consistency as the baseline consistency level for all data and applications. Building on asynchronous message passing and eventual consistency, we can construct stronger consistency guarantees [35] as needed.

"You must give up on the idea of a single database for all your data, normalized data, and joins across services. This is a different world, one that requires a different way of thinking and the use of different designs and tools" Jonas Bon茅r [49]

With asynchronous messaging and eventual consistency, we gain the superpower of being able to program applications and networks that can withstand disconnected and disrupted service environments. The programs operate locally first, are always available, that is they work offline, and can synchronize with others when network connection is available. Being constantly connected is no longer needed. Overall, this leads to a better user experience than what we have today with web apps.

Every participant interacting through a single global agreement, e.g. a blockchain, creates a bottleneck, which can only be so fast. Instead of requiring disconnected and unrelated interactions to be verified by the full network (for example Alice buying a coffee from Bob and sharing photos with Charlie) the individual programs should form sub-networks. The sub-networks can be seen as "micro-networks" per application or interaction. For example, Alice sharing photos with Charlie forms a network of two participants. Or, in a chat program, a channel with 50 users forms a network of 50 participants. Dividing large networks into sub-networks per application can be seen as "built-in sharding". Each participant in a network only stores and interacts with some parts of the network, but almost never all of them.

Each crypto-network having its own execution environment and a custom programming language to program the network causes fragmentation. This fragmentation is an inefficiency that prevents network effects from forming around programs and data. The accrued value is locked in each individual network. To overcome this challenge and to unlock value, we need to decouple the program execution layer from the platform (the ledger) and realize an efficient, distributed computation model that makes it possible to run the same program in multiple, otherwise disconnected and incompatible networks.

For this purpose, we present the Ambients protocol. Ambients is a protocol to build and run databases and programs in a peer-to-peer network. It decouples data from the underlying platform and network and creates an interoperability layer between different networks and systems. Decentralized applications can use it to build, deploy, execute, and share code and data in a compositional, verifiably safe, and scalable way.