The Internet was built to route around failure. That origin story is not just nostalgia. It describes a set of architectural instincts that shaped how networks interconnect, how protocols evolve, and how services survive disruption. The Internet’s core was designed as a network of networks, with no single owner and no single choke point.
And yet, if you look at how the Internet is experienced today, the lived reality is far less decentralized than the mythology. A handful of companies now sit at critical junctions of hosting, delivery, identity, and user access. Much of the world’s web traffic flows through a small number of cloud platforms and large-scale intermediaries. A few browser engines effectively determine what “the web” is allowed to become. Even parts of the trust infrastructure that makes the modern web work are concentrated in ways that would have seemed improbable a generation ago.
This does not mean the Internet has a single switch that can be flipped off. It does mean that the practical control points, the economic leverage, and the operational single points of failure are far more centralized than the Internet’s original design implied.
Decentralized protocols, centralized power
It is important to distinguish architecture from power. The Internet still uses decentralized routing protocols and a distributed addressing system. Anyone can connect networks and exchange traffic. In that sense, the base layer remains resilient.
But power does not live only in protocols. It lives in who controls the infrastructure that most people depend on, who sets the defaults, and who can change the rules of participation. Mark Nottingham’s IETF draft on avoiding Internet centralization makes this distinction explicit. It notes that even when technical centralization can be mitigated, control can still fall into a few hands and that different forms of centralization have different consequences.
When people say “the Internet is decentralized,” they often mean “no single entity owns it.” That can be true while still missing the more relevant question: who has leverage over how it functions in practice?
Cloud consolidation turned the “open web” into rented space
A major driver of modern centralization is the shift from self-hosted infrastructure to hyperscale cloud infrastructure. Companies and organizations increasingly build on top of a small set of cloud providers because it is cheaper, faster, and more scalable than running their own data centers.
That migration has produced striking concentration. Synergy Research data summarized in late 2025 shows Amazon, Microsoft, and Google together accounted for 63 percent of enterprise spending on cloud infrastructure services in Q3 2025. More recent reporting on Synergy’s Q4 2025 data put the “big three” at about 68 percent share of enterprise spending, roughly two-thirds of the market.
This is not just a market-share trivia point. It means that a considerable portion of what people think of as “the Internet” is physically hosted and operationally managed on a small number of platforms. Outages at those platforms ripple outward. Policy decisions at those platforms affect downstream services. Pricing changes become existential for businesses that have built on top of them.
It also means that the web’s apparent diversity can be deceptive. You can visit ten different websites and still be interacting with resources hosted on the same underlying cloud.
Centralization hides inside “performance”
Another engine of centralization is performance. The modern web is heavy, dynamic, and latency-sensitive. That naturally pushes sites toward content delivery networks, edge caching, and security intermediaries that can absorb attacks and keep sites online.
This arrangement is rational, and it is often beneficial. But it also makes a small set of intermediaries structurally central because they become the default way to achieve speed, reliability, and security at scale.
Cloudflare’s “Radar” year-in-review material illustrates how much visibility large intermediaries have into global web traffic patterns because so much traffic passes through their infrastructure. When a single company can publish global browser share estimates based on observed request traffic, it is a reminder that the network’s center of gravity has shifted toward large-scale platforms that sit in the flow.
Centralization in this form is subtle because it presents as convenience. It feels like the Internet is working better, because it often is. But it also concentrates dependency.
The browser layer is closer to a monoculture than we like to admit
For most people, the Internet is the browser. That makes browser diversity a governance issue, not a tech hobbyist issue.
StatCounter’s global browser market share for January 2026 shows Chrome at roughly 71 percent worldwide, with Safari far behind and Firefox in the low single digits. Those numbers are about browsers, but the deeper issue is engines. A large fraction of browsers is built on Chromium’s Blink engine, and Safari is built on WebKit. That effectively means that most web users depend on two dominant rendering engines, one of which is tied to Chromium’s ecosystem.
Cloudflare’s 2025 year-in-review similarly notes that roughly two-thirds of request traffic observed by Cloudflare came from Chrome, with Safari second.
When browser engines consolidate, the web becomes easier to develop for but also easier to ossify. If most users rely on the same implementation, then that implementation becomes the practical standard. Standards bodies can write specifications, but the lived web is what the dominant engines implement. The result is an Internet that looks decentralized on paper while being governed through a small number of implementation decisions.
Trust infrastructure is more concentrated than it appears
Encryption is one of the best things that happened to the Internet. HTTPS is now the expectation rather than the exception. But the success of encryption also highlights a quiet kind of centralization: a small number of certificate authorities and trust anchors carry enormous responsibility.
Let’s Encrypt, run by the nonprofit Internet Security Research Group, has become a major part of this story. In a retrospective post marking its tenth year, Let’s Encrypt said that by late 2025 it was frequently issuing around ten million certificates per day. Its published stats and annual reporting reinforce how central Let’s Encrypt has become to the web’s routine cryptographic operations.
This is not a criticism of Let’s Encrypt. It is a recognition that the web’s trust layer has its own concentration points, even when those points are mission-driven nonprofits. If a huge share of the web depends on a small number of organizations to issue certificates and maintain trust chains, then those organizations become structurally important.
Naming remains distributed, but governance matters
The Domain Name System is frequently cited as proof that the Internet is decentralized, and in many operational respects it is. The root server system is distributed across over a thousand instances worldwide, run by independent operators. ICANN’s overview notes there are 12 independent root server operators managing 13 root identities, implemented across more than 1,500 individual servers.
That distribution is real resilience. But the fact that ICANN convenes governance structures and consultative processes around the root server system is a reminder that even distributed infrastructure has institutional choke points in how it is coordinated. ICANN’s Root Server System governance working group process and associated public consultations illustrate that coordination and accountability frameworks matter even when the servers themselves are numerous.
This is not a claim that ICANN “controls the Internet.” It is a claim that modern Internet function depends on institutions and processes that shape critical shared resources. Decentralization does not mean “no governance.” It means governance should not collapse into capture by a few parties.
Standards bodies are openly worried about centralization
The most telling sign that decentralization is eroding is that Internet engineers and standards leaders are talking about it explicitly.
The IETF draft on avoiding Internet centralization opens with the blunt observation that the Internet, despite its decentralized design, is continuously subjected to forces that encourage centralization, and it explores what standards efforts can do about it. Separately, RFC 8890, “The Internet is for End Users,” argues that Internet standards decisions should favor end users when interests conflict, and it frames that as a principle the ecosystem must actively protect.
These documents exist because centralization is not just a market trend. It becomes a technical constraint. When too much traffic, too many applications, or too many users depend on a small set of platforms, the ability to introduce new protocols, new privacy protections, or more user-controlled architectures shrinks. The Internet becomes harder to evolve without the permission, cooperation, or at least non-opposition of the dominant intermediaries.
What we lose when the network recenters
Centralization can deliver real benefits. Hyperscale clouds make reliability and global scale accessible. CDNs reduce latency and absorb attacks. Browser consolidation reduces compatibility chaos. Let’s Encrypt helped encrypt the web by making certificates easy and free. Distributed root servers keep DNS resilient.
The cost is that these benefits often come with increased dependency. When the Internet recenters around a few providers, the practical meaning of “open” changes. The Internet remains open in theory, but in practice many services become tenants in privately governed ecosystems.
That is why “the Internet isn’t decentralized anymore” is not a romantic complaint. It is a warning about fragility and power. The original Internet design assumed many independent actors could fail without collapsing the whole. A centralized Internet can still be resilient, but its resilience begins to depend on the competence, incentives, and policies of fewer actors.
The decentralization story is not dead. It is just no longer automatic. If the open Internet is going to remain meaningfully open, it will require conscious choices by standards bodies, regulators, developers, and users to resist the gravitational pull toward a handful of default intermediaries. The architecture still allows decentralization. The market no longer guarantees it.
—Greg Collier