It’s easy to forget that the internet is, in many ways, a thing — a physical thing that inhabits 3D space, made of cables and electricity and server racks in refrigerated rooms. But it is! And it matters, sometimes greatly, for how the internet works, or doesn’t, around the globe.
Historically the internet has consisted of cables going to individual homes, though now it increasingly also consists of satellites roaming the skies, and cell antennas dotting the landscape. Sometimes these things are cleverly disguised, and sometimes they’re not.
But much of the internet — broadband lines that bring wifi to homes and schools and offices (back when we had those) — exists in physical cables buried below the earth, in trenches deep below the ocean surface, and cables extending from central hubs out toward homes and businesses in the “last miles” around them.
This is one of many reasons that internet speed and quality in rural and impoverished regions, even in wealthy nations like the United States, commonly lags behind that of their wealthier urban counterparts: it’s simply less profitable to bring internet there. This also helps explain why many communities and local governments have tried to create their own cooperatively-built and -run internet services — and why greedy telecommunications companies who have failed to serve those places frequently lobby governments to ban their creation. These networks are referred to as “municipal broadband”, and as of this writing 16 states still have laws on the books restricting their creation. (Though Colorado repealed theirs in May 2023, and cities removing restrictions have worked rapidly to improve internet access for their residents 🥳.)
But how do these cables get from one place to another — and especially, from one continent to another?
A Series of Tubes
In June of 2006, the former Alaska Senator Ted Stevens described the Internet “not as a big truck” that will carry as much as can be loaded onto it, but as “a series of tubes.” His analogy was met with ridicule by internet aficionados and even Jon Stewart, and quickly escalated to memedom; it is difficult to think of a single phrase that has as succinctly and continuously symbolized the view of politicians as out-of-touch and tech-illiterate as this one.
That said, Senator Orrin Hatch’s brilliantly naive question to Mark Zuckerberg in 2018 re: “how do ads work” is a close second, and well-worth a quick watch:
But this ridicule of Senator Stevens’ analogy has always irked me, because it’s actually decent. The internet isn’t literal “tubes,” but you could find a far worse metaphor to describe it. And I like the tubes idea in particular because it serves as a reminder of the internet’s physicality in a way that other common terms — like the “cloud”, “a network,” “connectivity,” and more — simply do not. We know tubes: they’re in our homes; they hold our toothpaste. These other words invoke an ephemeral magicality, a “now you see it, now you don’t”-ness, that has surely benefited internet evangelists for decades. But the same terms serve as a mystifying force to prevent anyone but the wonkiest internet nerds from understanding how the internet actually works.
And internet physicality, as we’ll see, has often surprising ramifications dictating how the internet works. Let’s look at two big factors now: network size and distance.
Size Matters 🤫😳
This section may be review for some readers, so skip to the next if you’re yawning.
Internet connections all have a finite capacity. Some have a lot, some a little, but there’s no such thing as “infinite” bandwidth, yet. This applies to satellite internet too, even though nothing like tubes are involved there. The whole capacity idea, in fact, is my my fellow Ted dug up his “tubes” analogy in the first place.
But let’s move away from tubes for a minute, and imagine a gumball dispenser filled with gumballs of various sizes. The spout, of course, is just one size — and the bigger the spout, the more gumballs can get through at once, and the larger they can be.
We can think of different-sized gumballs representing different types of internet activity:
Email, text messages, and text-heavy web pages would be small gumballs
Podcasts, photos, Instagram, and photo-heavy sites would be medium-sized
Video sites like TikTok and YouTube would be large
High-quality video formats like 4k would be very large
Now imagine that everyone using the internet in a local area — Seattle, say, or on a small island in Indonesia — is trying to get gumballs out of the one spout. We can see how if everyone is uploading and downloading videos (“large gumballs”) at the same time, only a few might get through — and the video gumballs might even block or slow email and photos (“small gumballs”) from making it through the spout, too.
It’s easy to see how a bigger gumball spout would let through more and more gumballs. When many American homes moved from 56k dial-up modems to DSL and then broadband in the early 2000’s, these were all progressive enlargements of the “gumball spout” — and the same happened when we went from 2G to 3G to 4G, and now 5G data, on our mobile phones.
The idea of many gumballs getting jammed when trying to get through the spout, by the way, is often referred to in internet jargon as “congestion.” This is the same reason why phone data can seem impossibly slow after a concert or sports game, or even when in the middle of a really heinous traffic jam. It’s because everyone around you is trying to use their phones, too. (What monsters!)
And this notion of “congestion” sits at the heart of why video streaming platforms like Netflix were so embroiled in US government net neutrality debates around the start of Trump’s presidency. Telecommunications companies weren’t happy with the idea of a small number of internet users streaming a disproportionately high number of videos — e.g. demanding an outsized quantity of gigantic gumballs — and slowing down the network for everyone else using it.
In fact, when “other Ted” Senator Stevens first made his tubes analogy, it was in this precise context. The reason Jon Stewart panned Stevens on The Daily Show was not just because of his “tubes” comment, but because he (the Senator) was arguing against the idea of net neutrality. I won’t go into that here — there’s a reason John Oliver devoted numerous lengthy segments to the topic, and even motivated viewers to crash the FCC’s website with protest emails in 2017. But suffice to say I agree with the criticism of Stevens and believe vociferously in the value of net neutrality, even if I do rather like his analogy in describing why he thinks it’s important.
Going the Distance
It’s hard to fathom that distance could have anything to do with internet speed, since so much of the internet seems to work at the click of a button (at least, it does today.) But the information we access online, and the 🔥🔥 content we upload to it, has to go and come from somewhere.
As author and researcher Alex Pang notes on his website, High Frequency Trading — referring to the computer-dominated algorithmic trading taking place on Wall Street today — “has long stood as a great example of how the physicality of computing really matters, contrary to companies’ breezy declarations that everything is in The Cloud.” Because a nanosecond (one billionth of a second) can matter in who first makes a trade, NYSE gave every trader an equal length of cable from their computer to the central trading server in their New Jersey data center. Otherwise, computers closer to the central server might be seen to have a nanosecond-level advantage, because their order signal would have to travel a few cable feet fewer. (I’ve seen Alex Pang speak by the way, and both his books Rest and Shorter are great reads filled with research on the benefits — even to companies — of employees working less.)
Let’s imagine an example slightly closer to our real lives, even if it is grossly oversimplified. Say I’m in San Francisco and want to watch a YouTube video — a video of a speech given by the city’s mayor the night before. So it’s popular locally, but not much beyond.
When I click “Play,” a request gets sent from my computer or phone to my internet provider saying “I want to watch this.”
My internet provider looks for a nearby copy of the video — it’s likely stored on a server (think of a big hard drive) close by, because it’s a popular video in my area.
My internet provider sends that video to my computer or phone, where I watch it.
This usually happens in the blink of an eye, which is why we don’t think about it.
But let’s say soon after that, I fly halfway across the world to, oh, the Maldives! For an amazing holiday vacation with my partner, and decide to show her that same mayor’s speech. Because no one in the little island nation has watched the mayor’s speech video, nor any other video like it, the local Maldives internet provider then has to go out in search of this video — possibly all the way across the Pacific to the Bay Area, to the same server that delivered me the video when I was in California. And then the video has to travel all the way back to me, on a tiny island in the Indian Ocean.
Even if I have fast internet service in the Maldives (which I probably don’t — it’s nowhere near, well, anything), this whole process will likely take longer than it would in California. Maybe only another second, perhaps barely noticeable, but my video request and the video itself are traveling much, much further to show me the video.
This would be different if the video was popular worldwide — say, a recent Taylor Swift music video. In this case, it’s likely many others in the Maldives have already tried watching it, and the video is already stored somewhere on a server (again, a hard drive, probably deep in a big, ugly building) somewhere nearby, where I can access it nearly as fast as if I were home in San Francisco — at least, if the local internet is good, which on a tiny island is usually iffy.
The process of storing a video like this on a local network is called caching by the way, and it’s a big part of how the internet works so seamlessly and efficiently — that popular Taylor Swift video doesn’t just exist in one place, on one server; there are probably copies of it in thousands of big, ugly gray buildings all around the world. This applies for just about any other internet content too. Seen a popular TikTok lately? There’s probably a copy of the video somewhere on a server near you. (This is just a guess, I couldn’t find data on it — but I think it’s a good one.)
This doesn’t matter much for most of our day-to-day experience of the internet — for most of us, most of the time, the internet just works. But it helps explain certain phenomena around how it does, and all this does matter in certain situations. Say, when you’re watching a very-San Francisco video while on an exotic vacation — or someone living in a tiny town in Tanzania.
I Was Promised Sharks?
Haha oh damn, was that this week? Oops!
Honestly though, this has gotten long — I promise (pinkies) we’ll talk about sharks and satellites and so much more next week.
Stay tuned, and thanks for joining this time ‘round. 🦈🤞🛰️
Song of the Week: Olivia Newton John — Let’s Get Physical (how could I not?)