Antoine Beaupr : How to nationalize the internet in Canada
Rogers had a catastrophic failure in July
2022. It affected emergency services (as in: people couldn't call 911,
but also some 911 services themselves failed), hospitals (which
couldn't access prescriptions), banks and payment systems (as payment
terminals stopped working), and regular users as well. The outage
lasted almost a full day, and Rogers took days to give any technical
explanation on the outage, and even when they did, details were
sparse. So far the only detailed account is from outside actors like
Cloudflare which seem to point at an internal BGP failure.
Its impact on the economy has yet to be measured, but it probably cost
millions of dollars in wasted time and possibly lead to
life-threatening situations. Apart from holding Rogers (criminally?)
responsible for this, what should be done in the future to avoid such
problems?
It's not the first time something like this has happened: it happened to
Bell Canada as well. The Rogers outage is also strangely similar
to the Facebook outage last year, but, to its credit, Facebook
did post a fairly detailed explanation only a day later.
The internet is designed to be decentralised, and having
large companies like Rogers hold so much power is a crucial mistake
that should be reverted. The question is how. Some critics were
quick to point out that we need more ISP diversity and competition,
but I think that's missing the point. Others have suggested that the
internet should be a public good or even straight out
nationalized.
I believe the solution to the problem of large, private, centralised
telcos and ISPs is to replace them with smaller, public, decentralised
service providers. The only way to ensure that works is to make sure
that public money ends up creating infrastructure controlled by the
public, which means treating ISPs as a public utility. This has been
implemented elsewhere: it works, it's cheaper, and provides better
service.
A modest proposal
Global wireless services (like phone services) and home internet
inevitably grow into monopolies. They are public utilities, just like
water, power, railways, and roads. The question of how they should be
managed is therefore inherently political, yet people don't seem to
question the idea that only the market (i.e. "competition") can solve
this problem. I disagree.
10 years ago (in french), I suggested we, in Qu bec, should
nationalize large telcos and internet service providers. I no longer
believe is a realistic approach: most of those companies have crap
copper-based networks (at least for the last mile), yet are worth
billions of dollars. It would be prohibitive, and a waste, to buy them
out.
Back then, I called this idea "R seau-Qu bec", a reference to the
already nationalized power company, Hydro-Qu bec. (This idea,
incidentally, made it into the plan of a political party.)
Now, I think we should instead build our own, public internet. Start
setting up municipal internet services, fiber to the home in all
cities, progressively. Then interconnect cities with fiber, and build
peering agreements with other providers. This also includes a bid on
wireless spectrum to start competing with phone providers as well.
And while that sounds really ambitious, I think it's possible to take
this one step at a time.
A modest proposal
Global wireless services (like phone services) and home internet
inevitably grow into monopolies. They are public utilities, just like
water, power, railways, and roads. The question of how they should be
managed is therefore inherently political, yet people don't seem to
question the idea that only the market (i.e. "competition") can solve
this problem. I disagree.
10 years ago (in french), I suggested we, in Qu bec, should
nationalize large telcos and internet service providers. I no longer
believe is a realistic approach: most of those companies have crap
copper-based networks (at least for the last mile), yet are worth
billions of dollars. It would be prohibitive, and a waste, to buy them
out.
Back then, I called this idea "R seau-Qu bec", a reference to the
already nationalized power company, Hydro-Qu bec. (This idea,
incidentally, made it into the plan of a political party.)
Now, I think we should instead build our own, public internet. Start
setting up municipal internet services, fiber to the home in all
cities, progressively. Then interconnect cities with fiber, and build
peering agreements with other providers. This also includes a bid on
wireless spectrum to start competing with phone providers as well.
And while that sounds really ambitious, I think it's possible to take
this one step at a time.
Municipal broadband
In many parts of the world, municipal broadband is an elegant
solution to the problem, with solutions ranging from Stockholm's
city-owned fiber network (dark fiber, layer 1) to Utah's
UTOPIA network (fiber to the premises, layer 2) and
municipal wireless networks like Guifi.net which connects
about 40,000 nodes in Catalonia.
A good first step would be for cities to start providing broadband
services to its residents, directly. Cities normally own sewage and
water systems that interconnect most residences and therefore have
direct physical access everywhere. In Montr al, in particular,
there is an ongoing project to replace a lot of old lead-based
plumbing which would give an
opportunity to lay down a wired fiber network across the city.
This is a wild guess, but I suspect this would be much less
expensive than one would think. Some people agree with me and quote
this as low as 1000$ per household. There is about 800,000
households in the city of Montr al, so we're talking about a 800
million dollars investment here, to connect every household in
Montr al with fiber and incidentally a quarter of the province's
population. And this is not an up-front cost: this can be built
progressively, with expenses amortized over many years.
(We should not, however, connect Montr al first: it's used as an
example here because it's a large number of households to connect.)
Such a network should be built with a redundant topology.
I leave it as an open question whether we
should adopt Stockholm's more minimalist approach or provide direct IP
connectivity. I would tend to favor the latter, because then you can
immediately start to offer the service to households and generate
revenues to compensate for the capital expenditures.
Given the ridiculous profit margins telcos currently have 8 billion
$CAD net income for BCE (2019), 2 billion $CAD for Rogers
(2020) I also believe this would actually turn into a
profitable revenue stream for the city, the same way Hydro-Qu bec is
more and more considered as a revenue stream for the state. (I
personally believe that's actually wrong and we should treat those
resources as human rights and not money cows, but I digress. The
point is: this is not a cost point, it's a revenue.)
The other major challenge here is that the city will need competent
engineers to drive this project forward. But this is not different
from the way other public utilities run: we have electrical engineers
at Hydro, sewer and water engineers at the city, this is just another
profession. If anything, the computing science sector might be more at
fault than the city here in its failure to provide competent and
accountable engineers to society...
Right now, most of the network in Canada is copper: we are hitting the
limits of that technology with DSL, and while cable has some life
left to it (DOCSIS 4.0 does 4Gbps), that is nowhere near the
capacity of fiber. Take the town of Chattanooga, Tennessee: in
2010, the city-owned ISP EPB finished deploying a fiber network to
the entire town and provided gigabit internet to everyone. Now, 12
years later, they are using this same network to provide the
mind-boggling speed of 25 gigabit to the home. To give you an
idea, Chattanooga is roughly the size and density of Sherbrooke.
Provincial public internet
As part of building a municipal network, the question of getting
access to "the internet" will immediately come up. Naturally, this
will first be solved by using already existing commercial providers to
hook up residents to the rest of the global network.
But eventually, networks should inter-connect: Montr al should connect
with Laval, and then Trois-Rivi res, then Qu bec
City. This will require long haul fiber runs, but those links are
not actually that expensive, and many of those already exist as a
public resource at RISQ and CANARIE, which cross-connects
universities and colleges across the province and the
country. Those networks might not have the capacity to cover the
needs of the entire province right now, but that is a router upgrade
away, thanks to the amazing capacity of fiber.
There are two crucial mistakes to avoid at this point. First, the
network needs to remain decentralised. Long haul links should be IP
links with BGP sessions, and each city (or MRC) should have its
own independent network, to avoid Rogers-class catastrophic failures.
Second, skill needs to remain in-house: RISQ has already made that
mistake, to a certain extent, by selling its neutral datacenter.
Tellingly, MetroOptic, probably the largest commercial
dark fiber provider in the province, now operates the QIX, the second
largest "public" internet exchange in Canada.
Still, we have a lot of infrastructure we can leverage here. If RISQ
or CANARIE cannot be up to the task, Hydro-Qu bec has power lines
running into every house in the province, with high voltage power lines
running hundreds of kilometers far north. The logistics of long
distance maintenance are already solved by that institution.
In fact, Hydro already has fiber all over the province, but it is
a private network, separate from the internet for security reasons
(and that should probably remain so). But this only shows they already
have the expertise to lay down fiber: they would just need to lay down
a parallel network to the existing one.
In that architecture, Hydro would be a "dark fiber" provider.
International public internet
None of the above solves the problem for
the entire population of Qu bec, which is notoriously dispersed, with
an area three times the size of France, but with only an eight of its
population (8 million vs 67). More specifically, Canada was originally
a french colony, a land violently stolen from native people who
have lived here for thousands of years. Some of those people now live
in reservations, sometimes far from urban centers (but definitely
not always). So the idea of leveraging the Hydro-Qu bec
infrastructure doesn't always work to solve this, because while Hydro
will happily flood a traditional hunting territory for an electric
dam, they don't bother running power lines to the village they
forcibly moved, powering it instead with noisy and polluting diesel
generators. So before giving me fiber to the home, we should give
power (and potable water, for that matter), to those communities
first.
So we need to discuss international connectivity. (How else could we
consider those communities than peer nations anyways?c) Qu bec has
virtually zero international links. Even in Montr al, which likes to
style itself a major player in gaming, AI, and technology, most
peering goes through either Toronto or New York.
That's a problem that we must fix,
regardless of the other problems stated here.
Looking at the submarine cable map, we
see very few international links actually landing in Canada. There is
the Greenland connect which connects Newfoundland to Iceland
through Greenland. There's the EXA which lands in Ireland, the UK
and the US, and Google has the Topaz link on the west
coast. That's about it, and none of those land anywhere near any major
urban center in Qu bec.
We should have a cable running from France up to
Saint-F licien. There should be a cable from Vancouver to
China. Heck, there should be a fiber cable running all the way
from the end of the great lakes through Qu bec, then up around the
northern passage and back down to British Columbia. Those cables are
expensive, and the idea might sound ludicrous, but Russia is actually
planning such a project for 2026. The US has cables running all the
way up (and around!) Alaska, neatly bypassing all of Canada in the
process. We just look ridiculous on that map.
(Addendum: I somehow forgot to talk about Teleglobe here was
founded as publicly owned company in 1950, growing international phone
and (later) data links all over the world. It was privatized by the
conservatives in 1984, along with rails and other "crown
corporations". So that's one major risk to any effort to make public
utilities work properly: some government might be elected and promptly
sell it out to its friends for peanuts.)
Wireless networks
I know most people will have rolled their eyes so far back their heads
have exploded. But I'm not done yet. I want wireless too. And by wireless, I
don't mean a bunch of geeks setting up OpenWRT routers on rooftops. I
tried that, and while it was fun and educational, it didn't scale.
A public networking utility wouldn't be complete without providing
cellular phone service. This involves bidding for frequencies at
the federal level, and deploying a rather large amount of
infrastructure, but it could be a later phase, when the engineers and
politicians have proven their worth.
At least part of the Rogers fiasco would have been averted if such a
decentralized network backend existed. One might even want to argue that a separate
institution should be setup to provide phone services, independently
from the regular wired networking, if only for reliability.
Because remember here: the problem we're trying to solve is not just
technical, it's about political boundaries, centralisation, and
automation. If everything is ran by this one organisation again, we
will have failed.
However, I must admit that phone services is where my ideas fall a
little short. I can't help but think it's also an accessible goal
maybe starting with a virtual operator but it seems slightly
less so than the others, especially considering how closed the phone
ecosystem is.
Counter points
In debating these ideas while writing this article, the following
objections came up.
I don't want the state to control my internet
One legitimate concern I have about the idea of the state running the
internet is the potential it would have to censor or control the
content running over the wires.
But I don't think there is necessarily a direct relationship between
resource ownership and control of content. Sure, China has strong
censorship in place, partly implemented through state-controlled
businesses. But Russia also has strong censorship in place, based on
regulatory tools: they force private service providers to install
back-doors in their networks to control content and surveil their
users.
Besides, the USA have been doing warrantless wiretapping since
at least 2003 (and yes, that's 10 years before the Snowden
revelations) so a commercial internet is no assurance that we have
a free internet. Quite the contrary in fact: if anything, the
commercial internet goes hand in hand with the neo-colonial
internet, just like businesses
did in the "good old colonial days".
Large media companies are the primary
censors of content here. In Canada, the media cartel requested the
first site-blocking order in 2018. The plaintiffs (including
Qu becor, Rogers, and Bell Canada) are both content providers and
internet service providers, an obvious conflict of interest.
Nevertheless, there are some strong arguments against having a
centralised, state-owned monopoly on internet service providers. FDN
makes a good point on this. But this is not what I am suggesting:
at the provincial level, the network would be purely physical, and
regional entities (which could include private companies) would peer
over that physical network, ensuring decentralization. Delegating the
management of that infrastructure to an independent non-profit or
cooperative (but owned by the state) would also ensure some level of
independence.
Isn't the government incompetent and corrupt?
Also known as "private enterprise is better skilled at handling this,
the state can't do anything right"
I don't think this is a "fait accomplit". If anything, I have found
publicly ran utilities to be spectacularly reliable here. I rarely
have trouble with sewage, water, or power, and keep in mind I live in
a city where we receive about 2 meters of snow a year, which tend
to create lots of trouble with power lines. Unless there's a major
weather event, power just runs here.
I think the same can happen with an internet service provider. But it
would certainly need to have higher standards to what we're used to,
because frankly Internet is kind of janky.
A single monopoly will be less reliable
I actually agree with that, but that is not what I am
proposing anyways.
Current commercial or non-profit entities will be free to offer their services on
top of the public network.
And besides, the current "ha! diversity is
great" approach is exactly what we have now, and it's not working.
The pretense that we can have competition over a single network is
what led the US into the ridiculous situation where they also pretend
to have competition over the power utility market. This led to
massive forest fires in California and major power outages in
Texas. It doesn't work.
Wouldn't this create an isolated network?
One theory is that this new network would be so hostile to incumbent
telcos and ISPs that they would simply refuse to network with the
public utility. And while it is true that the telcos currently do
also act as a kind of "tier one" provider in some places, I
strongly feel this is also a problem that needs to be solved,
regardless of ownership of networking infrastructure.
Right now, telcos often hold both ends of the stick: they are the
gateway to users, the "last mile", but they also provide peering to
the larger internet in some locations. In at least one datacenter in
downtown Montr al, I've seen traffic go through Bell Canada that was
not directly targeted at Bell customers. So in effect, they are in a
position of charging twice for the same traffic, and that's not only
ridiculous, it should just be plain illegal.
And besides, this is not a big problem: there are other providers
out there. As bad as the market is in Qu bec, there is still some
diversity in Tier one providers that could allow for some exits to the
wider network (e.g. yes, Cogent is here too).
What about Google and Facebook?
Nationalization of other service
providers like Google and Facebook is out of scope of this discussion.
That said, I am not sure the state should get into the business of
organising the web or providing content services however, but I will
point out it already does do some of that through its own
websites. It should probably keep itself to this, and also
consider providing normal services for people who don't or can't
access the internet.
(And I would also be ready to argue that Google and Facebook already
act as extensions of the state: certainly if Facebook didn't exist,
the CIA or the NSA would like to create it at this point. And Google
has lucrative business with the US department of defense.)
What does not work
So we've seen one thing that could work. Maybe it's too
expensive. Maybe the political will isn't there. Maybe it will
fail. We don't know yet.
But we know what does not work, and it's what we've been doing ever
since the internet has gone commercial.
Legal pressure and regulation
In 1984 (of all years), the US Department of Justice finally
broke up AT&T in half a dozen corporations, after a 10 year legal
battle. Yet a decades later, we're back to only three large
providers doing essentially what AT&T was doing back then, and those
are regional monopolies: AT&T, Verizon, and Lumen (not counting
T-Mobile that is from a different breed). So the legal approach
really didn't work that well, especially considering the political
landscape changed in the US, and the FTC seems perfectly happy to let
those major mergers continue.
In Canada, we never even pretended we would solve this problem at all:
Bell Canada (the literal "father" of AT&T) is in the same
situation now. We have either a regional monopoly (e.g. Videotron for
cable in Qu bec) or an oligopoly (Bell, Rogers, and Telus controlling
more than 90% of the market). Telus does have one competitor in the
west of Canada, Shaw, but Rogers has been trying to buy it
out. The competition bureau seems to have blocked the merger
for now, but it didn't stop other recent mergers like Bell's
acquisition one of its main competitors in Qu bec, eBox.
Regulation doesn't seem capable of ensuring those profitable
corporations provide us with decent pricing, which makes Canada one
of the most expensive countries (research) for mobile data on
the planet. The recent failure of the CRTC to properly protect smaller
providers has even lead to price hikes. Meanwhile the oligopoly
is actually agreeing on their own price hikes therefore becoming
a real cartel, complete with price fixing and reductions in
output.
There are actually regulations in Canada supposed to keep the worst of
the Rogers outage from happening at all. According to CBC:
Under Canadian Radio-television and Telecommunications Commission
(CRTC) rules in place since 2017, telecom networks are supposed to
ensure that cellphones are able to contact 911 even if they do not
have service.
I could personally confirm that my phone couldn't reach 911 services,
because all calls would fail: the problem was that towers were still
up, so your phone wouldn't fall back to alternative service providers
(which could have resolved the issue). I can only speculate as to why
Rogers didn't take cell phone towers out of the network to let phones
work properly for 911 service, but it seems like a dangerous game to
play.
Hilariously, the CRTC itself didn't have a reliable phone service due
to the service outage:
Please note that our phone lines are affected by the Rogers network outage.
Our website is still available: https://crtc.gc.ca/eng/contact/
https://mobile.twitter.com/CRTCeng/status/1545421218534359041
I wonder if they will file a complaint against Rogers themselves about
this. I probably should.
It seems the federal government is thinking more of the same medicine
will fix the problem and has told companies should "help" each other
in an emergency. I doubt this will fix anything, and could
actually make things worse if the competitors actually interoperate
more, as it could cause multi-provider, cascading failures.
Subsidies
The absurd price we pay for data does not actually mean everyone gets
high speed internet at home. Large swathes of the Qu bec countryside
don't get broadband at all, and it can be difficult or expensive, even
in large urban centers like Montr al, to get high speed internet.
That is despite having a series of subsidies that all avoided
investing in our own infrastructure. We had the "fonds de l'autoroute
de l'information", "information highway fund" (site dead since
2003, archive.org link) and "branchez les familles",
"connecting families" (site dead since 2003, archive.org
link)
which subsidized the development of a copper network. In 2014, more of
the same: the federal government poured hundreds of millions of
dollars into a program called connecting Canadians to connect 280
000 households to "high speed internet". And now, the federal and
provincial governments are proudly announcing that "everyone is now
connected to high speed internet", after pouring more than 1.1
billion dollars to connect, guess what, another 380 000 homes, right
in time for the provincial election.
Of course, technically, the deadline won't actually be met
until 2023. Qu bec is a big area to cover, and you can guess what
happens next: the telcos threw up their hand and said some areas just
can't be connected. (Or they connect their CEO but not the poor folks
across the lake.) The story then takes the predictable twist of
giving more money out to billionaires, subsidizing now Musk's
Starlink system to connect those remote areas.
To give a concrete example: a friend who lives about 1000km away from
Montr al, 4km from a small, 2500 habitant village, has recently
got symmetric 100 mbps fiber at home from Telus, thanks to those subsidies. But I can't get that
service in Montr al at all, presumably because Telus and Bell colluded
to split that market. Bell doesn't provide me with such a service
either: they tell me they have "fiber to my neighborhood", and only
offer me a 25/10 mbps ADSL service. (There is Vid otron offering
400mbps, but that's copper cable, again a dead technology, and
asymmetric.)
Conclusion
Remember Chattanooga? Back in 2010, they funded the development of a
fiber network, and now they have deployed a network roughly a
thousand times faster than what we have just funded with a billion
dollars. In 2010, I was paying Bell
Canada
60$/mth for 20mbps and a 125GB cap, and now, I'm still (indirectly)
paying Bell for roughly the same speed (25mbps). Back then, Bell was
throttling their competitors networks until 2009, when they were
forced by the CRTC to stop
throttling. Both
Bell and Vid otron still explicitly forbid you from running your own servers
at home, Vid otron charges prohibitive prices which make it near
impossible for resellers to sell uncapped services. Those companies
are not spurring innovation: they are blocking it.
We have spent all this money for the private sector to build
us a private internet, over decades, without any assurance of quality,
equity or reliability. And while in some locations, ISPs did deploy
fiber to the home, they certainly didn't upgrade their entire network
to follow suit, and even less allowed resellers to compete on that
network.
In 10 years, when 100mbps will be laughable, I bet those service
providers will again punt the ball in the public courtyard and tell us
they don't have the money to upgrade everyone's equipment.
We got screwed. It's time to try something new.
Updates
There was a discussion about this article on Hacker News which
was surprisingly productive. Trigger warning: Hacker News is kind of
right-wing, in case you didn't know.
Since this article was written, at least two more major acquisitions
happened, just in Qu bec:
In the latter case, vMedia was explicitly saying it couldn't grow
because of "lack of access to capital". So basically, we have given
those companies a billion dollars, and they are not using that very
money to buy out their competition. At least we could have given
that money to small players to even out the playing field. But this is
not how that works at all. Also, in a bizarre twist, an "analyst"
believes the acquisition is likely to help Rogers acquire Shaw.
Also, since this article was written, the Washington Post published a
review of a book bringing similar ideas: Internet for the People
The Fight for Our Digital Future, by Ben Tarnoff, at Verso
books. It's short, but even more ambitious than what I am suggesting
in this article, arguing that all big tech companies should be broken
up and better regulated:
He pulls from Ethan Zuckerman s idea of a web that is plural in
purpose that just as pool halls, libraries and churches each have
different norms, purposes and designs, so too should different
places on the internet. To achieve this, Tarnoff wants governments
to pass laws that would make the big platforms unprofitable and, in
their place, fund small-scale, local experiments in social media
design. Instead of having platforms ruled by engagement-maximizing
algorithms, Tarnoff imagines public platforms run by local
librarians that include content from public media.
(Links mine: the Washington Post obviously prefers to not link to the
real web, and instead doesn't link to Zuckerman's site all and
suggests Amazon for the book, in a cynical example.)
And in another example of how the private sector has failed us, there
was recently a fluke in the AMBER alert system where the entire
province was warned about a loose shooter in Saint-Elz ar except
the people in the town, because they have spotty cell phone
coverage. In other words, millions of people received a strongly
toned, "life-threatening", alert for a city sometimes hours away,
except the people most vulnerable to the alert. Not missing a beat,
the CAQ party is promising more of the same medicine again and giving
more money to telcos to fix the problem, suggesting to spend
three billion dollars in private infrastructure.
Provincial public internet
As part of building a municipal network, the question of getting
access to "the internet" will immediately come up. Naturally, this
will first be solved by using already existing commercial providers to
hook up residents to the rest of the global network.
But eventually, networks should inter-connect: Montr al should connect
with Laval, and then Trois-Rivi res, then Qu bec
City. This will require long haul fiber runs, but those links are
not actually that expensive, and many of those already exist as a
public resource at RISQ and CANARIE, which cross-connects
universities and colleges across the province and the
country. Those networks might not have the capacity to cover the
needs of the entire province right now, but that is a router upgrade
away, thanks to the amazing capacity of fiber.
There are two crucial mistakes to avoid at this point. First, the
network needs to remain decentralised. Long haul links should be IP
links with BGP sessions, and each city (or MRC) should have its
own independent network, to avoid Rogers-class catastrophic failures.
Second, skill needs to remain in-house: RISQ has already made that
mistake, to a certain extent, by selling its neutral datacenter.
Tellingly, MetroOptic, probably the largest commercial
dark fiber provider in the province, now operates the QIX, the second
largest "public" internet exchange in Canada.
Still, we have a lot of infrastructure we can leverage here. If RISQ
or CANARIE cannot be up to the task, Hydro-Qu bec has power lines
running into every house in the province, with high voltage power lines
running hundreds of kilometers far north. The logistics of long
distance maintenance are already solved by that institution.
In fact, Hydro already has fiber all over the province, but it is
a private network, separate from the internet for security reasons
(and that should probably remain so). But this only shows they already
have the expertise to lay down fiber: they would just need to lay down
a parallel network to the existing one.
In that architecture, Hydro would be a "dark fiber" provider.
International public internet
None of the above solves the problem for
the entire population of Qu bec, which is notoriously dispersed, with
an area three times the size of France, but with only an eight of its
population (8 million vs 67). More specifically, Canada was originally
a french colony, a land violently stolen from native people who
have lived here for thousands of years. Some of those people now live
in reservations, sometimes far from urban centers (but definitely
not always). So the idea of leveraging the Hydro-Qu bec
infrastructure doesn't always work to solve this, because while Hydro
will happily flood a traditional hunting territory for an electric
dam, they don't bother running power lines to the village they
forcibly moved, powering it instead with noisy and polluting diesel
generators. So before giving me fiber to the home, we should give
power (and potable water, for that matter), to those communities
first.
So we need to discuss international connectivity. (How else could we
consider those communities than peer nations anyways?c) Qu bec has
virtually zero international links. Even in Montr al, which likes to
style itself a major player in gaming, AI, and technology, most
peering goes through either Toronto or New York.
That's a problem that we must fix,
regardless of the other problems stated here.
Looking at the submarine cable map, we
see very few international links actually landing in Canada. There is
the Greenland connect which connects Newfoundland to Iceland
through Greenland. There's the EXA which lands in Ireland, the UK
and the US, and Google has the Topaz link on the west
coast. That's about it, and none of those land anywhere near any major
urban center in Qu bec.
We should have a cable running from France up to
Saint-F licien. There should be a cable from Vancouver to
China. Heck, there should be a fiber cable running all the way
from the end of the great lakes through Qu bec, then up around the
northern passage and back down to British Columbia. Those cables are
expensive, and the idea might sound ludicrous, but Russia is actually
planning such a project for 2026. The US has cables running all the
way up (and around!) Alaska, neatly bypassing all of Canada in the
process. We just look ridiculous on that map.
(Addendum: I somehow forgot to talk about Teleglobe here was
founded as publicly owned company in 1950, growing international phone
and (later) data links all over the world. It was privatized by the
conservatives in 1984, along with rails and other "crown
corporations". So that's one major risk to any effort to make public
utilities work properly: some government might be elected and promptly
sell it out to its friends for peanuts.)
Wireless networks
I know most people will have rolled their eyes so far back their heads
have exploded. But I'm not done yet. I want wireless too. And by wireless, I
don't mean a bunch of geeks setting up OpenWRT routers on rooftops. I
tried that, and while it was fun and educational, it didn't scale.
A public networking utility wouldn't be complete without providing
cellular phone service. This involves bidding for frequencies at
the federal level, and deploying a rather large amount of
infrastructure, but it could be a later phase, when the engineers and
politicians have proven their worth.
At least part of the Rogers fiasco would have been averted if such a
decentralized network backend existed. One might even want to argue that a separate
institution should be setup to provide phone services, independently
from the regular wired networking, if only for reliability.
Because remember here: the problem we're trying to solve is not just
technical, it's about political boundaries, centralisation, and
automation. If everything is ran by this one organisation again, we
will have failed.
However, I must admit that phone services is where my ideas fall a
little short. I can't help but think it's also an accessible goal
maybe starting with a virtual operator but it seems slightly
less so than the others, especially considering how closed the phone
ecosystem is.
Counter points
In debating these ideas while writing this article, the following
objections came up.
I don't want the state to control my internet
One legitimate concern I have about the idea of the state running the
internet is the potential it would have to censor or control the
content running over the wires.
But I don't think there is necessarily a direct relationship between
resource ownership and control of content. Sure, China has strong
censorship in place, partly implemented through state-controlled
businesses. But Russia also has strong censorship in place, based on
regulatory tools: they force private service providers to install
back-doors in their networks to control content and surveil their
users.
Besides, the USA have been doing warrantless wiretapping since
at least 2003 (and yes, that's 10 years before the Snowden
revelations) so a commercial internet is no assurance that we have
a free internet. Quite the contrary in fact: if anything, the
commercial internet goes hand in hand with the neo-colonial
internet, just like businesses
did in the "good old colonial days".
Large media companies are the primary
censors of content here. In Canada, the media cartel requested the
first site-blocking order in 2018. The plaintiffs (including
Qu becor, Rogers, and Bell Canada) are both content providers and
internet service providers, an obvious conflict of interest.
Nevertheless, there are some strong arguments against having a
centralised, state-owned monopoly on internet service providers. FDN
makes a good point on this. But this is not what I am suggesting:
at the provincial level, the network would be purely physical, and
regional entities (which could include private companies) would peer
over that physical network, ensuring decentralization. Delegating the
management of that infrastructure to an independent non-profit or
cooperative (but owned by the state) would also ensure some level of
independence.
Isn't the government incompetent and corrupt?
Also known as "private enterprise is better skilled at handling this,
the state can't do anything right"
I don't think this is a "fait accomplit". If anything, I have found
publicly ran utilities to be spectacularly reliable here. I rarely
have trouble with sewage, water, or power, and keep in mind I live in
a city where we receive about 2 meters of snow a year, which tend
to create lots of trouble with power lines. Unless there's a major
weather event, power just runs here.
I think the same can happen with an internet service provider. But it
would certainly need to have higher standards to what we're used to,
because frankly Internet is kind of janky.
A single monopoly will be less reliable
I actually agree with that, but that is not what I am
proposing anyways.
Current commercial or non-profit entities will be free to offer their services on
top of the public network.
And besides, the current "ha! diversity is
great" approach is exactly what we have now, and it's not working.
The pretense that we can have competition over a single network is
what led the US into the ridiculous situation where they also pretend
to have competition over the power utility market. This led to
massive forest fires in California and major power outages in
Texas. It doesn't work.
Wouldn't this create an isolated network?
One theory is that this new network would be so hostile to incumbent
telcos and ISPs that they would simply refuse to network with the
public utility. And while it is true that the telcos currently do
also act as a kind of "tier one" provider in some places, I
strongly feel this is also a problem that needs to be solved,
regardless of ownership of networking infrastructure.
Right now, telcos often hold both ends of the stick: they are the
gateway to users, the "last mile", but they also provide peering to
the larger internet in some locations. In at least one datacenter in
downtown Montr al, I've seen traffic go through Bell Canada that was
not directly targeted at Bell customers. So in effect, they are in a
position of charging twice for the same traffic, and that's not only
ridiculous, it should just be plain illegal.
And besides, this is not a big problem: there are other providers
out there. As bad as the market is in Qu bec, there is still some
diversity in Tier one providers that could allow for some exits to the
wider network (e.g. yes, Cogent is here too).
What about Google and Facebook?
Nationalization of other service
providers like Google and Facebook is out of scope of this discussion.
That said, I am not sure the state should get into the business of
organising the web or providing content services however, but I will
point out it already does do some of that through its own
websites. It should probably keep itself to this, and also
consider providing normal services for people who don't or can't
access the internet.
(And I would also be ready to argue that Google and Facebook already
act as extensions of the state: certainly if Facebook didn't exist,
the CIA or the NSA would like to create it at this point. And Google
has lucrative business with the US department of defense.)
What does not work
So we've seen one thing that could work. Maybe it's too
expensive. Maybe the political will isn't there. Maybe it will
fail. We don't know yet.
But we know what does not work, and it's what we've been doing ever
since the internet has gone commercial.
Legal pressure and regulation
In 1984 (of all years), the US Department of Justice finally
broke up AT&T in half a dozen corporations, after a 10 year legal
battle. Yet a decades later, we're back to only three large
providers doing essentially what AT&T was doing back then, and those
are regional monopolies: AT&T, Verizon, and Lumen (not counting
T-Mobile that is from a different breed). So the legal approach
really didn't work that well, especially considering the political
landscape changed in the US, and the FTC seems perfectly happy to let
those major mergers continue.
In Canada, we never even pretended we would solve this problem at all:
Bell Canada (the literal "father" of AT&T) is in the same
situation now. We have either a regional monopoly (e.g. Videotron for
cable in Qu bec) or an oligopoly (Bell, Rogers, and Telus controlling
more than 90% of the market). Telus does have one competitor in the
west of Canada, Shaw, but Rogers has been trying to buy it
out. The competition bureau seems to have blocked the merger
for now, but it didn't stop other recent mergers like Bell's
acquisition one of its main competitors in Qu bec, eBox.
Regulation doesn't seem capable of ensuring those profitable
corporations provide us with decent pricing, which makes Canada one
of the most expensive countries (research) for mobile data on
the planet. The recent failure of the CRTC to properly protect smaller
providers has even lead to price hikes. Meanwhile the oligopoly
is actually agreeing on their own price hikes therefore becoming
a real cartel, complete with price fixing and reductions in
output.
There are actually regulations in Canada supposed to keep the worst of
the Rogers outage from happening at all. According to CBC:
Under Canadian Radio-television and Telecommunications Commission
(CRTC) rules in place since 2017, telecom networks are supposed to
ensure that cellphones are able to contact 911 even if they do not
have service.
I could personally confirm that my phone couldn't reach 911 services,
because all calls would fail: the problem was that towers were still
up, so your phone wouldn't fall back to alternative service providers
(which could have resolved the issue). I can only speculate as to why
Rogers didn't take cell phone towers out of the network to let phones
work properly for 911 service, but it seems like a dangerous game to
play.
Hilariously, the CRTC itself didn't have a reliable phone service due
to the service outage:
Please note that our phone lines are affected by the Rogers network outage.
Our website is still available: https://crtc.gc.ca/eng/contact/
https://mobile.twitter.com/CRTCeng/status/1545421218534359041
I wonder if they will file a complaint against Rogers themselves about
this. I probably should.
It seems the federal government is thinking more of the same medicine
will fix the problem and has told companies should "help" each other
in an emergency. I doubt this will fix anything, and could
actually make things worse if the competitors actually interoperate
more, as it could cause multi-provider, cascading failures.
Subsidies
The absurd price we pay for data does not actually mean everyone gets
high speed internet at home. Large swathes of the Qu bec countryside
don't get broadband at all, and it can be difficult or expensive, even
in large urban centers like Montr al, to get high speed internet.
That is despite having a series of subsidies that all avoided
investing in our own infrastructure. We had the "fonds de l'autoroute
de l'information", "information highway fund" (site dead since
2003, archive.org link) and "branchez les familles",
"connecting families" (site dead since 2003, archive.org
link)
which subsidized the development of a copper network. In 2014, more of
the same: the federal government poured hundreds of millions of
dollars into a program called connecting Canadians to connect 280
000 households to "high speed internet". And now, the federal and
provincial governments are proudly announcing that "everyone is now
connected to high speed internet", after pouring more than 1.1
billion dollars to connect, guess what, another 380 000 homes, right
in time for the provincial election.
Of course, technically, the deadline won't actually be met
until 2023. Qu bec is a big area to cover, and you can guess what
happens next: the telcos threw up their hand and said some areas just
can't be connected. (Or they connect their CEO but not the poor folks
across the lake.) The story then takes the predictable twist of
giving more money out to billionaires, subsidizing now Musk's
Starlink system to connect those remote areas.
To give a concrete example: a friend who lives about 1000km away from
Montr al, 4km from a small, 2500 habitant village, has recently
got symmetric 100 mbps fiber at home from Telus, thanks to those subsidies. But I can't get that
service in Montr al at all, presumably because Telus and Bell colluded
to split that market. Bell doesn't provide me with such a service
either: they tell me they have "fiber to my neighborhood", and only
offer me a 25/10 mbps ADSL service. (There is Vid otron offering
400mbps, but that's copper cable, again a dead technology, and
asymmetric.)
Conclusion
Remember Chattanooga? Back in 2010, they funded the development of a
fiber network, and now they have deployed a network roughly a
thousand times faster than what we have just funded with a billion
dollars. In 2010, I was paying Bell
Canada
60$/mth for 20mbps and a 125GB cap, and now, I'm still (indirectly)
paying Bell for roughly the same speed (25mbps). Back then, Bell was
throttling their competitors networks until 2009, when they were
forced by the CRTC to stop
throttling. Both
Bell and Vid otron still explicitly forbid you from running your own servers
at home, Vid otron charges prohibitive prices which make it near
impossible for resellers to sell uncapped services. Those companies
are not spurring innovation: they are blocking it.
We have spent all this money for the private sector to build
us a private internet, over decades, without any assurance of quality,
equity or reliability. And while in some locations, ISPs did deploy
fiber to the home, they certainly didn't upgrade their entire network
to follow suit, and even less allowed resellers to compete on that
network.
In 10 years, when 100mbps will be laughable, I bet those service
providers will again punt the ball in the public courtyard and tell us
they don't have the money to upgrade everyone's equipment.
We got screwed. It's time to try something new.
Updates
There was a discussion about this article on Hacker News which
was surprisingly productive. Trigger warning: Hacker News is kind of
right-wing, in case you didn't know.
Since this article was written, at least two more major acquisitions
happened, just in Qu bec:
In the latter case, vMedia was explicitly saying it couldn't grow
because of "lack of access to capital". So basically, we have given
those companies a billion dollars, and they are not using that very
money to buy out their competition. At least we could have given
that money to small players to even out the playing field. But this is
not how that works at all. Also, in a bizarre twist, an "analyst"
believes the acquisition is likely to help Rogers acquire Shaw.
Also, since this article was written, the Washington Post published a
review of a book bringing similar ideas: Internet for the People
The Fight for Our Digital Future, by Ben Tarnoff, at Verso
books. It's short, but even more ambitious than what I am suggesting
in this article, arguing that all big tech companies should be broken
up and better regulated:
He pulls from Ethan Zuckerman s idea of a web that is plural in
purpose that just as pool halls, libraries and churches each have
different norms, purposes and designs, so too should different
places on the internet. To achieve this, Tarnoff wants governments
to pass laws that would make the big platforms unprofitable and, in
their place, fund small-scale, local experiments in social media
design. Instead of having platforms ruled by engagement-maximizing
algorithms, Tarnoff imagines public platforms run by local
librarians that include content from public media.
(Links mine: the Washington Post obviously prefers to not link to the
real web, and instead doesn't link to Zuckerman's site all and
suggests Amazon for the book, in a cynical example.)
And in another example of how the private sector has failed us, there
was recently a fluke in the AMBER alert system where the entire
province was warned about a loose shooter in Saint-Elz ar except
the people in the town, because they have spotty cell phone
coverage. In other words, millions of people received a strongly
toned, "life-threatening", alert for a city sometimes hours away,
except the people most vulnerable to the alert. Not missing a beat,
the CAQ party is promising more of the same medicine again and giving
more money to telcos to fix the problem, suggesting to spend
three billion dollars in private infrastructure.
Wireless networks
I know most people will have rolled their eyes so far back their heads
have exploded. But I'm not done yet. I want wireless too. And by wireless, I
don't mean a bunch of geeks setting up OpenWRT routers on rooftops. I
tried that, and while it was fun and educational, it didn't scale.
A public networking utility wouldn't be complete without providing
cellular phone service. This involves bidding for frequencies at
the federal level, and deploying a rather large amount of
infrastructure, but it could be a later phase, when the engineers and
politicians have proven their worth.
At least part of the Rogers fiasco would have been averted if such a
decentralized network backend existed. One might even want to argue that a separate
institution should be setup to provide phone services, independently
from the regular wired networking, if only for reliability.
Because remember here: the problem we're trying to solve is not just
technical, it's about political boundaries, centralisation, and
automation. If everything is ran by this one organisation again, we
will have failed.
However, I must admit that phone services is where my ideas fall a
little short. I can't help but think it's also an accessible goal
maybe starting with a virtual operator but it seems slightly
less so than the others, especially considering how closed the phone
ecosystem is.
Counter points
In debating these ideas while writing this article, the following
objections came up.
I don't want the state to control my internet
One legitimate concern I have about the idea of the state running the
internet is the potential it would have to censor or control the
content running over the wires.
But I don't think there is necessarily a direct relationship between
resource ownership and control of content. Sure, China has strong
censorship in place, partly implemented through state-controlled
businesses. But Russia also has strong censorship in place, based on
regulatory tools: they force private service providers to install
back-doors in their networks to control content and surveil their
users.
Besides, the USA have been doing warrantless wiretapping since
at least 2003 (and yes, that's 10 years before the Snowden
revelations) so a commercial internet is no assurance that we have
a free internet. Quite the contrary in fact: if anything, the
commercial internet goes hand in hand with the neo-colonial
internet, just like businesses
did in the "good old colonial days".
Large media companies are the primary
censors of content here. In Canada, the media cartel requested the
first site-blocking order in 2018. The plaintiffs (including
Qu becor, Rogers, and Bell Canada) are both content providers and
internet service providers, an obvious conflict of interest.
Nevertheless, there are some strong arguments against having a
centralised, state-owned monopoly on internet service providers. FDN
makes a good point on this. But this is not what I am suggesting:
at the provincial level, the network would be purely physical, and
regional entities (which could include private companies) would peer
over that physical network, ensuring decentralization. Delegating the
management of that infrastructure to an independent non-profit or
cooperative (but owned by the state) would also ensure some level of
independence.
Isn't the government incompetent and corrupt?
Also known as "private enterprise is better skilled at handling this,
the state can't do anything right"
I don't think this is a "fait accomplit". If anything, I have found
publicly ran utilities to be spectacularly reliable here. I rarely
have trouble with sewage, water, or power, and keep in mind I live in
a city where we receive about 2 meters of snow a year, which tend
to create lots of trouble with power lines. Unless there's a major
weather event, power just runs here.
I think the same can happen with an internet service provider. But it
would certainly need to have higher standards to what we're used to,
because frankly Internet is kind of janky.
A single monopoly will be less reliable
I actually agree with that, but that is not what I am
proposing anyways.
Current commercial or non-profit entities will be free to offer their services on
top of the public network.
And besides, the current "ha! diversity is
great" approach is exactly what we have now, and it's not working.
The pretense that we can have competition over a single network is
what led the US into the ridiculous situation where they also pretend
to have competition over the power utility market. This led to
massive forest fires in California and major power outages in
Texas. It doesn't work.
Wouldn't this create an isolated network?
One theory is that this new network would be so hostile to incumbent
telcos and ISPs that they would simply refuse to network with the
public utility. And while it is true that the telcos currently do
also act as a kind of "tier one" provider in some places, I
strongly feel this is also a problem that needs to be solved,
regardless of ownership of networking infrastructure.
Right now, telcos often hold both ends of the stick: they are the
gateway to users, the "last mile", but they also provide peering to
the larger internet in some locations. In at least one datacenter in
downtown Montr al, I've seen traffic go through Bell Canada that was
not directly targeted at Bell customers. So in effect, they are in a
position of charging twice for the same traffic, and that's not only
ridiculous, it should just be plain illegal.
And besides, this is not a big problem: there are other providers
out there. As bad as the market is in Qu bec, there is still some
diversity in Tier one providers that could allow for some exits to the
wider network (e.g. yes, Cogent is here too).
What about Google and Facebook?
Nationalization of other service
providers like Google and Facebook is out of scope of this discussion.
That said, I am not sure the state should get into the business of
organising the web or providing content services however, but I will
point out it already does do some of that through its own
websites. It should probably keep itself to this, and also
consider providing normal services for people who don't or can't
access the internet.
(And I would also be ready to argue that Google and Facebook already
act as extensions of the state: certainly if Facebook didn't exist,
the CIA or the NSA would like to create it at this point. And Google
has lucrative business with the US department of defense.)
What does not work
So we've seen one thing that could work. Maybe it's too
expensive. Maybe the political will isn't there. Maybe it will
fail. We don't know yet.
But we know what does not work, and it's what we've been doing ever
since the internet has gone commercial.
Legal pressure and regulation
In 1984 (of all years), the US Department of Justice finally
broke up AT&T in half a dozen corporations, after a 10 year legal
battle. Yet a decades later, we're back to only three large
providers doing essentially what AT&T was doing back then, and those
are regional monopolies: AT&T, Verizon, and Lumen (not counting
T-Mobile that is from a different breed). So the legal approach
really didn't work that well, especially considering the political
landscape changed in the US, and the FTC seems perfectly happy to let
those major mergers continue.
In Canada, we never even pretended we would solve this problem at all:
Bell Canada (the literal "father" of AT&T) is in the same
situation now. We have either a regional monopoly (e.g. Videotron for
cable in Qu bec) or an oligopoly (Bell, Rogers, and Telus controlling
more than 90% of the market). Telus does have one competitor in the
west of Canada, Shaw, but Rogers has been trying to buy it
out. The competition bureau seems to have blocked the merger
for now, but it didn't stop other recent mergers like Bell's
acquisition one of its main competitors in Qu bec, eBox.
Regulation doesn't seem capable of ensuring those profitable
corporations provide us with decent pricing, which makes Canada one
of the most expensive countries (research) for mobile data on
the planet. The recent failure of the CRTC to properly protect smaller
providers has even lead to price hikes. Meanwhile the oligopoly
is actually agreeing on their own price hikes therefore becoming
a real cartel, complete with price fixing and reductions in
output.
There are actually regulations in Canada supposed to keep the worst of
the Rogers outage from happening at all. According to CBC:
Under Canadian Radio-television and Telecommunications Commission
(CRTC) rules in place since 2017, telecom networks are supposed to
ensure that cellphones are able to contact 911 even if they do not
have service.
I could personally confirm that my phone couldn't reach 911 services,
because all calls would fail: the problem was that towers were still
up, so your phone wouldn't fall back to alternative service providers
(which could have resolved the issue). I can only speculate as to why
Rogers didn't take cell phone towers out of the network to let phones
work properly for 911 service, but it seems like a dangerous game to
play.
Hilariously, the CRTC itself didn't have a reliable phone service due
to the service outage:
Please note that our phone lines are affected by the Rogers network outage.
Our website is still available: https://crtc.gc.ca/eng/contact/
https://mobile.twitter.com/CRTCeng/status/1545421218534359041
I wonder if they will file a complaint against Rogers themselves about
this. I probably should.
It seems the federal government is thinking more of the same medicine
will fix the problem and has told companies should "help" each other
in an emergency. I doubt this will fix anything, and could
actually make things worse if the competitors actually interoperate
more, as it could cause multi-provider, cascading failures.
Subsidies
The absurd price we pay for data does not actually mean everyone gets
high speed internet at home. Large swathes of the Qu bec countryside
don't get broadband at all, and it can be difficult or expensive, even
in large urban centers like Montr al, to get high speed internet.
That is despite having a series of subsidies that all avoided
investing in our own infrastructure. We had the "fonds de l'autoroute
de l'information", "information highway fund" (site dead since
2003, archive.org link) and "branchez les familles",
"connecting families" (site dead since 2003, archive.org
link)
which subsidized the development of a copper network. In 2014, more of
the same: the federal government poured hundreds of millions of
dollars into a program called connecting Canadians to connect 280
000 households to "high speed internet". And now, the federal and
provincial governments are proudly announcing that "everyone is now
connected to high speed internet", after pouring more than 1.1
billion dollars to connect, guess what, another 380 000 homes, right
in time for the provincial election.
Of course, technically, the deadline won't actually be met
until 2023. Qu bec is a big area to cover, and you can guess what
happens next: the telcos threw up their hand and said some areas just
can't be connected. (Or they connect their CEO but not the poor folks
across the lake.) The story then takes the predictable twist of
giving more money out to billionaires, subsidizing now Musk's
Starlink system to connect those remote areas.
To give a concrete example: a friend who lives about 1000km away from
Montr al, 4km from a small, 2500 habitant village, has recently
got symmetric 100 mbps fiber at home from Telus, thanks to those subsidies. But I can't get that
service in Montr al at all, presumably because Telus and Bell colluded
to split that market. Bell doesn't provide me with such a service
either: they tell me they have "fiber to my neighborhood", and only
offer me a 25/10 mbps ADSL service. (There is Vid otron offering
400mbps, but that's copper cable, again a dead technology, and
asymmetric.)
Conclusion
Remember Chattanooga? Back in 2010, they funded the development of a
fiber network, and now they have deployed a network roughly a
thousand times faster than what we have just funded with a billion
dollars. In 2010, I was paying Bell
Canada
60$/mth for 20mbps and a 125GB cap, and now, I'm still (indirectly)
paying Bell for roughly the same speed (25mbps). Back then, Bell was
throttling their competitors networks until 2009, when they were
forced by the CRTC to stop
throttling. Both
Bell and Vid otron still explicitly forbid you from running your own servers
at home, Vid otron charges prohibitive prices which make it near
impossible for resellers to sell uncapped services. Those companies
are not spurring innovation: they are blocking it.
We have spent all this money for the private sector to build
us a private internet, over decades, without any assurance of quality,
equity or reliability. And while in some locations, ISPs did deploy
fiber to the home, they certainly didn't upgrade their entire network
to follow suit, and even less allowed resellers to compete on that
network.
In 10 years, when 100mbps will be laughable, I bet those service
providers will again punt the ball in the public courtyard and tell us
they don't have the money to upgrade everyone's equipment.
We got screwed. It's time to try something new.
Updates
There was a discussion about this article on Hacker News which
was surprisingly productive. Trigger warning: Hacker News is kind of
right-wing, in case you didn't know.
Since this article was written, at least two more major acquisitions
happened, just in Qu bec:
In the latter case, vMedia was explicitly saying it couldn't grow
because of "lack of access to capital". So basically, we have given
those companies a billion dollars, and they are not using that very
money to buy out their competition. At least we could have given
that money to small players to even out the playing field. But this is
not how that works at all. Also, in a bizarre twist, an "analyst"
believes the acquisition is likely to help Rogers acquire Shaw.
Also, since this article was written, the Washington Post published a
review of a book bringing similar ideas: Internet for the People
The Fight for Our Digital Future, by Ben Tarnoff, at Verso
books. It's short, but even more ambitious than what I am suggesting
in this article, arguing that all big tech companies should be broken
up and better regulated:
He pulls from Ethan Zuckerman s idea of a web that is plural in
purpose that just as pool halls, libraries and churches each have
different norms, purposes and designs, so too should different
places on the internet. To achieve this, Tarnoff wants governments
to pass laws that would make the big platforms unprofitable and, in
their place, fund small-scale, local experiments in social media
design. Instead of having platforms ruled by engagement-maximizing
algorithms, Tarnoff imagines public platforms run by local
librarians that include content from public media.
(Links mine: the Washington Post obviously prefers to not link to the
real web, and instead doesn't link to Zuckerman's site all and
suggests Amazon for the book, in a cynical example.)
And in another example of how the private sector has failed us, there
was recently a fluke in the AMBER alert system where the entire
province was warned about a loose shooter in Saint-Elz ar except
the people in the town, because they have spotty cell phone
coverage. In other words, millions of people received a strongly
toned, "life-threatening", alert for a city sometimes hours away,
except the people most vulnerable to the alert. Not missing a beat,
the CAQ party is promising more of the same medicine again and giving
more money to telcos to fix the problem, suggesting to spend
three billion dollars in private infrastructure.
I don't want the state to control my internet
One legitimate concern I have about the idea of the state running the
internet is the potential it would have to censor or control the
content running over the wires.
But I don't think there is necessarily a direct relationship between
resource ownership and control of content. Sure, China has strong
censorship in place, partly implemented through state-controlled
businesses. But Russia also has strong censorship in place, based on
regulatory tools: they force private service providers to install
back-doors in their networks to control content and surveil their
users.
Besides, the USA have been doing warrantless wiretapping since
at least 2003 (and yes, that's 10 years before the Snowden
revelations) so a commercial internet is no assurance that we have
a free internet. Quite the contrary in fact: if anything, the
commercial internet goes hand in hand with the neo-colonial
internet, just like businesses
did in the "good old colonial days".
Large media companies are the primary
censors of content here. In Canada, the media cartel requested the
first site-blocking order in 2018. The plaintiffs (including
Qu becor, Rogers, and Bell Canada) are both content providers and
internet service providers, an obvious conflict of interest.
Nevertheless, there are some strong arguments against having a
centralised, state-owned monopoly on internet service providers. FDN
makes a good point on this. But this is not what I am suggesting:
at the provincial level, the network would be purely physical, and
regional entities (which could include private companies) would peer
over that physical network, ensuring decentralization. Delegating the
management of that infrastructure to an independent non-profit or
cooperative (but owned by the state) would also ensure some level of
independence.
Isn't the government incompetent and corrupt?
Also known as "private enterprise is better skilled at handling this,
the state can't do anything right"
I don't think this is a "fait accomplit". If anything, I have found
publicly ran utilities to be spectacularly reliable here. I rarely
have trouble with sewage, water, or power, and keep in mind I live in
a city where we receive about 2 meters of snow a year, which tend
to create lots of trouble with power lines. Unless there's a major
weather event, power just runs here.
I think the same can happen with an internet service provider. But it
would certainly need to have higher standards to what we're used to,
because frankly Internet is kind of janky.
A single monopoly will be less reliable
I actually agree with that, but that is not what I am
proposing anyways.
Current commercial or non-profit entities will be free to offer their services on
top of the public network.
And besides, the current "ha! diversity is
great" approach is exactly what we have now, and it's not working.
The pretense that we can have competition over a single network is
what led the US into the ridiculous situation where they also pretend
to have competition over the power utility market. This led to
massive forest fires in California and major power outages in
Texas. It doesn't work.
Wouldn't this create an isolated network?
One theory is that this new network would be so hostile to incumbent
telcos and ISPs that they would simply refuse to network with the
public utility. And while it is true that the telcos currently do
also act as a kind of "tier one" provider in some places, I
strongly feel this is also a problem that needs to be solved,
regardless of ownership of networking infrastructure.
Right now, telcos often hold both ends of the stick: they are the
gateway to users, the "last mile", but they also provide peering to
the larger internet in some locations. In at least one datacenter in
downtown Montr al, I've seen traffic go through Bell Canada that was
not directly targeted at Bell customers. So in effect, they are in a
position of charging twice for the same traffic, and that's not only
ridiculous, it should just be plain illegal.
And besides, this is not a big problem: there are other providers
out there. As bad as the market is in Qu bec, there is still some
diversity in Tier one providers that could allow for some exits to the
wider network (e.g. yes, Cogent is here too).
What about Google and Facebook?
Nationalization of other service
providers like Google and Facebook is out of scope of this discussion.
That said, I am not sure the state should get into the business of
organising the web or providing content services however, but I will
point out it already does do some of that through its own
websites. It should probably keep itself to this, and also
consider providing normal services for people who don't or can't
access the internet.
(And I would also be ready to argue that Google and Facebook already
act as extensions of the state: certainly if Facebook didn't exist,
the CIA or the NSA would like to create it at this point. And Google
has lucrative business with the US department of defense.)
What does not work
So we've seen one thing that could work. Maybe it's too
expensive. Maybe the political will isn't there. Maybe it will
fail. We don't know yet.
But we know what does not work, and it's what we've been doing ever
since the internet has gone commercial.
Legal pressure and regulation
In 1984 (of all years), the US Department of Justice finally
broke up AT&T in half a dozen corporations, after a 10 year legal
battle. Yet a decades later, we're back to only three large
providers doing essentially what AT&T was doing back then, and those
are regional monopolies: AT&T, Verizon, and Lumen (not counting
T-Mobile that is from a different breed). So the legal approach
really didn't work that well, especially considering the political
landscape changed in the US, and the FTC seems perfectly happy to let
those major mergers continue.
In Canada, we never even pretended we would solve this problem at all:
Bell Canada (the literal "father" of AT&T) is in the same
situation now. We have either a regional monopoly (e.g. Videotron for
cable in Qu bec) or an oligopoly (Bell, Rogers, and Telus controlling
more than 90% of the market). Telus does have one competitor in the
west of Canada, Shaw, but Rogers has been trying to buy it
out. The competition bureau seems to have blocked the merger
for now, but it didn't stop other recent mergers like Bell's
acquisition one of its main competitors in Qu bec, eBox.
Regulation doesn't seem capable of ensuring those profitable
corporations provide us with decent pricing, which makes Canada one
of the most expensive countries (research) for mobile data on
the planet. The recent failure of the CRTC to properly protect smaller
providers has even lead to price hikes. Meanwhile the oligopoly
is actually agreeing on their own price hikes therefore becoming
a real cartel, complete with price fixing and reductions in
output.
There are actually regulations in Canada supposed to keep the worst of
the Rogers outage from happening at all. According to CBC:
Under Canadian Radio-television and Telecommunications Commission
(CRTC) rules in place since 2017, telecom networks are supposed to
ensure that cellphones are able to contact 911 even if they do not
have service.
I could personally confirm that my phone couldn't reach 911 services,
because all calls would fail: the problem was that towers were still
up, so your phone wouldn't fall back to alternative service providers
(which could have resolved the issue). I can only speculate as to why
Rogers didn't take cell phone towers out of the network to let phones
work properly for 911 service, but it seems like a dangerous game to
play.
Hilariously, the CRTC itself didn't have a reliable phone service due
to the service outage:
Please note that our phone lines are affected by the Rogers network outage.
Our website is still available: https://crtc.gc.ca/eng/contact/
https://mobile.twitter.com/CRTCeng/status/1545421218534359041
I wonder if they will file a complaint against Rogers themselves about
this. I probably should.
It seems the federal government is thinking more of the same medicine
will fix the problem and has told companies should "help" each other
in an emergency. I doubt this will fix anything, and could
actually make things worse if the competitors actually interoperate
more, as it could cause multi-provider, cascading failures.
Subsidies
The absurd price we pay for data does not actually mean everyone gets
high speed internet at home. Large swathes of the Qu bec countryside
don't get broadband at all, and it can be difficult or expensive, even
in large urban centers like Montr al, to get high speed internet.
That is despite having a series of subsidies that all avoided
investing in our own infrastructure. We had the "fonds de l'autoroute
de l'information", "information highway fund" (site dead since
2003, archive.org link) and "branchez les familles",
"connecting families" (site dead since 2003, archive.org
link)
which subsidized the development of a copper network. In 2014, more of
the same: the federal government poured hundreds of millions of
dollars into a program called connecting Canadians to connect 280
000 households to "high speed internet". And now, the federal and
provincial governments are proudly announcing that "everyone is now
connected to high speed internet", after pouring more than 1.1
billion dollars to connect, guess what, another 380 000 homes, right
in time for the provincial election.
Of course, technically, the deadline won't actually be met
until 2023. Qu bec is a big area to cover, and you can guess what
happens next: the telcos threw up their hand and said some areas just
can't be connected. (Or they connect their CEO but not the poor folks
across the lake.) The story then takes the predictable twist of
giving more money out to billionaires, subsidizing now Musk's
Starlink system to connect those remote areas.
To give a concrete example: a friend who lives about 1000km away from
Montr al, 4km from a small, 2500 habitant village, has recently
got symmetric 100 mbps fiber at home from Telus, thanks to those subsidies. But I can't get that
service in Montr al at all, presumably because Telus and Bell colluded
to split that market. Bell doesn't provide me with such a service
either: they tell me they have "fiber to my neighborhood", and only
offer me a 25/10 mbps ADSL service. (There is Vid otron offering
400mbps, but that's copper cable, again a dead technology, and
asymmetric.)
Conclusion
Remember Chattanooga? Back in 2010, they funded the development of a
fiber network, and now they have deployed a network roughly a
thousand times faster than what we have just funded with a billion
dollars. In 2010, I was paying Bell
Canada
60$/mth for 20mbps and a 125GB cap, and now, I'm still (indirectly)
paying Bell for roughly the same speed (25mbps). Back then, Bell was
throttling their competitors networks until 2009, when they were
forced by the CRTC to stop
throttling. Both
Bell and Vid otron still explicitly forbid you from running your own servers
at home, Vid otron charges prohibitive prices which make it near
impossible for resellers to sell uncapped services. Those companies
are not spurring innovation: they are blocking it.
We have spent all this money for the private sector to build
us a private internet, over decades, without any assurance of quality,
equity or reliability. And while in some locations, ISPs did deploy
fiber to the home, they certainly didn't upgrade their entire network
to follow suit, and even less allowed resellers to compete on that
network.
In 10 years, when 100mbps will be laughable, I bet those service
providers will again punt the ball in the public courtyard and tell us
they don't have the money to upgrade everyone's equipment.
We got screwed. It's time to try something new.
Updates
There was a discussion about this article on Hacker News which
was surprisingly productive. Trigger warning: Hacker News is kind of
right-wing, in case you didn't know.
Since this article was written, at least two more major acquisitions
happened, just in Qu bec:
In the latter case, vMedia was explicitly saying it couldn't grow
because of "lack of access to capital". So basically, we have given
those companies a billion dollars, and they are not using that very
money to buy out their competition. At least we could have given
that money to small players to even out the playing field. But this is
not how that works at all. Also, in a bizarre twist, an "analyst"
believes the acquisition is likely to help Rogers acquire Shaw.
Also, since this article was written, the Washington Post published a
review of a book bringing similar ideas: Internet for the People
The Fight for Our Digital Future, by Ben Tarnoff, at Verso
books. It's short, but even more ambitious than what I am suggesting
in this article, arguing that all big tech companies should be broken
up and better regulated:
He pulls from Ethan Zuckerman s idea of a web that is plural in
purpose that just as pool halls, libraries and churches each have
different norms, purposes and designs, so too should different
places on the internet. To achieve this, Tarnoff wants governments
to pass laws that would make the big platforms unprofitable and, in
their place, fund small-scale, local experiments in social media
design. Instead of having platforms ruled by engagement-maximizing
algorithms, Tarnoff imagines public platforms run by local
librarians that include content from public media.
(Links mine: the Washington Post obviously prefers to not link to the
real web, and instead doesn't link to Zuckerman's site all and
suggests Amazon for the book, in a cynical example.)
And in another example of how the private sector has failed us, there
was recently a fluke in the AMBER alert system where the entire
province was warned about a loose shooter in Saint-Elz ar except
the people in the town, because they have spotty cell phone
coverage. In other words, millions of people received a strongly
toned, "life-threatening", alert for a city sometimes hours away,
except the people most vulnerable to the alert. Not missing a beat,
the CAQ party is promising more of the same medicine again and giving
more money to telcos to fix the problem, suggesting to spend
three billion dollars in private infrastructure.
A single monopoly will be less reliable
I actually agree with that, but that is not what I am
proposing anyways.
Current commercial or non-profit entities will be free to offer their services on
top of the public network.
And besides, the current "ha! diversity is
great" approach is exactly what we have now, and it's not working.
The pretense that we can have competition over a single network is
what led the US into the ridiculous situation where they also pretend
to have competition over the power utility market. This led to
massive forest fires in California and major power outages in
Texas. It doesn't work.
Wouldn't this create an isolated network?
One theory is that this new network would be so hostile to incumbent
telcos and ISPs that they would simply refuse to network with the
public utility. And while it is true that the telcos currently do
also act as a kind of "tier one" provider in some places, I
strongly feel this is also a problem that needs to be solved,
regardless of ownership of networking infrastructure.
Right now, telcos often hold both ends of the stick: they are the
gateway to users, the "last mile", but they also provide peering to
the larger internet in some locations. In at least one datacenter in
downtown Montr al, I've seen traffic go through Bell Canada that was
not directly targeted at Bell customers. So in effect, they are in a
position of charging twice for the same traffic, and that's not only
ridiculous, it should just be plain illegal.
And besides, this is not a big problem: there are other providers
out there. As bad as the market is in Qu bec, there is still some
diversity in Tier one providers that could allow for some exits to the
wider network (e.g. yes, Cogent is here too).
What about Google and Facebook?
Nationalization of other service
providers like Google and Facebook is out of scope of this discussion.
That said, I am not sure the state should get into the business of
organising the web or providing content services however, but I will
point out it already does do some of that through its own
websites. It should probably keep itself to this, and also
consider providing normal services for people who don't or can't
access the internet.
(And I would also be ready to argue that Google and Facebook already
act as extensions of the state: certainly if Facebook didn't exist,
the CIA or the NSA would like to create it at this point. And Google
has lucrative business with the US department of defense.)
What does not work
So we've seen one thing that could work. Maybe it's too
expensive. Maybe the political will isn't there. Maybe it will
fail. We don't know yet.
But we know what does not work, and it's what we've been doing ever
since the internet has gone commercial.
Legal pressure and regulation
In 1984 (of all years), the US Department of Justice finally
broke up AT&T in half a dozen corporations, after a 10 year legal
battle. Yet a decades later, we're back to only three large
providers doing essentially what AT&T was doing back then, and those
are regional monopolies: AT&T, Verizon, and Lumen (not counting
T-Mobile that is from a different breed). So the legal approach
really didn't work that well, especially considering the political
landscape changed in the US, and the FTC seems perfectly happy to let
those major mergers continue.
In Canada, we never even pretended we would solve this problem at all:
Bell Canada (the literal "father" of AT&T) is in the same
situation now. We have either a regional monopoly (e.g. Videotron for
cable in Qu bec) or an oligopoly (Bell, Rogers, and Telus controlling
more than 90% of the market). Telus does have one competitor in the
west of Canada, Shaw, but Rogers has been trying to buy it
out. The competition bureau seems to have blocked the merger
for now, but it didn't stop other recent mergers like Bell's
acquisition one of its main competitors in Qu bec, eBox.
Regulation doesn't seem capable of ensuring those profitable
corporations provide us with decent pricing, which makes Canada one
of the most expensive countries (research) for mobile data on
the planet. The recent failure of the CRTC to properly protect smaller
providers has even lead to price hikes. Meanwhile the oligopoly
is actually agreeing on their own price hikes therefore becoming
a real cartel, complete with price fixing and reductions in
output.
There are actually regulations in Canada supposed to keep the worst of
the Rogers outage from happening at all. According to CBC:
Under Canadian Radio-television and Telecommunications Commission
(CRTC) rules in place since 2017, telecom networks are supposed to
ensure that cellphones are able to contact 911 even if they do not
have service.
I could personally confirm that my phone couldn't reach 911 services,
because all calls would fail: the problem was that towers were still
up, so your phone wouldn't fall back to alternative service providers
(which could have resolved the issue). I can only speculate as to why
Rogers didn't take cell phone towers out of the network to let phones
work properly for 911 service, but it seems like a dangerous game to
play.
Hilariously, the CRTC itself didn't have a reliable phone service due
to the service outage:
Please note that our phone lines are affected by the Rogers network outage.
Our website is still available: https://crtc.gc.ca/eng/contact/
https://mobile.twitter.com/CRTCeng/status/1545421218534359041
I wonder if they will file a complaint against Rogers themselves about
this. I probably should.
It seems the federal government is thinking more of the same medicine
will fix the problem and has told companies should "help" each other
in an emergency. I doubt this will fix anything, and could
actually make things worse if the competitors actually interoperate
more, as it could cause multi-provider, cascading failures.
Subsidies
The absurd price we pay for data does not actually mean everyone gets
high speed internet at home. Large swathes of the Qu bec countryside
don't get broadband at all, and it can be difficult or expensive, even
in large urban centers like Montr al, to get high speed internet.
That is despite having a series of subsidies that all avoided
investing in our own infrastructure. We had the "fonds de l'autoroute
de l'information", "information highway fund" (site dead since
2003, archive.org link) and "branchez les familles",
"connecting families" (site dead since 2003, archive.org
link)
which subsidized the development of a copper network. In 2014, more of
the same: the federal government poured hundreds of millions of
dollars into a program called connecting Canadians to connect 280
000 households to "high speed internet". And now, the federal and
provincial governments are proudly announcing that "everyone is now
connected to high speed internet", after pouring more than 1.1
billion dollars to connect, guess what, another 380 000 homes, right
in time for the provincial election.
Of course, technically, the deadline won't actually be met
until 2023. Qu bec is a big area to cover, and you can guess what
happens next: the telcos threw up their hand and said some areas just
can't be connected. (Or they connect their CEO but not the poor folks
across the lake.) The story then takes the predictable twist of
giving more money out to billionaires, subsidizing now Musk's
Starlink system to connect those remote areas.
To give a concrete example: a friend who lives about 1000km away from
Montr al, 4km from a small, 2500 habitant village, has recently
got symmetric 100 mbps fiber at home from Telus, thanks to those subsidies. But I can't get that
service in Montr al at all, presumably because Telus and Bell colluded
to split that market. Bell doesn't provide me with such a service
either: they tell me they have "fiber to my neighborhood", and only
offer me a 25/10 mbps ADSL service. (There is Vid otron offering
400mbps, but that's copper cable, again a dead technology, and
asymmetric.)
Conclusion
Remember Chattanooga? Back in 2010, they funded the development of a
fiber network, and now they have deployed a network roughly a
thousand times faster than what we have just funded with a billion
dollars. In 2010, I was paying Bell
Canada
60$/mth for 20mbps and a 125GB cap, and now, I'm still (indirectly)
paying Bell for roughly the same speed (25mbps). Back then, Bell was
throttling their competitors networks until 2009, when they were
forced by the CRTC to stop
throttling. Both
Bell and Vid otron still explicitly forbid you from running your own servers
at home, Vid otron charges prohibitive prices which make it near
impossible for resellers to sell uncapped services. Those companies
are not spurring innovation: they are blocking it.
We have spent all this money for the private sector to build
us a private internet, over decades, without any assurance of quality,
equity or reliability. And while in some locations, ISPs did deploy
fiber to the home, they certainly didn't upgrade their entire network
to follow suit, and even less allowed resellers to compete on that
network.
In 10 years, when 100mbps will be laughable, I bet those service
providers will again punt the ball in the public courtyard and tell us
they don't have the money to upgrade everyone's equipment.
We got screwed. It's time to try something new.
Updates
There was a discussion about this article on Hacker News which
was surprisingly productive. Trigger warning: Hacker News is kind of
right-wing, in case you didn't know.
Since this article was written, at least two more major acquisitions
happened, just in Qu bec:
In the latter case, vMedia was explicitly saying it couldn't grow
because of "lack of access to capital". So basically, we have given
those companies a billion dollars, and they are not using that very
money to buy out their competition. At least we could have given
that money to small players to even out the playing field. But this is
not how that works at all. Also, in a bizarre twist, an "analyst"
believes the acquisition is likely to help Rogers acquire Shaw.
Also, since this article was written, the Washington Post published a
review of a book bringing similar ideas: Internet for the People
The Fight for Our Digital Future, by Ben Tarnoff, at Verso
books. It's short, but even more ambitious than what I am suggesting
in this article, arguing that all big tech companies should be broken
up and better regulated:
He pulls from Ethan Zuckerman s idea of a web that is plural in
purpose that just as pool halls, libraries and churches each have
different norms, purposes and designs, so too should different
places on the internet. To achieve this, Tarnoff wants governments
to pass laws that would make the big platforms unprofitable and, in
their place, fund small-scale, local experiments in social media
design. Instead of having platforms ruled by engagement-maximizing
algorithms, Tarnoff imagines public platforms run by local
librarians that include content from public media.
(Links mine: the Washington Post obviously prefers to not link to the
real web, and instead doesn't link to Zuckerman's site all and
suggests Amazon for the book, in a cynical example.)
And in another example of how the private sector has failed us, there
was recently a fluke in the AMBER alert system where the entire
province was warned about a loose shooter in Saint-Elz ar except
the people in the town, because they have spotty cell phone
coverage. In other words, millions of people received a strongly
toned, "life-threatening", alert for a city sometimes hours away,
except the people most vulnerable to the alert. Not missing a beat,
the CAQ party is promising more of the same medicine again and giving
more money to telcos to fix the problem, suggesting to spend
three billion dollars in private infrastructure.
What about Google and Facebook?
Nationalization of other service
providers like Google and Facebook is out of scope of this discussion.
That said, I am not sure the state should get into the business of
organising the web or providing content services however, but I will
point out it already does do some of that through its own
websites. It should probably keep itself to this, and also
consider providing normal services for people who don't or can't
access the internet.
(And I would also be ready to argue that Google and Facebook already
act as extensions of the state: certainly if Facebook didn't exist,
the CIA or the NSA would like to create it at this point. And Google
has lucrative business with the US department of defense.)
What does not work
So we've seen one thing that could work. Maybe it's too
expensive. Maybe the political will isn't there. Maybe it will
fail. We don't know yet.
But we know what does not work, and it's what we've been doing ever
since the internet has gone commercial.
Legal pressure and regulation
In 1984 (of all years), the US Department of Justice finally
broke up AT&T in half a dozen corporations, after a 10 year legal
battle. Yet a decades later, we're back to only three large
providers doing essentially what AT&T was doing back then, and those
are regional monopolies: AT&T, Verizon, and Lumen (not counting
T-Mobile that is from a different breed). So the legal approach
really didn't work that well, especially considering the political
landscape changed in the US, and the FTC seems perfectly happy to let
those major mergers continue.
In Canada, we never even pretended we would solve this problem at all:
Bell Canada (the literal "father" of AT&T) is in the same
situation now. We have either a regional monopoly (e.g. Videotron for
cable in Qu bec) or an oligopoly (Bell, Rogers, and Telus controlling
more than 90% of the market). Telus does have one competitor in the
west of Canada, Shaw, but Rogers has been trying to buy it
out. The competition bureau seems to have blocked the merger
for now, but it didn't stop other recent mergers like Bell's
acquisition one of its main competitors in Qu bec, eBox.
Regulation doesn't seem capable of ensuring those profitable
corporations provide us with decent pricing, which makes Canada one
of the most expensive countries (research) for mobile data on
the planet. The recent failure of the CRTC to properly protect smaller
providers has even lead to price hikes. Meanwhile the oligopoly
is actually agreeing on their own price hikes therefore becoming
a real cartel, complete with price fixing and reductions in
output.
There are actually regulations in Canada supposed to keep the worst of
the Rogers outage from happening at all. According to CBC:
Under Canadian Radio-television and Telecommunications Commission
(CRTC) rules in place since 2017, telecom networks are supposed to
ensure that cellphones are able to contact 911 even if they do not
have service.
I could personally confirm that my phone couldn't reach 911 services,
because all calls would fail: the problem was that towers were still
up, so your phone wouldn't fall back to alternative service providers
(which could have resolved the issue). I can only speculate as to why
Rogers didn't take cell phone towers out of the network to let phones
work properly for 911 service, but it seems like a dangerous game to
play.
Hilariously, the CRTC itself didn't have a reliable phone service due
to the service outage:
Please note that our phone lines are affected by the Rogers network outage.
Our website is still available: https://crtc.gc.ca/eng/contact/
https://mobile.twitter.com/CRTCeng/status/1545421218534359041
I wonder if they will file a complaint against Rogers themselves about
this. I probably should.
It seems the federal government is thinking more of the same medicine
will fix the problem and has told companies should "help" each other
in an emergency. I doubt this will fix anything, and could
actually make things worse if the competitors actually interoperate
more, as it could cause multi-provider, cascading failures.
Subsidies
The absurd price we pay for data does not actually mean everyone gets
high speed internet at home. Large swathes of the Qu bec countryside
don't get broadband at all, and it can be difficult or expensive, even
in large urban centers like Montr al, to get high speed internet.
That is despite having a series of subsidies that all avoided
investing in our own infrastructure. We had the "fonds de l'autoroute
de l'information", "information highway fund" (site dead since
2003, archive.org link) and "branchez les familles",
"connecting families" (site dead since 2003, archive.org
link)
which subsidized the development of a copper network. In 2014, more of
the same: the federal government poured hundreds of millions of
dollars into a program called connecting Canadians to connect 280
000 households to "high speed internet". And now, the federal and
provincial governments are proudly announcing that "everyone is now
connected to high speed internet", after pouring more than 1.1
billion dollars to connect, guess what, another 380 000 homes, right
in time for the provincial election.
Of course, technically, the deadline won't actually be met
until 2023. Qu bec is a big area to cover, and you can guess what
happens next: the telcos threw up their hand and said some areas just
can't be connected. (Or they connect their CEO but not the poor folks
across the lake.) The story then takes the predictable twist of
giving more money out to billionaires, subsidizing now Musk's
Starlink system to connect those remote areas.
To give a concrete example: a friend who lives about 1000km away from
Montr al, 4km from a small, 2500 habitant village, has recently
got symmetric 100 mbps fiber at home from Telus, thanks to those subsidies. But I can't get that
service in Montr al at all, presumably because Telus and Bell colluded
to split that market. Bell doesn't provide me with such a service
either: they tell me they have "fiber to my neighborhood", and only
offer me a 25/10 mbps ADSL service. (There is Vid otron offering
400mbps, but that's copper cable, again a dead technology, and
asymmetric.)
Conclusion
Remember Chattanooga? Back in 2010, they funded the development of a
fiber network, and now they have deployed a network roughly a
thousand times faster than what we have just funded with a billion
dollars. In 2010, I was paying Bell
Canada
60$/mth for 20mbps and a 125GB cap, and now, I'm still (indirectly)
paying Bell for roughly the same speed (25mbps). Back then, Bell was
throttling their competitors networks until 2009, when they were
forced by the CRTC to stop
throttling. Both
Bell and Vid otron still explicitly forbid you from running your own servers
at home, Vid otron charges prohibitive prices which make it near
impossible for resellers to sell uncapped services. Those companies
are not spurring innovation: they are blocking it.
We have spent all this money for the private sector to build
us a private internet, over decades, without any assurance of quality,
equity or reliability. And while in some locations, ISPs did deploy
fiber to the home, they certainly didn't upgrade their entire network
to follow suit, and even less allowed resellers to compete on that
network.
In 10 years, when 100mbps will be laughable, I bet those service
providers will again punt the ball in the public courtyard and tell us
they don't have the money to upgrade everyone's equipment.
We got screwed. It's time to try something new.
Updates
There was a discussion about this article on Hacker News which
was surprisingly productive. Trigger warning: Hacker News is kind of
right-wing, in case you didn't know.
Since this article was written, at least two more major acquisitions
happened, just in Qu bec:
In the latter case, vMedia was explicitly saying it couldn't grow
because of "lack of access to capital". So basically, we have given
those companies a billion dollars, and they are not using that very
money to buy out their competition. At least we could have given
that money to small players to even out the playing field. But this is
not how that works at all. Also, in a bizarre twist, an "analyst"
believes the acquisition is likely to help Rogers acquire Shaw.
Also, since this article was written, the Washington Post published a
review of a book bringing similar ideas: Internet for the People
The Fight for Our Digital Future, by Ben Tarnoff, at Verso
books. It's short, but even more ambitious than what I am suggesting
in this article, arguing that all big tech companies should be broken
up and better regulated:
He pulls from Ethan Zuckerman s idea of a web that is plural in
purpose that just as pool halls, libraries and churches each have
different norms, purposes and designs, so too should different
places on the internet. To achieve this, Tarnoff wants governments
to pass laws that would make the big platforms unprofitable and, in
their place, fund small-scale, local experiments in social media
design. Instead of having platforms ruled by engagement-maximizing
algorithms, Tarnoff imagines public platforms run by local
librarians that include content from public media.
(Links mine: the Washington Post obviously prefers to not link to the
real web, and instead doesn't link to Zuckerman's site all and
suggests Amazon for the book, in a cynical example.)
And in another example of how the private sector has failed us, there
was recently a fluke in the AMBER alert system where the entire
province was warned about a loose shooter in Saint-Elz ar except
the people in the town, because they have spotty cell phone
coverage. In other words, millions of people received a strongly
toned, "life-threatening", alert for a city sometimes hours away,
except the people most vulnerable to the alert. Not missing a beat,
the CAQ party is promising more of the same medicine again and giving
more money to telcos to fix the problem, suggesting to spend
three billion dollars in private infrastructure.
Legal pressure and regulation
In 1984 (of all years), the US Department of Justice finally
broke up AT&T in half a dozen corporations, after a 10 year legal
battle. Yet a decades later, we're back to only three large
providers doing essentially what AT&T was doing back then, and those
are regional monopolies: AT&T, Verizon, and Lumen (not counting
T-Mobile that is from a different breed). So the legal approach
really didn't work that well, especially considering the political
landscape changed in the US, and the FTC seems perfectly happy to let
those major mergers continue.
In Canada, we never even pretended we would solve this problem at all:
Bell Canada (the literal "father" of AT&T) is in the same
situation now. We have either a regional monopoly (e.g. Videotron for
cable in Qu bec) or an oligopoly (Bell, Rogers, and Telus controlling
more than 90% of the market). Telus does have one competitor in the
west of Canada, Shaw, but Rogers has been trying to buy it
out. The competition bureau seems to have blocked the merger
for now, but it didn't stop other recent mergers like Bell's
acquisition one of its main competitors in Qu bec, eBox.
Regulation doesn't seem capable of ensuring those profitable
corporations provide us with decent pricing, which makes Canada one
of the most expensive countries (research) for mobile data on
the planet. The recent failure of the CRTC to properly protect smaller
providers has even lead to price hikes. Meanwhile the oligopoly
is actually agreeing on their own price hikes therefore becoming
a real cartel, complete with price fixing and reductions in
output.
There are actually regulations in Canada supposed to keep the worst of
the Rogers outage from happening at all. According to CBC:
Under Canadian Radio-television and Telecommunications Commission
(CRTC) rules in place since 2017, telecom networks are supposed to
ensure that cellphones are able to contact 911 even if they do not
have service.
I could personally confirm that my phone couldn't reach 911 services,
because all calls would fail: the problem was that towers were still
up, so your phone wouldn't fall back to alternative service providers
(which could have resolved the issue). I can only speculate as to why
Rogers didn't take cell phone towers out of the network to let phones
work properly for 911 service, but it seems like a dangerous game to
play.
Hilariously, the CRTC itself didn't have a reliable phone service due
to the service outage:
Please note that our phone lines are affected by the Rogers network outage.
Our website is still available: https://crtc.gc.ca/eng/contact/
https://mobile.twitter.com/CRTCeng/status/1545421218534359041
I wonder if they will file a complaint against Rogers themselves about
this. I probably should.
It seems the federal government is thinking more of the same medicine
will fix the problem and has told companies should "help" each other
in an emergency. I doubt this will fix anything, and could
actually make things worse if the competitors actually interoperate
more, as it could cause multi-provider, cascading failures.
Subsidies
The absurd price we pay for data does not actually mean everyone gets
high speed internet at home. Large swathes of the Qu bec countryside
don't get broadband at all, and it can be difficult or expensive, even
in large urban centers like Montr al, to get high speed internet.
That is despite having a series of subsidies that all avoided
investing in our own infrastructure. We had the "fonds de l'autoroute
de l'information", "information highway fund" (site dead since
2003, archive.org link) and "branchez les familles",
"connecting families" (site dead since 2003, archive.org
link)
which subsidized the development of a copper network. In 2014, more of
the same: the federal government poured hundreds of millions of
dollars into a program called connecting Canadians to connect 280
000 households to "high speed internet". And now, the federal and
provincial governments are proudly announcing that "everyone is now
connected to high speed internet", after pouring more than 1.1
billion dollars to connect, guess what, another 380 000 homes, right
in time for the provincial election.
Of course, technically, the deadline won't actually be met
until 2023. Qu bec is a big area to cover, and you can guess what
happens next: the telcos threw up their hand and said some areas just
can't be connected. (Or they connect their CEO but not the poor folks
across the lake.) The story then takes the predictable twist of
giving more money out to billionaires, subsidizing now Musk's
Starlink system to connect those remote areas.
To give a concrete example: a friend who lives about 1000km away from
Montr al, 4km from a small, 2500 habitant village, has recently
got symmetric 100 mbps fiber at home from Telus, thanks to those subsidies. But I can't get that
service in Montr al at all, presumably because Telus and Bell colluded
to split that market. Bell doesn't provide me with such a service
either: they tell me they have "fiber to my neighborhood", and only
offer me a 25/10 mbps ADSL service. (There is Vid otron offering
400mbps, but that's copper cable, again a dead technology, and
asymmetric.)
Conclusion
Remember Chattanooga? Back in 2010, they funded the development of a
fiber network, and now they have deployed a network roughly a
thousand times faster than what we have just funded with a billion
dollars. In 2010, I was paying Bell
Canada
60$/mth for 20mbps and a 125GB cap, and now, I'm still (indirectly)
paying Bell for roughly the same speed (25mbps). Back then, Bell was
throttling their competitors networks until 2009, when they were
forced by the CRTC to stop
throttling. Both
Bell and Vid otron still explicitly forbid you from running your own servers
at home, Vid otron charges prohibitive prices which make it near
impossible for resellers to sell uncapped services. Those companies
are not spurring innovation: they are blocking it.
We have spent all this money for the private sector to build
us a private internet, over decades, without any assurance of quality,
equity or reliability. And while in some locations, ISPs did deploy
fiber to the home, they certainly didn't upgrade their entire network
to follow suit, and even less allowed resellers to compete on that
network.
In 10 years, when 100mbps will be laughable, I bet those service
providers will again punt the ball in the public courtyard and tell us
they don't have the money to upgrade everyone's equipment.
We got screwed. It's time to try something new.
Updates
There was a discussion about this article on Hacker News which
was surprisingly productive. Trigger warning: Hacker News is kind of
right-wing, in case you didn't know.
Since this article was written, at least two more major acquisitions
happened, just in Qu bec:
In the latter case, vMedia was explicitly saying it couldn't grow
because of "lack of access to capital". So basically, we have given
those companies a billion dollars, and they are not using that very
money to buy out their competition. At least we could have given
that money to small players to even out the playing field. But this is
not how that works at all. Also, in a bizarre twist, an "analyst"
believes the acquisition is likely to help Rogers acquire Shaw.
Also, since this article was written, the Washington Post published a
review of a book bringing similar ideas: Internet for the People
The Fight for Our Digital Future, by Ben Tarnoff, at Verso
books. It's short, but even more ambitious than what I am suggesting
in this article, arguing that all big tech companies should be broken
up and better regulated:
He pulls from Ethan Zuckerman s idea of a web that is plural in
purpose that just as pool halls, libraries and churches each have
different norms, purposes and designs, so too should different
places on the internet. To achieve this, Tarnoff wants governments
to pass laws that would make the big platforms unprofitable and, in
their place, fund small-scale, local experiments in social media
design. Instead of having platforms ruled by engagement-maximizing
algorithms, Tarnoff imagines public platforms run by local
librarians that include content from public media.
(Links mine: the Washington Post obviously prefers to not link to the
real web, and instead doesn't link to Zuckerman's site all and
suggests Amazon for the book, in a cynical example.)
And in another example of how the private sector has failed us, there
was recently a fluke in the AMBER alert system where the entire
province was warned about a loose shooter in Saint-Elz ar except
the people in the town, because they have spotty cell phone
coverage. In other words, millions of people received a strongly
toned, "life-threatening", alert for a city sometimes hours away,
except the people most vulnerable to the alert. Not missing a beat,
the CAQ party is promising more of the same medicine again and giving
more money to telcos to fix the problem, suggesting to spend
three billion dollars in private infrastructure.
Conclusion
Remember Chattanooga? Back in 2010, they funded the development of a
fiber network, and now they have deployed a network roughly a
thousand times faster than what we have just funded with a billion
dollars. In 2010, I was paying Bell
Canada
60$/mth for 20mbps and a 125GB cap, and now, I'm still (indirectly)
paying Bell for roughly the same speed (25mbps). Back then, Bell was
throttling their competitors networks until 2009, when they were
forced by the CRTC to stop
throttling. Both
Bell and Vid otron still explicitly forbid you from running your own servers
at home, Vid otron charges prohibitive prices which make it near
impossible for resellers to sell uncapped services. Those companies
are not spurring innovation: they are blocking it.
We have spent all this money for the private sector to build
us a private internet, over decades, without any assurance of quality,
equity or reliability. And while in some locations, ISPs did deploy
fiber to the home, they certainly didn't upgrade their entire network
to follow suit, and even less allowed resellers to compete on that
network.
In 10 years, when 100mbps will be laughable, I bet those service
providers will again punt the ball in the public courtyard and tell us
they don't have the money to upgrade everyone's equipment.
We got screwed. It's time to try something new.
Updates
There was a discussion about this article on Hacker News which
was surprisingly productive. Trigger warning: Hacker News is kind of
right-wing, in case you didn't know.
Since this article was written, at least two more major acquisitions
happened, just in Qu bec:
In the latter case, vMedia was explicitly saying it couldn't grow
because of "lack of access to capital". So basically, we have given
those companies a billion dollars, and they are not using that very
money to buy out their competition. At least we could have given
that money to small players to even out the playing field. But this is
not how that works at all. Also, in a bizarre twist, an "analyst"
believes the acquisition is likely to help Rogers acquire Shaw.
Also, since this article was written, the Washington Post published a
review of a book bringing similar ideas: Internet for the People
The Fight for Our Digital Future, by Ben Tarnoff, at Verso
books. It's short, but even more ambitious than what I am suggesting
in this article, arguing that all big tech companies should be broken
up and better regulated:
He pulls from Ethan Zuckerman s idea of a web that is plural in
purpose that just as pool halls, libraries and churches each have
different norms, purposes and designs, so too should different
places on the internet. To achieve this, Tarnoff wants governments
to pass laws that would make the big platforms unprofitable and, in
their place, fund small-scale, local experiments in social media
design. Instead of having platforms ruled by engagement-maximizing
algorithms, Tarnoff imagines public platforms run by local
librarians that include content from public media.
(Links mine: the Washington Post obviously prefers to not link to the
real web, and instead doesn't link to Zuckerman's site all and
suggests Amazon for the book, in a cynical example.)
And in another example of how the private sector has failed us, there
was recently a fluke in the AMBER alert system where the entire
province was warned about a loose shooter in Saint-Elz ar except
the people in the town, because they have spotty cell phone
coverage. In other words, millions of people received a strongly
toned, "life-threatening", alert for a city sometimes hours away,
except the people most vulnerable to the alert. Not missing a beat,
the CAQ party is promising more of the same medicine again and giving
more money to telcos to fix the problem, suggesting to spend
three billion dollars in private infrastructure.
He pulls from Ethan Zuckerman s idea of a web that is plural in purpose that just as pool halls, libraries and churches each have different norms, purposes and designs, so too should different places on the internet. To achieve this, Tarnoff wants governments to pass laws that would make the big platforms unprofitable and, in their place, fund small-scale, local experiments in social media design. Instead of having platforms ruled by engagement-maximizing algorithms, Tarnoff imagines public platforms run by local librarians that include content from public media.(Links mine: the Washington Post obviously prefers to not link to the real web, and instead doesn't link to Zuckerman's site all and suggests Amazon for the book, in a cynical example.) And in another example of how the private sector has failed us, there was recently a fluke in the AMBER alert system where the entire province was warned about a loose shooter in Saint-Elz ar except the people in the town, because they have spotty cell phone coverage. In other words, millions of people received a strongly toned, "life-threatening", alert for a city sometimes hours away, except the people most vulnerable to the alert. Not missing a beat, the CAQ party is promising more of the same medicine again and giving more money to telcos to fix the problem, suggesting to spend three billion dollars in private infrastructure.