2019 has been a distinctly dramatic year for Facebook. Since January, the social media behemoth has been hit with a $5 billion fine for privacy violations and remains embroiled in U.S. antitrust investigations. In June, the company announced the release of Libra, its very own form of cryptocurrency, sparking criticism and speculation around the world.
Amidst all this hubbub, you may have missed that Facebook has also begun using artificial intelligence to map most of the population of the African continent. Facebook researchers combined computer vision techniques, population data, and high-resolution satellite imagery to search for built-up structures across the continent. They then created population density maps based on the number of buildings they observed.
Facebook’s Connectivity Lab has already released similar population maps for 22 countries, including Malawi, South Africa, Ghana, Haiti, and Sri Lanka, but this is its first continent-wide effort. Eventually the company plans to map population density around the world.
Facebook positions its map-making as a humanitarian effort emphasizing how the data (which is freely available to everyone) will enable aid agencies to “determine how populations are distributed even in remote areas, so that health care workers can better reach households and relief workers can better distribute aid.” It’s rhetoric that sounds rather similar to how Facebook is pitching the Libra cryptocurrency, which it claims will help poor people access financial services. Both Facebook’s altruistic-sounding maps and Libra invite the same key question: What’s in it for Facebook?
When Facebook first announced its mapping program in a February 2016 blog post, it didn’t use the word “humanitarian” at all. Instead, it described how the company was making better maps to “connect the unconnected and underserved in the world,” and how “accurate knowledge about the population distribution” was at the core of its efforts to get more people onto the internet. The map-making project was presented as a component of the company’s Internet.org project, a 2013 plan to get people around the world online, initially by partnering with telecom operators to offer internet services to people in developing markets via a Facebook-controlled mobile application.
Mark Zuckerberg pitched it as a humanitarian effort to extend internet connectivity to everyone, but it was met with considerable skepticism from the get-go. Researchers from the media organization Global Voices criticized Free Basics from multiple angles, observing that the service featured little content produced in-country, pushed users to sign up for Facebook, and constantly collected metadata about everyone who used it, including those without Facebook accounts. By February 2016 — the same month that the mapping program was announced — India’s telecom authorities had blocked Facebook’s Free Basics app in the name of net neutrality.
This is just one more example of how Facebook engages in “digital colonialism” — a framework for thinking about tech companies’ endless appetite for data about where we are, who we are, and what we’re doing.
India’s decision proved to be a serious blow to the service. Although Free Basics is still used by millions around the world, it has “quietly ended” in numerous countries since 2016, according to an investigation by the Outline in 2018. Facebook has largely stopped talking about the service in public. Perhaps it’s recognized that public opinion on the intrinsic goodness of being online has changed a lot since 2016. But that doesn’t mean the company has stopped working toward its larger goal of connecting more people around the world to the internet, on terms friendly to Facebook. Facebook’s population density maps seem to be part of that strategy.
Don’t get me wrong: The maps are useful to some extent for disaster responders (at least according to the ones I’ve asked). But in the process of creating these maps, Facebook is also accumulating oodles of new spatial information that will eventually help it get millions more people onto Facebook and its associated services. The maps are general population density information, and not granular information about individuals in and of themselves. But they’re a necessary step in the process of getting more people onto Facebook — and by claiming that they’re making the maps for humanitarian reasons, Facebook is able to better shield itself from criticism of its overall data-hungry business strategy.
This is just one more example of how Facebook engages in “digital colonialism” — a framework for thinking about tech companies’ endless appetite for data about where we are, who we are, and what we’re doing, particularly when it comes to their practices in countries outside of the United States and Europe. Renata Avila of the World Wide Web Foundation defines digital colonialism as the “new deployment of a quasi-imperial power over a vast number of people, without their explicit consent, manifested in rules, designs, languages, cultures, and belief systems by a vastly dominant power.” Knowledge Commons Brasil, a digital research group, describes data colonialism as the “increasing economic, cultural, and social hegemony exercised through the internet by the Global North over Southern countries.”
Researchers Nick Couldry and Ulises A. Mejias, in a recent paper, write that technology companies engage in “data relations,” which turn our daily lives into a highly profitable “data stream.” This process enacts “a new form of data colonialism, normalizing the exploitation of human beings through data, just as historic colonialism appropriated territory and resources and ruled subjects for profit.” Ultimately, they believe, data colonialism “paves the way for a new stage of capitalism whose outlines we only glimpse: the capitalization of life without limit.”
So, uh, in this metaphor we’re all trees and Facebook is the tree-cutting implement. (If you catch my drift.)
The overall point is this: Data-colonizing companies view human beings and societies as raw material — living mines or oil reserves of sorts — from which valuable information can be extracted and appropriated, in ways that are often distressingly similar to how history’s colonizers exploited and still exploit the people who originally owned the land. By comparing today’s data colonialism to the historical colonialism of the past, we can better contextualize (and resist) the digital power grab that is happening around the world today.
Under this line of criticism, projects like Facebook’s Internet.org and Google’s Project Loon (another effort to connect people in remote areas to the internet) are seen as nice-sounding means of gaining access to billions more data-generating people under the auspices of charitable, humanitarian-minded action. While we’re still learning about how Facebook’s Libra will work, we do know that the new currency will (if all goes according to plan) be another means by which Facebook can access information on millions of people’s financial transactions, often those carried out by people in poorer economies who may not be able to access banking services in any other way.
Again, a reasonable person could conclude that this is all pretty much a fair trade, that people are getting something valuable in exchange for permitting companies access to their personal data. But the rhetoric is eerily familiar. English colonizers attempted to justify their violent capture of other people’s land with elevated language, claiming their efforts would civilize “backward cultures.” Today, Facebook, Google, and the like use similar (if less overt) language to claim that pushing people toward their services is for their own good.
This is particularly disquieting when one considers that many of the places that Facebook and Google want to connect have no specific data protection laws in place at this time.
It’s also the case that this exchange of control over our own data for access to tech services — even if we accept it ourselves, or if political leaders accept it on our behalf — probably won’t end there. The history of colonialism features many examples of colonizers breaking promises or ripping up agreements as soon as they felt confident that the colonized could not meaningfully stop them. So too do Facebook, Google, and other tech companies repeatedly break their passionately stated promises regarding privacy and ethics when they feel they can get away with it. And they largely can get away with it, as we’ve seen over and over again, especially in less wealthy and well-connected nations. Consider Facebook’s decision in 2017 to roll out an experimental and unannounced change in how it displayed news stories in six countries. The decision blindsided news organizations, which found themselves with plummeting readership and little insight into why the decision had been made in the first place.
While Facebook did reverse this experiment after public outcry, the fact that it had the power to meaningfully damage the media in six sovereign nations is disturbing. And there’s not much stopping Facebook from doing the same thing elsewhere if it feels like it. In the absence of meaningful regulations, there’s little incentive for data colonizers not to come up with ever more innovative (and creepy) ways to use the data they hold, to push the boundaries of their original promises or agreements with their customers — and indeed, that’s what their shareholders demand. This is particularly disquieting when one considers that many of the places that Facebook and Google and their ilk want to connect have no specific data protection laws in place at this time (including, incidentally, the majority of African nations).
What will happen if Facebook pulls this off, if it gets its way and becomes the planet’s biggest and most inescapable information superpower? Well, we’ll be living in a much nastier world, one where we’ve lost control of both our personal information (from medical records to embarrassing late-night internet habits) as well as our political sovereignty. We shouldn’t stand for this — we can’t stand for this — and that’s the case even if the companies attempting to bring this world into being claim that they’re doing it for nice, humanitarian reasons, for our own good.
I’m painting a bleak picture here and I’d be lying if I felt particularly optimistic about our chances of protecting the world’s most vulnerable from being subsumed by data-colonizing forces. But we have to try to fight back — and there are some things we can do to resist data colonialism. First, we must admit that we can’t personal-choice our way out of this situation. We are, as individuals, pretty much irrelevant to Facebook and Google: A few of us grandly pronouncing that we’re going to stop using the tech giants’ platforms doesn’t bother them in the slightest, because there are only a few of us and they’ll find ways to get our data anyway.
We need governments and regulators to accept the gravity of the situation and to impose more punishments on Facebook and others that actually hurt; the record-breaking FTC fines imposed on Facebook this July are a good start. Second, we need to stop giving tech companies the benefit of the doubt when they say that they’re collecting data and creating new services for altruistic, humanitarian purposes. This doesn’t mean we shouldn’t use their data and services at all, but we should understand the privacy trade-offs: In other words, beware a Facebook that comes bearing gifts.
Finally, we need to extend our criticism beyond Facebook and Google themselves, to the business model that drives the entire data economy, and that motivates these companies to learn ever more about us. As Shoshana Zuboff points out in The Age of Surveillance Capitalism, Facebook and other tech companies are inherently dependent upon collecting as much data about people as they can, and upon developing ever more sophisticated means of using that data to manipulate our behavior. If they don’t surveil you and me and everyone we know, they’ll stop making money: They won’t exist. If they can’t find a way to survive without colonizing our data, then perhaps they shouldn’t survive at all.
© Flipboard and it's respective authors