We Don’t Need You

Ostav Nadezhdu
6 min readFeb 6, 2020

And we don’t need your data.

a man in a desert
tag yourself

Fearmongering is a niche industry but a very profitable one. It’s easy to write up a report on how much data Google is collecting on you and pass it around to a bunch of unaware proles, combined with a few dirty insinuations and hypothetical questions, and get a mob of scared people who (1) no longer trust Google and (2) suddenly trust in you. In fact it’s so profitable that it’s one of the oldest acts of social organizing, going as far back as the serpent in Eden whispering to Eve that perhaps God forbade the fruit of the tree of knowledge because He was hiding something from her. So to apply the same tactic to the issue of mass data collection is new wine in a very old wineskin, and it’s not surprising that not everyone is convinced. Just as quickly as the first, up springs a second batch of grifters to sneer away the first. “Um, ackshually, your data is totally anonymized and only used for advertising services.” Chances are you’ve had the opportunity to meet both of these camps in the wild, and heard the story from both sides.

The truth is both better and worse than you’ve been lead to believe.

I’ll never endorse anything that comes out of a marketing department, mostly because I’ve spent too much time in one myself, but I also won’t feed hysteria. So yes, most/all of your data is anonymized when companies collect it, and it’s virtually all used for advertising, or else to target you with products/services/webpages more accurately. Depending on your philosophy this may or may not be a good thing, but it’s not an existential threat. Which makes sense, because most companies are not very concerned with you. In fact, you are remarkably valueless to Google et al.

Take Facebook, for example. In 2019, FB made 21.08B in revenue with 2.5B users. Discounting all the other ways they could make money, this comes out to a maximum of $8.43 per user. And data isn’t a liquid asset: most personal data sells for pennies per identity on the black market. A credit card number, verified, laundered and guaranteed still good might fetch 10–20 dollars. the value of data only comes into play once you have millions of samples. But the whole is greater than the sum of its parts — a single data point doesn’t give you much insight into anything beyond that one specific person’s quirks, but a million data points allows you to do statistical analysis, break out trends, run funnel analytics and even train machine learning algorithms. Collect more data, and the new data will compound the value of the old. This is the power of big tech: not in what kind of data they can scrape from you, but in how many people they can reach.

You, individually, don’t matter. Your data contributes virtually nothing to Google’s algorithms. Nobody knows you exist, and they wouldn’t notice if you disappeared tomorrow. Only autonomic minds ever come into contact with you directly; humans just see the aggregations of thousands or millions of users at once. The biggest hurdle to overcome in understanding big data is the demarcation between statistics and individuals. Stalin knew this, it’s why he was able to purge so many. Rather than targeting individuals, he targeted groups. “So then shouldn’t we take care to recognize their humanity, so that people can’t get away with those kinds of purges?” What you should do is understand the relationship between groups of people and the people who comprise those groups — they are not the same. For example, consider the classic psychological example of the angry mob. Perhaps none of the people in the mob are murderous people, but the group of them can still be itself murderous.

this is also a desert

Usually when we talk about data collection and surveillance, we break down into two major categories: you’re either abnormal (outlier, unusual behavior) or normal (follow trends, blend in). It’s pretty obvious why abnormal users would want privacy — as outliers, they call attention to themselves. They stand out among all the trend followers and are most likely to be investigated by any curious or suspicious data scientists. If you’re doing something illegal, subversive, anticorporate or just plain embarrassing, you have a vested interest in controlling what data about you can be harvested. “So if I’m normal I’m fine,” you say. Not so fast. Normals have good reason to care about their data, too.

I recently read a short story that talked about a mythological computer virus released into the net by a Moreauvian tech lab, a virus which would infect your machines and perform the ultimate man-in-the-middle attack. It would begin editing and controlling all the information crossing your screen in or out, becoming Dante’s cyberdemon, manipulating your digital reality to keep you so engrossed in the virtual world that you would eventually die of starvation. A recent horror game did much the same thing in reverse, where the cyberdemon would fake its victims' social media to keep up the appearance of being alive even after it killed them irl. These stories are just spooky stories — I don’t think there will be Deceivers trawling the net for victims any time soon. But it’s significant that these stories exist at all, and that they resonate with people. We find these things scary because we see how they could happen, if those demons were real. We intuitively understand the significance of our online experience.

Our cyberbodies are an important part of us, and unless you’re totally unplugged, you interact with the online world — and it interacts with you. When Google plugs your data into its algorithm and plots you to a trendline, it begins to treat you differently: as a member of that trend, rather than as an isolated incident. You will get different ads, yes, but that’s only the part Google cares the most about. You also get different search results, news headlines, social media posts, video recommendations…. Tech companies don’t care about these features as much, they just use them to make marginal improvements to keep you engaged on their websites longer. But to you, these features comprise most of your online experience, and they are informed by the data harvested from you.

Google starts serving you headlines that keep you engaged, rather than informed: “that’s fine, I only engage with content that informs me anyway.” No you don’t, but let’s pretend for the sake of argument that that’s true. Can you say the same for the rest of your cohort? Because Google doesn’t actually serve you headlines that keep you engaged, they serve headlines that keep their estimated model of people generally like you engaged. As a member of a trendline, you are not the target audience — your trend is. As the member of a trendline, you are being handled. This is also true on Facebook, Twitter, YouTube, and anywhere else that uses user data to improve your experience. And the script that they follow has been designed to placate you, to slip by unnoticed, to make you a happy customer.

Maybe this doesn’t bother you, but it does me. I don’t want to be handled. I don’t want to feel cozy with a multinational cybercorp. I’d rather keep an edge, and run into things that bother me. I’d rather my digital environ be a bit more hostile, to be completely honest. That’s how I’ve done the most learning and growth online to date: not by sitting in a trendline, but by transversing them.

--

--