Data science (done wrong) in big tech can erode humanity's individuality

June 1, 2024

I’ve been reading The Myth of Normal by Gabor Maté. It’s a long, fairly heavy book about trauma, social connectivity and our emotional wellbeing in today’s world. I didn’t expect it to hit me right in the career feels. One particular chapter struck very hard in relation to a lot of work we do in corporate (product) data science. Today’s interconnected world has uncovered a particular weakness in humans - our desire to fit in - and data science takes advantage of this to the detriment of humankind.

We are a social species. To survive as a whole, we need to behave according to society’s needs. On the whole, this has historically been a good thing. It has kept us from behaving in non-antisocial ways. It has kept us from hurting other people. However, the world has become more interconnected than we were prepared for. Instead of feeling social pressure to adhere to the needs of a small tribe, instead we pick up these needs from far too many sources. Society - all of which is accessible on our phones - imbues us with expectations of our behaviour, slowly eroding what makes us individuals. Our need to fit in drives us to botox our faces, to work in jobs we are not comfortable with to make enough money to afford that new Tesla.

The trouble is that what’s considered normal in society isn’t necessarily good for the individual. As a personal example: I am a fairly shy introvert type, but due to success in previous roles, society (and past managers) have tried to push me into management positions for more money and more status. For a while, they succeeded - and I took up management - but I quickly realised this was terrible for my mental health, at least in the big corporate environment.

But if we stick in those roles, we change. As Maté quotes one of his friends, “[my life was] a sham, an illusion, a fake… there was virtually none of me in it.” At its worst, society rewards us for it. We’re taught that money and power - to buy possessions and experiences - are paramount. None of us as child wanted to run a pharmaceutical company that prioritises profits over saving lives. Yet high-ups in these corporations are compensated very well for doing work that no doubt is in opposition to their core values.

Maté gives two very relevant examples of social character that is imbued onto us by today’s culture, and both are something we take advantage of as data scientists:

Obviously, these are both bad for us as individuals. These put us on the Hedonic Treadmill, pull us out of the moment, and make us behave in ways that just aren’t us. Mostly, they just make us unhappy because we are not true to ourselves, and always looking for more. But they are very good for business.

As far back as in 1948, Thomas Merton wrote this paragraph which is still frighteningly accurate today: “We live in a society whose whole policy is to excite every nerve in the human body and keep it at the highest pitch of artificial tension, to strain every human desire to the limit and to create as many new desires and synthetic passions as possible, in order to cater to them with the products of our factories and printing presses and movie studios and all the rest.”

By now, you can see where this is going from a data science perspective. To further our company’s bottom line:

We separate people from themselves, through optimising for engagement

Clustering and recommender algorithms nudge people into group personalities and points of view. YouTube, Instagram and Facebook, for instance, are both entirely based on grabbing your attention through pushing your interests towards that of a cluster of individuals, rather than your unique self. This way, they can maximise engagement. For instance, Instagram now heavily prioritises influencers and sponsored content over your own friends’ content. Worse, some platforms literally push misinformation and harmful content in the name of engagement.

The result is that we now doom scroll through endless content telling us that other people are happier. That some people - the other side - are trying to take away our livelihood. That other people are more successful than us. That we’re not fitting in because they’re wearing ankle socks. That because we don’t fit in, we’re not enough for others. And so, we are pushed to:

Consume, consume, consume, through conversion optimisation

We make people feel the need to buy something, whilst feeling less for not having what others do - whether experiences or possessions. Have you ever spent a typical day with your adblocker turned off, and counted the things that are advertised to you? It’s scary. This consumption includes buying ideas, too. Canada, for instance, is currently obsessed with the idea that immigrants are driving housing prices up. A mathematical fact that can’t be argued with extent, true, but the topic is literally all you see on social media. It has us all riled up, and the main drive behind this is buying our advertising money (on news sites) and our votes in the next election, rather than the good of humanity.

Of course, this is all just capitalism. Similarly, pushing for engagement and conversion is also just doing your job. My intention here is to ask you to, when building a new algorithm or doing a new analysis, think about the consequences: Are we as data scientists truly making our customers happy and fulfilled, or are we pushing our algorithms to the limit to create shallow, engaged, converting robots? Are we making people unhappy?

© 2024 Ryan Anderson