Mind Matters Reporting on Natural and Artificial Intelligence
group-of-friends-using-smartphones-to-communicate-in-social-media-concept-of-a-generation-of-millennials-who-are-online-all-the-time-warm-hipster-filter-stockpack-adobe-stock.jpg
Group of friends using smartphones to communicate in social media. Concept of a generation of millennials who are online all the time. Warm hipster filter.
Group of friends using smartphones to communicate in social media. Concept of a generation of millennials who are online all the time. Warm hipster filter.

The Social Dilemma: You’re Not the Customer, You’re the Product

A new Netflix documentary explores the techniques used to explore, then shape and sharpen, our attitudes, values, and beliefs

What is truth? This question has likely been pondered by man for as long as man has been able to ponder. How do you know that what you read or hear is true? How do you know that what you think is true? Why is it that people with different worldviews or belief systems can look at the exact same raw objective data and interpret it in radically different ways?

The answers to these questions are important to “know”, insofar as anyone can know anything within a reasonable degree of certainty. However, in our society today, it is becoming more and more difficulty to determine what is true––with any degree of certainty.

A recent 90-minute Netflix documentary, The Social Dilemma, casts light on why it is becoming more difficult. It demonstrates the power of social media, not only in our own lives and our children’s lives, but also in our country. The power of the “Like” and “Share” buttons—of demonstrating how much (or how little) we like some image, document, report, news story, or information—goes far beyond what you might think.

Imagine, if you will, that everything you do online is tracked (it is, by the way). The more data you provide to those who are tracking you, the better they can predict what you will like. The better they predict what you like, the longer they are able to keep you engaged and online. The more they keep you engaged and online, the more advertisements you see. The more advertisements you see… the more money you spend on what is advertised.

Social media platforms and large internet companies use computers and artificial intelligence (AI) algorithms to analyze the data that you put into the system. Not just what you click on, but what you look at, how long you look at it, what you like, what you dislike, where you shop, what you shop for, and what opinions you have about politics, religion, cars, clothes, etc. Somewhere out there in that system is a file on you. It knows you. It knows who you are and what you like.

As one film critic put it,

The commonly held belief that social media companies sell users’ data is quickly cast aside – the data is actually used to create a sophisticated psychological profile of you. What they’re selling is their ability to manipulate you, or as one interviewee puts it: “It’s the gradual, slight, imperceptible change in your own behaviour and perception. It’s the only thing for them to make money from: changing what you do, how you think, who you are.”

Alexi Duggins, “Why The Social Dilemma is the most important documentary of our times” at Independent


That, in and of itself, is scary. But wait, there’s more! When you have an AI-driven algorithm that has learned enough about you to know what you like, it will start showing you more and more things that it thinks you will like. And so, pretty soon, all you see are things that you want to see.

“Well, what’s wrong with that?”, some might ask. The problem is that the algorithm doesn’t know what is true, and what is not true. It has no truth detector. Fake news is a huge business these days. So if you start liking fake news stories, or even pausing on them or look at them a bit longer, it (the AI system) knows that. You start getting more and more of them.

Another issue is that your thoughts and feelings about an issue become more and more biased if the only thing you see and hear are things that the AI thinks you want to see and hear (your filter bubble ). You don’t see other sides to the story.

And when you do see other sides to the story, you start to disagree with them more and more. And so opinions get stronger, and divides get wider, and differences go deeper, than they used to. All because that’s what is being fed to you. The information you see is the information that you want to see… or is it?

So, let’s get back to our question. What is truth? I think that a case can be made for AI-driven algorithms to subjectify and bias information to the point that there is little chance of you actually creating an “objective” opinion about a given topic without veering off your chosen platform.

It’s hard enough for us to determine what is true sometimes. We certainly don’t need to be moving further from that ability, and it seems that that is what the AI systems of social media are bringing about.

So what can you do? There are many things. One thing that you have already done—if you’ve read this far—is to simply be aware! Now you know. The immediate (and for some people, scary) thing you can do is to remove yourself from (or at least significantly limit) social media. Turn it off. Disconnect. And so then, how do you stay connected to family and friends? I have a suggestion: Go see them. Call them.

And if COVID-19 remains a consideration (and it is at the time of this writing, then use one of the many communication platforms that allows you to see and hear your friend or family member. Also, to do your part in reducing depression and suicide in our youth, severely limit access to social media and smartphones in children and adolescents. I know that sounds harsh, but the health and well-being of our next generation is at stake here.

For more help, and more information about what you can do, visit The Social Dilemma’s Take Action page.


Further reading:

New AI can create—and detect—fake news. But how good is it at either task? We tested some copy.

Are you trapped in a news bubble? The news filtered to you might leave out important things you need to know. But how can you tell?

and

Escaping the news filter bubble: Three simple tips Spoiler: Reduce the amount of information big providers have about YOU.


James C. Patterson II

James C Patterson II is the Chairman of the Department of Psychiatry and Behavioral Medicine at LSU Health Shreveport. Dr. Patterson received his undergraduate degree in biology from Lamar University followed by his combined MD and PhD in Neuroscience from the University of Texas Medical Branch in Galveston, where he also completed his Internship and Residency in Psychiatry. He completed a Psychiatry/Functional Neuroimaging Fellowship at the National Institute of Mental Health in Bethesda, Maryland. He is Board Certified by the American Board of Psychiatry and Neurology. He and his wife of many years have two children, two grandchildren, three cats, and two dogs. He has multiple hobbies including science apologetics, carpentry, landscaping, and computers.

The Social Dilemma: You’re Not the Customer, You’re the Product