Mind Matters Natural and Artificial Intelligence News and Analysis
people-with-tv-screens-on-the-brain-addicted-to-social-media-manipulation-and-mind-control-by-media-teenager-disconnected-to-reality-stockpack-adobe-stock
People with TV screens on the brain, addicted to social media, manipulation and mind control, by media, teenager disconnected to reality

The Dark Art of Online “Nudging”: How to Protect Yourself

Organizations of all kinds use psychological tricks to move our minds as we browse — but a handy acronym helps detect them
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Beware of Internet FORCES aiming to change your mind or direct your decisions! That acronym, coined by behavioral scientist Patrick Fagan, helps people know when they’re being “nudged.”

Behavioral Science with Nudging

A “nudge” is a way to change people’s thoughts and decisions in a predictable way. But it doesn’t appeal directly to a person’s economic and practical interests in terms of features, functions, benefits, and prices. Instead, nudging triggers mental processes that favor a particular outcome. Unlike, say, propaganda, it is subtle as it presses an individual toward a certain decision without overtly blocking the other options. 

What keeps us clicking, clicking…

In “Clicks and Tricks: The Dark Art of Online Persuasion” (ScienceDirect, Aug. 2024), Fagan scoured the scholarly research on online “digital nudging” to boil it down into a user-friendly short article. There he introduced his clever acronym, FORCES, which stands for: Frame, Obstruct, Ruse, Compel, Entangle, Seduce. Let’s unpack the way these strategies can manipulate us online.

     Frame refers to presenting information in a way that biases the observer’s choice. Framing would include fake reviews and endorsements, and urgent but vague claims that a product is scarce or in high demand. Framing also makes one option stand out visually above alternatives. “Confirmshaming,” for example, makes you feel bad about yourself if you choose certain options. We’ve all seen options displayed as two buttons on the screen, such as:

○ “Yes, I want to find the love of my life”

○ “No, I want to drift through life frustrated and alone”

     Obstruct refers to making it harder for users to do what they initially intended to do. Examples include:

• Making it simple to subscribe or get access but then difficult or impossible to get out or logout

•  Imposing time delays to slow people from unsubscribing or deleting an account

• Fogging up or hiding information such as pricing and automatic renewals

• Burdening the user with extra navigation steps to find specific pages and policies

     Ruse refers to tricking users into making a choice they didn’t intend. This tactic appears everywhere online, as “Clicks and Tricks” describes:

•  Sponsored ads disguised as normal content

• Ads with a delayed appearance so that users accidently click on them when they meant to click on something else

• Disguised ads, such as when a “download here” button redirects to another sales page

•  Ambiguous information which causes users to get an outcome different from what they expected

•  Bait and switch, where the user sets out to do one thing but something else happens instead

• Trick questions, such as a list of checkboxes where the options are not in a natural order

• Distraction, such as focusing attention on one element so the user doesn’t see the small opt-out checkbox

• Hidden costs like delivery fees added to the basket at the end

We see the obstruct and ruse methods combined when a radio advertisement sends you to an online order page for your “free” one-time trial, and the page asks for a credit card number “to cover shipping costs.” In small print somewhere there is text saying you will be billed for a continuing subscription or repeated shipments unless you intentionally cancel after you’ve received the first delivery. If you later remember to cancel, you return to the website only to struggle, searching for the cancellation option.

     Compel refers to demanding, shoehorning, or practically forcing users to do something they didn’t necessarily want to do. For instance, “forced continuity” is common where automatic recurring credit charges that you didn’t want are nevertheless baked into the ordering process. Other examples include defaults and pre-selected options that appear on an on-line form.

Gamers experience “grinding,” a form of compelling when the application requires them to repeat the same steps to win additional badges and rewards. Nagging reminders in apps and online pages, often as pop ups or service interruptions, apply pressure to buy premium versions of services. The unpleasant terms, Privacy Zuckering and Contact Zuckering, point up the widespread tactic of tricking users into sharing their personal data and address book contacts when asked to “just use your Facebook or Gmail account to log in.” 

     Entangle refers to how pages and apps keep users occupied for longer than they might have intended. Methods include: never-ending autoplay of videos, infinite scrolling that presents new content at the bottom of the feed, and repeating “notifications” that can be paused but never stopped.

     Seduce refers to engaging users emotionally instead of rationally. “Tricks and Clicks” concisely describes research findings showing “one of the biggest predictors” of a message going viral “is emotional arousal.” One study examined “influencer tweets” and found “sharing is higher if emotions are more prevalent than argument quality.”  Expectedly, too, “News is similarly more likely to be shared if the headline uses surprise and exclamation marks.” 

A nudge in the right direction

“Tricks and Clicks” drew from behavioral research to identify an important clue about what keeps us clicking:

Within emotions, there is a strong negativity bias. An investigation of 51 million tweets about current affairs … found that message virality is driven by three things: negativity, causal arguments [e.g., blaming someone], and threats to personal or societal values.

Nudging is not just websites selling products or services and memes getting mass-forwarded worldwide. Fagan and co-author Laura Dodsworth describe in their book, Free Your Mind (HarperCollins 2024), how government agencies have expressly used the techniques to promote ideas and policies in populations. Quite openly, the UK government in 2010 created an agency popularly called the Nudge Unit that aims to influence public views and get people to comply happily with government directives.

     The FORCES acronym helps us sense when a website, app, news report, or video source is nudging us to think and act in ways that perhaps we hadn’t expected. Knowing when someone is trying to manipulate us is half the battle.


Richard Stevens

Fellow, Walter Bradley Center on Natural and Artificial Intelligence
Richard W. Stevens is a retiring lawyer, author, and a Fellow of Discovery Institute's Walter Bradley Center on Natural and Artificial Intelligence. He has written extensively on how code and software systems evidence intelligent design in biological systems. Holding degrees in computer science (UCSD) and law (USD), Richard practiced civil and administrative law litigation in California and Washington D.C., taught legal research and writing at George Washington University and George Mason University law schools, and specialized in writing dispositive motion and appellate briefs. Author or co-author of four books, he has written numerous articles and spoken on subjects including intelligent design, artificial and human intelligence, economics, the Bill of Rights and Christian apologetics. Available now at Amazon is his fifth book, Investigation Defense: What to Do When They Question You (2024).

The Dark Art of Online “Nudging”: How to Protect Yourself