Mind Matters Natural and Artificial Intelligence News and Analysis
shttefan-282769-unsplash
User looking down at smartphone screen
Photo by SHTTEFAN on Unsplash

Your Phone Is Selling Your Secrets

You’d be shocked to know what it tells people who want your money
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

If you visit an emergency room, you may create a wealth of data you had no idea of, even before you walk in the door. As one doctor explains:

As it is, there are companies with established digital geofencing around hospital perimeters who can capture entry of a mobile phone onto the premises. In so doing, they initiate a cascade of events that allows marketing agencies hired by personal injury law firms, for example, to solicit patients directly with ads to their phone (while still in the ER). Though these ads can be cast while in a clinic or other medical locale, the system is sparked by arrival to the emergency room.

Jamie Wells, M.D. , “Is Patient Privacy Already Passé?” at American Council on Science and Health

And it’s not just your phone. Other systems are talking about you to strangers too. A recent New York Times article revealed that many hospitals use public sources of data on their patients such as records of property owned and charitable or political donations. If you own anything, you might be hearing from a hospital VIP.

According to the Times, “Some hospitals train doctors and nurses to identify patients who have expressed gratitude for their care, and then put the patients in touch with staff fund-raisers.” That entirely innocent compliment could cost you… Even though your gratitude isn’t digital, most of what happens afterward probably is. And it stays in the system.

Some jurisdictions are moving to make such practices illegal. But the main handicap reformers face is not lack of public support; it’s what people don’t know.

The tracking does not stop when you leave the hospital. In a digital environment, you slowly accumulate a “shadow health record” outside the medical system:

Every time you shuffle through a line at the pharmacy, every time you try to get comfortable in those awkward doctor’s office chairs, every time you scroll through the web while you’re put on hold with a question about your medical bill, take a second to think about the person ahead of you and behind you.

Chances are, at least one of you is being monitored by a third party like data analytics giant Optum, which is owned by UnitedHealth Group, Inc. Since 1993, it’s captured medical data — lab results, diagnoses, prescriptions, and more — from 150 million Americans. That’s almost half of the U.S. population.

Jeanette Beebe, “What you don’t know about your health data will make you sick” at Fast Company

But this endless tracking is not just hospitals and pharmacies. Your shadow health record includes what you buy at health food stores and gyms and data from internet-connected medical devices, sleep monitors, and fitness trackers (and legal pot shops and vape shops). Your physician probably does not have all this data on a patient but a data broker building a list of potential customers for a client very well might. Once you are connected, you don’t own the results anymore.

A privacy lawyer told Beebe that even he cannot opt out of the brokers’ right to collect data from his own health insurance contracts. And it is legal for healthcare providers to sell their data to any number of companies involved in pharmaceuticals, insurance, and highly targeted advertising. Unfortunately, these days it is not evidence of paranoia to believe you are being watched and followed.

It is much easier than we might think, given enough “anonymous” data, to guess individual identities. (The technical term for that is “deanonymizing” data.”) Here’s an illustration: You may feel pretty anonymous. But suppose you are the only person in your neighborhood who shops at the Econo-Pet, pays bills to Merrywell Pet Hospital, drives a Ford Focus hatchback, and is a Bronze Level donor to a retired service dog foundation? A motivated group with digital data from those sources, however it was obtained, probably knows or can easily find your name, address, and phone number too. And the broker can likely get health and lifestyle information about your contacts, not just about you and your dogs, by using AI to follow trails.

Big tech companies have an ambiguous relationship with online invasions of privacy. As tech philosopher George Gilder points out in Life after Google: The Fall of Big Data and the Rise of the Blockchain Economy, if you’re not paying the social media companies for all that service, it’s because you’re the product, not the customer. The companies may be able to make much more money selling information about you than you would pay them to use their medium.

Not too surprisingly, the last face-to-face meeting of the Do Not Track consortium, attempting to rein all this in, was held in 2013:

In October 2018, on its public mailing list, the W3C group discussed how to describe Do Not Track’s failure in a preface to its final piece of work. After some back and forth, the group agreed on the language that appears:

“… there has not been sufficient deployment of these extensions (as defined) to justify further advancement, nor have there been indications of planned support among user agents, third parties, and the ecosystem at large. “

It’s an artful self-own by the group’s participants, which included representatives from ad industry trade groups, large advertisers, and ad delivery platforms, as well as ones from privacy groups, governments, and browser makers. After a flurry of work from 2011 to 2013, the group hadn’t met face to face since 2013, according to its notes.

Glenn Fleishman, “How the tragic death of Do Not Track ruined the web for everyone” at Fast Company

Some put their faith in government to regulate the problem away, as with, for example, Europe’s General Data Protection Regulations (GDPR). An adversarial, winner-take-all process driven by legislation will probably not work as well as stricter industry standards in a competitive environment. But if, as Gilder believes, the underlying model is unworkable, reforms will be patchwork, no matter who is sponsoring them.

Apple, sensing a competitive advantage, has become a “vocal supporter of privacy.” The CEO outlined the problem in a recent magazine essay:

One of the biggest challenges in protecting privacy is that many of the violations are invisible. For example, you might have bought a product from an online retailer — something most of us have done. But what the retailer doesn’t tell you is that it then turned around and sold or transferred information about your purchase to a “data broker” — a company that exists purely to collect your information, package it and sell it to yet another buyer.

The trail disappears before you even know there is a trail. Right now, all of these secondary markets for your information exist in a shadow economy that’s largely unchecked—out of sight of consumers, regulators and lawmakers.

Tim Cook, “You Deserve Privacy Online. Here’s How You Could Actually Get It” at Time Magazine

But Apple, like all big tech companies, is hampered by the fact that its wealth depends in part on data sales. He proposes an approach that enables the customer to find out what data has been harvested:

But Cook doesn’t only argue for legislation restricting data brokers from accessing your information from the shadows — he wants the Federal Trade Commission to establish a data broker database that would require all data brokers to register and provide tools that would allow anyone to do a simple search to find out who has their data and give the individual “the power to delete their data on demand, freely, easily, and online, once and for all.”

Mark Sullivan, “Apple’s inconvenient truth: It’s part of the data surveillance economy” at Fast Company

Of course, if he got his wish, the data streams would be worth far less and social media companies would need other ways to turn a profit. Pay for service is bound to be unpopular but it would at least restore a normal marketplace, one where there is a clear distinction between the customer and the product.

The current situation puts one in mind of an old proverb: Three women and a goose make a market. That is, a minimal market is one seller, two potential buyers, and one product. But current social media skews the market: The two potential buyers are now the products, sold to unknown third parties (via their digitized lives).

Things haven’t changed as much as we might think. If information about us has any financial value, someone might want to buy it. And we have always had to pay for privacy, whether we have curtains, locks, sealed envelopes, or confidential advisers. It’s doubtful that any economic system, however high-tech, can overrule such perennial facts of nature.

See also: Your phone knows everything now. And it is talking.

and

The $60 billion-dollar medical data market is coming under scrutiny. As a patient, you do not own the data and are not as anonymous as you think


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Your Phone Is Selling Your Secrets