19 April 2021

Indeed > THE PRIVACY PARADOX

Please take the time to read this: First here is a fore-thought from the author Mike Masnick
"It would be nice, though, if we could have this kind of debate and conversation in a reasonable manner, rather than everyone jumping immediately to their own corners about who's evil and who's good. Every one of these decisions has tradeoffs, and it would be more productive if we could recognize that and debate the relative merits of all of those tradeoffs. But, having nuanced discussions about subjects with no easy answers does not seem to be in fashion these days."
-------------------------------------------------------------------------------------------------------------------------------
For many years I've tried to point out that no one seems to have a very good conceptual framework for "privacy." Many people act as if privacy is a concrete thing -- and that we want our information kept private. But as I've pointed out for years, that doesn't make much sense. Privacy is a set of tradeoffs. It's information about ourselves, that we often offer up freely, if we feel that the tradeoff is worth it. And, related to that, there's a big question about who is controlling the data in question. On top of that, things get confusing when we consider just who is controlling what data. If we're controlling our own data, then we have some degree of autonomy over our privacy trade-offs. But when we hand that data off to a third party, then they have much more say over our privacy -- and even if they agree to "lock down and protect" that data, the end result might not be what we want. For one, we're giving those companies more power of our data than we, ourselves have. And that can be a problem!

Because of this, privacy questions are often highly contextual -- and often conflict with other issues. For example, after the Cambridge Analytica scandal, Facebook was yelled at over and over again regarding its poor data privacy efforts -- leading the company to say "okay, fine we'll lock down your data, and just keep it for ourselves." Which is a totally reasonable response to the complaints that "Oh, Facebook leaked our data." But, of course, the end result of that is... worse. Then we've handed Facebook even more control over our data, and given significantly less ability for competitors to come along. That's not good!

There's a similar issue with advertising and privacy, that we discussed just last month. Google clarified its plans to block 3rd party cookies. In many ways, this is good for privacy. 3rd party cookies are often abused in creepy ways to track people. So it's good that Google won't support them (Firefox and Safari already made this move earlier). But lots of people then vocally complained that this would only give more power to Google, because it can deal with the lack of data, while competitive (smaller) advertising firms cannot.

These issues are often in conflict -- and many of the big tech critics out there don't want to recognize that. In fact, it lets them attack these companies no matter what they do. If they do something that's good for privacy, but bad for competition, focus on how it's bad for competition. If they do something that's good for competition, but bad for privacy, focus on how it's bad for privacy.

A recent article in Wired by Gilad Edelman highlights this tension in the antitrust context. Noting that in the big antitrust fights against Facebook and against Google, the two companies are being attacked in very different ways: one for being more protective of private data in a way that gives the company more power, and one for violating privacy of users.

HERE’S SOMETHING TO puzzle over. In December, the Federal Trade Commission and a coalition of states filed antitrust lawsuits against Facebook, alleging that as the company grew more dominant and faced less competition, it reneged on its promises to protect user privacy. In March, a different coalition of states, led by Texas, accused Google of exclusionary conduct related to its plan to get rid of third-party cookies in Chrome. In other words, one tech giant is being sued for weakening privacy protections while another is being sued for strengthening them. How can this be?

Edelman tries to solve this seeming paradox by suggesting that there might be a way to sort out the actual intent of these actions:

Maybe, then, the right way to think about what should happen when the privacy and competition dials diverge is to ask whether a company is cutting off access to personal data that it intends to keep using itself. That could help distinguish between a case like the Privacy Sandbox, on the one hand, and Apple’s App Tracking Transparency framework, on the other. Apple’s new policy will force all iPhone app developers to ask for permission before tracking users. That is expected to hurt companies that make money by tracking users across the web, most notably Facebook, which has reportedly considered filing an antitrust lawsuit to block the change. But since Apple doesn’t make its money by selling personalized ads based on surveilling user behavior, it’s harder to argue that it is hoarding access to user data for its own purposes. That makes the tension between privacy and competition easier to resolve.

Not that Apple will always come out ahead in this analysis. Contrast the Facebook spat with the ongoing feud between Apple and Tile, which sells tracking technology to help users find lost stuff and thus competes with Apple’s own “Find My” software. According to Tile, Apple has discriminated against the company by prohibiting certain practices, like background location tracking, that it requires to function. Apple says the rules are meant to protect user privacy. If it were to sue, Tile might have a stronger case than Facebook because its product competes more directly with Apple.

That's an interesting idea -- but I'm not sure it would be so easy in practice. There are so many competing interests at play, and so many actions may seem good for one particular concern, but less good for others.

Obviously, I've long been an advocate of simply removing much of the data from these large companies' control entirely -- via a system of protocols that moves control out to the end users. But in most versions of that system, most users are going to eventually entrust that data to some third party company, and in some ways that puts us back where we're started. My hope is that such a world would end up with more neutral 3rd party "data banks" and you could even suggest that there could be an information fiduciary model, in which these companies are legally required to act in your best interests.

But, even that model runs into some trouble, and we end up talking about questionable ideas like a DMCA for privacy, which seems like it would be a true horror online.

It would be nice, though, if we could have this kind of debate and conversation in a reasonable manner, rather than everyone jumping immediately to their own corners about who's evil and who's good. Every one of these decisions has tradeoffs, and it would be more productive if we could recognize that and debate the relative merits of all of those tradeoffs. But, having nuanced discussions about subjects with no easy answers does not seem to be in fashion these days.

Filed Under: antitrust, big tech, competition, narrative, privacy
Companies: facebook, google

More

No comments:

22 Arizona police officers punished so far in 2024 | Phoenix New Times

Arizona's law enforcement watchdog has opened misconduct investigations into 25 cops and punished 22 from around the state so far this y...