BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Is It Time For Facebook To Finally Come Clean About Our Data?

Following
This article is more than 5 years old.

Another week, another story about Facebook sharing its users’ data without their knowledge or consent. This past June Facebook argued that granting access to your personal private data to more than 60 companies didn’t count as “sharing” because those companies were “partners” and “providers.” In reaction to yesterday’s expose by the New York Times documenting even more massive and egregious data sharing with a much wider assortment of companies, Facebook once again argues that they were merely all “partners” and that there is nothing to see here. More to the point, the company reminds us that it is under no obligation to actually clarify any of its data sharing practices and that it can pick and choose what information to share with the public about their data is being used. Is it time for Facebook to finally just come clean about all of the ways it is using, sharing and selling access to our data?

As The New York Times puts it in the opening sentence of its latest expose, “for years, Facebook gave some of the world’s largest technology companies more intrusive access to users’ personal data than it has disclosed, effectively exempting those business partners from its usual privacy rules.”

The Times documents an astonishing and bewildering array of partnerships and agreements that allowed companies across the world, including Russian and Chinese companies, privileged access to the personal, private and intimate data of Facebook’s user community.

The company once again argues that all of these companies were “partners” and “service providers” of Facebook’s such that it was under no obligation to inform the FTC of the extent to which it was still sharing its users’ data. As Facebook clarified to the Times, it was under no obligation to secure informed consent for external data sharing because even the Royal Bank of Canada and Microsoft were merely considered extensions of itself.

In its response to the Times’ latest expose, Facebook offered a brief response that did not actually deny any of the allegations, but rather took on an almost indignant tone. Users could use Facebook from other devices and services so they obviously knew their data was being shared and besides, you “sign[ed] in with your Facebook account” so you gave full informed consent and knowing permission for it to share your data in any way the company wished.

The problem with this response is that just because a user logs into an outside service using their Facebook login, they likely don’t realize all of the permissions and rights they are granting that website to their data. Similarly, just because they see they can post to Facebook from their streaming music app or get personalized results from their search engine, doesn’t mean they realize that that music app now has full unrestricted access to all their private messages or that Microsoft is building profiles of Facebook users on its own servers.

Just because users see a “Post to Facebook” button or glimpse Facebook posts on another site does not at all mean that they understand the privacy cost underlying those features.

Facebook’s two billion users all recognize that they are using a free website and that because it is free it must be generating revenue somehow. Few of those users, on the other hand, are fully aware of the ultimate extent to which Facebook is monetizing them and exploiting their personal information.

Facebook notes that in every one of these sharing agreements, the user clicked a button at some point in their history that legally authorized it.

In other words, in Facebook’s telling, you have no right to complain, you clicked a button at some point in your life that you didn’t understand and had no idea what it did, but you clicked it and in doing so granted it the legal right to do what it wanted with your data.

It is noteworthy that nowhere in Facebook’s response is there any form of apology. No acknowledgement that perhaps it could have done better in making its users aware of just how widely their data was accessible or all of the privacy tradeoffs that underlay all of those Facebook options and content they saw across the web and mobile worlds.

Imagine for a moment if Facebook followed Estonia’s model of absolute consent and for every single application or research project that wanted your data, it gave you a complete description of how it wanted to use your data and every single thing that would be done using your data and you had the right to grant or reject access. No “Facebook requires access to your microphone” popups when you’re installing an application and can’t get to the next screen without clicking yes. An actual human readable exhaustive description of each and every single proposed use of your data and most importantly, the right to say no.

What would happen if Facebook presented all two billion users an annual privacy report of every single access of their data that was not by a live human being? To close Facebook’s infamous “partner” loophole, this would include all accesses by applications creating “Facebook experiences.” If a partner downloads your data to their server as part of their Facebook integration, that would be included here too.

Every year you would see a massive list of every non-human that saw your data in any way, shape or form.

Why doesn’t Facebook do this? Likely because the results would be so frightening to most users that Facebook would suffer an overwhelming privacy backlash and threats of further regulation. Of course, as Facebook itself has clarified again and again, it actually has no idea who all is accessing your data and even today in Fall 2018 companies still routinely misuse Facebook data without Facebook having any idea.

What if Facebook manually required you to formally review and authorize every single non-human access to your data, whether by an outside company or by Facebook itself for its own research?

In fact, Facebook is rushing as fast as it can in the opposite direction. It is opening up all of your most personal, private and intimate information to the world’s academics across the globe to mine and manipulate you to their heart’s content. The company’s stance is that since data will be anonymized you have no right to opt out. Yet it simultaneously refuses to provide any real detail on the anonymization process or respond to concerns about key vulnerabilities of the anonymization approach it has selected for the project. Academia’s ethics review boards and its top journals have all enthusiastically embraced this approach to mass mining of your intimate data without your knowledge, informed consent or right to opt out. After all, why should you have any rights at all in the digital era?

Facebook won’t even comment on whether you can even delete data from its platform anymore since academics would love to mine all your deleted posts.

In an era of heightened scrutiny of cybersecurity, it is particularly noteworthy that Facebook did not turn off the privileged access it granted partners, even years after they stopped using the data feeds and APIs. The company acknowledges that it “shouldn’t have left the APIs in place” but a spokesperson declined to comment on why the company does not believe this was a major security risk.

The Times reports that Facebook’s privacy team did not review all data sharing agreements equally and that the level of review “depended on the specific partnership and the time it was created.”

This raises questions about Facebook’s data ethics review team that has responsibility for overseeing all of Facebook’s own research using your personal data and research by outside academics granted access to your “anonymized” private data. Despite touting that its ethics board must review all research, the company has steadfastly refused to provide any detail on the board or its review criteria. When it emerged last year that two employees conducted extraordinarily sensitive emotional research on young children without submitting their work for review by the board, the company declined to comment on how much of Facebook’s research the board actually conducted any form of evaluation of.

The fact that its privacy team did not review all of Facebook’s partnerships with equal scrutiny raises similar questions about its research ethics board. Unsurprisingly, the company declined to comment.

A spokesperson declined to comment on any of the questions posed to the company, offering only a generic statement “We know we've got work to do to regain people's trust. Protecting people's information requires stronger teams, better technology, and clearer policies, and that's where we've been focused for most of 2018. Partnerships are one area of focus and, as we've said, we're winding down the integration partnerships that were built to help people access Facebook” and a link to its public post.

The company declined to comment on most of the details of the Times expose, especially surrounding consent and the extent of those data sharing agreements.

When asked how the company would respond to criticism that its two billion users deserve to have answers to all of the concerns raised in the Times piece, since ultimately all of those questions revolve around how the company is handling their personal data, the company did not respond. It also did not respond when asked why it did not already have responses prepared for all of the issues identified by the Times given that all of those agreements have been in place for years now.

It also did not respond when asked why it had not been more forthcoming about the full and complete extent of all of these data sharing agreements and clearly educated its users about its practices and the privacy tradeoffs behind all of the Facebook integrations they saw across the web and mobile worlds.

Of course, why should it? It has no legal obligation to tell us absolutely anything about what it does with our data.

The fact that Facebook can pick and choose what to respond to and simply decide not to tell its users anything more about data sharing agreements that leak to the press, underscores just how little rights we have in the digital era. The companies that we pour our data into have no legal obligation to tell us what they do with it and we have no right to demand an accounting of what has happened to the data we entrusted with them.

Putting this all together, perhaps instead of playing perpetual defense, responding to each week’s latest privacy or security revelation by hiding behind contorted legalese or indignantly arguing that users have nothing to complain about, since they clicked a button at some point in their lives that authorized all of this, perhaps Facebook should just come clean and tell us what it has done and is doing with all of our data. Perhaps Facebook could take a cue from the public concern about each of these stories and recognize that it needs to do a better job educating its users about how it is using their data. In the end, perhaps the best option of all is for Facebook to just come clean and tell us everything. Of course, it can’t, because as we’ve seen in bits and pieces over this year, the answer makes 1984 seem like child’s play.