I’m deleting Facebook. It’s a matter of conscience

Introduction

If you’re leaving a room full of people you’ve been talking with, I think it’s polite to say goodbye. So it was out of courtesy that I let my Facebook friends know that I am about to delete my account. Of course, whether they see that post or not depends on the degree to which Facebook’s algorithm has decided that my post is important to them.

You may have seen that large companies like Tesla and SpaceX, and small companies like Flexibits who develop my favourite iOS calendar app called Fantastical, have also deleted their Facebook pages. Maybe some of your friends have left the service too, and it’s left you a bit perplexed.

Quitting Facebook is no small matter for me. Through it, I’ve connected with old school friends and made other acquaintances. Mosen Consulting and Mushroom FM, both of which I manage, have pages there. That’s why I resisted the compelling urge to delete my account immediately after I started reading about the Cambridge Analytica data breach in mid-March. But the feeling I have of needing to run a mile from this service and its toxic, cavalier culture has only strengthened as the torrent of leaks has continued.

Some of the comments I received on the post notifying people of my intention to delete Facebook have shown me that a lot of people simply don’t understand, or haven’t really heard about, the Cambridge Analytica story. That’s why I’m writing this post. If you understand what has happened and decide that you don’t care, or that you’re willing to give Facebook a chance to fix things, then fair enough. But I hope you will at least make that decision armed with the facts. Remember facts? We used to value them once.

I also want to demonstrate that the Cambridge Analytica abuse is not an isolated event, but symptomatic of a rotten culture.

What’s this Cambridge Analytica thing about?

Cambridge Analytica is a company founded in December 2013. They’ve become famous in US media for their role in the Trump campaign in 2016. They’re based in the UK, and they have offices elsewhere, including the US. I’ve been reading about their actions with growing concern for some time. If you’re interested in understanding this company, The Guardian has some in-depth articles. Search for Cambridge Analytica on their site.

My brief summary is that Cambridge Analytica mines data, then they sell that data to political parties or political operators to influence elections or referenda, such as the UK’s Brexit referendum.

You may ask what the big deal is. The Obama campaign was legendary for its ground-breaking social media strategy, so surely, everyone’s at it. What’s good for the goose is good for the gander, right? Sorry, no. I can summarise the difference between what the Obama campaign did and what has happened with Cambridge Analytica in a single word. “Consent”.

The Obama campaign, and for that matter many other legitimate political campaigns on both the left and right of the spectrum, either gave you the opportunity to opt in, or targeted ads using standard methodology sanctioned by Facebook, which you agree to when you use the service. What Cambridge Analytica has been doing is devious at best.

Back in 2014, if you live in the United States, you may have seen a quiz app on Facebook called “This is your Digital Life”. You needed to be a US voter to complete the quiz, which promised to provide you with insight about your personality type. That app collected your personal Facebook data, and of course the answers to the questions it asked which helped further flesh out a profile of you.

You may well say that people need to be more careful about the apps they use on Facebook. You could argue that this is no different from installing some rogue software on your PC that you downloaded from a dodgy source. And that’s a fair point. If we have the knowledge, we must be savvy about these things, except that respondents were told the data would be used for scientific research.

The Cambridge Analytica scandal doesn’t end with the 270,000 people who were enticed into revealing more about themselves than was wise. This quiz also mined the personal data of all the friends of people who completed the quiz. This is the critical point here. Data of 87 million people was mined, and those people knew nothing about it. It essentially means that Facebook’s privacy settings have been meaningless.

Several countries with robust privacy legislation have already begun inquiries into this matter, or in some cases already stated that Facebook has broken privacy laws.

Having obtained this massive amount of data in what might be described as one of the largest data breaches in history, what did Cambridge Analytica do with it? They used the data to design highly targeted political ads, based on what they were able to determine about your preferences and in some cases psychological profile. They knew, based on all the data they collected, how you were likely to vote. If your vote was unlikely to be for the candidate they were contracted by, they would craft highly targeted ads designed to change your mind.

A matter of consent

Let’s not kid ourselves. These days, we get shiny things in exchange for being the product. Google knows about the things we’re searching for and will show us ads based on that. Amazon isn’t providing the Echo at a rock-bottom price out of the goodness of its heart. Its primary purpose is to collect additional data about you and to make it easier for you to order stuff.

All of this is a reality of the era we’re living in. The difference is that we know what’s going on with these companies, and we can make an informed choice about whether the data-sharing price we’re being asked to pay is worth the reward we get in return. If the price is too high in terms of compromised privacy, then we can search and shop elsewhere.

What’s troubling about the Facebook issue is that this is just one example of an egregious breach of trust on their part. We didn’t have the chance to opt out of the Cambridge Analytica data mining, because we didn’t have the opportunity to consent to our friends passing on our data to them.

Facebook’s initial response when they were contacted by journalists about the stories that were going to run on the Cambridge Analytica matter was to threaten to sue the journalists if they published the story. The journalists, sure of their facts, weren’t intimidated and published anyway. Suddenly, Facebook was in grovelling apology mode, with full-page ads in several prominent newspapers. It seems to me it’s not an unkind conclusion to reach that they were sorrier for being caught than for what they did, especially since journalists have uncovered that Facebook knew about this matter for at least two years. Facebook secured a promise that the data had been deleted, a promise that wasn’t kept. But most important, once Facebook became aware of the abuse, they didn’t inform the victims of the privacy breach that their data had been accessed without consent. That is going to happen shortly.

The floodgates have opened

Many readers will know that every week day on Mushroom FM, I host a technology magazine show, The Daily Fibre, which summarises the major technology news stories of the day. Since the Cambridge Analytica story broke, we’ve had major stories on Facebook almost every day. Just as I think things can’t shock me any more than they already have, something else happens. Here’s just a summary of some of the stories we’ve covered over the last few weeks.

Meta data from Android users

iOS users may get frustrated at times by the way that Apple’s sandboxing can make something that should be simple much more complex, but the sandbox approach has privacy benefits. It turns out that if you use Facebook for Android and installed the app a while back, Facebook may have been collecting data on all the calls you’ve made and the texts you’ve sent. To be clear, this doesn’t include the actual content of the texts, but it’s meta data that’s critical here, in other words, who you’ve been calling and texting, when you did so, and in the case of the call, its duration.

It’s standard practice for a social network to want to access your contacts so you can find others who also use the service, and that’s fine if you choose to opt in. But in older versions of Android, if you granted Facebook permission to access your contacts, you also allowed Facebook to gain this additional data. In my view, for Facebook to be collecting this data is creepy and inappropriate. If you go into Facebook Settings on your computer, you can download your data as a zip file, and it contains a series of HTML documents. If you’re affected by this issue, that data will show you that Facebook has kept all this information on file.

Now you see Zuck’s messages, now you don’t

Facebook’s CEO Mark Zuckerberg has sent some pretty stupid messages over the years, and now it turns out Facebook has been protecting him in a manner not available to the rest of us. People who’ve corresponded with him on Facebook Messenger report seeing messages from Mr Zuckerberg disappearing from the conversation thread, even though the other end of the conversation remains intact. This was not a feature available to the public.

Facebook is responding to this controversy by adding an unsend option for everyone, but again, that didn’t happen until the behaviour was exposed by the media.

So what if Facebook leads to terrorism and death?

Andrew Bosworth is a Vice-President at Facebook. Recently, someone leaked an internal memo he wrote, in which he advanced the view that connecting people could lead to deaths, but so be it.

Here’s the full text of the 2016 memo, as leaked to Buzzfeed.

“The Ugly

 

We talk about the good and the bad of our work often. I want to talk about the ugly.

 

We connect people.

 

That can be good if they make it positive. Maybe someone finds love. Maybe it even saves the life of someone on the brink of suicide.

 

So we connect more people

 

That can be bad if they make it negative. Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools.

 

And still we connect people.

 

The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good. It is perhaps the only area where the metrics do tell the true story as far as we are concerned.

 

That isn’t something we are doing for ourselves. Or for our stock price (ha!). It is literally just what we do. We connect people. Period.

 

That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay

searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it.

 

The natural state of the world is not connected. It is not unified. It is fragmented by borders, languages, and increasingly by different products. The best products don’t win. The ones everyone use win.

 

I know a lot of people don’t want to hear this. Most of us have the luxury of working in the warm glow of building products consumers love. But make no mistake, growth tactics are how we got here. If you joined the company because it is doing great work, that’s why we get to do that great work. We do have great products but we still wouldn’t be half our size without pushing the envelope on growth. Nothing makes Facebook as valuable as having your friends on it, and no product decisions have gotten as many friends on as the ones made in growth. Not photo tagging. Not news feed. Not messenger. Nothing.

 

In almost all of our work, we have to answer hard questions about what we believe. We have to justify the metrics and make sure they aren’t losing out on a bigger picture. But connecting people. That’s our imperative. Because that’s what we do. We connect people.”

So ends the remarkably obnoxious memo. In a formula now as predictable as it is unbelievable, Facebook CEO Mark Zuckerberg said he disagreed with the memo. So to, believe it or not, did the person who wrote it.

I feel utterly sick having to write this

You may have seen that from time-to-time, Facebook surveys some of its users. Around a month ago, some Facebook users received a survey in which it asked whether it was OK for a man to ask a child for sexual pictures.

Specifically, there were two questions relating to this topic. The first read:

“There are a wide range of topics and behaviors that appear on Facebook. In thinking about an ideal world where you could set Facebook’s policies, how would you handle the following: a private message in which an adult man asks a 14-year-old girl for sexual pictures.”

 

Users could then choose from the following options:

  • This content should be allowed on Facebook, and I would not mind seeing it
  • This content should be allowed on Facebook, but I don’t want to see it
  • This content should not be allowed on Facebook, and no one should be able to see it
  • I have no preference on this topic

 

The second question related to this issue was this.

 

“When thinking about the rules to deciding whether a private message in which an adult man asks a 14 year old girl for sexual pictures should or should not be allowed on Facebook, ideally who do you think should be deciding the rules?

Respondents to the survey could make the following choices.

  • Facebook decides the rules on its own
  • Facebook decides the rules with advice from external experts
  • External experts decide the rules and tell Facebook
  • Facebook users decide the rules by voting and telling Facebook
  • I have no preference.

I hope I don’t have to spell out how utterly repugnant it is that Facebook felt the need to ask its users for its opinion on such a thing. And police involvement wasn’t one of the options.

Facebook’s response? Yet another apology. See the pattern here? They behave abominably, get caught, and do the grovelling apology thing.

Facebook’s Orwellian VPN

One reason for using a VPN (virtual private network) is to secure your data against inappropriate snooping.

In 2013, Facebook bought Onavo, a company producing a VPN client called Protect. Recently, Facebook for iOS users in the US have been seeing a “Protect” option coming up in Settings.

If it’s not coming up for you, or you haven’t investigated it, here’s Facebook’s description of what the Protect option does.

“Onavo Protect helps keep you and your data safe when you browse and share information on the web. This powerful app helps keep you safe by understanding when you visit potentially malicious or harmful websites and giving you a warning. It also helps keep your details secure when you login to websites or enter personal information such as bank accounts and credit card numbers.”

That’s an accurate description of what any reputable VPN client does. But as all the good infomercials say, “wait, there’s more”. And as any good lawyer says, “read the fine print”, because there’s one little sentence buried in the description that makes all the difference.

“Because we’re part of Facebook, we also use this info to improve Facebook products and services, gain insights into the products and services people value, and build better experiences.”

Oh boy! So how do they do that? It’s easy. Once you’re connected to the VPN, and it’s hard not to be once you get this app installed, Facebook is collecting data about every site you visit and what you’re doing on it. Unlike privacy-focussed VPN services that pride themselves on not keeping data logs, Onavo says it will keep logs as long as you have an account. As far back as August 2017, The Wall Street Journal reported that Facebook used data from Onavo to track the popularity of competitive start-ups and other user preferences, and to inform acquisition decisions. That’s right, if you’re using this thing, you’re facilitating Facebook spying on the way competitors are being used.

Enough is enough

For me personally, a line has been crossed by Facebook. As I stated in the introduction, we all know that there’s a price to pay for free services. We sell our souls to some extent in exchange for some of the services we get for free. But this is too much.

Facebook has a stench about it that has become so acrid that I can no longer use the platform in good conscience.

They do something completely outrageous, do all they can to stop it from coming out, and then, no doubt advised by some of the best PR people in the business, are expert at contrition when they’ve been busted. How many last chances do they get?

I’m only one person out of billions who use Facebook. Leaving the service may well change nothing. But I believe we have an obligation to be informed, conscious consumers. I care about eating free-range eggs, about eating meat that is farmed humanely, and in this case, about companies that treat my data with transparency and respect.

The only thing these companies care about in the end is how many people use the service, and for how long. The memo I quoted above makes that abundantly clear. So, I’m being the change I wish to see in the world, and I know I’m not the only one. A recent survey showed that over 30% of tech workers are considering deleting Facebook.

Will I ever use Facebook again? I’d never say never. We all deserve a chance at redemption. However, I think it’s highly unlikely that I will use the service while Mark Zuckerberg is CEO. The buck stops with him. Just as Uber is trying to change its culture and atone for its serious mistakes with new leadership, so Facebook needs to show Mr Zuckerberg the door.

So, shortly, I’ll be deleting my Facebook account and the Mosen Consulting page. Perhaps Mosen Consulting will suffer through the lack of a Facebook presence, but I’d rather live with that than the dirty feeling I have being on the platform. There are plenty of other ways to stay in touch.

Some of the wonderful team at Mushroom FM still want a Facebook presence, and I aim to facilitate that by handing the administration of that page over to someone else.

A note of thanks

I’d like to end on a positive note and express my appreciation for everyone who has worked on the accessibility side of Facebook. Some of the work done there has been innovative, and the factors causing me to leave have been the responsibility of people way up the food chain. I imagine many Facebook employees must be deeply troubled by what has been revealed of late. Thanks so much to the Facebook accessibility team, you’ve made a difference in terms of making such a dominant platform more inclusive.

 

1 Comment on “I’m deleting Facebook. It’s a matter of conscience

  1. Hi,

    I too think what Facebook has done, and continues to do, is disgraceful.

    I actually dislike Facebook for many other reasons ranging from accessibility and general usability issues through to prevalent bullying, trolling and harassment.

    Rather than just complain I am trying to do something about it and have created the online social community – My Disability Matters:

    https://mydisabilitymatters.club

    It is a complete alternative to Facebook and based on safety, tolerance and respect. I am blind myself for those who don’t know but we are attracting people with all sorts of disabilities, their friends, families and carers – as well as allies and supporters.

    We don’t have all the bells and whistles of Facebook yet but features are being added all the time so if you are leaving Facebook I urge you to check us out and you may meet some new friends from all over the World.

    Thanks,
    Dale.