On the morning of March 26th 2018 I got behind my PC with my usual cup of coffee and decided to check my notifications on Facebook. I clicked to open two of them, both in a separate window, and in both windows was presented with a page asking me if I was going through a difficult time, and if I needed help. You can see a screenshot of the page in the above image.
This is the first time I ever saw this on Facebook. From what I can gather, it seems that my behavior on Facebook during the previous days had triggered one or more of Facebook’s algorithms detecting that I was probably ‘going through a difficult time’. Perhaps I was showing signs that other people had previously shown on Facebook who were going through difficult times. Or perhaps this was another one of those psychological experiments being run by Facebook where people were being manipulated and their behavior was being monitored without their knowledge and consent.
It’s quite outrageous that Facebook would presume that they are allowed to monitor my behavior in this way and that I would welcome their help and support in my personal life. I don’t remember signing up for this kind of psychological monitoring, analysis and profiling when I made a Facebook account years ago. I also don’t remember giving Facebook permission to do this.
This just goes to show how far Facebook is taking the (ab)use of their users and all of the information their users share on their platform. The recent Cambridge Analytica scandal which they’re involved in, where they knowingly gave third parties access to private data from millions of users worldwide, is well deserved. And it’s highly likely that they have the blessing and support of US intelligence agencies as well:
Millions of people’s personal information was stolen and used to target them in ways they wouldn’t have seen, and couldn’t have known about, by a mercenary outfit, Cambridge Analytica, who, Wylie says, “would work for anyone”.
Tamsin Shaw, a philosophy professor at New York University, […] calls Wylie’s disclosures “wild” and points out that “the whole Facebook project” has only been allowed to become as vast and powerful as it has because of the US national security establishment.
“It’s a form of very deep but soft power that’s been seen as an asset for the US. Russia has been so explicit about this, paying for the ads in roubles and so on. It’s making this point, isn’t it? That Silicon Valley is a US national security asset that they’ve turned on itself.”
Or, more simply: blowback.
Ironically I was doing just fine that day until I posted the above screenshot on Facebook and, in response to comments by some of my friends, commented a meme of Hitler laughing. The next day, on March 27th 2018, Facebook removed that comment and banned me again for 30 days from being able to post on their platform. A picture of Hitler laughing goes against their precious ‘Community Standards’. But sharing private data from millions of users with third parties without their users’ knowledge and consent is perfectly fine.
Facebook, and in particular its CEO Mark Fuckerberg, is such an unethical and immoral company that I’m surprised people still want to work there. People working at Facebook were already quitting their jobs not too long ago when Facebook wanted to build a censorship tool for the criminal government of China, but after all the recent scandals Facebook’s reputation in Silicon Valley is badly damaged, and as far as I’m concerned, permanently so. When you damage people’s trust on the scale at which Facebook and Fuckerberg have been doing for many years now, there’s just no recovery possible anymore. One tech investor simply said that ‘they’re fucked’:
To some cynical journalists or techno-skeptics, this maneuvering might seem like Facebook just being Facebook—that the Cambridge scandal is merely the latest in a litany of privacy intrusions; that Facebook’s de facto response is, as Dance noted, disingenuous. But this scandal really is different, and everyone in Silicon Valley knows it. Since the story broke a significant investor and entrepreneur, who has worked in tech for over two decades, recalled to me that the incident reminded him of what happened to Microsoft in the 1990s, when years of pugilistic corporate behavior caught up to the company in the form of significant antitrust regulation. One tech investor put it more succinctly: “This is a slow roll into serious fuckery. They’re fucked.”
Indeed, the repercussions are massive in both immediate and longitudinal ways. Just a couple of days into the Cambridge crisis, Facebook’s stock has dropped by more than 20 points, which has led its market capitalization to fall by tens of billions of dollars.
Scandals surrounding privacy issues are part of Facebook’s DNA. Back in his Harvard days, Zuckerberg told a friend that he could use The Facebook (as it was called back then) to find out anything on anyone. “I have over 4,000 emails, pictures, addresses, SNS,” Zuckerberg proudly wrote to his friend. “People just submitted it. I don’t know why. They ‘trust me.’ Dumb fucks.” Zuckerberg was still a teenager when he wrote that note, but that glaring blind spot has become a leitmotif of his tenure as an executive.
Fuckerberg’s ethical/moral standard is mirrored by other executives at Facebook. Here’s an example:
BuzzFeed on Thursday published a June 2016 memo by Andrew “Boz” Bosworth, who currently leads the company’s hardware division, in which Bosworth says he wants “to talk about the ugly” aspect of the company’s work.
“We connect people. Period,” Bosworth wrote. “That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it.”
I discussed the kind of work they were looking at doing in China in a previous post. It’s incredible that anyone would think that’s a ‘justifiable’ thing to do to enable more growth. And notice how they KNOW that what they’re doing is ‘questionable’ and ‘ugly’.
So it comes as no surprise that engineers at Facebook are now scrambling to get transferred to a different department inside Facebook or are quitting their jobs because of ethical concerns. I would have done exactly the same thing. And I’d probably also have been very ashamed to put Facebook in my resume as a place where I’ve worked in the past and where I had contributed to all of the abuse that they’re responsible for. I would even go so far as to say that anyone who decides to go work at Facebook is either very naïve or has a very questionable sense of morality. I’d be very careful with such a person.
And these kinds of psychological tactics of manipulation, persuasion and influencing of people without their knowledge and consent is not something that’s unique to Facebook; most tech companies are guilty of it these days, as I discussed in a previous post in more details. In that post I also mentioned some of the psychological tools that they have in their arsenal and which they frequently (ab)use. That’s why it’s no surprise to me that most other tech companies are remaining silent as they watch what’s happening with Facebook and don’t comment about it too much in the media; they know they’re guilty of doing the exact same crap. This also explains their need for secrecy and why they go to great lengths to control and monitor their employees.
For example, on March 27th, 2018 — ‘coincidentally’ the same day I got banned on Facebook for posting the Hitler meme above — I received an email from YouTube telling me that they restricted access to one of the videos on my channel.
It’s an audio interview with Dennis Wise, the creator of the amazing documentary titled ‘Adolf Hitler: The Greatest Story NEVER Told’. Even just talking about a documentary about Hitler appears to be such a big issue that it needs to be censored (and if you want to learn why, click here).
Going forward I’m planning to reduce my participation on Facebook and all the traditional social media websites. In fact, at the end of 2016 when I was building my new website I had already planned to stop posting much on social media and focus more on posting on my blog where I have full control of the content. I had planned to start the transition when my new blog design would have been finished in 2017. Due to other work, I haven’t been able to focus on my new blog just yet, but I will be doing that in the future. So from then on I’ll be posting on my blog almost daily instead of on social media.
I’m also looking at other social media websites and applications that feature a peer to peer (P2P) and decentralized architecture. Such an architecture lends itself more to respecting users and the data that they share. There’s a lot of activity in that space (IPFS, Mastodon, DTube, PeerTube etc.) and I’m confident that we’ll soon have a couple of high quality P2P alternatives to choose from. As a recent article on Bloomberg mentions:
While YouTube has had to start taking a tougher line on censoring offensive videos that advertisers don’t want to be associated with, a growing swath of creators have fled to sites such as DTube to avoid the constraints. Like other upstart sites, DTube runs on the blockchain network Steem, and users can pay creators and commenters in digital tokens. […] Video creators with an interest in cryptocurrency say that’s also a factor driving them away from the big names. In the wake of Facebook’s data scandal, privacy is a third.
The less centralized platforms keep more power—and potentially, privacy—in the hands of creators and users, says Ned Scott, who runs the Steem-based social network Steemit.
So you might want to start subscribing to my RSS feed or via email to my maillist (via the form on the right sidebar) because in the near future that’s the only way you’ll really be able to stay in touch and get updates from me.