Spyware and surveillance on Apple and Xiaomi devices
In the past I’ve warned multiple times not to use hardware and software from Chinese manufacturers such as Huawei, and more examples keep piling up showing why that’s a bad idea. Here’s from Reuters, “Lithuania says throw away Chinese phones due to censorship concerns” (September 21st 2021):
VILNIUS, Sept 21 (Reuters) – Lithuania’s Defense Ministry recommended that consumers avoid buying Chinese mobile phones and advised people to throw away the ones they have now after a government report found the devices had built-in censorship capabilities.
Flagship phones sold in Europe by China’s smartphone giant Xiaomi Corp (1810.HK) have a built-in ability to detect and censor terms such as “Free Tibet”, “Long live Taiwan independence” or “democracy movement”, Lithuania’s state-run cybersecurity body said on Tuesday.
The capability in Xiaomi’s Mi 10T 5G phone software had been turned off for the “European Union region”, but can be turned on remotely at any time, the Defence Ministry’s National Cyber Security Centre said in the report.
“Our recommendation is to not buy new Chinese phones, and to get rid of those already purchased as fast as reasonably possible,” Defence Deputy Minister Margiris Abukevicius told reporters in introducing the report.
Keep in mind that companies in China are obligated by law to work with the Chinese criminal government when it comes to spying on people. That could mean installing backdoors or surveillance capabilities into their hardware and/or software products. And it’s also know that they are exporting their “Surveillance State” technology to the rest of the world.
But even other companies such as Apple, who supposedly value people’s privacy, are now building backdoors into their hardware and software. Here’s from The Wall Street Journal, “Apple Plans to Have iPhones Detect Child Pornography, Fueling Privacy Debate” (August 5th 2021):
Apple Inc. plans to introduce new iPhone software designed to identify and report collections of sexually exploitative images of children, aiming to bridge the yearslong divide between the company’s pledge to protect customer privacy and law enforcement’s desire to learn of illegal activity happening on the device.
The software, slated for release in an update for U.S. users later this year, is part of a series of changes Apple is preparing for the iPhone to protect children from sexual predators, the company said Thursday.
The reason given by Apple for wanting to include this spyware on their devices (the iPhone is just the beginning of course) is basically to “protect the children.” That’s one of the standard reasons given (along with preventing “money laundering” and ‘terrorism’) whenever corporations and criminal governments want to convince the general public to give up their freedom and accept nefarious policies. It’s always about ‘protecting’ you and especially the children.
And I’m sure you can imagine that once the above software has been installed on millions of Apple devices worldwide, it will be very easy to add capabilities to it for detecting other types of content. This would enable Apple and criminal governments worldwide to spy on people in real-time, much like already happens in China. The Electronic Frontier Foundation (EFF) had the following to say about this in “Apple’s Plan to ‘Think Different’ About Encryption Opens a Backdoor to Your Private Life” (August 5th 2021):
Apple has announced impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage. If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system.
Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.
…
We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.
All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change. Take the example of India, where recently passed rules include dangerous requirements for platforms to identify the origins of messages and pre-screen content. New laws in Ethiopia requiring content takedowns of “misinformation” in 24 hours may apply to messaging services. And many other countries—often those with authoritarian governments—have passed similar laws. Apple’s changes would enable such screening, takedown, and reporting in its end-to-end messaging.
After the worldwide backlash that Apple received, they have “delayed rollout” of their backdoor, but it’ll probably be implemented in the very near future; they’ll just wait for the next opportune moment.
And pay very special attention: this backdoor developed by Apple also sounds a lot like the capability that Facebook wants to build into WhatsApp to spy on users. I blogged about that a while ago, and that feature in WhatsApp involved the following:
In Facebook’s vision, the actual end-to-end encryption client itself such as WhatsApp will include embedded content moderation and blacklist filtering algorithms. These algorithms will be continually updated from a central cloud service, but will run locally on the user’s device, scanning each cleartext message before it is sent and each encrypted message after it is decrypted.
The company even noted that when it detects violations it will need to quietly stream a copy of the formerly encrypted content back to its central servers to analyze further, even if the user objects, acting as true wiretapping service.
Facebook’s model entirely bypasses the encryption debate by globalizing the current practice of compromising devices by building those encryption bypasses directly into the communications clients themselves and deploying what amounts to machine-based wiretaps to billions of users at once.
The “embedded content moderation and blacklist filtering algorithms” that Facebook wants to build into WhatsApp also sound very similar to what’s already installed on the Xiaomi phones mentioned above. In the mean time, Facebook already reads supposedly private communications of users on WhatsApp. Here’s from the New York Post, “Facebook reads and shares WhatsApp private messages: report” (September 7th 2021):
Facebook’s encrypted messaging service WhatsApp isn’t as private as it claims, according to a new report. The popular chat app, which touts its privacy features, says parent Facebook can’t read messages sent between users. But an extensive report by ProPublica on Tuesday claims that Facebook is paying more than 1,000 contract workers around the world to read through and moderate WhatsApp messages that are supposedly private or encrypted.
What’s more, the company reportedly shares certain private data with law enforcement agencies, such as the US Department of Justice. The revelation comes after Facebook boss Mark Zuckerberg has repeatedly said that WhatsApp messages are not seen by the company. “We don’t see any of the content in WhatsApp,” the CEO said during testimony before the US Senate in 2018.
Privacy is touted even when new users sign up for the service, with the app emphasizing that “your messages and calls are secured so only you and the person you’re communicating with can read or listen to them, and nobody in between, not even WhatsApp.”
“Those assurances are not true,” said the ProPublica report. “WhatsApp has more than 1,000 contract workers filling floors of office buildings in Austin, Texas, Dublin and Singapore, where they examine millions of pieces of users’ content.”
Facebook acknowledged that those contractors spend their days sifting through content that WhatsApp users and the service’s own algorithms flag, and they often include everything from fraud and child porn to potential terrorist plotting.
Are you surprised? I certainly am not. Fuckerberg keeps lying and keeps getting away with it.
Apple is even going a step further and is now working on features that will detect “cognitive issues” when people use their devices. Here’s from The Wall Street Journal, “Apple Wants iPhones to Help Detect Depression, Cognitive Decline” (September 21st 2021):
Apple Inc. is working on technology to help diagnose depression and cognitive decline, aiming for tools that could expand the scope of its burgeoning health portfolio, according to people familiar with the matter and documents reviewed by The Wall Street Journal.
Using an array of sensor data that includes mobility, physical activity, sleep patterns, typing behavior and more, researchers hope they can tease out digital signals associated with the target conditions so that algorithms can be created to detect them reliably, the people said. Apple hopes that would become the basis for unique features for its devices, according to the people and documents.
You can imagine that such features could easily be used to create psychological profiles of users, allowing Apple and criminal governments around the world to, easily and very early on, detect when someone starts to behave a lot like, for example, a “conspiracy theorist.” Facebook has been doing such profiling and psychological experiments on users for many years now, and I blogged about a personal case back in 2018 when Facebook thought I was “going through a difficult time.”
The above, and much more, is why I switched to using Signal instead of WhatsApp, and why I’m going to switch to using a privacy phone like the Librem 5 in the near future. Just like Lithuania’s Defense Ministry, I highly recommended throwing away all Apple and Chinese surveillance hardware and software, if you know what’s good for you.
Comments
There are 0 responses. Follow any responses to this post through its comments RSS feed. You can leave a response, or trackback from your own site.