Signal Welcomes Users Fleeing WhatsApp in Search of Privacy—Which Faces Being Targeted Next
Jan. 15 , 2021 (EIRNS)—The dictatorial control over what can and cannot be said, exerted by Twitter, Facebook, and other social media platforms over the recent period, depends on those platforms knowing what its users are saying. But what if the platform itself does not know, and indeed cannot know, what its users are talking about?
This is the situation under end-to-end encryption, whereby communications between two (or more) parties can be read only by those parties, with absolutely no possibility of the provider of the communication service reading their messages as it acts as a delivery service. Difficult to use in the 1990s, end-to-end encryption became slightly more user friendly in the first decade of the 2000s. But it was in 2010 that the pseudonymous Moxie Marlinspike created TextSecure and RedPhone, which introduced easy end-to-end-encrypted texting and voice calls to Android users. In the years that followed, Marlinspike worked with WhatsApp to introduce end-to-end encryption into the communications of that service. But Facebook’s ownership of WhatsApp raises questions about the presence of deliberately introduced vulnerabilities into the encryption.
After frictions following Facebook’s purchase of his company, WhatsApp co-founder Brian Acton left WhatsApp to start the non-profit Signal Foundation with Marlinspike. There, they created Signal, an end-to-end encrypted messaging and voice platform, whose source code is made publicly available for auditing purposes and to demonstrate that there are no backdoors.
Created with a $50 million interest-free loan from Acton, Signal relies on donations rather than sales, keeps no user data, holds non-profit status which prevents its ever being purchased by a corporation, and is recommended by Edward Snowden himself for communication purposes.
Following the banning of Parler from Google and Apple’s app stores and its expulsion from Amazon Web Services, Signal became, by the middle of this week, the most-downloaded iOS app in 70 countries and the top Android app in 45.
Attempts to mandate backdoors into communications technologies have been made over the years (such as the flawed Clipper Chip proposed in the U.S. in the mid-1990s). But backdoors, to which only the intelligence services would have the proverbial key, are fundamentally impossible to keep secure. If the door exists, there is no way to guarantee that no one else can discover the key.
When Apple refused to create special software to attempt to unlock the iPhone of the 2015 San Bernardino shooter, it stated, correctly, that it was itself incapable of breaking through the robust security features it had built into the product. There was no backdoor.
While strong encryption raises the issue of terrible things being discussed privately—such as murder plots or child abuse imagery—there remains the possibility, as Edward Snowden pointed out in his recent interview with John Stossel of targeted hacking of the individual phones of people suspected of breaking the law. The widespread use of encrypted communication could force the transformation of mass surveillance into targeted surveillance, dealing a technical blow to the Orwellian horror of total data collection.
Will Signal be added to Google and Apple’s ban lists? Will privacy itself be considered criminal?