U.S. officials urge Americans to use encrypted apps amid unprecedented cyberattack
U.S. officials urge Americans to use encrypted apps amid unprecedented cyberattack
It’s probably also good practice to assume that not all encrypted apps are created equal, too. Google’s RCS messaging, for example, says “end-to-end encrypted”, which sounds like it would be a direct and equal competitor to something like Signal. But Google regularly makes money off of your personal data. It does not behoove a company like Google to protect your data.
Start assuming every corporation is evil. At worst you lose some time getting educated on options.
End to end is end to end. Its either “the devices sign the messages with keys that never leave the the device so no 3rd party can ever compromise them” or it’s not.
Signal is a more trustworthy org, but google isn’t going to fuck around with this service to make money. They make their money off you by keeping you in the google ecosystem and data harvesting elsewhere.
That’s a different threat model that verges on “most astonishing corporate espinoage in human history and greatest threat to corporate personhood” possible for Google. It would require thousands if not tens of thousands of Google employees coordinating in utter secrecy to commit an unheard of crime that would be punishable by death in many circumstances.
If they have backdoored all android phones and are actively exploting them in nefarious ways not explained in their various TOS, then they are exposing themselves to ungodly amounts of legal and regulatory risks.
I expect no board of directors wants a trillion dollars of company worth to evaporate overnight, and would likely not be okay backdooring literally billions of phones from just a fiduciary standpoint.
It would require thousands if not tens of thousands of Google semployees coordinating in utter secrecy
This is usually used for things like the Moon Landing, where so many folks worked for NASA to make it entirely impossible that the landing was faked.
But it doesn’t really apply here. We know for example that NSA backdoors exist in Windows. Were those a concerted effort by MS employees? Does everyone working on the project have access to every part of the code?
It just isn’t how development works at this scale.
Telegram has its supposedly E2EE protocol which isn’t used by most of Telegram users, but also there have been a few questionable traits found in it.
Google is trusted a bit more than Pavel Durov, but it can well do a similar thing.
And yes, Android is a much larger heap of hay where they can hide a needle.
This is usually used for things like the Moon Landing, where so many folks worked for NASA to make it entirely impossible that the landing was faked.
I think it’s also confirmed by radio transmissions from the Moon received in real time right then by USSR and other countries.
How do spyware services used by nation-state customers, like Pegasus, work?
They use backdoors in commonly used platforms on an industrial scale.
Maybe some of them are vulnerabilities due to honest mistakes, the problem is - the majority of vulnerabilities due to honest mistakes also carry denial of service risks in widespread usage. Which means they get found quickly enough.
So your stance is that Google is applying self designed malware to its own services to violate its own policies to harvest data that could bring intense legal, financial and reputational harm to it as an org it was ever discovered?
Seems far fetched.
Legal and financial - doubt it. Reputational - counter-propaganda is a thing.
I think your worldview lags behind our current reality. I mean, even in 30-years old reality it would seem a bit naive.
Also you’ve ignored me mentioning things like Pegasus, from our current, not hypothetical, reality.
So yes.
You think a nearly trillion dollar public company has an internal division that writes malware against flaws in its own software in order to harvest data from its own apps. It does this to gain just a bit more data about people it already has a lot of data on, because why not purposely leave active zero days in your own software, right?
That is wildly conspiratorial thinking, and honestly plain FUD. It undermines serious, actual privacy issues the company has when you make up wild cabals that are running double secret malware attacks against themselves inside Google.
You think a nearly trillion dollar public company has an internal division that writes malware against flaws in its own software in order to harvest data from its own apps. It does this to gain just a bit more data about people it already has a lot of data on, because why not purposely leave active zero days in your own software, right?
You think you are being the smart one here?
No, that’s not what I said. Also cypherpunks and other hobbyists are not that much smarter than corporations and nation-states, to be the only ones to think about plausible deniability.
For example, the whole Windows sources have been given officially for various 3-letter agencies of various countries (Russia included) to study, and of course there are vulnerabilities with the size of such codebase. MS might not have left obvious backdoors and informed FSB of them, but it has given interested parties the ability to find those themselves, which is only a question of work, or maybe make tampered versions of DLLs and what not easier.
Also they are legally obligated to silently comply with a lot of things.
That is wildly conspiratorial thinking, and honestly plain FUD.
WhatsApp and Facebook (before it bought WhatsApp) have both done this, Telegram has done this, MS has done this, even Apple has done this.
when you make up wild cabals that are running double secret malware attacks against themselves inside Google.
You made that up, not me. Should have tried to read what you are being told first.