1. WhatsApp has a backdoor that can be used to decrypt your "end-to-end encrypted" texts.
2. People discover said backdoor (I guess through poking around or reverse engineering, since it's proprietary and code inspection isn't possible) and warn WhatsApp about it.
3. WhatsApp acknowledges existence of backdoor, says it's intentional ("it's not a bug, that's our design!") and they won't bother fixing it.
4. Millions of people continue using communications that can be actively intercepted, because they were told that some "encryption" thing they know nothing about would protect them.

Why am I not surprised here?

https://gnusocial.club/url/40013

Don't use #WhatsApp. Just don't.

!privacy
@kzimmermann2 It's not a backdoor, but it's a terrible and vulnerability dismissed as a usability feature. (Which does fit the definition of some backdoors, granted.)

With that said, WhatsApp is proprietary and they can decide at any point to do something malicious without the user's knowledge or consent.
I think that's the key point: it's proprietary, and we cannot inspect or change the way it works.

If this was Free Software, with some effort I could understand this being defended as some kind of feature ("hey, you don't like it, fork it!"). However, since users have no choice and this behavior can be used to compromise security, I don't see how this is any different from how Microsoft leaves some vulnerabilities unpatched or, worse, even shares them with the NSA.

And sadly, just like with Windows computers, sometimes we have to use WhatsApp too.
@kzimmermann2 If you're defining any proprietary software as a potential backdoor, then I'm with you.
Well, I'm not defining proprietary software, but if I was, I would include "not giving users choice" somewhere in there. Including the choice to avoid / fix known backdoors.