Head of the Signal app threatens to withdraw from Europe

https://lemmy.zip/post/50130760

Head of the Signal app threatens to withdraw from Europe - Lemmy.zip

Lemmy

Signal CEO Whittaker said that in the worst case scenario, they would work with partners and the community to see if they could find ways to circumvent these rules. Signal also did this when the app was blocked in Russia or Iran. “But ultimately, we would leave the market before we had to comply with dangerous laws like these.”

This is why we need the ability to sideload apps.

Most likely the reason, among others, they’re fighting tooth & nail to remove side loading too.
Are they?
Google will soon stop you sideloading unverified apps – here’s what that means for you

Sideload securely

TechRadar

Google will soon stop you sideloading unverified apps

unverified

ie, unsigned, so they are not

fighting tooth & nail to remove side loading too

Sideloading is still available: you can sign it yourself or bypass verification with adb as they documented.

Will Android Debug Bridge (ADB) install work without registration? As a developer, you are free to install apps without verification with ADB.

If I want to modify or hack some apk and install it on my own device, do I have to verify? Apps installed using ADB won’t require verification.

So, cool misinformation.

Frequently asked questions  |  Android developer verification  |  Android Developers

This guide answers common questions regarding Android developer verification, covering topics like compliance, account types, fees, and app registration processes.

Android Developers

Bruh, you’re trying to sanewash this of all things? Right now I can go to any third-party app store and click install on an app without me nor the developer having to kiss the ring of Google or by extension the regulators (EU with Chat Control) that they are beholden to.

After this I’ll have to fucking install Google’s SDK on my computer, manually download application files, and deploy them to my device over USB with CLI commands. I will never ever ever be able to get friends and family access to third-party applications after this change.

And fuck, man, there’s not even a guarantee this solution will last, either. Google promised they would allow on-device sideloading back when they started adding deeper and deeper settings restrictions on enabling sideloaded app support, their word means fuck-all and you know that.

You misidentified your objection. It isn’t sideloading removal, which isn’t happening. It’s developer verification, which affects the sideloading that remains available.

Just because you don’t understand the value of verifying signatures doesn’t mean it lacks value.

I recall the same alarm over secureboot: there, too, we can (load our certificates into secureboot and) sign everything ourselves. This locks down the system from boot-time attacks.

I will never ever ever be able to get friends and family access to third-party applications after this change.

Then sign it: problem solved.

Developer verification should also give them a hard enough time to install trash that fucks their system and steals their information when that trash is unsigned or signed & suspended.

Even so, it’s mentioned only in regard to devices certified for and that ship with Play Protect, which I’m pretty sure can be disabled.

Google promised they would allow on-device sideloading

Promise kept.

their word means fuck-all and you know that

No, I don’t. Developers are always going to need some way to load their unfinished work.

Android developer verification  |  Android Developers

Get started building your Android apps.

Android Developers
Adb is functionally useless for most people.

That’s twice that you’ve missed the point that everyone else is saying. Read it again:

without me nor the developer having to kiss the ring of Google or by extension the regulators (EU with Chat Control) that they are beholden to

Google is irreversibly designating themselves the sole arbiter of what apps can be freely installed in the formerly-open Android ecosystem. It’s the same as if they just one day decided that Chromium-based browsers would require sites have a signature from Google and Google alone. I honestly don’t give a shit if they did it just on Pixel devices, but they’re doing it to the phones of ALL manufacturers by looping it into Play services.

I just don’t understand: why the fuck are you so pussy-whipped by Google that you’re stanning their blatant power grabs?

I just don’t understand: why the fuck are you so pussy-whipped by Google that you’re stanning their blatant power grabs?

Probably works at google or is a fanboy.

They’re being precise about their terms, while everyone else is being sloppy. Not stanning

I don’t understand why you can’t read: (1) developer verification can be disabled, bypassed, or worked with, (2) you called it sideloading removal, which it isn’t.

You just don’t like the extra steps that limit the ease for ignorant users to install software known to be malicious that could have been blocked. I don’t like handholding my dumbass folks through preventable IT problems they created.

This does fuck all for “security”. It’s targeting, mainly, power users and puts just more hoops for developers. This has nothing with security (they should purge malware from Play store first) and everything to do with consolidating power over users.

It’s a blatant power grab and I’m surprised to see this interpreted as anything else. Arguing about semantics just helps Google fuck everyone over.

So let me buy a goddamn phone that I can install what I want in it. Again, I do not give a shit about any phone manufacturers that want to make a walled garden out of their Android installations. I agree, it’s perfect for the grandmas of the world. But Google is forcibly doing this to every goddamn phone, phone manufacturer, and Android enthusiast.

The only silver lining is that whenever Google decides that unregulated social media services like Lemmy are not family-safe I won’t have to listen to your malicious horseshit.

Seems you don’t care about grandmas & gen z.

forcibly doing this to every goddamn phone, phone manufacturer, and Android enthusiast

They can manage.

whenever Google decides that unregulated social media services like Lemmy are not family-safe I won’t have to listen to your malicious horseshit

So casual users can get wrecked, yet I’m malicious? Maybe think of users other than yourself, weigh the potential losses to them by successful attacks, and consider whether OS designers have a legitimate claim in preventing exposure of known threats to casual users while still allowing power users to bypass those checks.

You’re assuming I use an Android app (trash) to get on here, and not a proper workstation or web browser. You’re welcome to this “malicious horseshit” for eternity.

developer verification can be disabled, bypassed, or worked with

In reality, this is useless given the technical capabilities (or access to the technology necessary) of nearly every android user. What percentage of them do you think has the capacity or capability to use ADB?

you called it sideloading removal, which it isn’t.

Strictly it ticks the box, however effectively it is sideloading removal. Arguing otherwise honestly makes me think you work for them. It’s such obvious marketing bullshit “Oh, we left this tiny window open to tick the box which people can use, but almost certainly not you and even if you are capable, it’s a pain in the arse”. There are 7 intelligent people in my house. I’m the only one capable of using ADB without enormous effort, making it a deliberately huge barrier and even I’m not going to do it to install a trusted open source app.

Let’s be clear; the only reason they left that little window open was to have people like you say “no, sideloading is still possible” to cover their arses legally and also for actual developers, not because they care about an open ecosystem.

What percentage of them do you think has the capacity and capability to use ADB?

All of them: they can follow procedures, plug a cable, and push buttons if they really want to. Most won’t bother: capacity isn’t willpower.

it’s a pain in the arse

That’s the idea: welcome to an effective deterrent.

even I’m not going to do it to install a trusted open source app

Good, then it’ll deter as designed.

the only reason

Nah, the use cases are legitimate:

  • It will actually deter installation of malicious software once it’s been identified & flagged that way in their system.
  • It also verifies install packages haven’t been tampered (possibly maliciously) from their original releases.

Malicious software on devices connected to everything including highly sensitive information poses high-cost risks that you & casual users overlook because muh inconvenience 😭. If casual users can’t bother with a straightforward procedure as you say, then how prepared are they to handle the real challenges of a successful attack?

From a security perspective, it makes sense for OS designers to choose to limit exposure to that threat to power users who can be expected to at least have a better idea of what they’re getting themselves into.

Google employee confirmed. Absolute trash reasoning verging on trolling it’s so ridiculous. Wild that you arguing so vehemently in favour of reduced access to use your hardware the way you want.

All of them

Laughable. You’ve obviously never worked in any kind of customer support role.

Most people are going to melt at the steps necessary to use adb.

capacity isn’t willpower.

By capacity I meant access to hardware. There are so many people in poorer countries out there that don’t have a laptop, permission to start using one for installing adb on it but also have an android phone.

welcome to an effective deterrent.

I don’t want an effective deterrent that effectively kills fdroid and the like. That’s the whole point. I’ve favoured android because it’s more open. The talking points in favour of it pale in comparison to the loss of freedom.

Honestly just jog on. Please.

Even the OPLus phones are planning to softlock their phones in newer models
That means nothing when the servers stop taking EU traffic. I get your point, but the real solution here is putting a bullet (double tap) in Chat Control, once and for all.
You can run your own server for signal by the look of it
Not officially I don’t think. And even if you did, you’d need a customized app to point to said server, and then you wouldn’t be interoperable with the regular signal network

That means nothing when the servers stop taking EU traffic

I don’t use any of these apps, so I’m not quite sure how they work. But couldn’t you just make an app that keeps a local private and public key pair. Then when you send a message (say via regular sms) it includes under the hood your public key. Then the receiver when they reply uses your public key to encrypt the message before sending to you?

Unless the sms infrastructure is going to attempt to detect and reject encrypted content, this seems like it can be achieved without relying on a server backend.

That makes the assumption you want to use your phone number at all. And I’m sure the overhead of encryption would break SMS due to the limits on character counts.

That makes the assumption you want to use your phone number at all

Can’t use Signal without a phone number.

You CAN use it to interact with people without them knowing your number. The only current requirement is specific to registration.

That is how the signal protocol works, it’s end to end encrypted with the keys only known between the two ends.

The issue is that servers are needed to relay the connections (they only hold public keys) because your phone doesn’t have a static public IP that can reliably be communicated to. The servers are needed to communicate with people as they switch networks constantly throughout the day. And they can block traffic to the relay servers.

I think they’re suggesting doing it on top of SMS/MMS instead of a different transport protocol, like Signal does, which is IP based
Which is what Textsecure was. The precursor to Signal. Signal did it too, but removed it because it confused stupid people.

Signal does have a censorship circumvention feature in the advanced settings on iOS which may work when this hits provided you already have the app installed. Never had to use it though.

I think SimpleX removes the need for static relays.
SimpleX Chat - Contact

It was so hard getting people to use signal im imagining thisll never catch on

It is potentially doable:

A short message is 140 bytes of gsm7-bit packed characters (I.e. each character is translated to “ascii” format which only take up 7-bit space, which also is packed together forming unharmonic bytes), so we can probably get away with 160 characters per SMS.

According to crypto.stackexchange, a 2048-bit private key generates a base64 encoded public key of 392 characters.

That would mean 3 SMSs per person you send your public key to. For a 4096-bit private key, this accounts to 5 SMSs.

As key exchange only has to be sent once per contact it sounds totally doable.

After you sent your public key around, you should now be able to receive encrypted short messages from your contacts.

The output length of a ciphertext depends on the key size according to crypto.stackexchange and rfc8017. This means we have 256 bytes of ciphertext for each 2048-bit key encrypted plaintext message, and 512 bytes for 4096-bit keys. Translated into short messages, it would mean 2 or 4 SMSs for each text message respectively, a 1:2, or 1:4 ratio.

  • NIST recommends abandoning 2048-bit keys by 2030 and use 3072-bit keys (probably a 1:3 ratio)
  • average number of text messages sent per day and subscriber seems to be around 5-6 SMS globally, this excludes WhatsApp and Signal messages which seems to be more popular than SMS in many parts of the world [quotation needed, I just quickly googled it]

Hope you have a good SMS plan 😉

What is the public key length of RSA and Ed25519?

I have made some research but doesn't understand fully: In this link, it says ed25519 has a length of 64 characters. Questions: Is this base64 encoded characters? And does ed25519 limit to only 64

Cryptography Stack Exchange
That’s how signal started way back. Doesn’t work well - sms is terrible.

putting a bullet (double tap) in Chat Control,

Yes, please.

once and for all.

LOL, no. They’ll come back again with some other bullshit to Save the Children!™, it’s a never-ending whack-a-mole.

And they only have to win once, we have to fight and win every time they introduce a new variant. Its exhausting.
We need to get the right to privacy and control over our own devices enshrined as fundamental rights, like so many other rights the EU protects.
Signal has never done that. Whilst the app might not be available in some regions they’ve been proud to talk about how people can use it to avoid government barriers.
The CEO is saying they are willing to, that should be taken seriously.
I have become convinced by Cory Doctorow’s (tech writer and inventor of the term “enshittification”) argument that the fact that we’re even discussing this in terms of “sideloading” is a massive win for tech companies. We used to just call that “installing software” but now for some reason because it’s on a phone it’s something completely weird and different that needs a different term. It’s completely absurd to me that we as a society have become so accustomed to not being able to control our own devices, to the point of even debating whether or not we should be allowed to install our own software on our own computers “for safety.” It should be blatantly obvious that this is all just corporate greed and yet the general public can’t or refuses to see it.
TBH I was confused when I came across the term “sideloading” for the first few times because I thought it was something new. Part if the plan I guess. Damn.
Most of the general public buries their head in the sand. They are convinced being politically involved is either a waste of time or makes you crazy.

Tbf both are true.

Source: I have gone mad and everything has only become worse.

There are groups to support:

And in the UK:

Some political groups are better than others, but most politicians are clueless.

The key is to get muggles to understand we are living in Technofeudalism and why being digital serfs is bad. The problem is ineffective competition law and that monopolies are bad. That monopolies and standards are not the same thing. I have no idea how. Most people are just naturally compliant and unquestioning of something seemingly so abstract.

European Union

EFF has hundreds of donors and thousands of active supporters throughout Europe. We work with the many digital rights organizations across the continent and are members of EDRi, the international digital rights advocacy organization based in Brussels.

Electronic Frontier Foundation

In the 80’s (I’m that old), many home computers came with the programming manual, and the impetus was to learn to code and run your programs on your own device. Even with Android it’s not especially hard (with LLM’s even less so than it used to be) to download Android Studio, throw some shit onto the screen, hit build, and run your own helper app or whatever sideloaded installed via usb cable (or wirelessly) on your own device.

In certain cases (cars, health related hw etc.) I get why it’s probably for the best if the user is not supposed to mod their device outside preinstalled sw’s preferences/settings. But when it comes to computers (i.e. smartphones, laptops, tablets, tv boxes etc.) I fully agree with Cory here. Such a shame everything must go to shit.

About freedom, not freedom and various other things - might want to extend the common logic of gun laws to the remaining part of the human societies’ dynamics.

Signal is scary in the sense that it’s a system based on cryptography. Cryptography is a reinforcement, not a basis, if we are not discussing a file encryption tool. And it’s centralized as a service and as a project. It’s not a standard, it’s an application.

It can be compared to a gun - being able to own one is more free, but in the real world that freedom affects different people differently, and makes some freer than the other.

Again, Signal is a system based on cryptography most people don’t understand. Why would there not be a backdoor? Those things that its developers call a threat to rapid reaction to new vulnerabilities and practical threats - these things are to the same extent a threat against monoculture of implementations and algorithms, which allows backdoors in both.

It is a good tool for people whom its owners will never be interested to hurt - by using that backdoor in the open most people are not qualified to find, or by pushing a personalized update with a simpler backdoor, or by blocking their user account at the right moment in time.

It’s a bad tool even for them, if we account for false sense of security of people, who run Signal on their iOS and Android phones, or PCs under popular OSes, and also I distinctly remember how Signal was one of the applications that motivated me to get an Android device. Among weird people who didn’t have one then (around 2014) I might be even weirder, but if not, this seems to be a tool of soft pressure to turn to compromised suppliers.

Signal discourages alternative implementations, Signal doesn’t have a modular standard, and Signal doesn’t want federation. In my personal humble opinion this means that Signal has their own agenda which can only work in monoculture. Fuck that.

I think you may need some sleep man. wtf are you talking about
Perhaps you need to get some sleep if you don’t understand what I’m talking about.

I get it messenger = gun wow i didnt know!

Holstering my phone now thanks

Unironically yes, communications (information and roads) were historically as important. Lenin’s call to “take post, telegraph, telephone stations, bridges and rail stations” kinda illustrates that.

What I meant is that abstractly having fully private and free communications is just as universally good as everyone having a drone army. In reality both have problems. The problems with weapons are obvious, the problems with communications in my analogy are not symmetric to that, but real still - it’s that people can be deceived and backdoors and traps exist. Signal is one service, application and cryptographic system, it shouldn’t be relied upon this easily.

It’s sometimes hard to to express things based only on someone with good experience telling them to me, making it an appeal to anonymous authority, but a person who participated in a project for a state security service once told me that in those services cryptography is never the basis of a system. It can only be a secondary part.

Also, other than backdoors and traps, imbalance exists. Security systems are tools for specific purposes, none are universal. 20 years ago anonymity and resilience and globalism (all those plethora of Kademlia-based and overlay routing applications, most of which are dead now) were more in fashion, and now privacy and political weight against legal bans (non-technical thing, like, say, the title of the article) are. The balance between these in popular systems determines which sides and powers lose and benefit from those being used by many people. In case of Signal the balance is such that we supposedly have absolute privacy and convenience (many devices, history), but anonymity, resilience and globalism are reduced to proverbial red buttons on Meredith Whittaker’s table.

Unfortunately, I don’t get most of your refetences, but sure you can find similarities in wildy different things.

Signal being easy to rely on is its biggest benefit. No one will adopt something that’s more complex, but I don’t think extra complexity would offer better security for the average person. More complexity just means more things to go wrong.

People can be deceieved anywhere in their life, this isn’t synonymous to an end to end encrypted chat.

Backdoors do exist and they are obviously bad, but Signal choosing to leave the market before implementing one sounds best to me.

state security service once told me that in those services cryptography is never the basis of a system. It can only be a secondary part.

Obviously I’m no smarter than this person, but without cryptography how is any “secure” project actually “secure”. The only thing more important that I can imagine would be the physical location of a server (for example) being highly protected from bad actors.

In the end, I personally think having an easy to use platform that is secure gives everyone amazing power to recoup their free speech wherever is it eroded.

Signal being easy to rely on is its biggest benefit. No one will adopt something that’s more complex, but I don’t think extra complexity would offer better security for the average person. More complexity just means more things to go wrong.

My concerns on this are more that acceptable share in something in the internetworked world seems to be in percentages far smaller than the usual common sense percentages. Like - there are political systems with quotas, and there are anti-monopoly regulations, but with computers and the Internet every system is a meta-system. Allowing endless supply of monopolies and monocultures.

Signal is so easy to rely, that if you ask which applications with zero-knowledge cryptography and reliable groupchat encryption and so on people use, that are available without p2p (draining battery and connectivity requirements), with voice calls and file transfers, it’ll be mostly Signal.

Doesn’t matter it’s only one IM application. In its dimension it’s almost a monoculture. One group of developers, one company, one update channel. An update comes with a backdoor and it’s done.

It’s not specifically about Signal, rather about the amount of effort and publicity that goes into year 2002 schoolgirl’s webpage is as much as any separate IM application should get, if we want to avoid dangers with the Internet which don’t exist in other spheres. And they usually get more. The threshold where something becomes too big with computers is much smaller than with, I don’t know, garden owner associations.

Even if there are already backdoors put by their developers in a few very “open”, ideologically nice and friendly and “honorable” things like Signal, then such backdoors can exist and be used for many years before being found.

I mean, there are precedents IRL, and with computers you are hiding the needle in a much bigger hay stack.

Obviously I’m no smarter than this person

I’m bloody certain you are smarter than this person in everything not concerning things they were directly proficient in. And while being an idiot, they would stuck their nose into everything not their concern in very dangerous (for others, not for them) ways.

but without cryptography how is any “secure” project actually “secure”.

There are security schemes, security protocols, security models, and then there is cryptography as one kind of building blocks, with, just like in construction materials, its own traits and behavior.

In the end, I personally think having an easy to use platform that is secure gives everyone amazing power to recoup their free speech wherever is it eroded.

And I think the moment anything specific and controlled by one party becomes popular enough to be a platform, we’re screwed and we’re not secure.

Reminds of SG-1 and the Goauld (not good guys, I know) adjusting their spawn genome for different races.

Perhaps something like that should be made, a common DSL for describing application protocols and maybe even transport protocols, where we’d have many different services and applications, announcing themselves by a message in that DSL describing how to interact with them. (Also inspired by what Telegram creators have done with their MTProto thing, but even more general ; Telegram sometimes seems something that grew out of an attempt to do a very cool thing, I dunno if I was fair saying bad things about Durov on the Internet.)

A bit like in Star Wars Han Solo and Chewbacca speak to each other.

And a common data model, fundamentally extensible, say, posts as data blobs with any amount of tags of any length, it’s up to any particular application to decide on limits. Even which tag is the ID and how it’s connected to the data blob contents and others tags is up to any particular application. What matters is that posts can be indexed by tags and then replicated\shared\transferred\posted by various application protocols.

It should be a data-oriented system, so that one would, except for latency, use it as well by sharing post archives as they would by searching and fetching posts from online services, or even subscribing to posts of specific kind to be notified immediately. One can imagine many kinds of network services for this, relay services (like, say, IRC), notification services (like, say, SIP), FTP-like services, email-like services. The important thing would be that these are all transports, all variable and replaceable, and the data model is constant.

There can also be a DSL that describes some basics on how a certain way of interpreting posts and their tags works and which buttons, levers and text fields it presents, kinda similar to how we use the Web. It should be a layer above the DSL that would describe verification of checksums, identities, connections, trust, who has which privileges and so on.

Except all these DSLs should be concise and comprehensible, because otherwise they will turn into something like TG’s protocol in complexity and ugliness.

OK, I have temperature and I think I’ve lost my thought.

I am starting to agree with the new point. I still think everyone should move to Signal for now because it works and works well, but I see your point that one authority can become dangerous if any one malicious party in power tried anything.

There are probably solutions that could exist because it’s open source (eg a different trusted entity like f-droid managed builds from source for example so Signal themselves can’t add extra code in their builds or just a way to verify that no extra code is present in signals build vs any build from source).

In the future, I would prefer we moved to something more decentralised like what the Matrix protocol is trying to achieve. This could come with further issues, but while those are fixed, Signal is my main go to.

With Matrix I believe we would end up with pretty much the common data models as you were mentioning. Anyone can build their own server and or client and interact with others, knowing at least their software is safe.

I don’t think you understand anything you wrote about. Signal is open source, is publicly audited by security researchers, and publishes its protocol, which has multiple implementations in other applications. Messages are encrypted end-to-end, so the only weaknesses are the endpoints: the sender or recipients.

Security researchers generally agree that backdoors introduce vulnerabilities that render security protocols unsound. Other than create opportunities for cybercriminals to exploit, they only serve to amplify the powers of the surveillance state to invade the privacy of individuals.

I don’t think you understand anything you wrote about. Signal is open source,

I don’t think you should comment on security if “open source” means anything to you in that regard. For finding backdoors binary disassembly is almost as easy or hard as looking in that “open source”. It’s very different for bugs introduced unintentionally, of course.

Also why the hell are you even saying this, have you looked at that source for long enough? If not, then what good it is for you? Magic?

I suppose you are an illustration to the joke about Raymond’s “enough eyeballs” quote, the joke is that people talking about “enough eyeballs” are not using their eyeballs for finding bugs\backdoors, they are using them and their hands for typing the “enough eyeballs” bullshit.

“Given enough good people with guns, all streets in a town are safe”. That’s how this reads for a sane person who has at least tried to question that idiotic narrative about “open source” being the magic pill.

Stallman’s ideology was completely different, sort of digital anarchism, and it has some good parts. But the “open source” thing - nah.

is publicly audited by security researchers,

Exactly, and it’s not audited by you, because you for the life of you won’t understand WTF happens there.

Yes, it’s being audited by some security researchers out there, mostly American. If you don’t see the problem you are blind.

and publishes its protocol, which has multiple implementations in other applications.

No, there are no multiple implementations of the same Signal thing. There are implementations of some mechanisms from Signal. Also have you considered that this is all fucking circus and having a steel gate in a flimsy wooden fence? Or fashion, if that’s easier to swallow.

Can you confidently describe what zero-knowledge means there, how is it achieved, why any specific part in the articles they’ve published matters? If you can’t, what’s the purpose of it being published, it’s like a schoolboy saying “but Linux is open, I can read the code and change it for my needs”, yeah lol.

Security researchers generally agree that backdoors introduce vulnerabilities that render security protocols unsound.

Do security researches have to say anything on DARPA that funds many of them? That being an American military agency.

And on how that affects what they say and what they don’t say, what they highlight and what they pretend not to notice.

In particular, with a swarm of drones in the sky at some point, do you need to read someone’s messages, or is it enough to know that said someone connected to Signal servers 3 minutes ago from a very specific location and send one of those drones. Hypothetically.

Other than create opportunities for cybercriminals to exploit, they only serve to amplify the powers of the surveillance state to invade the privacy of individuals.

Oh, the surveillance state will be fine in any case!

And cybercriminals we should all praise for showing us what the surveillance state would want to have hidden, to create the false notion of security and privacy. When cybercriminals didn’t yet lose the war to said surveillance state, every computer user knew not to store things too personal in digital form on a thing connected to the Internet. Now they expose everything, because they think if cybercriminals can no longer abuse them, neither can the surveillance state.

Do you use Facebook, with TLS till its services and nothing at all beyond that? Or Google - the same?

Now Signal gives you a feeling that at least what you say is hidden from the service. But can you verify that, maybe there’s a scientific work classified yet, possibly independently made in a few countries. This is a common thing with cryptography, scientific works on that are often state secret.

You are also using AES with NSA-provided s-boxes all the time.

I suggest you do some playing with cryptography in practice. Too few people do, while it’s very interesting and enlightening.

I don’t think you should comment on security if “open source” means anything to you

Anyone can look at the source, brah, and security auditors do.

For finding backdoors binary disassembly is almost as easy or hard as looking in that “open source”.

Are you in the dark ages? Beyond code review, there are all kinds of automations to catch vulnerabilities early in the development process, and static code analysis is one of the most powerful.

Analysts review the design & code, subject it to various security analyzers including those that inspect source code, analyze dependencies, check data flow, test dynamically at runtime.

There are implementations of some mechanisms from Signal.

Right, the protocol.

Can you confidently describe

Stop right there: I don’t need to. It’s wide open for review by anyone in the public including independent security analysts who’ve reviewed the system & published their findings. That suffices.

Do security researches have to say anything on DARPA that funds many of them?

They don’t. Again, anyone in the public including free agents can & do participate. The scholarly materials & training on this aren’t exactly secret.

Information security analysts aren’t exceptional people and analyzing that sort of system would be fairly unexceptional to them.

Oh, the surveillance state will be fine in any case!

Even with state-level resources, it’s pretty well understood some mathematical problems underpinning cryptography are computationally beyond the reach of current hardware to solve in any reasonable amount of time. That cryptography is straightforward to implement by any competent programmer.

Legally obligating backdoors only limits true information security to criminals while compromising the security of everyone else.

I do agree, though: the surveillance state has so many resources to surveil that it doesn’t need another one.

In short - something “everyone being able to look upon” is not an argument. The real world analogies are landmines and drug dealers and snake oil.

Even with state-level resources, it’s pretty well understood some mathematical problems underpinning cryptography are computationally beyond the reach of current hardware to solve in any reasonable amount of time.

You are not speaking from your own experience, because which problems are solved and which are not is not solely determined by hardware you have to do it by brute force. Obviously.

And nation states can and do pay researchers whose work is classified. And agencies like NSA do not, for example, provide reasoning for their recommended s-boxes formation process. For example.

Solving problems is sometimes done analytically, you know. Mostly that’s what’s called solving problems. If that yields some power benefits, that can be classified, you know. And kept as a state secret.

Are you in the dark ages? Beyond code review, there are all kinds of automations to catch vulnerabilities early in the development process, and static code analysis is one of the most powerful.

People putting those in are also not in the dark ages.

Stop right there: I don’t need to. It’s wide open for review by anyone in the public including independent security analysts who’ve reviewed the system & published their findings. That suffices.

There are things which were wide open for review by anyone for thousands of years, yet we’ve gotten ICEs less than two centuries ago, and electricity, and so on. And in case of computers, you can make very sophisticated riddles.

So no, that doesn’t suffice.

They don’t.

Oh, denial.

Again, anyone in the public including free agents can & do participate. The scholarly materials & training on this aren’t exactly secret.

There have been plenty of backdoors found in the open in big open source projects. I don’t see how this is different. I don’t see why you have to argue, is it some religion?

Have you been that free agent? Have you participated? How do you think, how many people check things they use? How often and how deeply?

Information security analysts aren’t exceptional people and analyzing that sort of system would be fairly unexceptional to them.

Yes, but you seem to be claiming they have eagle eyes and owl wisdom to see and understand everything. As if all of mathematics were already invented.

Legally obligating backdoors only limits true information security to criminals while compromising the security of everyone else.

It’s not about obligating someone. It’s about people not working for free, and those people working on free (for you) stuff might have put in backdoors which it’s very hard to find. Backdoors usually don’t have the “backdoor” writing on them.

I do agree, though: the surveillance state has so many resources to surveil that it doesn’t need another one.

Perhaps the reason they have so many resources is that they don’t miss opportunities, and they don’t miss opportunities because they have the resources.

You sound paranoid but it doesn’t mean you aren’t right, at least to some extent.
So what’s your solution for secure messaging?
Getting rid of monoculture via transports and cryptography being pluggable (meaning that the resulting system would be fit for sneakernet as well as for some kind of federated relays as well as something Kademlia-based, the point is that the common standard would describe the data structure, not transports and verification and protection).

that’s a lot of words to say you generally accuse any programm that isn’t federated of having an agenda targeted at its userbase.

And lots of social woo-woo that doesn’t extend much further than “people don’t understand cryptography and think it’s therefore scary”.

A pretty weird post, and one which I don’t support any statement from because I think you’re wrong.