can someone explain why the guy who posted 45 patches to "rewrite the kernel in c++" in 2018 is posting completely insane modifications to crypto/ in his linux-fs tree (edit: this is all on master and i assume it's going in the 7.0 rc) https://git.kernel.org/pub/scm/linux/kernel/git/dhowells/linux-fs.git/commit/?h=keys-pqc&id=f3eccecd782dbaf33d5ad0d1fd22ea277300acdb

pkcs7: Allow the signing algo to do whatever digestion it wants itself

Allow the data to be verified in a PKCS#7 or CMS message to be passed
directly to an asymmetric cipher algorithm (e.g. ML-DSA) if it wants to do
whatever passes for hashing/digestion itself. The normal digestion of the
data is then skipped as that would be ignored unless another signed info in
the message has some other algorithm that needs it.

so already this is framing ML-DSA as like doing something extra but the NIST publication indicates that's the job of HashML-DSA

https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.204.pdf

this exact fucking guy has an entirely separate "keyutils" repo (he apparently has a patent in this space?) https://git.kernel.org/pub/scm/linux/kernel/git/dhowells/keyutils.git/ no activity for 3 years

Rename ->digest and ->digest_len to ->m and ->m_size to represent the input
to the signature verification algorithm, reflecting that ->digest may no
longer actually *be* a digest.

https://git.kernel.org/pub/scm/linux/kernel/git/dhowells/linux-fs.git/commit/?h=keys-pqc&id=f728074f1f577565c97e465652c3d4afb0964013

diff --git a/crypto/asymmetric_keys/asymmetric_type.c b/crypto/asymmetric_keys/asymmetric_type.c
index 348966ea2175c9..2326743310b1ee 100644
--- a/crypto/asymmetric_keys/asymmetric_type.c
+++ b/crypto/asymmetric_keys/asymmetric_type.c
@@ -593,10 +593,10 @@ static int asymmetric_key_verify_signature(struct kernel_pkey_params *params,
{
struct public_key_signature sig = {
.s_size = params->in2_len,
- .digest_size = params->in_len,
+ .m_size = params->in_len,
.encoding = params->encoding,
.hash_algo = params->hash_algo,
- .digest = (void *)in,
+ .m = (void *)in,
.s = (void *)in2,
};
pkcs7: Allow the signing algo to do whatever digestion it wants itself - kernel/git/dhowells/linux-fs.git - VFS and Filesystem bits

i'm gonna have to send an email. this is accepted in 7.0 rc and on my laptop because i pulled master
reading the nist doc first. you will never guess that it says "strongly unforgeable" without citation and with no glossary entry. looking it up and "strongly unforgeable" appears to be strictly a thing for module-learning with errors lattice

https://eprint.iacr.org/2010/070.pdf

Our construction offers the same efficiency as the “bonsai tree” scheme but supports the stronger notion of strong unforgeability.

hmmmmm sounds fascist

bonsai tree definitely sounds tougher. you know they don't choose specially cultured trees for those? that's a real tree and it was made tough and beautiful

Over the past several years, there has been steady progress toward building quantum computers. The
security of many commonly used public-key cryptosystems will be at risk if large-scale quantum computers
are ever realized. This would include key-establishment schemes and digital signatures that are based on
integer factorization and discrete logarithms (both over finite fields and elliptic curves).

with you

As a result, in 2016, NIST

you've lost me there

choose a normal fucking name what's wrong with you

ML-DSA is derived from one of the selected schemes, CRYSTALS-DILITHIUM [5, 6]

well into the foreseeable future, including after the advent of cryptographically relevant quantum computers.

"cryptographically relevant" makes no fucking sense. 25. cmon. a horse can be trained to factor 25

destroy:
An action applied to a key or a piece of secret data. After a key or a piece of
secret data is destroyed, no information about its value can be recovered.

the definition of digital signature is complete word salad

The result of a cryptographic transformation of data that, when properly implemented, provides a mechanism to verify origin authenticity and data integrity and to enforce signatory non-repudiation.

"when properly implemented". psh yeah you can do cryptographic transformation of data, but can you provide a mechanism to verify origin authenticity and data integrity and to enforce signatory non-repudiation?????

🚨 fedi entities: NIST has a glossary entry for you

An individual person, organization, device, or process. Used interchangeably with party.

🥳

oh my god yes this definition of equivalence

Two processes are equivalent if the same output is produced when the same values are input to each process

ok, cool

(either as input parameters, as values made
available during the process, or both).

literally what the fuck

A function on bit strings in which the output can be extended to any desired length.

any length. ok

as long as the specified output length is
sufficiently long to prevent trivial attacks

wouldn't really say any

oh lmao and they're like "it's one-way", a definition that can only be disproven. collision-resistant: yes it has a meaning. but we're past the c entries. you're too late honey!
is a hash value a message digest?
a hash value is much broader than a message digest
only glossary entry that's a redirect
bro they have shall and should in bold. it's so over

oh my GOD

trusted third party (TTP)
An entity other than the key pair owner and the verifier that is trusted by the owner, the verifier, or both. Sometimes shortened to “trusted party.”

is it? is it sometimes shortened to that?

signature validation
The mathematical verification of the digital signature along with obtaining the appropriate assurances (e.g., public-key validity, private-key possession, etc.).

LMAO literally "the math part plus the government surveillance"

tagging the astral engineer who wrote this because pgp keys were not susceptible to his little wehow tutorial on cryptographic surveillance https://blog.yossarian.net/2023/05/21/PGP-signatures-on-PyPI-worse-than-useless
PGP signatures on PyPI: worse than useless

PyPI has done everything right, including purposely removing frontend support for PGP years ago.

literally intended to make you think an Event had happened. nope!

Type casting. The variable 𝑥 is assigned a value in a set 𝑆 that is constructed from
the value of an expression 𝑦 in a possibly different set 𝑇. The set 𝑇 and the mapping
from 𝑇 to 𝑆 are not explicitly specified, but they should be obvious from the context
in which this statement appears.

this is like. a math notation? "should be obvious from the context" honey no you need to tell me about this hash scheme and shut up

The Number Theoretic Transform (NTT) is a specific isomorphism between the rings 𝑅𝑞 and 𝑇𝑞.

that is nippon telephone and teletype

The motivation for using NTT is that multiplication is considerably faster in the ring 𝑇𝑞

that's good. i was worried there was any history of ulterior motives for NIST choosing a specific set of parameters to construct a ring

they mention things that make absolutely no sense and next line oh yeah this symbol means composition of matrix multiplication
i still hate the capital R_m used for the ring of single-variable polynomials. hated it when dan jackoff bernstein used it too
oh great the pdf links mysteriously fail at hashml-dsa.sign yesssssss and for the verify phase
algorithms "4" and "5"
YOU CAN'T BE FUCKING SERIOUS

In the HashML-DSA version, the message input to ML-DSA.Sign_internal is the result of applying either a hash function or a XOF to the content to be signed.

so the red-hat guy making insane crypto changes is also just saying insane shit about the crypto that people can't call him out on. ml-dsa does not "do its own hashing"

For some cryptographic modules that generate ML-DSA signatures, hashing the message in step 6 of ML-DSA.Sign_internal may result in unacceptable performance if the message 𝑀 is large.

this fucking high school science project is safe against quantum computers which not only will exist, but could even exist RIGHT NOW!

which is why we are doing this in the open together

come on man don't do this

For some use cases, this may be addressed by signing a digest of the message along with some domain separation information rather than signing the message directly. This version of ML-DSA is known as “pre-hash” ML-DSA or HashML-DSA . In general, the “pure” ML-DSA version is preferred.

CRYPTOGRAPHICALLY RELEVANT

These algorithms are used to support the key compression optimization of ML-DSA.

"compression"

if you could "compress" it, it wouldn't be "encrypted"

OPTIMIZATION!!!!!!!

They involve dropping the 𝑑 low-order bits of each coefficient of the polynomial vector 𝐭 from the public key using the function Power2Round.

sure that's fine they're lower order right that's fine right

However, in order to make this optimization work,

that's what i wanna hear

additional information called a “hint” needs to be provided in the signature

i dare you to use a single normal name

to allow the verifier to reconstruct enough of the information in the dropped public-key bits to verify the signature.

hm is that what the verifier is doing? reconstructing information? tbh don't care

This standard makes use of the functions SHAKE256 and SHAKE128, as defined in FIPS 202 [7]. While FIPS 202 specifies these functions as inputting and outputting bit strings, most implementations treat inputs and outputs as byte strings.

honey fips 202 is not the part you need to cite

literally

Using this approach, security strength is not described by a single number, such as “128 bits of security.”

this is gonna be good

Instead, each ML-DSA parameter set is claimed to be at least as secure as a generic block cipher with a prescribed key size or a generic hash function with a prescribed output length.

you fucking prick that's the same thing fuck you

YES!!!! YES!!!!

More precisely, it is claimed that the computational resources needed to break ML-DSA are greater than or equal to the computational resources needed to break the block cipher or hash function when these computational resources are estimated using any realistic model of computation.

yeah see it's the quantum factor

oh they set themselves up so hard

Different models of computation can be more or less realistic and, accordingly, lead to more or less accurate estimates of security strength.

nodding yes jack nicholson gif

zero normal names

For the default “hedged” version of ML-DSA signing

shut the fuck up you're not cute

for the optional deterministic variant,

that's definitely a line to only write out in the pseudocode of algorithm 2 and nowhere else. bet you wish you could do that to my code huh