🚨 fedi entities: NIST has a glossary entry for you

An individual person, organization, device, or process. Used interchangeably with party.

🥳

oh my god yes this definition of equivalence

Two processes are equivalent if the same output is produced when the same values are input to each process

ok, cool

(either as input parameters, as values made
available during the process, or both).

literally what the fuck

A function on bit strings in which the output can be extended to any desired length.

any length. ok

as long as the specified output length is
sufficiently long to prevent trivial attacks

wouldn't really say any

oh lmao and they're like "it's one-way", a definition that can only be disproven. collision-resistant: yes it has a meaning. but we're past the c entries. you're too late honey!
is a hash value a message digest?
a hash value is much broader than a message digest
only glossary entry that's a redirect
bro they have shall and should in bold. it's so over

oh my GOD

trusted third party (TTP)
An entity other than the key pair owner and the verifier that is trusted by the owner, the verifier, or both. Sometimes shortened to “trusted party.”

is it? is it sometimes shortened to that?

signature validation
The mathematical verification of the digital signature along with obtaining the appropriate assurances (e.g., public-key validity, private-key possession, etc.).

LMAO literally "the math part plus the government surveillance"

tagging the astral engineer who wrote this because pgp keys were not susceptible to his little wehow tutorial on cryptographic surveillance https://blog.yossarian.net/2023/05/21/PGP-signatures-on-PyPI-worse-than-useless
PGP signatures on PyPI: worse than useless

PyPI has done everything right, including purposely removing frontend support for PGP years ago.

literally intended to make you think an Event had happened. nope!

Type casting. The variable 𝑥 is assigned a value in a set 𝑆 that is constructed from
the value of an expression 𝑦 in a possibly different set 𝑇. The set 𝑇 and the mapping
from 𝑇 to 𝑆 are not explicitly specified, but they should be obvious from the context
in which this statement appears.

this is like. a math notation? "should be obvious from the context" honey no you need to tell me about this hash scheme and shut up

The Number Theoretic Transform (NTT) is a specific isomorphism between the rings 𝑅𝑞 and 𝑇𝑞.

that is nippon telephone and teletype

The motivation for using NTT is that multiplication is considerably faster in the ring 𝑇𝑞

that's good. i was worried there was any history of ulterior motives for NIST choosing a specific set of parameters to construct a ring

they mention things that make absolutely no sense and next line oh yeah this symbol means composition of matrix multiplication
i still hate the capital R_m used for the ring of single-variable polynomials. hated it when dan jackoff bernstein used it too
oh great the pdf links mysteriously fail at hashml-dsa.sign yesssssss and for the verify phase
algorithms "4" and "5"
YOU CAN'T BE FUCKING SERIOUS

In the HashML-DSA version, the message input to ML-DSA.Sign_internal is the result of applying either a hash function or a XOF to the content to be signed.

so the red-hat guy making insane crypto changes is also just saying insane shit about the crypto that people can't call him out on. ml-dsa does not "do its own hashing"

For some cryptographic modules that generate ML-DSA signatures, hashing the message in step 6 of ML-DSA.Sign_internal may result in unacceptable performance if the message 𝑀 is large.

this fucking high school science project is safe against quantum computers which not only will exist, but could even exist RIGHT NOW!

which is why we are doing this in the open together

come on man don't do this

For some use cases, this may be addressed by signing a digest of the message along with some domain separation information rather than signing the message directly. This version of ML-DSA is known as “pre-hash” ML-DSA or HashML-DSA . In general, the “pure” ML-DSA version is preferred.

CRYPTOGRAPHICALLY RELEVANT

These algorithms are used to support the key compression optimization of ML-DSA.

"compression"

if you could "compress" it, it wouldn't be "encrypted"

OPTIMIZATION!!!!!!!

They involve dropping the 𝑑 low-order bits of each coefficient of the polynomial vector 𝐭 from the public key using the function Power2Round.

sure that's fine they're lower order right that's fine right

However, in order to make this optimization work,

that's what i wanna hear

additional information called a “hint” needs to be provided in the signature

i dare you to use a single normal name

to allow the verifier to reconstruct enough of the information in the dropped public-key bits to verify the signature.

hm is that what the verifier is doing? reconstructing information? tbh don't care

This standard makes use of the functions SHAKE256 and SHAKE128, as defined in FIPS 202 [7]. While FIPS 202 specifies these functions as inputting and outputting bit strings, most implementations treat inputs and outputs as byte strings.

honey fips 202 is not the part you need to cite

literally

Using this approach, security strength is not described by a single number, such as “128 bits of security.”

this is gonna be good

Instead, each ML-DSA parameter set is claimed to be at least as secure as a generic block cipher with a prescribed key size or a generic hash function with a prescribed output length.

you fucking prick that's the same thing fuck you

YES!!!! YES!!!!

More precisely, it is claimed that the computational resources needed to break ML-DSA are greater than or equal to the computational resources needed to break the block cipher or hash function when these computational resources are estimated using any realistic model of computation.

yeah see it's the quantum factor

oh they set themselves up so hard

Different models of computation can be more or less realistic and, accordingly, lead to more or less accurate estimates of security strength.

nodding yes jack nicholson gif

zero normal names

For the default “hedged” version of ML-DSA signing

shut the fuck up you're not cute

for the optional deterministic variant,

that's definitely a line to only write out in the pseudocode of algorithm 2 and nowhere else. bet you wish you could do that to my code huh

hey just so it's clear all of linux-crypto and some guy from cloudflare signed off the c++ rewrite guy modifying the entire cryptographic implementation because of shit he made up

bro oh my god
https://git.kernel.org/pub/scm/linux/kernel/git/dhowells/linux-fs.git/commit/?h=keys-pqc&id=2c62068ac86bdd917a12eef49ba82ec8b091208b

It may be that this is inadvisable.

you can't say that man. your code is on my laptop and you are actively obfuscating it in linux-fs

x509: Separately calculate sha256 for blacklist - kernel/git/dhowells/linux-fs.git - VFS and Filesystem bits

Note that whilst ML-DSA does allow for an "external mu", CMS doesn't yet have that standardised.

hey is it good when the IETF refers to the CMS and doesn't link it

Errata Exist

oh thanks

This is because in most situations, CMS signatures are computed over a set of signed attributes that contain a hash of the content, rather than being computed over the message content itself.

see the thing about the IETF? they never let me down

ML-DSA signature generation and verification is significantly faster than SLH-DSA.

no clue what SLH is but it must be a new form of quantum

Federal Information Processing Standard (FIPS) 205, Stateless Hash-Based Digital Signature Standard

This standard specifies the stateless hash-based digital signature algorithm (SLH-DSA). Digital signatures are used to detect unauthorized modifications to data and to authenticate the identity of the signatory. In addition, the recipient of signed data can use a digital signature as evidence in demonstrating to a third party that the signature was, in fact, generated by the claimed signatory. This is known as non-repudiation since the signatory cannot easily repudiate the signature at a later time. SLH-DSA is based on SPHINCS+, which was selected for standardization as part of the NIST Post-Quantum Cryptography Standardization process.

CSRC | NIST

@hko i was wondering why i had never ever heard that word that sounds like aliens making fun of english "repudiation"

This is known as non-repudiation since the signatory cannot easily repudiate the signature at a later time.

you know what? i can't repudiate that

@hko NO WAY IT'S LITERALLY JUST A SIGNATURE

@hipsterelectron oh yes, ml-dsa and slh-dsa are both signing algorithms.
The latter is ridiculously heavy.

In the OpenPGP pqc draft, the first of the two is only allowed as one part of a "composite" signature that includes a traditional signature (e.g. ml-dsa + ed25519)

@hko i quite like composite keys! kyber is neat and while the nist reference implementation had that hilarious miscompile in 2023-2024 nist will do anything to harm good crypto
@hko from reading the table of contents this sounds like "what if we did tree-based hashing and made it bad"
@hipsterelectron i have not tried to understand the innards of the algorithms so far 😬 (I've just used them via libraries)
@hko i don't tend to waste my time on obfuscated work