The official microG OS project (https://lineage.microg.org/) leaked their private keys for logging into their servers and signing releases:

https://github.com/lineageos4microg/l4m-wiki/wiki/December-2025-security-issues

We make our official builds on local machines. Our signing machine's keys aren't ever on any storage unencrypted.

LineageOS for microG

LineageOS for microG website.

LineageOS for microG
Our roadmap for improving security of verifying updates is based on taking advantage of the reproducible builds. We plan to have multiple official build locations and a configurable signoff verification system in the update clients also usable with third party signoff providers.
We don't have faith in any available commercial HSM products being more secure than keeping keys encrypted at rest on the primary local build machine. Instead, we're planning to develop software for using the secure element on GrapheneOS phones as an HSM for signing our releases.

@GrapheneOS would you be willing to elaborate on your position on commercial HSM products? Does this include products like Yubikey?

(Read: I use a Yubikey to decrypt my LUKS drive and now I'm wondering if I made a huge mistake.)

@Canine3625 YubiKey doesn't even support firmware updates. Instead of having verified boot with cryptographic signature verification and downgrade protection along with update signing and downgrade protection, their firmware cannot be updated. They cannot fix security flaws on their products beyond releasing new ones. Many of their products have been found to be vulnerable to exploits at a firmware or hardware level and they do not even have the option to release patches for those issues.
@Canine3625 We have very little faith in the security of products from Yubico and other HSM vendors. We would consider it to be a downgrade from having the keys stored encrypted at rest on the primary build machine. The primary build machine being exploited would already be a disaster and needs to be protected against. Signing keys aren't actually super special compared to the security of the builds. We don't want to create a weak point for signing keys through an attempt at improving things.
@GrapheneOS @Canine3625 The main use for a hardware token I could think of would be make it impossible for it to be leaked somewhere by accident or in backups.

@GrapheneOS @Canine3625 For that purpose, you don’t need it to be more secure, or even for the key to be unexportable. The only requirement is that the key not be accessible through normal filesystem operations, so that things like a full filesystem backup don’t expose it.

The idea I had was not to mitigate a build machine compromise. It was to prevent the key from being leaked from an uncompromised build machine due to human error.

@GrapheneOS Makes sense, thank you for the answer.

(I had an exchange with them about the inability to update firmware for a device with a vulnerability and they told me to kick rocks. I probably should've taken it as a red flag.)

@GrapheneOS @Canine3625 Why do they design their products to have non-updateable firmware? It seems to be standard in that industry, but I don’t understand why.
@alwayscurious @GrapheneOS The logic is essentially "firmware updates are an attack vector." Which is true, but there are mitigations (signature verification, downgrade protection, e-fuses), and the consequence to this is all vulnerabilities are permanent.
@Canine3625 @GrapheneOS I did get a clue from someone on Reddit who claimed to be an employee of theirs. They stated that they had customers in “government and defense” and that this made for a different threat model than for a company like Ledger. This makes me wonder if they are concerned about a hostile intellegence agency using physical violence to coerce one of their developers into releasing a malicious update.
@Canine3625 @GrapheneOS As an aside, I think Ledger hardware wallets are probably some of the more secure HSMs one can buy.

@alwayscurious @Canine3625 Trezor Safe 7 includes both their own open source secure element and an additional proprietary one where an attacker would need to compromise both:

https://trezor.io/trezor-safe-7

SLIP39 is a very nice way to handle deriving keys with backups but it isn't really what we want for the release signing keys for the OS and apps. We want the flexibility to be able to use a different approach rather than being locked into the SLIP39 key derivation approach among other things.

Trezor Safe 7 | Hardware Wallet with TROPIC01 Secure Element

Secure your crypto with Trezor Safe 7—Bluetooth, high-res color touchscreen, and quantum-ready security.

@alwayscurious @Canine3625 We use per-device keys which means natural key rotation over time with new device generations. We don't want to give that up and it doesn't really fit into this approach where the whole point is using a single seed that's backed up securely for everything.
@GrapheneOS Why not sign with ssh and just add an additional check to update_engine?
@alwayscurious @Canine3625 They could make a setting people can use to burn and efuse disabling updates or sell variants with it already burned. They don't need to cripple the security of all of their devices in order to do it. Also, they can use a similar insider attack protection approach as the Pixel secure elements since the Pixel 2 where user authentication is required before firmware updates will be accepted. We're definitely not interested in an HSM without security updates.
@GrapheneOS @Canine3625 My understanding is that their large customers, as well as the customers of the secure element vendors who make products for bank cards and passports, prefer immutable firmware. I thought https://www.bunniestudios.com/blog/2022/towards-a-more-open-secure-element-chip/ had relevant info, but I could not find anything there.
Towards a More Open Secure Element Chip « bunnie's blog

@alwayscurious Pixels aren't even vulnerable to this type of attack. You must unlock to upgrade titan m firmware