The official microG OS project (https://lineage.microg.org/) leaked their private keys for logging into their servers and signing releases:

https://github.com/lineageos4microg/l4m-wiki/wiki/December-2025-security-issues

We make our official builds on local machines. Our signing machine's keys aren't ever on any storage unencrypted.

LineageOS for microG

LineageOS for microG website.

LineageOS for microG
Our roadmap for improving security of verifying updates is based on taking advantage of the reproducible builds. We plan to have multiple official build locations and a configurable signoff verification system in the update clients also usable with third party signoff providers.
We don't have faith in any available commercial HSM products being more secure than keeping keys encrypted at rest on the primary local build machine. Instead, we're planning to develop software for using the secure element on GrapheneOS phones as an HSM for signing our releases.

@GrapheneOS would you be willing to elaborate on your position on commercial HSM products? Does this include products like Yubikey?

(Read: I use a Yubikey to decrypt my LUKS drive and now I'm wondering if I made a huge mistake.)

@Canine3625 YubiKey doesn't even support firmware updates. Instead of having verified boot with cryptographic signature verification and downgrade protection along with update signing and downgrade protection, their firmware cannot be updated. They cannot fix security flaws on their products beyond releasing new ones. Many of their products have been found to be vulnerable to exploits at a firmware or hardware level and they do not even have the option to release patches for those issues.
@Canine3625 We have very little faith in the security of products from Yubico and other HSM vendors. We would consider it to be a downgrade from having the keys stored encrypted at rest on the primary build machine. The primary build machine being exploited would already be a disaster and needs to be protected against. Signing keys aren't actually super special compared to the security of the builds. We don't want to create a weak point for signing keys through an attempt at improving things.
@GrapheneOS @Canine3625 The main use for a hardware token I could think of would be make it impossible for it to be leaked somewhere by accident or in backups.

@GrapheneOS @Canine3625 For that purpose, you don’t need it to be more secure, or even for the key to be unexportable. The only requirement is that the key not be accessible through normal filesystem operations, so that things like a full filesystem backup don’t expose it.

The idea I had was not to mitigate a build machine compromise. It was to prevent the key from being leaked from an uncompromised build machine due to human error.

@GrapheneOS Makes sense, thank you for the answer.

(I had an exchange with them about the inability to update firmware for a device with a vulnerability and they told me to kick rocks. I probably should've taken it as a red flag.)

@GrapheneOS @Canine3625 Why do they design their products to have non-updateable firmware? It seems to be standard in that industry, but I don’t understand why.
@alwayscurious @GrapheneOS The logic is essentially "firmware updates are an attack vector." Which is true, but there are mitigations (signature verification, downgrade protection, e-fuses), and the consequence to this is all vulnerabilities are permanent.
@Canine3625 @GrapheneOS I did get a clue from someone on Reddit who claimed to be an employee of theirs. They stated that they had customers in “government and defense” and that this made for a different threat model than for a company like Ledger. This makes me wonder if they are concerned about a hostile intellegence agency using physical violence to coerce one of their developers into releasing a malicious update.
@Canine3625 @GrapheneOS As an aside, I think Ledger hardware wallets are probably some of the more secure HSMs one can buy.

@alwayscurious @Canine3625 Trezor Safe 7 includes both their own open source secure element and an additional proprietary one where an attacker would need to compromise both:

https://trezor.io/trezor-safe-7

SLIP39 is a very nice way to handle deriving keys with backups but it isn't really what we want for the release signing keys for the OS and apps. We want the flexibility to be able to use a different approach rather than being locked into the SLIP39 key derivation approach among other things.

Trezor Safe 7 | Hardware Wallet with TROPIC01 Secure Element

Secure your crypto with Trezor Safe 7—Bluetooth, high-res color touchscreen, and quantum-ready security.

@alwayscurious @Canine3625 We use per-device keys which means natural key rotation over time with new device generations. We don't want to give that up and it doesn't really fit into this approach where the whole point is using a single seed that's backed up securely for everything.
@GrapheneOS Why not sign with ssh and just add an additional check to update_engine?
@alwayscurious @Canine3625 They could make a setting people can use to burn and efuse disabling updates or sell variants with it already burned. They don't need to cripple the security of all of their devices in order to do it. Also, they can use a similar insider attack protection approach as the Pixel secure elements since the Pixel 2 where user authentication is required before firmware updates will be accepted. We're definitely not interested in an HSM without security updates.
@GrapheneOS @Canine3625 My understanding is that their large customers, as well as the customers of the secure element vendors who make products for bank cards and passports, prefer immutable firmware. I thought https://www.bunniestudios.com/blog/2022/towards-a-more-open-secure-element-chip/ had relevant info, but I could not find anything there.
Towards a More Open Secure Element Chip « bunnie's blog

@alwayscurious Pixels aren't even vulnerable to this type of attack. You must unlock to upgrade titan m firmware
@GrapheneOS
Having worked with "real" HSMs before (Gemalto, Safenet, Thales, FIPS 140)... yeah. They're just Linux boxes with a bunch of stuff turned off, and a button on the back to quick erase it. They've all merged down to two companies last I looked, and fired everyone who understood secure development.
@GrapheneOS Cool to see this level of thought and discussion. But lot of nuance or specifics that are being left unsaid that are likely to confuse many IMVHO
@GrapheneOS what do you use to encrypt keys at rest and make them available when needed?
@cedric_cvl Standard OpenSSL scrypt + AES key encryption with a noswap configured tmpfs for using the decrypted keys where they're wiped after usage. It's not anything complex. It avoids the unencrypted keys ever being on storage so they're at rest whenever builds aren't being signed. Otherwise, they wouldn't be at rest much since these machines are rarely turned off and disk encryption wouldn't be enough of a protection alone especially with desktops not being as secure as the phones at all.
@GrapheneOS Shouldn’t signing be separate from the build machine since compilation brings its own attack vectors? (Although the transmission would also add different attack vectors)

@TheAlgorythm No, since Android's signing system trusts the output of the build so the only way to secure them more is running them in a container or VM with restricted access.

Android Open Source Project has an upstream container system for the build process although the intended purpose is not security. The purpose of their build sandbox is avoiding build determinism issues and also avoiding accidental damage being done to the host system or account.

@GrapheneOS Well yes. But it was not about the trust on the binary produced by the compiler. There is not much you can do to increase it. It was about (malicious) side effects the compilation process could have on the build system. Although the likelihood should be low.

@GrapheneOS

Hi,

This is a new account, so I wasn’t sure where to ask this.

I noticed the NFC APK signing key was only recently added to the build scripts (a7b69d9). From what I understand, it was previously signed with the AOSP public test key.

Since those keys are publicly available, wouldn’t that be a security issue for a system component like NFC?

Was this communicated or i missed it, also to update ASAP?

https://github.com/GrapheneOS/script/commit/a7b69d9

https://github.com/GrapheneOS/grapheneos.org/commit/ecc26ba

Thanks.

add missing nfc signing key · GrapheneOS/script@a7b69d9

Scripting for generating signed production releases of AOSP and metadata for the Updater app along with partially automated maintenance of out-of-tree patch sets. - add missing nfc signing key · GrapheneOS/script@a7b69d9

GitHub

@Occasion_Antique GrapheneOS blocks system app updates not done via OS updates or App Store. Android has a recent protection which stops this from being a serious issue itself too.

Handling the changes adding new keys for signing components within APEX modules was included in the release notes for 2026021200 and documented on the build page:

https://grapheneos.org/releases#2026021200

> update release signing to handle AOSP APEX changes

We also made changes to prevent further added keys in AOSP regressing it.

GrapheneOS releases

Official releases of GrapheneOS, a security and privacy focused mobile OS with Android app compatibility.

GrapheneOS

@GrapheneOS

Thanks for the clarification.

I was just worried after noticing the change.

You mentioned GrapheneOS blocks system app updates not done via OS updates or the App Store. Could you point to where this is enforced (code or docs)?

If someone got ADB access, would `adb install` still be blocked from updating a system component like the NFC APK?

A link would be helpful.

Thanks.

@Occasion_Antique Why would you be worried about this of all things that you could worry about? There are Critical and High severity issues being fixed on a regular basis. Why worry about a lightly privileged app?

> If someone got ADB access, would `adb install` still be blocked from updating a system component like the NFC APK?

No, but then they already have a lot more privileges than this app already. Android began blocking apps like this hijacking privileged permissions via shared UID.

@Occasion_Antique

https://github.com/GrapheneOS/platform_frameworks_base/commit/64789c4939e92f99a0c7a41c1d8165364cdfa35c

It's only possible to update system components via App Store or ADB.

Android doesn't have verified boot for system APK updates but rather that's a feature added by GrapheneOS, so it only weakened a protection we added rather than a standard one.

https://github.com/GrapheneOS/platform_frameworks_base/blob/8209fe0ab30eaf097c380bb373bc6433b29c3556/services/core/java/com/android/server/pm/InstallPackageHelper.java#L5041

This is a standard protection which prevents it from becoming far more privileged.

It was a minor issue and won't happen again when they quietly add more within-APEX APK keys again.

allow only first party app source and shell to update system packages · GrapheneOS/platform_frameworks_base@64789c4

Contribute to GrapheneOS/platform_frameworks_base development by creating an account on GitHub.

GitHub

@GrapheneOS

Thanks, so its safe to use any recent builds right. I noticed the NFC key has been in AOSP for a while, More than 2 years.

Is there a reason this wasn’t handled earlier and only addressed recently? Even if its a minor issue.

@Occasion_Antique

This was never a practical issue. It's strange that you keep questioning whether releases are safe to use due to a minor issue which was present for a months and addressed a month ago.

We've been signing the NFC APEX with the main releasekey since it was introduced and it was still being properly signed. We didn't need to make a dedicated key for it until recently.

This only recently became an issue after Android 16. We now have detection for future AOSP key changes.

@Occasion_Antique Look at our release signing script which you linked. We use releasekey for signing nearly all of the APEX components because it doesn't provide any additional privileges. On the stock OS, they make separate keys for each one because they update them out-of-band across multiple devices. If we ever want to start doing out-of-band APEX updates we could do the same but it has the downside of no longer getting easy automatic key rotation without explicitly choosing to do it.

@GrapheneOS

Since the NFC key has existed for a while, what changed in Android 16 that made this start being treated as an issue?

I’m curious about the technical reason for the delay in enforcing the dedicated key. Was this intentional, or was it simply missed until the changes in Android 16?

Reposted, as I see my messages are locked.

@Occasion_Antique We were always signing the NFC APEX as long as it existed but there's an APK inside of the APEX which needs to be signed separately. We already explained why it happened due to an AOSP signing script deficiency not erroring out for having missing keys which is what we want it to do. We've addressed that so the issue can't happen again. We already had blocking of updating system apps without a first party source or ADB which along with an AOSP protection made this unimportant.

@GrapheneOS

I appreciate the precautions already in place to prevent such APKs from being installed accidentally. Was it mainly missed because the AOSP signing script didn’t error when the key was missing and the build completed successfully?

I noticed bluetooth was handled earlier and NFC was added in the next Android version, how was Bluetooth noticed but not NFC?

https://github.com/GrapheneOS/platform_build/commit/c70f942

I'm not trying to raise an issue or controversy. I’m just trying to understand how it was missed.

add bluetooth to standard key mapping · GrapheneOS/platform_build@c70f942

Make Build System (being phased out upstream). Contribute to GrapheneOS/platform_build development by creating an account on GitHub.

GitHub
@Occasion_Antique We were always signing the NFC APEX from the beginning. There's an APK inside of the NFC APEX which started no longer being signed with releasekey but rather the nfc key we weren't using. That's what caused the issue. AOSP signing scripts don't error out if keys are missing and we've now addressed that with our own checks for similar issues. There was never a case of the NFC APEX not being signed, it was signed with releasekey by us. An APK inside was being special cased.

@GrapheneOS

Also I saw the release notes, but the NFC signing change isn’t mentioned specifically. Is there a reason it wasn’t listed?

Thanks

@Occasion_Antique That's one part of the change we listed. We determined it wasn't a serious issue and couldn't think of any relevant attack vector beyond a minor reduction in our added verified boot security. There are other cases where our added security features were found to have holes in them which we resolved. There's a fix for one of them in the latest release via closing an upstream hole in the INTERNET permission which impacts Network.

The NFC app is not the more privileged NFC code.

@Occasion_Antique There are Critical and High severity issues being fixed upstream on a regular basis. We also often have to patch issues we find in the AOSP code ourselves.

The NFC APEX itself was properly signed already prior to this. This was about a specific APK inside of the NFC APEX which they started signing with a new key. The way the signing scripts work doesn't result in them failing if more keys get added but not replaced which is something we've addressed for them now.

@GrapheneOS so are you building on diversified machines in multiple jurisdictions and enforcing a quorum to release?
@GrapheneOS Do you use an hardware token? Some projects do and some don’t. I’m curious what the reasoning is.
@alwayscurious We definitely don't trust any of the available HSM products and have far more confidence in keeping signing keys secure by having them stored encrypted at rest on the primary build machine. However, we do plan to improve things by having a reproducible build signoff system with configurable signoff parties along with using Pixels with GrapheneOS to provide an HSM via the secure element. We need to develop an app to run on GrapheneOS and a client side part for the build machine.
@GrapheneOS @alwayscurious How does reproducible build help here?
GrapheneOS (@[email protected])

Our roadmap for improving security of verifying updates is based on taking advantage of the reproducible builds. We plan to have multiple official build locations and a configurable signoff verification system in the update clients also usable with third party signoff providers.

GrapheneOS Mastodon