The PIN is the important part there, encrypted sessions (and/or EK cert verification) without PIN are not much more then obfuscation, and defeated by both the interposer attack, and the tweezer attack. (Or the TPM hack to rule them all, e.g. desoldering the chip and connecting it to a microcontroller you control)
I supposse a PIN is a slight improvement over a regular password, but a big appeal of TPM FDE in my opinion is unattended unlock.
I think discrete TPMs don't really have a future in systems that need robust system state attestation (both local and remote) against attackers with physical access. TPMs should be integrated into the CPU/SoC to defend against such attacks.
> discrete TPMs don't really have a future in systems that need robust system state attestation (both local and remote) against attackers with physical access. TPMs should be integrated into the CPU/SoC
What are your thoughts on Microsoft Pluton and Google OpenTitan as TPM alternatives/emulators?
Should system attestation roots of trust be based on open-source firmware?
Recent AI/Copilot PCs based on Qualcomm SDXE/Oryon/Nuvia, AMD Zen5 and Intel Lunar Lake include Microsoft Pluton.
> What are your thoughts on Microsoft Pluton and Google OpenTitan as TPM alternatives/emulators?
I am not familiar enough of the technical details of Pluton or OpenTitan to make a meaningful statement on their security.
> Should system attestation roots of trust be based on open-source firmware?
Yes, and not only root of trusts, I am strong believer in open source firmware in general. I have been developing coreboot as a hobby for a long time. I wish their was more industry support for such things, especially at the lowest levels of modern systems.
> encrypted sessions (and/or EK cert verification) without PIN are not much more then obfuscation
this is completely incorrect, encrypted sessions defeat TPM interposers when there is a factory burned-in processor side secret to use. lol at being just "obfuscation" because you can spend $5m to decap and fetch the key then put the processor back into working order for the attack.
that just requires a vertically integrated device instead of a consumer part-swappable PC.
What you are saying is sound, and I agree it could be done.
But there are multiple caveats:
- How do you hide the secret so that only "legitimate" operating systems can use it for establishing their sessions and not "Mate's bootleg totally not malware live USB"?
- And unfortunately current CPUs don't implement this.
- Additionally don't be so smug to think you need to decap a CPU to extract on-die secrets. Fault injection attacks are very effective and hard to defend against.
I agree the security of this can somewhat be somewhat improved, but if you are building a custom CPU anyhow, you might as well move the TPM on-die and avoid this problem entirely.
before the popularity of ARM SoCs that contain everything on-die there were much fewer choices for vertically integrated devices. it's a different segment.
if you look at apple's vertically integrated devices, they chose a cryptography coprocessor that was not on die originally. with a key accessible only by both pieces of silicon's trusted execution environments, rather than the operating system directly, encrypted comms are established in a similar fashion as the TPM2.0 proposal.
>robust system state attestation (both local and remote) against attackers with physical access
Phrases like this give me the shivers, as it translates into "mandatory surveillance by some authority telling me what I can and can't do with my computer".
TPM is an evil concept. Physical access should be final.
That "attestation" in the full disk encryption case means your disk encryption key only being available to the operating system you chose to install. And disallowing the ability of a laptop thief to change that.
Or remote attestation can be used to restrict access to a corporate network to corporate controlled devices only. No one surveills you, or has access to your device in this scenario either, the TPM there is used to produce a certificate of the device state that can effectively act as access credentials to a resource.
This is about recognising the fact that the person in physical possession of a device isn't necessarily the legitimate owner.
I get the reaction, but what about the trust factor of a box you own and have running on the other side of the world? TPM isn’t an evil concept, it’s fairly useful for some scenarios. Coercion to use TPMs, that sounds evil.
>So I get my hands on your laptop for a few minutes, there should be nothing you can do to impede me from doing whatever I want to it?
Correct. This is true of all my other possessions as well.
Ultimately, the physical hardware of the computer cannot tell the difference between a legitimate user and an illegitimate one. The distinction is social, not mathematical - the kind of thing one might litigate in court, rather than by multiplying some large primes together. Technologically enforcing the concept of ownership over an object implies the construction of a parallel, extra-legal system of rights management, with some final higher authority that is neither you nor in all likelihood your government. Here's how that plays out: yes, you paid for the computer, yes, you "legally" own it, but you did something to it that Microsoft doesn't approve of and so we're afraid it doesn't work anymore. Might makes right. Too bad!
Problem is that the BCM and the BIOS/UEFI and every component talking to the TPM all need to store one (or more) public keys for it (and the corresponding templates and/or save files) in order to set up encrypted sessions to the TPM.
Bitlocker is traditionally the implementation susceptible for this attack, but for that I'll just defer to Chris Fenner.
https://www.dlp.rip/tpm-genie