pol, drm 

remote attestation goes against software freedom and shouldn't ever be accepted

it's also not a security feature, it's just drm. the actual security feature is verified boot, and that's only a security feature if the user can install their own keys, else it's just a tool for oppression

pol, drm 

@lumi

I agree on undesirability, but want to point out two situations when it can be a security feature (if there wasn't a stream of exploits against the variants I know of): selling computational power without ability to see what the buyer is doing and poor man's cloneable HSM with potentially much more complicated logic inside (which would export keys, but only to instances of itself running someplace else).

pol, drm 

@robryk i believe in the first one it just can't give you high enough assurances and the failure modes are quite disastrous. just like you state in your post, there is a stream of exploits against them, and i feel this is because it's trying to solve an inherently impossible task: determining what's running on a remote host

at least, i believe it is impossible in the general case

you can see the first situation and safetynet as duals, who is verifying and who is verified are what's switched

because they're trying to do the fundamentally impossible thing of telling what's running on a remote host, it fails in both cases

but for safetynet this doesn't matter, because your freedom is severely infringed either way (drm always fails in this way, all of it is defective by design)

and for the case you're talking about, it does matter, because it's just broken

as for the second part, as i understand it, this is a use case for secure enclaves, TPMs and such, not remote attestation? unless you mean this HSM is running on a remote host? in which case, refer to the response for the first situation

for what it's worth, i do think things like secure enclaves can be valuable and can have a purpose (i hope this won't come bite me in the ass in the future
:neocat_3c:), but remote attestation has no place in computing

Follow

pol, drm 

@lumi

The second one is worse than first one, because second one has larger vulnerability windows (potentially even unlimitedly large).

Yeah, first situation is basically the same thing as SafetyNet but with socially important properties switched around.

In the second case I mean an HSM that's running somewhere (and has standard rules for being asked to perform operations with keys it stored, no attestation involved there), but can migrate/duplicate itself to a different host (so that it doesn't die if the host dies). That requires some reason to trust that whoever we ship off a copy of all our secrets to will behave just as we behave (or rather, whoever has the session key that will decrypt the secrets we're shipping out).

pol, drm 

@robryk from a failure mode perspective, yeah, remote attestation where the remote server is the verified one has a much worse failure mode

from a perspective of power, the safetynet case is
way worse, because in the case of trying to figure out what software a remote server runs, the users freedom is not affected. in the case of safetynet, the users freedom is severely compromised

in the HSM case, well, that's the thing, you need to trust whoever you ship it off to, attestation can't make it trustworthy

for something like this, maybe homomorphic encryption is a better route to take?

pol, drm 

@lumi

> from a failure mode perspective, yeah, remote attestation where the remote server is the verified one has a much worse failure mode

Sorry, I think I wasn't clear. I meant that my example number two (HSM) has worse failure modes than my example number one (buying computational resources). I agree completely that it'd be better if we didn't have remote attestation.

> in the HSM case, well, that's the thing, you need to trust whoever you ship it off to, attestation can't make it trustworthy

If attestation works, you can surely ship it off only to someone who can attest as being an enclave that runs the same software as you? (More precisely, only encrypt it to public keys s.t. you trust that the corresponding private key is only present in an enclave that runs a copy of yourself.)

pol, drm 

@robryk well that's the thing, i don't think it can realistically work. if you want to do this i think you just need to trust whoever you send it to

pol, drm 

@lumi

Because you anyway need to trust the manufacturer of the CPUs (as they are able to create fake attestations), because tamper resistant hardware is basically impossible, or for some more reasons?

pol, drm 

@robryk i think the big ones here are how much you need to trust the manufacturer and key compromise

pol, drm 

@lumi

BTW I don't really see a reason to use enclaves without remote attestation (apart from weird implementation details that cause them to be better than just virtualisation at combating side channel bugs in CPUs).

pol, drm 

@robryk i think having them on your own cpu, to keep them isolated from the rest of your system, could potentially be useful

but yeah, this is just me going off some vague feelings, and all of this is complex. just having an external hsm is likely much safer just by virtue of it being much simpler

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.