pol, drm
I agree on undesirability, but want to point out two situations when it can be a security feature (if there wasn't a stream of exploits against the variants I know of): selling computational power without ability to see what the buyer is doing and poor man's cloneable HSM with potentially much more complicated logic inside (which would export keys, but only to instances of itself running someplace else).
pol, drm
The second one is worse than first one, because second one has larger vulnerability windows (potentially even unlimitedly large).
Yeah, first situation is basically the same thing as SafetyNet but with socially important properties switched around.
In the second case I mean an HSM that's running somewhere (and has standard rules for being asked to perform operations with keys it stored, no attestation involved there), but can migrate/duplicate itself to a different host (so that it doesn't die if the host dies). That requires some reason to trust that whoever we ship off a copy of all our secrets to will behave just as we behave (or rather, whoever has the session key that will decrypt the secrets we're shipping out).
pol, drm
> from a failure mode perspective, yeah, remote attestation where the remote server is the verified one has a much worse failure mode
Sorry, I think I wasn't clear. I meant that my example number two (HSM) has worse failure modes than my example number one (buying computational resources). I agree completely that it'd be better if we didn't have remote attestation.
> in the HSM case, well, that's the thing, you need to trust whoever you ship it off to, attestation can't make it trustworthy
If attestation works, you can surely ship it off only to someone who can attest as being an enclave that runs the same software as you? (More precisely, only encrypt it to public keys s.t. you trust that the corresponding private key is only present in an enclave that runs a copy of yourself.)
pol, drm
@robryk well that's the thing, i don't think it can realistically work. if you want to do this i think you just need to trust whoever you send it to
pol, drm
Because you anyway need to trust the manufacturer of the CPUs (as they are able to create fake attestations), because tamper resistant hardware is basically impossible, or for some more reasons?
pol, drm
@robryk i think the big ones here are how much you need to trust the manufacturer and key compromise
pol, drm
BTW I don't really see a reason to use enclaves without remote attestation (apart from weird implementation details that cause them to be better than just virtualisation at combating side channel bugs in CPUs).
pol, drm
@robryk i think having them on your own cpu, to keep them isolated from the rest of your system, could potentially be useful
but yeah, this is just me going off some vague feelings, and all of this is complex. just having an external hsm is likely much safer just by virtue of it being much simpler
pol, drm
@robryk from a failure mode perspective, yeah, remote attestation where the remote server is the verified one has a much worse failure mode
from a perspective of power, the safetynet case is way worse, because in the case of trying to figure out what software a remote server runs, the users freedom is not affected. in the case of safetynet, the users freedom is severely compromised
in the HSM case, well, that's the thing, you need to trust whoever you ship it off to, attestation can't make it trustworthy
for something like this, maybe homomorphic encryption is a better route to take?