France bans Office 365 and Google Workspaces for schools and public administration. I can't read the thing but hopefully somewhere it says "BECAUSE CLOUD IS STUPID STUPID STUPID FOR THINGS LIKE THIS"
@jrm4 It’s mostly about overcollecting personal data (telemetry) and taking it outside the jurisdiction of European regulators.
I thought the original reason for this decision was some sort of "can our data be accessed from out-of-EU" sort of thing. If so, I'd wager that whatever agreement there doesn't specify that with enough precision to actually have that effect. (OTOH I do believe that the agreement is probably good enough to ensure that _in the ordinary course of business_ that data will not be accessed from out-of-EU-or-some-similar-notion).
I'm confused. You described why ordinary course of business if the yardstick you want to measure it by, and then said it's not.
By ordinary course of business I meant making e.g. assumptions that no (or insufficiently many) malicious insiders exist. Did you understand it in some other way?
Where did I see you say it's not the yardstick:
> Ordinary course of business is not a relevant criterium here.
Where did I think you described it as the yardstick: you talk about purposes of data usage and ways existing data is used. Well, if data is accessible from someplace, it _can_ be used for any purpose. We're relying on the company doing its business in the way it claims to/intends to to ensure that it's used only for some purposes and made accessible only from some additional places. IOW, technical controls do not understand "purpose" so can't filter on that, even in principle.
@robryk @jrm4 If the data is held in, say Ireland, the Irish government is still bound by the European Convention for Human Rights if it demands the data for, say, prosecution of a crime or intelligence purposes. That is a safeguard that is in place, for access that is not ordinary, but can and will happen. If it is held in the USA, the ECHR does not apply anymore for the same type of access by US government. See now why "ordinary course of business" isn't the yardstick here?
What does "held in" mean?
Motivating examples:
- there's a machine in Ireland that contains the data and happily provides it on request to a machine in US,
- data in encrypted form is in X, and the key is in Y,
- we've split the data into two so that XOR of the two pieces yields the original data, and each of the pieces alone is random.
Thanks.
I'm curious if you think that the software update scenario is similar to 1 or not.
I find your interpretation of scenario 2 weird and I can't really build a model that produces it. Let's imagine that I store some data in location X, but additionally encrypt it before storing with a key that's totally public. ISTM that this interpretation of scenario 2 forces us to claim that the data resides then everywhere. Am I missing something?
@robryk @jrm4 The software update scenario is one that so far has been treated as being distinct from providing access. There’s a lot of unexplored territory here, even decades after the 1995 Directive. As for scenario 2: cryptography is never perfect or eternal, so the data is being processed in X. Whoever holds the key can presumably decrypt it and having access is generally considered equivalent to processing. Though there’s no jurisprudence there either. And you’re not wrong that things …
> cryptography is never perfect or eternal, so the data is being processed in X.
Does this mean that my ISP processes all the data I exchange with any website?
> Whoever holds the key can presumably decrypt it
Whoever has the key *and the data*. Compare the case of "X stores data in plaintext" vs "X encrypts with a key that everyone knows and stores the ciphertext". Doing the latter instead of the former doesn't change who can access the data in any way.
I think they get weird even without personal data that's public.
@robryk @jrm4 There are at least three issues in play: 1) these solutions gather way more telemetry than can be justified from a data protection perspective and 2) for purposes that are not controlled by the user organisations and 3) possibly transferred to third countries (non-EEA) with insufficient checks and balances on their state security apparatuses' surveillance. Ordinary course of business is not a relevant criterium here.