MtG Phoenix edition: your identity is in Apple's compote
Hello dear reader,
It's been a while since you've received our "weekly" content update. I take full credit for that - there's tons of relevant news and developments in privacy every week, but I underestimated the effort involved with providing background to 3-4 of them each week. So, I'll just have to take smaller bites going forward.
"Taking a bite" of a certain fruit is a topic for this edition. Today's focus:
- If you choose an iPhone because Apple touts itself as the most private option, you might have made the wrong choice.
- My co-founder Bart and myself had some observations when we were at a compliance conference in London last week.
- We announced our launch in AWS as a packaged install.
Happy reading!
Pim @ STRM
Apple's anonymous analytics is not so anonymous 🍐
The finding exposes the difference between the privacy policy you may think you’re covered under and the privacy policy that’s actually being applied. - The Verge
Over to a story I don't quite understand it's not bigger: it appears that even the white knights in privacy wear black clothes underneath their marketing armour (hello, DuckDuckGo). Apple is becoming an ad company, and that's just very hard to do on nothing but private data.
Apple has been "caught" in using much less anonymous data than they promise in the general privacy policy. Two nosey iOS security researchers found an unexpected connection between an analytical service identifier and iCloud accounts - linking even "anonymous" app store data to individual users. It's a profound (architectural) mistake if you didn't intend to, and it's also done if you've opted out of the app store analytics.
The researchers themselves neatly summarise it in this tweet:
Trying to conceal linking identities is one thing. Applying different policies to make it "compliant" is quite another. You can also take a lesson out of Google's latest book, which just moved the Maps domain to the google.com root instead of a subdomain (giving all Google services access to location services if you allow that in your browser!).
Learn more about Apple's compote in this totally unopionated piece by Security Boulevard:
"Privacy is where security was 15 years ago"
Last week we visited the #RISK conference in London ("Mind the Gap"!).
It was tuned more to risk, compliance and the legal perspective than tech. I still find that interesting because "data privacy" is where data and privacy meet... We had fun!
My main take from the conference was the fact that it's really new to many risk professionals that you can actually align policies inside the data itself. Bart summarised three more as follows:
1️⃣ Legal is not tech is not legal is not compliance = risk
2️⃣ Tools to the rescue. But for what problem and which persona?
3️⃣ “Privacy is where security was 15 years ago”
Read along for the full post and ongoing discussions @ LinkedIn:
Product news: STRM is now privacy infra as a one-click install in AWS
This week's last snack is a release announcement we're pretty proud of: we're now a packaged install in AWS marketplace.
Aligning nicely with the learnings at #RISK (ref above), we've taken this step because we see a need for practically applicable solutions that go beyond just "providing an up to date view" of data privacy compliance in an org.
Native availability (all the way through to billing) is an import step to remove some of the inherent friction of going privacy-first/by-design (it also keeps all your sensitive data where it already is - inside your cloud account).
Against that background I gave some consideration to why practically applicable solutions are a necessary next step in privacy and how "privacy infra" can help: