MtG Phoenix edition: your identity is in Apple's compote

MtG Phoenix edition: your identity is in Apple's compote
Photo by Nikolai Chernichenko / Unsplash

Hello dear reader,

It's been a while since you've received our "weekly" content update. I take full credit for that - there's tons of relevant news and developments in privacy every week, but I underestimated the effort involved with providing background to 3-4 of them each week. So, I'll just have to take smaller bites going forward.

"Taking a bite" of a certain fruit is a topic for this edition. Today's focus:

Happy reading!

Pim @ STRM


Apple's anonymous analytics is not so anonymous 🍐

We now know the guy in the ad was just filming everyone instead of hiding his face behind that iPhone.
The finding exposes the difference between the privacy policy you may think you’re covered under and the privacy policy that’s actually being applied. - The Verge

Over to a story I don't quite understand  it's not bigger: it appears that even the white knights in privacy wear black clothes underneath their marketing armour (hello, DuckDuckGo). Apple is becoming an ad company, and that's just very hard to do on nothing but private data.

Apple has been "caught" in using much less anonymous data than they promise in the general privacy policy. Two nosey iOS security researchers found an unexpected connection between an analytical service identifier and iCloud accounts - linking even "anonymous" app store data to individual users. It's a profound (architectural) mistake if you didn't intend to, and it's also done if you've opted out of the app store analytics.

The researchers themselves neatly summarise it in this tweet:

Trying to conceal linking identities is one thing. Applying different policies to make it "compliant" is quite another. You can also take a lesson out of Google's latest book, which just moved the Maps domain to the google.com root instead of a subdomain (giving all Google services access to location services if you allow that in your browser!).

Learn more about Apple's compote in this totally unopionated piece by Security Boulevard:

iPhone Privacy ‘Lies’ Exposed Again: Apple Analytics not Anonymous
Apple has been caught lying in a privacy policy. So say the now-notorious security researchers at Mysk.

"Privacy is where security was 15 years ago"

Bart: "So, are there any DeLorean's in the room so we can do some time travel?" Picture by Pavel Rosca, http://www.pr-photographer.com/

Last week we visited the #RISK conference in London ("Mind the Gap"!).

It was tuned more to risk, compliance and the legal perspective than tech. I still find that interesting because "data privacy" is where data and privacy meet... We had fun!

My main take from the conference was the fact that it's really new to many risk professionals that you can actually align policies inside the data itself. Bart summarised three more as follows:

1️⃣ Legal is not tech is not legal is not compliance = risk
2️⃣ Tools to the rescue. But for what problem and which persona?
3️⃣ “Privacy is where security was 15 years ago”

Read along for the full post and ongoing discussions @ LinkedIn:

“Privacy is where security was 15 yrs ago”: my London thoughts on #risk, privacy and data (5 min 🕑)
Our team visited #risk2022 in London with lots of privacy professionals, in-depth panel discussions and many vendors offering tech and tools to mitigate risk. What should you know? 1️⃣ Legal is not tech is not legal is not compliance = risk First and foremost, a lot of legal, compliance, risk and pr

Product news: STRM is now privacy infra as a one-click install in AWS

This week's last snack is a release announcement we're pretty proud of: we're now a packaged install in AWS marketplace.

Aligning nicely with the learnings at #RISK (ref above), we've taken this step because we see a need for practically applicable solutions that go beyond just "providing an up to date view" of data privacy compliance in an org.

Native availability (all the way through to billing) is an import step to remove some of the inherent friction of going privacy-first/by-design (it also keeps all your sensitive data where it already is - inside your cloud account).

Against that background I gave some consideration to why practically applicable solutions are a necessary next step in privacy and how "privacy infra" can help:

STRM in AWS: privacy infra as a one-click install
We’re now available as a packaged install on AWS. Here’s why that’s a necessary step for privacy.

Make your data do what your policies say

Request a demo