I must admit: we skipped a beat. This week's Mind the Gap comes a few days late.
My excuse: I just bought a TV and tried to fix some privacy issues it introduced. That turned out to be quite the task. More on that below.
(Also, the weather was just very nice last week.)
Without further ado, in this fifth episode of Mind the Gap:
- my new Smart-vertising TV is chillingly unprivate by design
- Firefox is adding privacy to its design
- The French DPA is setting an example of privacy reverse-engineering on location data.
Happy reading, let me know what you think, and recommend Mind the Gap to get us some fellow subscribers!
I'm just a boy, standing in front of a TV, asking it to not track me.
Last week we bought a new TV. Hardly anytime is that worth an item in a newsletter.
But this one is, because geez is it hungry for my data.
Now I'm not a journalist and this is not an investigation. That's up to a DPA.
But I see a few issues.
TV's have become one of the many devices that often represent a 1:1 relationship between a machine and a person, and so are rife with "private" data. That's a great opportunity for manufacturers to design and build their products from privacy forward (if they care about data privacy beyond privacywashing).
And so this new TV of ours has a"privacy centre" where I can "easily manage" my privacy choices to "protect my privacy". Hey that's promising!
Delivering on the promise: protecting advertiser interests
But when diving into those choices, my enthusiasm waned: in the Privacy Centre you need to scroll endless (and I mean endless) lists of ad network parties to deselect one-by-one. I wasn't the only one to notice, and this video/tweet captures it better than the quick shots I took.
Just looking at it you understand it's virtually impossible to not give up your privacy and data to the manufacturer and its advertising buddies:
Consent to all is easy, opting out is hard. Data acquisition purposes are sketchy at best (third-party data acquisition as legitimate use?). Under which logic is it absolutely "necessary" to enable cross-device tracking and device identifiers? Also, why does a car manufacturer need my TV data!?
Even I couldn't go through the effort to "manage my privacy choices".
UX design for lean-modality platforms like a TV is hard, but this clearly protects advertiser interest over the user's. After all, a "decline all" button isn't any harder to design nor build than the "accept all" opposite.
One way to achieve privacy by design is through design. Perhaps the intersection of Privacy + UX should be taught in design schools?
Firefox introduces Total Cookie Protection for everyone
Total Cookie Protection creates a separate cookie jar for each website you visit.
Leaving the TV behind, we switch to that other window into the world of media consumption: the browser.
Now, being the geek I am, the tech behind cross-device linking is very interesting (whether deterministic, such as through account ID's, or probabilistic, such as fingerprint estimations). Cookies were invented as an elegant, useful and lightweight way to store some state on a local machine. But their use has gone beyond imagination over time, and as a consumer you don't stand a chance against other purposes like invasive tracking practices. Real (practical!) solutions are most probably found at a different control point from legislation or the individual: in the layer between you and the data acquirer, like in an OS or browser.
And that's what Firefox, being a browser, did with a strikingly elegant solution: Total Cookie Protection. In a nutshell, they give each website (not just each session!) their own little jar to place their cookies, so any linking or recognition is confined to that website.
This approach strikes the balance between eliminating the worst privacy properties of third-party cookies – in particular the ability to track you – and allowing those cookies to fulfill their less invasive use cases (e.g. to provide accurate analytics)
Now, that's privacy by design.
What is Privacy Engineering?
the full potential of privacy engineering will only be unleashed when it is aimed at building trustworthy systems just as much, if not more, as at building systems that are “just” compliant.
When starting STRM, we never anticipated the amount of education we would need to do on the domain we're in. It's necessary to build a business out of it, but at the same time being ahead is more about helping others to keep up than to leap further. That's why we're happy to drive some traffic to (other) educators in this domain.
In a series of one - two - three blogposts, Christian Zimmerman, explores what "Privacy Engineering" means, where it comes from, and gives an interesting overview from definitions all the way to an overview of Privacy Enhancing Technologies, with a discussion of the most common approaches and their pros and cons.
Class starts here:
Privacy Reverse-Engineering: Data Protection Authorities as a Red Team
In cybersecurity, a Red Team is a well-know concept. It basically means you task a team of (ethical) hackers with breaching your systems, so you gain insight into your weak spots before they are exploited.
We're not sure this was what inspired the CNIL to come up with their plan, but there's a clear parallel: acquire data from brokers and actively try to break privacy measures inside it, specifically for location data. And location data has quite some unintended "benefits".
As such, the interesting pattern here is that a regulator is looking to move from reactive- to active enforcement in a domain they classify as high-risk (data brokerage for location data).
Is this the first data point in a pattern of regulators moving beyond (establishing) paper realities and into data realities?
And... that's it for this week! We're onto deploying that Data Plane of ours into some cloud marketplaces for your convenience and will be doing some brainstorming on how to bring our Platform into your Process.