Apple clearly has power, but it isn’t accountable

By John Naughton

The only body that has, to date, been able to exert real control over the data-tracking industry is a giant private company which itself is subject to serious concerns about its monopolistic behaviour. Where is democracy in all this?

A few weeks ago, Apple dropped its long-promised bombshell on the data-tracking industry.

The latest version (14.5) of iOS — the operating system of the iPhone — included a provision that required app users explicitly to confirm that they wished to be tracked across the Internet in their online activities.

At the heart of the switch is a code known as “the identity for advertisers” or IDFA. It turns out that every iPhone comes with one of these identifiers, the object of which is to provide participants in the hidden real-time bidding system with aggregate data about the user’s interests.

For years, iPhone users had had the option to switch it off by digging into the privacy settings of their devices; but, because they’re human, very few had bothered to do that.

From 14.5 onwards, however, they couldn’t avoid making a decision, and you didn’t have to be a Nobel laureate to guess that most iPhone users would opt out.

Which explains why those who profit from the data-tracking racket had for months been terminally anxious about Apple’s perfidy.

Some of the defensive PR mounted on their behalf — for example Facebook’s weeping about the impact on small, defenceless businesses — defied parody.

“We have evidence of its [real-time bidding] illegitimacy, and a powerful law on the statute book which in principle could bring it under control — but which we appear unable to enforce.”

Other counter-offensives included attacks on Apple’s monopolistic control over its Apps store, plus charges of rank hypocrisy – that changes in version 14.5 were not motivated by Apple’s concerns for users’ privacy but by its own plans to enter the advertising business. And so on.

It’ll be a while until we know for sure whether the apocalyptic fears of the data-trackers were accurate.

It takes time for most iPhone users to install operating system updates, and so these are still relatively early days. But the first figures are promising. One data-analytics company, for example, has found that in the early weeks the daily opt-out rate for American users has been around 94 percent.

This is much higher than surveys conducted in the run-up to the change had suggested — one had estimated an opt-out rate closer to 60 per cent.

If the opt-out rate is as high as we’ve seen so far, then it’s bad news for the data-tracking racket and good news for humanity. And if you think that description of what the Financial Times estimates to be a $350B industry is unduly harsh, then a glance at a dictionary may be helpful.

Merriam-Webster, for example, defines ‘racket’ as “a fraudulent scheme, enterprise, or activity” or “a usually illegitimate enterprise made workable by bribery or intimidation”.

It’s not clear whether the computerised, high-speed auction system in which online ads are traded benefits from ‘bribery or intimidation’, but it is certainly illegal — and currently unregulated.

That is the conclusion of a remarkable recent investigation by two legal scholars, Michael Veale and Frederik Zuiderveen Borgesius, who set out to examine whether this ‘real-time bidding’ (RTB) system conforms to European data-protection law.

“The irony in this particular case is that there’s no need for such an overhaul: Europe already has the law in place.”

They asked whether RTB complies with three rules of the GDPR (General Data Protection Regulation) — the requirement for a legal basis, transparency, and security. They showed that for each of the requirements, most RTB practices do not comply. “Indeed”, they wrote, “it seems close to impossible to make RTB comply”. So, they concluded, it needs to be regulated.

It does.

Often the problem with tech regulation is that our legal systems need to be overhauled to deal with digital technology. But the irony in this particular case is that there’s no need for such an overhaul: Europe already has the law in place.

It’s the GDPR, which is part of the legal code of every EU country and has provision for swingeing punishments of infringers. The problem is it’s not being effectively enforced.

Why not? The answer is that the EU delegates regulatory power to the relevant institutions — in this case Data Protection Authorities — of its member states. And these local outfits are overwhelmed by the scale of the task – and are lamentably under-resourced for it.

Half of Europe’s DPAs have only five technical experts or fewer. And the Irish Data Protection Authority, on whose patch most of the tech giants have their European HQs, has the heaviest enforcement workload in Europe and is clearly swamped.

So here’s where we are: an illegal online system has been running wild for years, generating billions of profits for its participants.

We have evidence of its illegitimacy, and a powerful law on the statute book which in principle could bring it under control — but which we appear unable to enforce.

And the only body that has, to date, been able to exert real control over the aforementioned racket is… a giant private company which itself is subject to serious concerns about its monopolistic behaviour. And the question for today: where is democracy in all this? You only have to ask to know the answer.


A version of this post appeared in The Observer on 23 May, 2021.

Clubhouse in China shows that even “harmless” apps may put individuals in harm’s way

by Alina Utrata

Up until last week, most people hadn’t heard of the voice chatroom app Clubhouse—popular mostly with LA celebrities and Silicon Valley elites (and venture capitalist firm Andreessen Horowitz has invested tens of millions of dollars in the platform). The reason the app gave off such an air of exclusivity was because not just anyone could join—new users required an invite from a current user in order to sign up. Clubhouse growth has increased recently, especially after Elon Musk tweeted about it, with some invites being auctioning off for up to $125. But the recent uptick in user growth has underscored a privacy nightmare that is almost certainly putting Clubhouse users and non-users at risk—and in some cases even putting them in danger. 

The problem centers around Clubhouse invites. In order to invite someone onto the platform, you must allow the app access to your contact list. The app then uploads your entire contact list—and lets you know how many Clubhouse users also have your contacts in their phone’s contacts. Because of Clubhouse’s aggressive contact list collection policy (you literally cannot be invited onto the app unless one of your contacts has allowed Clubhouse access to their contacts) the app has has the capacity to quickly accumulate a “social graph.”

Some, like Will Oremus, have pointed out that this is a little bit “creepy.” You can suddenly see how many people have your therapist (or drug dealer) in their contact list, revealing networks or connections that were previously invisible. This “invite a friend” practice is also, as Alexander Hanff pointed out, almost certainly illegal under GDPR. By providing your contact list to Clubhouse, you have shared your contact’s personal information with the company without their consent. And while you may know that a friend shared your phone number with Clubhouse if you received an invite, Clubhouse also knows how many of its users have your phone number in their contacts—whether you are on the app at all. 

For places where GDPR does not apply, it is unclear whether this type of non-consensual data collection is illegal. As a California resident, I have the right to request that companies delete my personal information under California’s Consumer Privacy Act of 2018—and I wrote to the company to specifically request that Clubhouse delete my phone number that other users have shared. Clubhouse (or its parent company, Alpha Exploration Co) have 45 days to respond or request a 90 day extension.

While I personally find this type of data collection irritating, it may literally be a matter of life or death for others. Just this week, some articles have triumphantly proclaimed that “Clubhouse cracked the Great Firewall.” Chat rooms entitled “Does Xinjiang have concentration camps?” and “Friends from Tibet and Xinjiang, we want to invite you over for a chat” appeared on the platform, and were supposedly attended by individuals from mainland China (who downloaded the app via VPN, as Clubhouse is not available in China’s Apple app store). Clubhouse has since been shut down by China’s censors.

However, a report by the Stanford Internet Observatory pointed out that there are major security flaws in the Clubhouse app that almost certainly have put Chinese users and non-users of the app at risk. The SIO report found multiple problems with Clubhouse’s security—including lack of encryption for sensitive information; the use of a back-end infrastructure software located in Shanghai, which may therefore be legally obligated to share information with the Chinese government; and unclear policies surrounding Clubhouse’s storage and retention of chatroom audio. The report noted that “in at least one instance, SIO observed room metadata being relayed to servers we believe to be hosted in the PRC, and audio to servers managed by Chinese entities and distributed around the world via Anycast. It is also likely possible to connect Clubhouse IDs with user profiles.”

Clubhouse has almost certainly put attendees of those discussions who live in or have connections to mainland China at risk of retaliation from the Chinese government. It may also, by implication, have put non-Clubhouse users at risk, depending on how securely the users’ contact list data was stored. Even individuals who did not join the app or participate in the chatrooms may find themselves implicated if their phone number is found in the contact lists of individuals who are now associated with politically sensitive issues by virtue of participating in the Clubhouse chatrooms. 

This issue extends beyond the China context. As Dr Matt Mahmoudi discussed with me on my most recent podcast episode—data collection can be a death sentence. If individuals who work with undocumented immigrants, for instance, join Clubhouse, the network affects of their combined contact lists can reveal phone numbers and therefore the identity of undocumented individuals—and, if shared or demanded by ICE, lead to deportations. 

In the wake of the Stanford Internet Observatory report, Clubhouse has said that they are reviewing their data protection practices. But the fact remains that this comes only after individuals have already been put at risk because of Clubhouse’s poor data practices. These issues were easily predicted and prevented, underscoring the need for a duty of care and risk assessment for even seemingly “harmless” apps. And—as a rule of thumb—the less data collected, the better.

Create your website with WordPress.com
Get started