Clubhouse in China shows that even “harmless” apps may put individuals in harm’s way

by Alina Utrata

Up until last week, most people hadn’t heard of the voice chatroom app Clubhouse—popular mostly with LA celebrities and Silicon Valley elites (and venture capitalist firm Andreessen Horowitz has invested tens of millions of dollars in the platform). The reason the app gave off such an air of exclusivity was because not just anyone could join—new users required an invite from a current user in order to sign up. Clubhouse growth has increased recently, especially after Elon Musk tweeted about it, with some invites being auctioning off for up to $125. But the recent uptick in user growth has underscored a privacy nightmare that is almost certainly putting Clubhouse users and non-users at risk—and in some cases even putting them in danger. 

The problem centers around Clubhouse invites. In order to invite someone onto the platform, you must allow the app access to your contact list. The app then uploads your entire contact list—and lets you know how many Clubhouse users also have your contacts in their phone’s contacts. Because of Clubhouse’s aggressive contact list collection policy (you literally cannot be invited onto the app unless one of your contacts has allowed Clubhouse access to their contacts) the app has has the capacity to quickly accumulate a “social graph.”

Some, like Will Oremus, have pointed out that this is a little bit “creepy.” You can suddenly see how many people have your therapist (or drug dealer) in their contact list, revealing networks or connections that were previously invisible. This “invite a friend” practice is also, as Alexander Hanff pointed out, almost certainly illegal under GDPR. By providing your contact list to Clubhouse, you have shared your contact’s personal information with the company without their consent. And while you may know that a friend shared your phone number with Clubhouse if you received an invite, Clubhouse also knows how many of its users have your phone number in their contacts—whether you are on the app at all. 

For places where GDPR does not apply, it is unclear whether this type of non-consensual data collection is illegal. As a California resident, I have the right to request that companies delete my personal information under California’s Consumer Privacy Act of 2018—and I wrote to the company to specifically request that Clubhouse delete my phone number that other users have shared. Clubhouse (or its parent company, Alpha Exploration Co) have 45 days to respond or request a 90 day extension.

While I personally find this type of data collection irritating, it may literally be a matter of life or death for others. Just this week, some articles have triumphantly proclaimed that “Clubhouse cracked the Great Firewall.” Chat rooms entitled “Does Xinjiang have concentration camps?” and “Friends from Tibet and Xinjiang, we want to invite you over for a chat” appeared on the platform, and were supposedly attended by individuals from mainland China (who downloaded the app via VPN, as Clubhouse is not available in China’s Apple app store). Clubhouse has since been shut down by China’s censors.

However, a report by the Stanford Internet Observatory pointed out that there are major security flaws in the Clubhouse app that almost certainly have put Chinese users and non-users of the app at risk. The SIO report found multiple problems with Clubhouse’s security—including lack of encryption for sensitive information; the use of a back-end infrastructure software located in Shanghai, which may therefore be legally obligated to share information with the Chinese government; and unclear policies surrounding Clubhouse’s storage and retention of chatroom audio. The report noted that “in at least one instance, SIO observed room metadata being relayed to servers we believe to be hosted in the PRC, and audio to servers managed by Chinese entities and distributed around the world via Anycast. It is also likely possible to connect Clubhouse IDs with user profiles.”

Clubhouse has almost certainly put attendees of those discussions who live in or have connections to mainland China at risk of retaliation from the Chinese government. It may also, by implication, have put non-Clubhouse users at risk, depending on how securely the users’ contact list data was stored. Even individuals who did not join the app or participate in the chatrooms may find themselves implicated if their phone number is found in the contact lists of individuals who are now associated with politically sensitive issues by virtue of participating in the Clubhouse chatrooms. 

This issue extends beyond the China context. As Dr Matt Mahmoudi discussed with me on my most recent podcast episode—data collection can be a death sentence. If individuals who work with undocumented immigrants, for instance, join Clubhouse, the network affects of their combined contact lists can reveal phone numbers and therefore the identity of undocumented individuals—and, if shared or demanded by ICE, lead to deportations. 

In the wake of the Stanford Internet Observatory report, Clubhouse has said that they are reviewing their data protection practices. But the fact remains that this comes only after individuals have already been put at risk because of Clubhouse’s poor data practices. These issues were easily predicted and prevented, underscoring the need for a duty of care and risk assessment for even seemingly “harmless” apps. And—as a rule of thumb—the less data collected, the better.