From the outset, there were signs that Clubhouse was speeding up the platform’s lifecycle. Weeks after its launch, it ran into claims that it enabled the spread of intimidation and hate speech, including large rooms where speakers reportedly made anti-Semitic comments. The startup tried to update community guidelines and add basic blocking and reporting features, and the founders did the requisite Zuckerbergian apology tour. (“We unequivocally condemn anti-blackness, anti-Semitism and all other forms of racism, hate speech and clubhouse abuse,” a company blog post read in October.)
The company was also accused of misusing user data, including a Stanford report finding that the company may have routed certain data through servers in China, potentially allowing the Chinese government to access sensitive user information. (The company pledged to lock down user data and submit to an external audit of its security practices.) And privacy advocates object to the app’s aggressive growth practices, including asking users to upload their entire contact list to send invitations to others. send .
“Great concerns about privacy and security, a lot of data extraction, use of dark patterns, growth without a clear business model. When will we learn? Elizabeth M. Renieris, the director of the Notre Dame-IBM Tech Ethics Lab, wrote in a tweet this week in which she currently compared Clubhouse to the early days of Facebook.
To be fair, there are some important structural differences between Clubhouse and existing social networks. Unlike Facebook and Twitter, which revolve around central, algorithmically curated feeds, Clubhouse is more organized like Reddit – a cluster of current rooms, moderated by users, with a central ‘hallway’ where users can browse rooms. Clubhouse rooms disappear after they are over and the inclusion of a room is against the rules (although it still does), which means that “going viral”, in the traditional sense, is not really possible. Users must be invited to speak on the “stage” of a room, and moderators can easily initiate unruly or disruptive speakers, so there is less risk of a civilized discussion being hijacked by trolls. And Clubhouse has no ads, which reduces the risk of making a profit.
But there are still many similarities. Like other social networks, Clubhouse has a number of “discovery” features and aggressive growth hacking tactics designed to draw new users deeper into the app, including algorithmic recommendations and personalized push alerts, and a list of suggested users to follow. These features, combined with Clubhouse’s ability to form private and semi-private rooms with thousands of people in them, create some of the same bad incentives and opportunities for abuse that have hurt other platforms.
The app’s reputation for lax moderation has also attracted a number of people blocked by other social networks, including numbers associated with QAnon, Stop the stealing and other extremist groups.
Clubhouse has also become a home for those disillusioned with social media censorship and critical of various gatekeepers. Attacking The New York Times, in particular, has become something of an obsession among clubhouse addicts for reasons that would take another full column to explain. (A room called How to Destroy the NYT ran for many hours and attracted thousands of listeners.)