Startups

Can Clubhouse Move Fast Without Breaking Things? – The New York Times


From the start, there were signs that Clubhouse was speed-running the platform life cycle. Weeks after launching, it ran into claims that it was allowing harassment and hate speech to proliferate, including large rooms where speakers allegedly made anti-Semitic comments. The start-up scrambled to update its community guidelines and add basic blocking and reporting features, and its founders did the requisite Zuckerbergian apology tour. (“We unequivocally condemn Anti-Blackness, Anti-Semitism, and all other forms of racism, hate speech and abuse on Clubhouse,” read one company blog post in October.)

The company has also faced accusations of mishandling user data, including a Stanford report that found that the company may have routed some data through servers in China, possibly giving the Chinese government access to sensitive user information. (The company pledged to lock down user data and submit to an outside audit of its security practices.) And privacy advocates have balked at the app’s aggressive growth practices, which include asking users to upload their entire contact lists in order to send invitations to others.

“Major privacy & security concerns, lots of data extraction, use of dark patterns, growth without a clear business model. When will we learn?” Elizabeth M. Renieris, the director of the Notre Dame-IBM Tech Ethics Lab, wrote in a tweet this week that compared Clubhouse at this moment to the early days of Facebook.

To be fair, there are some important structural differences between Clubhouse and existing social networks. Unlike Facebook and Twitter, which revolve around central, algorithmically curated feeds, Clubhouse is organized more like Reddit — a cluster of topical rooms, moderated by users, with a central “hallway” where users can browse rooms in progress. Clubhouse rooms disappear after they’re over, and recording a room is against the rules (although it still happens), which means that “going viral,” in the traditional sense, isn’t really possible. Users have to be invited to a room’s “stage” to speak, and moderators can easily boot unruly or disruptive speakers, so there’s less risk of a civilized discussion’s being hijacked by trolls. And Clubhouse doesn’t have ads, which reduces the risk of profit-seeking mischief.

But there are still plenty of similarities. Like other social networks, Clubhouse has a number of “discovery” features and aggressive growth-hacking tactics meant to draw new users deeper into the app, including algorithmic recommendations and personalized push alerts, and a list of suggested users to follow. Those features, combined with Clubhouse’s ability to form private and semiprivate rooms with thousands of people in them, create some of the same bad incentives and opportunities for abuse that have hurt other platforms.

The app’s reputation for lax moderation has also attracted a number of people who have been barred by other social networks, including figures associated with QAnon, Stop the Steal and other extremist groups.

Clubhouse has also become a home for people who are disillusioned with social media censorship and critical of various gatekeepers. Attacking The New York Times, in particular, has become something of an obsession among Clubhouse addicts for reasons that would take another full column to explain. (A room called, in part, How to Destroy the NYT ran for many hours, drawing thousands of listeners.)





READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.