On Tuesday, Jack Dorsey, the CEO of Twitter, came to TED 2019 to answer for the sins of his platform. In his signature black hoodie and jeans, unkempt facial hair and black beanie, he sat with head of TED Chris Anderson and Whitney Pennington Rodgers, who curates current affairs for the conference, for a conversation that left all three members, and the audience, frustrated.
â€œWeâ€™re on this great voyage with you on the Twittanic,â€� Anderson told Dorsey after roughly 20 minutes of interrupted back and forth. â€œThere are people in steerage who are saying, â€˜We are worried about the iceberg ahead!â€™ And you say, â€˜That is a good pointâ€™ and â€˜our boat hasnâ€™t been built to handle it,â€™ and weâ€™re waiting, and you are showing this extraordinary calm and weâ€™re all worried but weâ€™re outside saying, â€˜Jack, turn the fucking wheel!â€™â€�
Dorsey stoically listened to this comparison, like the meditative yogi he often talks about aspiring to be. â€œItâ€™s democracy at stake! Itâ€™s our culture at stake! Itâ€™s our world at stake!â€� Anderson continued. â€œYouâ€™re doing a brilliant role of listening, Jack, but can you actually dial up the urgency and move on this stuff? Will you do that?â€�
â€œYeah, yeah, yes,â€� Dorsey replied, but then added, â€œWe could do a bunch of superficial things to address what youâ€™re talking about, but we need to go deep.â€�
Itâ€™s been more than a year since Dorsey publicly committed to â€œfixingâ€� Twitter, and figuring out what a platform that encourages healthy discussions looks like. Heâ€™s been on a mea culpa tour since then, telling the worldâ€”and regulatorsâ€”that he knows Twitter is broken, that itâ€™s toxic and terrible and that he and the team are planning to radically rebuild it. He reiterated all of this on the TED stage, explaining that he wants to rethink what behavior the site incentivizes, for instance, by possibly getting rid of the like button and de-emphasizing follower count and emphasizing topical interests instead. He repeated that he wants to focus on maximizing the health of conversations, and prioritizing people spending their time learning on the site, rather than getting outraged or harassed. He admitted Twitter was full of problems, problems he didnâ€™t anticipate 13 years ago when the site was founded, and which heâ€™s still trying to figure out how to solve.
The urgency of this task couldnâ€™t have been made clearer in the days leading up to Dorseyâ€™s appearance. Over the weekend, Rep. Ilhan Omarâ€”a woman of color, an immigrant, and a Muslimâ€”reported an increase in death threats after President Trump tweeted out a video that intercut a speech she recently gave with footage of 9/11. Many of the threats were made on Twitter. Then on Monday, as Notre Dame burned, people came to the platform to mourn the loss in real timeâ€”but also to spread lies and hate as quickly as the flames engulfed cathedralâ€™s spire. When Rep. Omar tweeted her own heartfelt condolences, people replied with more death threats. Twitter was very much itself, showcasing the power of its network as well the danger.
Dorsey didnâ€™t address any of these incidents specifically at TED. In fact, his answers lacked specificity overall. When he was asked pointed questions, he evaded them, as he often does. Rodgers asked him how many people are working on content moderation on Twitterâ€”a number the company has never published, and Tuesday continued the streak.
â€œIt varies,â€� Dorsey said. â€œWe want to be flexible on this. There are no amount of people that can actually scale this, which is why we have done so much work on proactively taking down abuse.â€�
That proactive work was the big news Dorsey announced from the stage: A year ago, Twitter wasnâ€™t proactively monitoring abuse actively using machine learning at all. Instead, it relied entirely on human reportingâ€”a burden Dorsey was quick to recognize was unfairly put on the victims of the abuse. â€œWeâ€™ve made progress,â€� he said. â€œThirty-eight percent of abusive tweets are now proactively recognized by machine learning algorithms, but those that are recognized are still reviewed by humans. But that was from zero percent just a year ago.â€� As he uttered those words, Twitter sent out a press release with more information on the effort, highlighting that three times more abusive accounts are being suspended within 24 hours of getting reported compared with this time last year.
That progress is good, but 38 percent is not exactly a lot. Facebookâ€™s most recent transparency report, for comparison, says that over 51 percent of content it acted on for violating policies against hate speech was flagged before users reported it. Nor did Dorsey or the official Twitter announcement provide many details about how the technology to proactively flag abuse works. Relying on algorithms and automation wonâ€™t solve all Twitterâ€™s problems, either. Facebook just announced a slew of changes to better fight abuse and misinformation, which for all its technological sophistication, it hasnâ€™t come close to eradicating. And on Monday YouTube briefly flagged news broadcasts’ live video of the Notre Dame fire with a link to information about the September 11 attacksâ€”an effort at automated fact-checking that in this case demonstrated how imperfect such systems can be.
For years, organizations like Amnesty International have urged Twitter to be more transparent about abuse on its platform and the steps the company is taking to combat it. Rodgers noted that last year, a crowdsourced study by Amnesty found that a problematic or abusive tweet is sent to a woman every 30 seconds. For women of color, one in every ten tweets they receive is abusive.
By bringing up the very real suffering of people on his platform Rodgers and Anderson tried to bring a sense of urgency to the conversation. But Dorseyâ€™s signature laconic and intensely calm style of speaking was at odds with the tone they were trying to set. When Dorsey tried to get into specifics of how Twitter is measuring healthy conversations on the siteâ€”using four metrics developed by MITâ€™s Cortico teamâ€”Anderson cut him off.
â€œHow hard is it to get rid of Nazis from Twitter?â€� he asked.
Dorsey sighed. Deeply. He explained that the team has taken hateful accounts down, and when they can see that an account is associated with a hate group, they are banned. â€œWeâ€™re in a situation right now where that term is used fairly loosely and we just cannot take any one mention of that word accusing someone else as a factual indication that they should be removed from the platform,â€� he said. Twitter, and Dorsey in particular, have long upheld free speech as a defining value for the platform.
The conversation was clearly frustrating for all three participants. â€œYou didnâ€™t let me finish,â€� Dorsey told Anderson at one point, after he was cut off again. In that way, the TED event was also quite meta: To use Twitter is to be frustrated by its promise and limitations, but how much of a garbage fire it is while also being so useful to modern life, by how obvious some of its problems are and yet how apparently elusive solutions can be.
Dorsey did bring up one specific fix. â€œThe first thing you see when you go to [the page to report abuse] is about intellectual property protection. You scroll down and you get to abuse and harassment,â€� he noted. â€œI donâ€™t know how that happened in the companyâ€™s history but we put that above the thing that people actually want the most information on. Just our ordering shows the world what we believed was important. We are changing all that, we are ordering it the right way.â€�
For all his insistence on the bigger picture, this was a very small problem for Dorsey to point out, and one with a very obvious solution. And yet, itâ€™s not fixed. Why? The reasoning here is frustratingly circular: Because Dorsey says he doesnâ€™t want to do a bunch of small iterative quick fixes; he wants to fundamentally rebuild the site to encourage better conversations, and that will take time.
Time itâ€™s unclear the world can afford.
The day before Dorsey appeared at TED, Carole Cadwalladr, the British journalist who broke the story about Cambridge Analyticaâ€™s role in the Brexit vote, stood on the same stage and issued a challenge to all the â€œgods of Silicon Valley,â€� listing them by name: Zuckerberg, Sandberg, Brin, Page, and Dorsey last among them. â€œThis technology that you have invented has been amazing, but now itâ€™s a crime scene,â€� she said. â€œMy question to you is, is this what you want? Is this how you want history to remember you? As the handmaidens to authoritarianism all across the world? You set out to connect people and the same technology is now driving us apart.â€�
Of those gods, only Dorsey showed up. But unlike an omniscient being, Dorsey doesnâ€™t have all the answers. Heâ€™s more like a captain of a ship, wondering aloud how to avoid the many icebergs in his path while continuing ahead at full steam.
More Great WIRED Stories