Are you about to post a tweet containing “harmful” language? Don’t worry — Twitter is experimenting with a feature that will warn you ahead of time, so you don’t accidentally infect any users with your wrongthink.
“When things get heated, you may say things you don’t mean,” wrote Twitter’s official support account in a tweet yesterday.
“To let you rethink a reply, we’re running a limited experiment on iOS with a prompt that gives you the option to revise your reply before it’s published if it uses language that could be harmful.”
It’s a bit like “precrime” in the 2002 Tom Cruise film Minority Report. Twitter will notice when you’re about to use a bad word, giving you the chance to avoid a rule violation and save your account from the digital gulag.
“We’re trying to encourage people to rethink their behavior and rethink their language before posting because they often are in the heat of the moment and they might say something they regret,” said Sunita Saligram, Twitter’s global head of site policy for trust and safety, in a comment to Reuters.
Twitter’s “trust and safety” department was acknowledged by its then-policy manager, Olinda Hassan to be “controversial,” in an undercover video recorded by Project Veritas in 2018.
In the same secretly-recorded exchange, Hassan said one of the things the Trust & Safety team was working on in 2018 was ways to “get the shitty people to not show up.” (The comment was made in response to an undercover reporter asking her about the prominence of tweets from author and filmmaker Mike Cernovich).
Twitter now appears to be taking a different approach. Instead of trying to get you to not show up, the platform intends to train its users out of their bad behavior, like unruly dogs.
Are you an insider at Google, Reddit, Facebook, Twitter, or any other tech company who wants to confidentially reveal wrongdoing or political bias at your company? Reach out to Allum Bokhari at his secure email address email@example.com.
Allum Bokhari is the senior technology correspondent at Breitbart News.