Elon Musk says he can stop child abuse on Twitter. So far, he’s fired jobs and fired guards.
Less than a month after taking control of Twitter, Elon Musk said tackling child sexual exploitation content on the social media platform was “Priority 1. “
But there is little evidence that the company is taking tougher action under his watch or putting more resources toward the platform’s long-running problem of pedophile content, according to interviews with four former employees and a current employee, internal company records, and interviews with people working to stop the content. Online child abuse.
Meanwhile, Musk has turned the topic of online safety into part of a larger effort to disparage Twitter’s former leaders and to portray his ownership of the company as part of a sociopolitical battle against the “awakened mind virus,” as the center-left calls it to the fullest extent. – ideals. The shift comes as he has embraced far-right rhetoric online that also often includes false allegations of pedophilia.
“It is a crime that they have refused to take action on child exploitation for years!” Musk tweeted on Friday in response to a resignation letter from a member of the company’s Trust and Safety Board who worked on child abuse cases.
Former CEO Jack Dorsey replied, “It’s wrong.”
Under Musk’s new management, Twitter said That his account was suspended in November for child sexual exploitation content was higher than in any other month of 2022 thanks to new partnerships with unnamed organizations and new “detection and enforcement methods” that the company did not explain.
Meanwhile, Twitter’s resources to combat online pedophile content (and what’s sometimes called child pornography or pedophilia material) are weak, following layoffs, mass firings and resignations from the company.
While the number of employees is still changing at Twitter, internal records obtained by NBC News and CNBC indicate that as of early December, approximately 25 employees had titles related to “trust and safety” out of a total of 1,600 employees still at the company. The total includes more than 100 people who Musk has allowed to work at Twitter but who work at his other companies, Tesla, SpaceX and The Boring Co. , along with a diverse group of investors and advisors.
A former Twitter employee who worked on child safety said they know of “a handful” of people at the company still working on the issue, but most or all of the product managers and engineers who were on the team are no longer there. The employee spoke on condition of anonymity for fear of reprisals for his discussion about the company.
Twitter’s staff swelled to more than 7,500 by the end of 2021. Even if Musk doesn’t take over, layoffs are likely, former employees said.
Twitter did not respond to requests for comment.
Under Musk, the company has also backed out from some outside commitments with child safety groups.
For example, Twitter disbanded its Trust and Safety Council on Monday, which included 12 groups that advised the company on its efforts to address pedophilia.
The National Center for Missing & Exploited Children (NCMEC), the organization tasked by the US government with tracking reports of child sexual abuse material online, said little has changed under Musk’s leadership regarding Twitter’s reporting practices thus far.
“Despite the rhetoric and some of what we’ve seen from people posting online, their CyberTipline numbers are nearly identical to what they were before Musk joined the board,” said NCMEC representative Gavin Portnoy, referring to the organization’s centralized CSAM reporting system.
Portnoy noted that one of the changes the group noted was that Twitter did not send a representative to the organization’s annual social media roundtable.
“The previous person was one of the people who quit,” Portnoy said. When asked if Twitter wanted to send a proxy, Portnoy said Twitter refused.
Most recently, Musk used the pedophile content case to attack former Twitter employees, most notably Yoel Roth, who took over as head of the company’s trust and safety efforts and was hailed by Musk when he first became CEO in October. Roth left Twitter a few weeks later, following the United States midterm elections.
Musk suggested that Roth’s doctoral thesis on the LGBTQ dating app Grindr advocated pedophilia when the opposite was true. Roth, who is openly gay, stated that Grindr was not a safe place for minors and discussed how to create age-appropriate content and spaces for LGBTQ teens to connect online and through apps.
Musk’s misleading allegations left Roth facing widespread abuse online, including on Twitter, where users hurled anti-Semitic threats, sentiments, and slurs at him. On Monday, Roth reportedly left his home over the threats following Musk’s unsubstantiated accusations about him.
Laurel Powell, deputy director of communications for programs at the Campaign for Human Rights, a nonprofit LGBT advocacy organization, said Musk’s incendiary tweets targeting Roth fit with an escalating series of far-right attacks on LGBTQ people using false claims of “grooming.”
“This grooming rhetoric is really in many cases just a recycling of anti-LGBT hate speech,” Powell said. “This is a really dangerous moment we’re in — someone with as big a platform as Mr. Musk is feeding into this unproven false rhetoric.”
Twitter’s imperfect efforts to combat pedophile content are well documented. In 2013, the company said it would introduce PhotoDNA technology to prevent the publication of child sexual abuse (CSAM) material already in a known CSAM database. However, this technique cannot detect newly generated materials.
The company’s transparency reports, which detail things like legal requests and account removals, have shown that the company Removed over a million accounts in 2021 for violating the child sexual exploitation content rules.
In 2021, Twitter reported 86,666 cases of CSAM detected on its platform, a number Portnoy said could be higher. “We’ve always felt there should have been more reports coming out of Twitter, no matter how you cut it, just given the sheer number of users out there,” he said.
Child sexual exploitation content remained an issue for Twitter, even though most major social media platforms continue to deal with it in one form or another.
Some advertisers left Twitter earlier this year after discovering ads were appearing alongside problematic content. A lawsuit was filed against Twitter in 2021 by a child sexual abuse victim and their mother who alleged that the company did not act quickly enough when alerted to a video of a child circulating on the platform. A second child was later added to the lawsuit, which is currently before the Ninth Circuit Court of Appeals.
Moderation of such content typically relies on a combination of automated detection systems, specialized internal teams and external contractors to identify and remove child abuse content. Twitter’s policy defines such content as “images and videos that are flagged as child pornography, but also written solicitations and other materials that promote child sexual exploitation.”
According to people familiar with the situation and internal records, the layoffs, firings, and resignations have cut the number of engineers at Twitter by more than half, including countless employees and leaders at the company who worked on trust and security features and improvements to the existing platform. Ella Irwin, Twitter’s current head of trust and safety, told Reuters that Musk has also cut out contractors, while the company looks to high-tech automation for its moderation needs.
“You tend to think that more bodies means more safety,” Portnoy said. “So, I mean, this is frustrating.”
It’s unclear how many Twitter employees are still working on child safety issues.
A LinkedIn search for current Twitter employees who say they work in child safety turned up just a handful of accounts. Bloomberg and Wired previously reported that Musk’s targeted layoffs and terminations on Twitter have reduced the number of people working to moderate content, including a focus on pedophile content.
However, Musk emphasized that he is reorienting the company in a way that prioritizes children’s safety.
Twitter has turned to at least one outside researcher for help — Andrea Stroppa, an Italian cybersecurity researcher who says he is friendly with Elon Musk, and who has praised Musk online since his takeover of Twitter.
Stroppa previously analyzed bots and propaganda on Twitter, and told NBC News that he is now working with employees at the company to find and remove child sexual exploitation content and accounts. It is owned by Irwin, the current Head of Trust and Safety at Twitter Thanks Strooppa “For his partnership and dedication.”
Stroppa, who remains an independent researcher, said he felt Twitter’s previous efforts were lacking and was now moving quickly to find and suspend accounts posting child sexual exploitation content. He said the company also changed its policy from removing individual tweets to immediately banning accounts found to be violating its policies.
“I think it’s a drastic change,” he said in a phone interview.
Marijki Chartoni, who has suffered from human trafficking and abuse and is now working to bring awareness to the problem, said she had previously achieved good results by reporting problematic accounts and content on Twitter starting in 2020.
The platform wasn’t perfect, but it wasn’t as sloppy as Musk claimed. “The old Twitter was very responsive and removed the accounts,” she said in an email. “I felt like I was making some progress.”
#Elon #Musk #stop #child #abuse #Twitter #hes #fired #jobs #fired #guards