According to a review conducted by cybersecurity group Ghost Data, the ads of more than 30 companies showed up alongside Twitter accounts connected to online CSA. The ads were from companies such as The Walt Disney Company, The Coca-Cola Company and NBCUniversal Media. Verification by Reuters revealed that the aforementioned accounts were peddling exploitative material, with key words related to "rape" and "teens."
Reuters cited two examples of this in an article. In one example, a promoted tweet for footwear brand Cole Haan appeared next to a post asking to exchange exploitative content. In another instance, a user's tweet about content centering on "young girls only" was followed by a promoted tweet from Scottish Rite Children's Hospital in Dallas.
The hospital did not return multiple requests for comment on the matter. Cole Haan, meanwhile, expressed outrage.
David Maddocks, the footwear brand's president, remarked: "We're horrified. Either Twitter is going to fix this, or we'll fix it by any means we can – which includes not buying Twitter ads."
Other brands, however, moved swiftly in response.
Dyson, Mazda, Forbes and PBS Kids joined several major advertisers in either suspending their marketing campaigns or removing their ads from parts of Twitter, as confirmed by their respective companies. Satellite TV provider DIRECTV and tech consultancy Thoughworks also confirmed that they paused their ad campaigns on the site. (Related: Major corporations cut ties with Twitter after discovering their ads were placed alongside child porn tweets.)
Twitter spokeswoman Celeste Carswell reiterated in a statement that the company "has zero tolerance for child sexual exploitation." She added that Twitter is working closely with its advertising clients and partners to investigate and take steps to prevent the situation from happening again.
According to Carswell, Twitter is investing more resources dedicated to child safety, including hiring for new positions to write policy and implement solutions.
Twitter bans CSA just like other social media platforms. However, the site founded by Jack Dorsey allows content from adults. In fact, an internal document seen by Reuters revealed that pornographic images comprise about 13 percent of all Twitter content.
In a separate study shared exclusively with Reuters, Ghost Data managed to identify the more than 500 Twitter accounts in its review done in September 2022 alone. The cybersecurity company lamented that Twitter failed to remove more than 70 percent of these accounts analyzed during a 20-day period that month. A separate verification done by the publication found that dozens of these accounts were soliciting child pornography.
Reuters later shared a sample of 20 accounts to the tech giant on Sept. 29, which prompted a removal of an additional 300 accounts. But a review by Ghost Data and the publication saw that more than 100 accounts remained up as of Sept. 30.
Following the review, Reuters sent the full list of more than 500 accounts that showed up alongside promoted tweets to the company on Sept. 26 for action. Carswell confirmed the next day that the erring accounts were reviewed and permanently suspended for terms of service violations.
The company then sent an email to advertisers on the morning of Sept. 28. It stated that "ads were running within Profiles that were involved with publicly selling or soliciting CSA material."
Back in February 2021, a team of Twitter employees concluded that the company needed more investment to identify and remove CSA content at scale. It noted in a report that moderators had a backlog of cases to review for possible reporting to law enforcement.
"While the amount of [CSA content] has grown exponentially, Twitter's investment in technologies to detect and manage the growth has not," it stated.
Watch this video about Twitter's inaction toward CSA content on its platform.
This video is from the Red Pill channel on Brighteon.com.
Sources include: