UN-backed group Tech Against Terrorism stated Monday that tech platforms should prohibit or restrict the distribution of content from the Taliban.
Pursuant to this, the group has added the Taliban, which has retaken control of Afghanistan as a result of the withdrawal of US forces, to its Terrorist Content Analytics Platform (TCAP), which detects verified terrorist content on the internet and notifies platforms of its presence.
The group stated:
Tech Against Terrorism recommends that tech companies remove or restrict access to content produced by the Taliban. Platforms should, in addition to looking to TCAP alerts and designation lists of democratic states, also assess groups, actors and the content they produce against based on their own rules on terrorism, violent extremism, and incitement to and/or glorification of violence. Whilst we appreciate that this is a challenging moderation issue, the fact that the Taliban now effectively constitutes the Afghan government should not prevent platforms from implementing their rules in this area and from removing material produced by a designated terrorist organisation.
Tech Against Terrorism also expressed that it initially concentrated its efforts on a small number of designated violent Islamist and far-right terrorist organisations. This approach has proven to be the most effective because it ensures that the organization does not contribute to any undue norm-setting and “content cartelization” in its tech company assistance mechanisms
According to the TCAP’s website, the TCAP is available for use by “all types of tech platforms” and is geared toward helping smaller platforms succeed. Company decision-makers must decide how best to respond to the alerts, which are issued on an advisory basis, and are based on the company’s own content standards.
As social media companies faced questions about how they would deal with the Taliban quickly seizing control of Afghanistan, YouTube said that it has adopted a long-standing policy of not allowing accounts believed to be operated by the Taliban to be posted on their platforms. Similarly, Facebook earlier confirmed that it will continue to prohibit Taliban content from its platforms because the group is considered a terrorist organisation by the company. The company claims to have a dedicated team of Afghan experts who are responsible for monitoring and removing content associated with the group.
After appreciating that this is a challenging moderation issue, Tech Against Terrorism said, “The fact that the Taliban now effectively constitutes the Afghan government should not prevent platforms from implementing their rules in this area and from removing material produced by a designated terrorist organisation.”