FREQUENTLY ASKED QUESTIONS
Disclaimer: Please note that the opinions expressed on this website and as a result of the Terrorist Content Analytics Platform (TCAP) project represent the views of Tech Against Terrorism and do not necessarily reflect the official policy or position of the Government of Canada.
What is the Terrorist Content Analytics Platform (TCAP)?
Why is Tech Against Terrorism developing the TCAP?
How is development of the TCAP funded?
Will the TCAP cost money to use or access?
Who will have access to the TCAP?
Do governments have access to the TCAP?
Governments do not have access to the TCAP. Whilst the TCAP is sponsored by the Government of Canada, they do not have access, and neither does any other government. Only tech companies, academics, and civil society members who have a legitimate reason for accessing the TCAP will be allowed access.
When the TCAP team monitors terrorist channels and content notice an immediate and credible threat to life, Tech Against Terrorism will alert United Kingdom police.
Which tech platforms will be able to use the TCAP?
How can I learn more about the TCAP?
How can I support the TCAP?
When will TCAP be available to use?
What other services does Tech Against Terrorism provide?
How will the TCAP support smaller tech companies?
What other services are provided to tech companies?
How will the TCAP support academics and researchers?
Why aren't all terrorist groups included in Phase 1?
What type of files will be available?
Can users download content from the platform?
How will content be added to TCAP?
How will content be verified?
Our in-house experts verify terrorist content that is up for inclusion in the TCAP to ensure that it falls in scope of the TCAP and is aligned with our policies, particularly our group inclusion policy.
They do this according to two criteria: the source of the content and the actual content itself. To verify the source, our experts identify core beacon channels through which a terrorist groups’ messaging and propaganda is shared.
In order to assess the content, our team conducts an intelligence assessment that evaluates attributes of the content that indicate a high level of probability that the material was produced by a designated terrorist organisation in scope of the TCAP.
We will provide more detail in our soon-to-be released content verification policy.
Will the TCAP ingest terrorist content that is referred by others?
What do we mean by content moderation?
What specific considerations is Tech Against Terrorism taking in developing the TCAP?
In developing the TCAP, Tech Against Terrorism is paying particular attention to the following: • Accuracy of content included • Privacy and security of those using the platform • Transparency and accountability – ensuring that the TCAP is developed in a transparent manner where there is opportunity to challenge specific content inclusions • Tech platform autonomy in making moderation decisions • Welfare and mental health of all platform users
Has there been any public consultation on the TCAP?
Developing the TCAP in a transparent and accountable manner is our top priority. In late 2019 Tech Against Terrorism commenced a consultation process to ensure that these considerations are accounted for. This process included a meeting during the UN General Assembly Week in New York, as well as an online public consultation process, the results of which are available here. Tech Against Terrorism is also carrying out consultations with tech companies, academics, and civil society groups on an ongoing basis. If you would like to contribute to TCAP feature ideas, development and testing, including potential early-access to new features please contact us.