| 2023Type: Website
The report, JUST JOKING!, analyses more than seventy recent cases of a wide range of deepfakes. Some are examples of potent satire, art, or activism, from mocking authoritarian leaders to synthetically resurrecting victims of injustice to demand action. But others demonstrate how bad actors use comedy as both a sword and a shield, to glorify the powerful and attack marginalized communities, while seeking to escape culpability. Increasingly, satire is used as a defensive excuse — “just joking!” — after a video has circulated and caused harm.
Thorn | 2023Type: Report
In this report from Thorn, they share critical insights and action steps for tech companies regarding emerging online trends in CSAM in 2023.
Tech Against Terrorism | 2023
The Knowledge Sharing Platform is a platform developed by Tech Against Terrorism to provide smaller tech companies with a collection of interactive tools and resources designed to support their operational and policy-oriented efforts with regard to developing and implementing an effective and human rights compliant counterterrorism response. Please note: Access to the KSP requires registration; however, this is primarily a security measure to vet access to sensitive content. Anyone with an email address affiliated with a tech company is de facto approved.
Brittan Heller | 2023Type: Journal article
Hate speech is a contextual phenomenon. What offends or inflames in one context may differ from what incites violence in a different time, place, and cultural landscape. This paper discusses how online hate speech may operate differently in a postcolonial context. While hate speech impacts all societies, the global South—Africa in particular—has been sorely understudied. The author posits that in postcolonial circumstances, the interaction of multiple cultural contexts and social meanings form concurrent layers of interpretation that are often inaccessible to outsiders. This study expands the concept of online harms by examining the political, social, and cultural dimensions of data-intensive technologies.
Thorn | 2023Type: Report
Thorn's latest research sought to understand kids’ online social networks to better disentangle high-value versus high-risk relationships. In a survey of 1,200 youth (aged 9-17), we explored young people’s attitudes and experiences with friendships and flirting online, and how they respond to threats of manipulation, grooming, and abuse. With 2 in 5 of all kids reporting they’ve been approached online by someone they thought was attempting to “befriend and manipulate” them, relevant and scalable interventions are urgently needed.
Thorn | 2023Type: Report
Thorn’s latest research monitors the evolution of youth attitudes and experiences with SG-CSAM among 9-17 year-olds for the third consecutive year. Findings from 2021 reveal a sustained increase in the number of young people sharing their own SG-CSAM as well as the perceived normalcy of non-consensually re-sharing another child’s SG-CSAM. The data also underscored heightened risk among boys and Hispanic/Latino youth.
Thorn | 2023Type: Report
Thorn's latest research on youth’s attitudes toward online safety features found that 22 percent of minors report having online sexual interactions with adults—the same percentage of minors who report having sexual interactions with peers their own age. The research, which builds on our 2020 report on the same topic, also underscores that the features provided by platforms are simply insufficient when it comes to keeping kids safe.
Tech Coalition | 2023
A voluntary and open source image classification system adopted by members of the Tech Coalition that is used by many electronic service providers to categorize images and videos that depict apparent child sexual abuse and exploitation.
Tech Coalition | 2023Type: Report
Trust: Voluntary Framework for Industry Transparency (the Framework) was developed by the Tech Coalition to provide principles-based guidance to tech companies seeking to build trust around their efforts to address online child sexual exploitation and abuse (CSEA) risks on their services. The Framework outlines principles that provide a general basis for considering how to approach transparency reporting and recommended report categories.
Pearson, E., Whittaker, J., Baaken, T., Zeiger, S., Atamuradova, F. and Conway, M. | 2023Type: Report
VOX-Pol's new report presents findings from the REASSURE (Researcher, Security, Safety, and Resilience) project's in-depth interviews with 39 online extremism and terrorism researchers. Based at universities, research institutes, and think tanks in Europe and North America, the interviewees studied mainly, albeit not exclusively, far-right and violent jihadist online activity. The report catalogues for the first time the range of harms they have experienced, the lack of formalised systems of care or training, and their reliance therefore on informal support networks to mitigate those harms.