Featured Item
New competition allows students to “hack” hate speech online
The rise of hate speech including antisemitism on social media is overwhelming, partly because the platforms seem reluctant to stop it, experts say. On Twitter in particular, since Elon Musk bought the platform at the end of October, hate speech has soared, according to a study by Montclair State University.
“Reducing hate speech online is a long-term project,” says Dr Günther Jikeli, Erna B. Rosenfeld Associate Professor at the Institute for the Study of Contemporary Antisemitism at Indiana University. “We have to approach this in a variety of ways. Empowering students to know about it, to recognise it manually and perhaps automatically, is one important step.”
“It’s only in the past three to four years that the mainstream social media companies have started taking hate speech more seriously,” says Jikeli. “Some have pledged not to accept Holocaust denial, but antisemitism is all too often tolerated, especially if it comes in disguised forms. Some platforms might take messages down that call for the killing of Jews. But they rarely take messages down if they call for the killing of Zionists or the destruction of the Jewish state, which is often just a codeword for Jews.”
In addition, “social media is a completely new tool of communication, allowing for multidirectional communication on a global scale. Our online and offline societies have to develop new rules for these spaces. It’s increasingly clear that hate speech is a serious problem on these platforms, and that we need to think more creatively about how to address it.”
The Montclair State University study tracked how often homophobic, antisemitic, and racial-hate-driven terms were used immediately after Musk’s takeover. Researchers found that about 398 hate tweets an hour were made in the 12 hours after the acquisition was finalised – a number that nearly quadrupled from the 84 tweets an hour in the week leading up to Musk’s takeover. Hate-driven tweets skyrocketed to 4 778 times over that span compared with the average 1 000 beforehand.
Musk, the billionaire business mogul who is also the chief executive of SpaceX and Tesla, took over Twitter on 27 October, ushering in the promise to loosen the social media platform’s content restrictions.
“The character of what Twitter will look like with Musk as the head remains speculative, in spite of his stated intentions,” the Montclair State report reads. “What’s not speculative, however, is the extent to which his date of formal acquisition was celebrated by racist and extremist users on the platform.”
Similar research by the National Contagion Research Institute determined that the use of the “N-word” racial slur increased more than 500% on Twitter the day after Musk’s takeover. Another word used to attack transgender people was used 33 926 times in tweets and retweets, which was 53% higher than the 2022 average. Hateful language to describe gay people, Jews, and Hispanics had also increased since Musk took over the company.
“Twitter influences your life and the information you consume, even if you’ve never used it,” wrote journalist Yair Rosenberg in his Deep Shtetl newsletter on 10 November. “Twitter allows people who previously couldn’t be heard to be heard. Twitter often intensifies pre-existing human tendencies.”
In light of this, Jikeli’s department has launched a datathon, hackathon, and machine learning competition that will allow students to compete for prizes, learn skills for future careers, meet others from around the world, and most importantly, play a part in reducing hate speech online.
“The competition is open to all high school and undergraduate students, including from South Africa,” says Jikeli. “Our goal is to sharpen awareness of the threats posed by online hate speech, and to teach participants how to detect and combat it using machine learning.
“In these workshops, led by Indiana University professors, participants will learn how to recognise, monitor, and track biased messages on social media, specifically Twitter. Participants will be placed into teams and asked to work together on their own scripts and annotations. The three workshops will ask questions like, ‘What’s hate speech online? What are manifestations of bias against Asians, black people, Hispanics, Jews, and Muslims? Why do we need machine learning to observe and combat hate speech?’
“In a datathon, students look at data and classify it,” Jikeli says. “In our case, it’s social media posts that might disseminate hatred against minority groups, such as Asians, Black people, Jews, Muslims, or Latinos. In a hackathon, students develop algorithms that can detect hateful messages, based on earlier classification of data by our research group and by the participating students. We combine both in an educational project so that students learn about different forms of biases, recognise stereotypes and biases in an online environment, and get an idea of how to deal with big data and how to develop automated detection – at least in principle.”
The competition came about because “we noticed that students are interested in looking at live data and working hands-on on research projects that have an impact. We mostly focus on online antisemitism in all its forms, including anti-Zionist forms of antisemitism. However, other minorities are also targeted, and a common denominator of perpetrators is often a conspiratorial way of thinking, which is the essence of modern antisemitism.”
“Participants will also learn how to signal biased messages to the platforms so that they can be taken down or downgraded and made less visible,” he says. “The algorithms that the participants develop can be used to identify messages that are probably biased. This is important because in most cases, hate speech doesn’t get called out.”
Asked if students could go on to use these skills to combat hate speech and in their professions, he says, “Absolutely. Social media literacy is increasingly important. Much of our social and professional life is already on social media, and it will only increase in the future. We also use data from other social media platforms for our research. For the competition, we keep it simply and concentrate on one platform [Twitter].”
Students taking part will work in international teams. “We’re a global world, and we hope to develop stronger ties with talented students and community members around the world,” says Jikeli. “We need to stand up against antisemitism together.”
To apply or find out more, visit https://isca.indiana.edu/publication-research/social-media-project/datathon-2023/index.html
Photo credit: Patrick Pleul/PoolAFP via Getty Images (JTA)