The electoral commission’s ground rules alone can’t be expected to insulate the country from harmful tactics.
As South Africa goes to the polls on 29 May, online campaigning is expected to play an unprecedented role in the high-stakes contest. While the online space can inform about parties’ policies and manifestos, the risk of disinformation – intentional distortion of information – is considerable. About 26-million people use social media in South Africa, and the number is rising.
The World Economic Forum identifies misinformation and disinformation as top global short-term risks. Disinformation threatens democracy by eroding the checks and balances that underpin open societies. So how can South Africa guard against influence operations, while protecting freedom of expression?
Information integrity is not simply about what is said online, but how it’s said. As one politician remarked at a recent conference on disinformation hosted by the European Union, Spain and Institute for Security Studies (ISS) in Cape Town, the creation of echo chambers by influence merchants creates the ‘false impression of being informed.’ With many traditional media houses putting content behind paywalls, citizens may turn to social media for their news, which in many instances is unverified.
The creation of digitally contrived ‘communities’ can be used to perpetuate prejudice, hatred and violence. Witness the xenophobic rhetoric of Operation Dudula – a movement that started online and has morphed into a political party.
In its extreme form, disinformation is defined as information warfare. Globally, the key players are classified according to motivation or location. They may be driven by political ideology, commercial gain or recreational ‘kicks’ and satire.
Home-grown product influencers – like those seen in an ISS study of online influence during Kenya’s 2022 poll – pivot their carefully groomed audiences towards political narratives during election season. These influencers use their ‘prepackaged’ audience to command considerable payment by simply creating a hashtag, liking a post, or sharing content with embedded political messaging.
Product influencers’ messages may be as blunt as ‘Vote for candidate X’ right through to persuading users not to vote at all. These influencers aren’t politically aligned but commercially driven. They are used from time to time, the ISS study shows, by foreign clients and domestic political actors.
Other types of influencers include political strategists, external nation states or their proxies. They ‘stir the pot’ online to achieve domestic political or geopolitical objectives – tapping into racial, economic and religious divisions or simply sowing confusion or fear on election day. During last year’s Democratic Republic of the Congo elections, Code for Africa identified various techniques, from creating confusion over the electoral process to homophobic slurs against one of the candidates.
Whether foreign or local, disinformation threatens democracies. Campaigns often seek to delegitimise electoral authorities or rubbish professional mainstream media, whose job is to hold power to account. Both tactics were used in Kenya’s 2022 polls.
The Independent Electoral Commission of South Africa seeks to insulate itself from such moves by establishing ground rules and principles for social media use during elections.
Although some techniques used in Kenya may be deployed in South Africa, the latter’s geopolitical position and robust professional media may result in a different dynamic. South Africa’s historic ties to Russia, and divisions over the Ukraine-Russia and Israel-Gaza conflicts, provide fertile ground for foreign friends and foes of the country’s political elite to meddle.
The challenge will be how to respond swiftly and proportionately. Plausible deniability is the disinformation merchant’s friend. And in democracies such as South Africa, robust online engagement is par for the course.
South Africa’s experience of Bell Pottinger, the United Kingdom-based public relations firm that used racial fault lines to drive the Guptas’ white monopoly capital narrative, has perhaps primed voters to expect online meddling during elections. Bell Pottinger showed that external influence campaigns aren’t the preserve of undemocratic states. Its antics resulted in professional and political consequences in the UK – an unlikely outcome in authoritarian states.
In the current environment, now supercharged with artificial intelligence (AI), Russia is accused number one in information operations, using experience from its international troll farm – the Internet Research Agency. Russia also appears to consider Africa an attractive target, given the weak checks and balances in many of the continent’s fragile democracies.
The Africa Center for Strategic Studies identified 23 campaigns targeting Africa since 2014; 16 linked to Russia. The Digital Forensic Research Lab warns that the ‘political and social instability caused by influence operations’ has ramifications beyond countries’ borders.
As campaigning in South Africa ramps up, already a video ‘deepfake’ of former United States President Donald Trump apparently claiming to support the new uMkhonto weSizwe party may be a harbinger of what’s to come. Given AI advances, sophisticated ‘deepfakes’ using audio, video and text, distributed at scale and at speed, may make it hard for mainstream media and other watchdogs to react timeously.
The European Union has just voted on landmark legislation seeking to control AI use, but some say it doesn’t go far enough. Legal regulations are probably unsuitable in an African context where many countries don’t have the ‘institutional strength’ to make them work.
There is also the risk that an overzealous state might target legitimate conversations on social media, as happened during Nigeria’s #EndSARS campaign. A partnership approach that promotes public digital literacy would probably be more practical than new laws.
Protecting South Africa and Africa against information manipulation should include building resilience in the mainstream media. In the short term, that means ensuring that traditional media don’t inadvertently amplify influence campaigns by being drawn into online echo chambers.
Fact-checking organisations such as Africa Check debunk disinformation at source and develop a coalition of fact checkers to identify disinformation early. In Kenya, early detection helped limit attempts at voter suppression. South Africa should do the same.
Engaging directly with social media platforms may also work. The South African National Editors’ Forum, Media Monitoring Africa and the electoral commission have jointly called on the major social media platforms to ‘co-create a conducive information environment in the upcoming elections.’ They acknowledge the existential threat disinformation can pose to democracies.
Part of their agreement is to create a complaints platform, Real411, to enable a swift response to online harms. The onus will be on social media platforms to proactively remove content and issue ‘advisory warnings’ when potential harms are identified.
To date, motivating the social media industry to understand the African context in which information operations thrive has been hard. With its geopolitical prominence and expanding tech marketplace, South Africa is well positioned to profile these concerns with Google, TikTok, Meta and other social media enterprises.
Written by Karen Allen, Consultant, ISS
EMAIL THIS ARTICLE SAVE THIS ARTICLE ARTICLE ENQUIRY
To subscribe email subscriptions@creamermedia.co.za or click here
To advertise email advertising@creamermedia.co.za or click here