Site icon Stuff South Africa

How the DA is tackling Facebook’s misinformation problem

Facebook faces the DA

Facebook has taken a bit of a hammering this year in the press. And that’s putting things charitably.

At the beginning of the year it found itself in a stand-off against the Australian government over a piece of legislation designed to make it – and other platforms – pay news publications that it links to. Facebook responded by banning reading or sharing news on its platform for Australian users, and (apparently mistakenly) banning access to emergency services, charity groups and the local department of health ahead of a COVID-19 vaccine drive.

It wasn’t a good look.

Over in the United States, it finally (albeit temporarily) banned Donald Trump, after the outgoing president used social media to spur on a mob of his followers to attack Capitol Hill in Washington.

Twitter, arguably Trump’s most valuable microphone to the masses, permanently banned him. Parler, a social network that allowed a basic free-for-all in its terms and conditions, and which was also tagged as a network that helped coordinate the Captiol Hill invasion, was dumped by its server providers. Facebook is still cogitating about whether or not to let ‘The Don’ back on its platform.

It wasn’t – and still isn’t – a good look.

It’s worth wondering whether Mark Zuckerburg ever thought Facebook could become the force in billions of people’s lives that it is. What’s become increasingly apparent, is that Facebook isn’t treating the power that it wields with something approaching responsibility.

Facebook to face parliament

Closer to home, the Democratic Alliance has fired a shot across the social media giant’s boughs. South Africa’s opposition party wants Facebook to appear in parliament to answer questions about misinformation on its platform. Or fake news. Or lies. Whatever you want to call it.

It’s clarion call that Phumzile Van Damme has taken up with gusto. The DA MP is asking the communications portfolio committee to summon Facebook to account for its role in misinformation and the protection of the digital privacy of its South African users.

“I’ve been very interested in misinformation and done a lot of work around it back in the Bell Pottinger days,” says Van Damme.

(Bell Pottinger you’ll remember was the London-based PR firm that tried to rehabilitate the image of the Gupta family by creating the phrase ‘white-monopoly-capital’ as a deflection tactic).

“I had to spend a lot of time studying the information [around that campaign] and ended up going to London and presenting our case to the Public Relations and Communications Association (PRCA), which ended with Bell Pottinger being thrown out of the PRCA and the firm folded shortly after that.”

“That experience made me understand the dangers of misinformation and how it can distort the public narrative – especially in democracies.”

Facebook’s misinformation and moderation dilemma

Van Damme says that misinformation is a problem across the globe – although she doesn’t blame any particular platform for this – and she also realises that South African politicians have a very limited space in which to address this. But she feels it’s important to add a voice to what is becoming a global movement for better moderation on the social media platforms we all use on a daily basis.

“Moderation at the moment relies on AI and algorithms and detecting human nuance is difficult for these systems,” she says. “So mine is a call for more human moderation. If you look at the US election and just how much misinformation had an impact on that situation, you have to wonder what sort of impact it can have on local elections.”

For the longest time, the giants of social media have been quite content, it seems, to leave moderation up to the masses. In order to get banned or cautioned on Twitter or Facebook or Instagram or a dozen other platforms, one had be reported by another user. These reports, incidentally don’t always result in a user getting banned.

“Misinformation can result in high engagement rates,” says Van Damme. “The algorithm used brings those posts to the fore if they have high engagement. So take for example, a post or a tweet that denies the holocaust; there’ll be a huge amount of activity around that post from both sides. It doesn’t matter whether the original post was right or not – the algorithm will direct traffic in its direction because of its high engagement.”

“The big response that people have to more moderation on these platforms is that to do so would limit our freedom of speech,” she says. “But in order to sign up to a social media platform, you have to agree to their terms and conditions in the first place. We have to have a very realistic view of what these platforms are; they have owners and that means they’re not necessarily the public square where everyone has the right to engage in hate-speech or misinformation.”

Van Damme is adamant that misinformation is a very real issue – across both the political and social spectrums – and without some sort of moderation, they can pose very real threats to democracy by distorting elections. Beyond that she says she has noted – with her experience with the Bell Pottinger campaign – how much damage can be done through coordinated attacks on social media.

“The fact that ‘white monopoly capital’ is a phrase that is now part of South Africa’s lexicon, shows you how powerful these campaigns can be,” she says.

Adding an African voice

Facebook, for its part has released a statement in which it suggests it’s willing to work with the government over the issues Van Damme has raised.

“Protecting people’s information is a priority for us at Facebook,” a Facebook company spokesperson said. “We are committed to respecting South African users’ privacy and safety. We will continue to engage with national governments and welcome ongoing dialogue.”

Van Damme has welcomed Facebook’s sentiments but for her, this is just the beginning.

“I’ve thought of this very specifically with reference to South Africa. I don’t expect Mark Zuckerburg to come down here – it would be awesome, but I don’t think he will. We’ll have a discussion with them and I don’t expect it will result in content moderation across all social media. But at least it will contribute an African voice to the discussion on misinformation, which has been absent so far.”

“I’m very pleased to be able to add that voice.”

 

Exit mobile version