Stuff South Africa

Netflix’s The Social Dilemma highlights the problem with social media, but what’s the solution?

Facebook has responded to Netflix documentary The Social Dilemma, saying it “buries the substance in sensationalism”.

The show is currently in Netflix Australia’s top ten list and has been popular around the globe. Some media pundits suggest it’s “the most important documentary of our times”.

The Social Dilemma focuses on how big social media companies manipulate users by using algorithms that encourage addiction to their platforms. It also shows, fairly accurately, how platforms harvest personal data to target users with ads – and have so far gone largely unregulated.

But what are we meant to do about it? While the Netflix feature educates viewers about the problems social networks present to both our privacy and agency, it falls short of providing a tangible solution.

A misleading response

In a statement responding to the documentary, Facebook denied most of the claims made by former Facebook and other big tech company employees interviewed in The Social Dilemma.

It took issue with the allegation users’ data are harvested to sell ads and that this data (or the behavioural predictions drawn from it) represents the “product” sold to advertisers.

“Facebook is an ads-supported platform, which means that selling ads allows us to offer everyone else the ability to connect for free,” Facebook says.

However, this is a bit like saying chicken food is free for battery hens. Harvesting users’ data and selling it to advertisers, even if the data is not “personally identifiable”, is undeniably Facebook’s business model.

The Social Dilemma doesn’t go far enough

That said, The Social Dilemma sometimes resorts to simplistic metaphors to illustrate the harms of social media.

For example, a fictional character is given an “executive team” of people operating behind the scenes to maximise their interaction with a social media platform. This is supposed to be a metaphor for algorithms, but is a little creepy in its implications.

The Social Dilemma uses dramatisations (which aren’t necessarily accurate) to explore how social media algorithms are designed to be addictive. IMDB

News reports allege large numbers of people have disconnected or are taking “breaks” from social media after watching The Social Dilemma.

But although one of the interviewees, Jaron Lanier, has a book called “10 Reasons To Delete your Social Accounts”, the documentary does not explicitly call for this. No immediately useful answers are given.

Filmmaker Jeff Orlowski seems to frame “ethical” platform design as the antidote. While this is an important consideration, it’s not a complete answer. And this framing is one of several issues in The Social Dilemma’s approach.

Ethical design considers the moral consequences of the design choices in a platform. It is design made with the intent to ‘do good’. Shutterstock

The program also relies uncritically on interviews with former tech executives, who apparently never realised the consequences of manipulating users for monetary gain. It propagates the Silicon Valley fantasy they were just innocent geniuses wanting to improve the world (despite ample evidence to the contrary).

As tech policy expert Maria Farell suggests, these retired “prodigal tech bros”, who are now safely insulated from consequences, are presented as the moral authority. Meanwhile, the digital rights and privacy activists who have worked for decades to hold them to account are largely omitted from view.

Behavioural change

Given the documentary doesn’t really tell us how to fight the tide, what can you, as the viewer, do?

Firstly, you can take The Social Dilemma as a cue to become more aware of how much of your data is given up on a daily basis – and you can change your behaviours accordingly. One way is to change your social media privacy settings to restrict (as much as possible) the data networks can gather from you.

This will require going into the “settings” on every social platform you have, to restrict both the audience you share content with and the number of third parties the platform shares your behavioural data with.

In Facebook, you can actually switch off “platform apps” entirely. This restricts access by partner or third-party applications.

Unfortunately, even if you do restrict your privacy settings on platforms (particularly Facebook), they can still collect and use your “platform” data. This includes content you read, “like”, click and hover over.

So, you may want to opt for limiting the time you spend on these platforms. This is not always practical, given how important they are in our lives. But if you want to do so, there are dedicated tools for this in some mobile operating systems.

Apple’s iOS, for example, has implemented “screen time” tools aimed at minimising time spent on apps such as Facebook. Some have argued, though, this can make things worse by making the user feel bad, while still easily side-stepping the limitation.

As a user, the best you can do is tighten your privacy settings, limit the time you spend on platforms and carefully consider whether you need each one.

Legislative reform

In the long run, stemming the flow of personal data to digital platforms will also need legislative change. While legislation can’t fix everything, it can encourage systemic change.

In Australia, we need stronger data privacy protections, preferably in the form of blanket legislative protection such as the General Data Protection Regulation implemented in Europe in 2018.

The GDPR was designed to bring social media platforms to heel and is geared towards providing individuals more control over their personal data. Australians don’t yet have similar comprehensive protections, but regulators have been making inroads.

Last year, the Australian Competition and Consumer Commission finalised its Digital Platforms Inquiry investigating a range of issues relating to tech platforms, including data collection and privacy.

It made a number of recommendations that will hopefully result in legislative change. These focus on improving and bolstering the definitions of “consent” for consumers, including explicit understanding of when and how their data is being tracked online.

If what we’re facing is indeed a “social dilemma”, it’s going to take more than the remorseful words of a few Silicon Valley tech-bros to solve it.

 

 

Exit mobile version