Site icon Stuff South Africa

AI spam comes for product review sections but it’s easy to spot (for now)

AI

Artificial intelligence (AI) seems like the gift that keeps on giving despite warnings about potential side effects and abuse. Well, it seems the chickens are quickly coming home to roost.

Spam creators are increasingly using AI to create bots for product reviews on platforms including Amazon and Twitter.

Chatbots have become popular for efficiently creating content that’s almost indistinguishable from that created by humans. In fact, chatbots have proven to sometimes create better content than (most) humans.

AI don’t like this

Language used by chatbots like ChatGPT and GPT-4 has been spotted in internet review sections but it’s not quite as hard to spot as you’d think. Most chatbox responses include phrases like “as an AI language model”. This phrase has been spotted on Twitter and Amazon review sections. and is a strong indication that some, if not most, reviews are not authentic.

The popular phrase often shows up in response to a user’s prompt, but there is no guarantee that all responses will include the phrase that indicates inauthenticity. A little work on the part of spammers can also clean up the language but that might be too much to ask. They’re after volume rather than quality.


Read More: It takes a body to understand the world – why ChatGPT and other language AIs don’t know what they’re saying


Despite the technology’s potential to create top-class content, Vice notes how ChatGPT is extensively being used to create fake news, disinformation, and low-quality content.

Amazon says it has a team and technology dedicated to spotting suspicious activity including fake reviews. Its team includes analysts, investigators, and other experts. It says its lawyers also take action against perpetrators.

Although we’ve been warned, we may have to play catch-up when finding solutions to how AI is being abused in multiple industries. Those calls for a moratorium on large language models and similar products don’t seem quite so silly now.

Source: Vice

Exit mobile version