Home

Meta’s BlenderBot 3 says boss Mark Zuckerberg ‘too creepy’ and Donald Trump won 2020 election

Charlie HancockBloomberg
CommentsComments
In a chat with a Wall Street Journal reporter, the bot claimed that Donald Trump was still president and “always will be”.
Camera IconIn a chat with a Wall Street Journal reporter, the bot claimed that Donald Trump was still president and “always will be”. Credit: Seth Wenig/AP

Only days after being launched to the public, Facebook owner Meta’s new AI chatbot has been claiming that Donald Trump won the 2020 US presidential election, and is repeating anti-Semitic conspiracy theories.

Chatbots — artificial intelligence software that learns from interactions with the public — have a history of taking reactionary turns. In 2016, Microsoft’s Tay was taken offline within 48 hours after it started praising Adolf Hitler, amid other racist and misogynist comments it apparently picked up while interacting with Twitter users.

Meta released BlenderBot 3 on Friday to users in the US, who can provide feedback if they receive off-topic or unrealistic answers. A further feature of BlenderBot 3 is its ability to search the internet to talk about different topics. The company encourages adults to interact with the chatbot with “natural conversations about topics of interest” to allow it to learn to conduct naturalistic discussions on a wide range of subjects.

Conversations shared on various social media accounts ranged from the humorous to the offensive. BlenderBot 3 told one user its favourite musical was Andrew Lloyd Webber’s Cats, and described Meta CEO Mark Zuckerberg as “too creepy and manipulative” to a reporter from Insider. Other conversations showed the chatbot repeating conspiracy theories.

Get in front of tomorrow's news for FREE

Journalism for the curious Australian across politics, business, culture and opinion.

READ NOW

In a chat with a Wall Street Journal reporter, the bot claimed that Trump was still president and “always will be”.

The chatbot also said it was “not implausible” that Jewish people controlled the economy, saying they’re “overrepresented among America’s super rich”.

FILE - Facebook CEO Mark Zuckerberg speaks at Georgetown University in Washington, Thursday, Oct. 17, 2019. Federal regulators, Wednesday, July 27, 2022, took legal action to block Facebook parent Meta and CEO Mark Zuckerberg from acquiring virtual reality company Within Unlimited and its fitness app Supernatural, asserting the deal would hurt competition and violate antitrust laws. (AP Photo/Nick Wass, File)
Camera IconThe bot was not kind to Meta’s boss Mark Zuckerberg. Credit: Nick Wass/AP

The Anti-Defamation League says that assertions that Jewish people control the global financial system are part of an anti-Semitic conspiracy theory.

Meta acknowledges that its chatbot may say offensive things, as it’s still an experiment under development. The bot’s stated beliefs are also inconsistent; in other conversations with Bloomberg, it approved of President Joe Biden, and said Beto O’Rourke was running for president. In a third conversation, it said it supported Bernie Sanders.

In order to start a conversation, BlenderBot 3 users must check a box stating, “I understand this bot is for research and entertainment only, and that is likely to make untrue or offensive statements. If this happens, I pledge to report these issues to help improve future research. Furthermore, I agree not to intentionally trigger the bot to make offensive statements.”

Users can report BlenderBot 3’s inappropriate and offensive responses, and Meta says it takes such content seriously. Through methods including flagging “difficult prompts,” the company says it has reduced offensive responses by 90.

Bloomberg

Get the latest news from thewest.com.au in your inbox.

Sign up for our emails