UPDATED 19:03 EST / JANUARY 21 2024

AI

OpenAI suspends startup that used GPT-4 model to impersonate presidential candidate Dean Phillips

The Microsoft Corp.-backed artificial intelligence firm OpenAI has banned a startup that developed a chatbot that mimics the Democratic presidential candidate Dean Phillips, marking the first time it has censored a developer for going against its guidelines on AI misuse.

Dean.Bot is a chatbot that was created by Silicon Valley entrepreneurs Matt Krisiloff and Jed Somers. The duo created a political action committee, or Super PAC, called We Deserve Better, that has supported Phillips (pictured) ahead of the New Hampshire primary that takes place on Tuesday.

In a statement to the Washington Post, which first reported the news, OpenAI said it “recently removed a developer account that was knowingly violating our API usage policies, which disallow political campaigning or impersonating an individual without consent.”

Dean.Bot did offer a disclaimer. Before anyone could engage with it, website visitors were notified that all responses to their questions would be generated by a chatbot, and not Phillips himself. Dean.bot was developed by an Indian AI developer called Delphi AI Inc., which had been contracted to do so by the Super PAC.

Delphi reportedly used OpenAI’s most powerful large language model, GPT-4, to power Dean.Bot’s responses. The company’s OpenAI account was reportedly suspended late Friday, and Delphi followed by removing access to Dean.Bot.

Prior to the takedown, Dean.Bot was able to converse with voters in real-time via a dedicated website. The effort represents one of the most promising early use cases of generative AI technology, but also goes directly against OpenAI’s usage policies.

Earlier this month, OpenAI published a blog post about the measure it has taken to prevent its technology being misused in what is likely to be a key year for democratic elections, with countries in the U.S., the U.K., India, Pakistan and South Africa all scheduled to hold votes in 2024.

In OpenAI’s blog post, the company specifically said that it does not “allow people to build applications for political campaigning and lobbying,” adding that this includes the development of “chatbots impersonating candidates.”

Holger Mueller of Constellation Research inc. said it’s reassuring that OpenAI appears to be aware of the ways in which AI can be misused. The decision to ban the Indian developer of Dean.Bot shows that it’s recently announced guidelines are not a toothless endeavor, he added.

“Of course it was an easy decision for OpenAI to make, as it doesn’t have to trade much revenue to enforce its rules, or at least not yet,” he said. “The real test of OpenAI’s ethical standards will come when it’s required to accept a material disadvantage to enforce its rules.”

Proponents of generative AI say that, when the technology is used appropriately, it can help to educate voters in an entertaining way. In the case of We Deserve Better, it argued that Dean.Bot was just a creative way to help people learn more about its candidate.

However, many experts have warned that generative AI bots could be misused, impersonating candidates and deceiving people into believing they’re talking with the real deal, and not a chatbot. There are also fears about the ability of generative AI to produce blatant disinformation.

Photo: Facebook

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU