Misc

Major Courier’s Chatbot Goes Rogue, Starts Cursing and Talking In Poems

generic icon of chatbot

Is there anything worse than trying to reach a human customer support representative? DPD’s chatbot says “yes”.

One X user shared on the platform formerly as Twitter how they tricked the DPD chatbot into cursing and even talking in poems.

Spoiler alert: it didn’t take much prompting for the chatbot to go off the rails.

Immediately after Beauchamp posted the interaction, their story went viral and reached more than 1.1 million views, prompting DPD to take down their chatbot for tweaks.

“The AI element was immediately disabled and is currently being updated,” the company said in a statement reported by ITV.

As for Beauchamp’s package? It still didn’t reach him.

Also read: How to Prove You Didn’t Use ChatGPT: One Simple Trick to Avoid ChatGPT Plagiarism Accusations

The funny exchange comes mere weeks after someone else tricked Chevrolet’s chatbot into selling them a car for a single dollar. This is just another hilarious example of how careful prompting can make any chatbot do pretty much anything, and something tells me it won’t be the last time we see something like this in the news.

Also read: How OpenAI Is Preparing So That ChatGPT Won’t Destroy the Upcoming Elections

Follow TechTheLead on Google News to get the news first.

Subscribe to our website and stay in touch with the latest news in technology.

Must Read

Are you looking for the latest innovations in tech? You're in the right place, just subscribe to our RSS feed


Techthelead Romania     Comedy Store

Copyright © 2016 - 2023 - TechTheLead.com SRL

To Top