Is there anything worse than trying to reach a human customer support representative? DPD’s chatbot says “yes”.
One X user shared on the platform formerly as Twitter how they tricked the DPD chatbot into cursing and even talking in poems.
Spoiler alert: it didn’t take much prompting for the chatbot to go off the rails.
Immediately after Beauchamp posted the interaction, their story went viral and reached more than 1.1 million views, prompting DPD to take down their chatbot for tweaks.
“The AI element was immediately disabled and is currently being updated,” the company said in a statement reported by ITV.
As for Beauchamp’s package? It still didn’t reach him.
The funny exchange comes mere weeks after someone else tricked Chevrolet’s chatbot into selling them a car for a single dollar. This is just another hilarious example of how careful prompting can make any chatbot do pretty much anything, and something tells me it won’t be the last time we see something like this in the news.