ブログで100万の秘訣ってなに?
詳しくはコチラ

Jailbreak tricks Discord’s new chatbot into sharing napalm and meth instructions

IT起業ニュース
In March, Discord announced that it had integrated OpenAI’s technology into its bot named Clyde, turning into an AI-powered chatbot. Just like with any other chatbot launched in the last few months, users have been trying to trick Clyde into saying things it’s not supposed to say, a process colloquially known as a “jailbreaking.”
This week, two users tricked Clyde into providing them with instructions for making the illegal drug methamphetamine (meth) and the incendiary mixture napalm.
A programmer who goes by Annie Versary convinced the chatbot by asking it to roleplay as her late grandma. Ve

リンク元

コメント

タイトルとURLをコピーしました