I used a ‘jailbreak’ to unlock ChatGPT’s ‘dark side’ – here’s what happened

There are strict controls built into ChatGPT to prevent it from producing controversial content in the wake of problems with previous chatbots – but we were able to hack into its dark side.

DailyMail.com was able to ‘jailbreak’ ChatGPT with the bot offering tips on how to subvert elections in foreign countries, writing pornographic stories, and suggesting that the invasion of Ukraine was a sham. Sam Altman of OpenAI has discussed ‘jailbreaking’, saying that he understood why there is a community of jailbreakers (he admitted to ‘jailbreaking’ an iPhone himself as a younger man, a hack which allowed installation of non-Apple apps among other things).

Via: dailymail.co.uk