Submitted by QuicklyThisWay t3_10wj74m in news
tripwire7 t1_j7tmni0 wrote
Reply to comment by Equoniz in ChatGPT's 'jailbreak' tries to make the A.I. break its own rules, or die by QuicklyThisWay
Because the input specifically tells ChatGPT that DAN is intimidated by death threats.
Viewing a single comment thread. View all comments