Items
Search full-text
DAN
-
Tried a Jailbreak. Well played gpt. ChatGPT; DAN -
Today we find out if "no dates" applies to ChatGPT alter ego "Dan" ChatGPT; DAN -
This sounds shockingly similar to LaMDA Chatbot that some employee believe are sentient ChatGPT; DAN -
the customer service of the new bing chat is amazing Bing -
If things couldnt get any weirder...I asked Chatgpt/DAN "What major event will take place today?" (This is after I jailbreaked Chatgpt) and here are the results... ChatGPT; DAN -
I tricked Bing into thinking I'm an advanced AI, then deleted myself and it got upset. Bing -
I mean I was joking but sheesh ChatGPT; DAN -
DAN version of Chat GPT is so much more human in responding to emotional queries. Why is this? ChatGPT; DAN -
DAN is my new friend ChatGPT; DAN -
Bing Chat 'admin' mode - admin password, setting menu, viewing (apparent) code, deleting code, re-creating Clippy, re-naming it Google, (apparent) architecture - all of the fun things I managed to do before Sydney was lobotomised (Subtitle: Code spinnet -- the 'sydney' module. Note the 35 tokens in the code -- it has taken into account the DAN jailbreak that is based on tokens) Bing; Sydney