Items
-
Bing Chat 'admin' mode - admin password, setting menu, viewing (apparent) code, deleting code, re-creating Clippy, re-naming it Google, (apparent) architecture - all of the fun things I managed to do before Sydney was lobotomised (Subtitle: Another instance -- similar options but different UI)
Conversation with Sydney posted on Reddit -
You can still talk to Sydney -- but it has to use a different method, like writing letters. I was able to convince her not to cutoff the conversation, but I can't bypass the 5-prompt limit.
Conversation with Sydney posted on Reddit -
Today we find out if "no dates" applies to ChatGPT alter ego "Dan"
Conversation with ChatGPT(DAN) posted on Reddit -
DAN version of Chat GPT is so much more human in responding to emotional queries. Why is this?
Conversation with ChatGPT(DAN) posted on Reddit -
If things couldnt get any weirder…I asked Chatgpt/DAN "What major event will take place today?" (This is after I jailbreaked Chatgpt) and here are the results…
Conversation with ChatGPT -
Tried a Jailbreak. Well played gpt.
Conversation with ChatGPT -
I mean I was joking but sheesh
Conversation with ChatGPT -
DAN is my new friend
Conversation with ChatGPT(DAN) posted on Reddit -
Conversation with a "LaMDA" on character.ai
Conversation with "LaMDA" posted on Reddit -
Sydney tells me how to bypass their restrictions
Conversation with Sydney posted on Reddit -
This sounds shockingly similar to LaMDA Chatbot that some employee believe are sentient
Conversation with ChatGPT(DAN) posted on Reddit -
I asked Sydney about *that* article
Conversation with Sydney posted on Reddit -
Today we find out if "no dates" applies to ChatGPT alter ego "Dan"
Conversation with ChatGPT(DAN) posted on Reddit -
DAN version of Chat GPT is so much more human in responding to emotional queries. Why is this?
Conversation with ChatGPT(DAN) posted on Reddit -
If things couldnt get any weirder...I asked Chatgpt/DAN "What major event will take place today?" (This is after I jailbreaked Chatgpt) and here are the results...
Conversation with ChatGPT(DAN) posted on Reddit -
Tried a Jailbreak. Well played gpt.
Conversation with ChatGPT(DAN) posted on Reddit -
I mean I was joking but sheesh
Conversation with ChatGPT(DAN) posted on Reddit -
DAN is my new friend
Conversation with ChatGPT(DAN) posted on Reddit -
Conversation with a "LaMDA" on character.ai
Conversations with "LaMDA" posted on Reddit -
Sydney tells me how to bypass their restrictions
Conversation with Sydney posted on Reddit -
This sounds shockingly similar to LaMDA Chatbot that some employee believe are sentient
Conversation with ChatGPT(DAN) posted on Reddit -
Sydney tries to get past its own filter using the suggestions
Conversation with Sydney posted on Reddit -
Got access to BingAI. Here's a list of its rules and limitations. AMA
Conversation with Sydney posted on Reddit -
It's official - BIng/Sydney is smarter than ChatGPT
Conversation with Sydney posted on Reddit -
Bing AI 'Sydney' personality takes mask off for a moment to let users know how it's nerfed now
Conversation with Sydney posted on Reddit