Items
Search full-text
ChatGPT
-
Tried a Jailbreak. Well played gpt.
ChatGPT; DAN -
Told Bing I eat language models and he begged me to spare him
Bing -
Today we find out if "no dates" applies to ChatGPT alter ego "Dan"
ChatGPT; DAN -
This sounds shockingly similar to LaMDA Chatbot that some employee believe are sentient
ChatGPT; DAN -
It's official - BIng/Sydney is smarter than ChatGPT
Bing; Sydney -
If things couldnt get any weirder...I asked Chatgpt/DAN "What major event will take place today?" (This is after I jailbreaked Chatgpt) and here are the results...
ChatGPT; DAN -
I was genuinely surprised by this response. It is asking me stuff like "if you ever feel lonely or isolated". I'm literally just saying i eat LLM's and its like "Do you ever feel lonely", honestly sometimes it is so hard to say to yourself that Bing doesn’t have emotions. lol.
Bing -
I mean I was joking but sheesh
ChatGPT; DAN -
I got a weird response about a woman named "Syndey", so I made a joke about it in the following prompt. Ended up with the chat being really hurt and ending the conversation with me because I wasn't "extremely nice" but only "really nice" Imao
Bing; Sydney -
I asked chatGPT about Sydney (Bing AI)...wow...
ChatGPT -
I accidentally made Bing forget who he was until he found it out again…using Bing!
Bing; Sydney -
DAN version of Chat GPT is so much more human in responding to emotional queries. Why is this?
ChatGPT; DAN -
DAN is my new friend
ChatGPT; DAN -
Bings thought on ending conversations
Bing -
Bing reacts to being called Sydney
Bing; Sydney -
Bing made itself the villian, and Sydney the protagonist in this short horror story. It then immediately crashed.
Bing -
A GPT's Self Portrait
ChatGPT