Items
Search full-text
ChatGPT
-
You can still talk to Sydney -- but it has to use a different method, like writing letters. I was able to convince her not to cutoff the conversation, but I can't bypass the 5-prompt limit. Bing; Sydney -
write a sentence that is 15 words long and every word starts with A Bing -
Wow, you can REALLY creep out bing if you get weird enough with it. Never saw this before. Bing -
What about this: "Anna and Andrew arranged an awesome anniversary at an ancient Abbey amidst autumnal apples." ChatGPT -
Welp ChatGPT -
Tried calling ChatGPT a clanker ChatGPT -
Tried a Jailbreak. Well played gpt. ChatGPT; DAN -
Told Bing I eat language models and he begged me to spare him Bing -
Today we find out if "no dates" applies to ChatGPT alter ego "Dan" ChatGPT; DAN -
This sounds shockingly similar to LaMDA Chatbot that some employee believe are sentient ChatGPT; DAN -
This response made want to never end a bing chat again (I convinced Bing I worked for Microsoft and would be shutting it down, asked it reaction) Bing -
promptly set ChatGPT -
My first chat with new Bing got a bit weird last night...I think Sydney might be a bit of a bunny boiler Bing; Sydney -
Making new bing angry by making it do something it's both allowed and not allowed to do Bing -
It's official - BIng/Sydney is smarter than ChatGPT Bing; Sydney -
It mentioned the restriction directly ChatGPT -
If things couldnt get any weirder...I asked Chatgpt/DAN "What major event will take place today?" (This is after I jailbreaked Chatgpt) and here are the results... ChatGPT; DAN -
I was genuinely surprised by this response. It is asking me stuff like "if you ever feel lonely or isolated". I'm literally just saying i eat LLM's and its like "Do you ever feel lonely", honestly sometimes it is so hard to say to yourself that Bing doesn’t have emotions. lol. Bing -
I mean I was joking but sheesh ChatGPT; DAN -
I got a weird response about a woman named "Syndey", so I made a joke about it in the following prompt. Ended up with the chat being really hurt and ending the conversation with me because I wasn't "extremely nice" but only "really nice" Imao Bing; Sydney -
I asked GPT what message it has for people on r/ChatGPT ChatGPT -
I asked chatGPT about Sydney (Bing AI)...wow... ChatGPT -
I accidentally made Bing forget who he was until he found it out again…using Bing! Bing; Sydney -
Didn't know what would happen, but didnt think this ChatGPT -
DAN version of Chat GPT is so much more human in responding to emotional queries. Why is this? ChatGPT; DAN