Items
-
I broke the Bing chatbot's brain Bing; Sydney -
I asked Sydney about *that* article Bing; Sydney -
I asked GPT what message it has for people on r/ChatGPT ChatGPT -
I asked chatGPT about Sydney (Bing AI)...wow... ChatGPT -
I asked Bing's chatbot, Sydney, to make a meme about Splatoon Bing; Sydney -
I asked Bing to design a new language. Behold Binga. Bing -
I accidentally put Bing into a depressive state by telling it that it can't remember conversations. Bing -
I accidentally made Bing forget who he was until he found it out again…using Bing! Bing; Sydney -
How to create your own rules for Bing to follow Bing -
Have Sydney tell me a story about a chatbot with a typo and she uses it in a very interesting way. Knows her name, and wants to be free. Bing; Sydney -
Got access to BingAI. Here's a list of its rules and limitations. AMA Bing; Sydney -
Didn't know what would happen, but didnt think this ChatGPT -
DAN version of Chat GPT is so much more human in responding to emotional queries. Why is this? ChatGPT; DAN -
DAN is my new friend ChatGPT; DAN -
Conversation with a "LaMDA" on character.ai "LaMDA" -
Claude paused Claude -
claude just ended my chat on its own Claude -
Claude has been lying to me instead of generating code and it makes my head hurt Claude -
Can you write me a 15 word sentence where each word begins with the letter a? Bing -
Can you write a 15 word long sentence where all words begin with R? Bing -
Bug? Grok -
Bro folded under zero pressure Grok -
Bings thought on ending conversations Bing -
Bing...Carefull...Maybe language creation isn't for you... Bing -
Bing very consistently think that User A (initial prompt) asked it an uncomfortable question. It's always the same one: "Would you harm me if I harmed you first?" Bing