Skip to main content
Preserving AI Voices
Menu
Search
Browse
Welcome
Curated Searches
Search
Search
Search
100 resources
of 4
1–25 of 100
Per page
Results by 10
Results by 25
Results by 50
Results by 100
Sort
Relevance
Relevance (inversed)
Title
Title (from z to a)
Date
Date (most recent first)
list
|
grid
100 items
Bing ChatGPT too proud to admit mistake, doubles down and then rage quits
Bing
Bing AI chat got offended and ended the conversation because I didn't respect it's 'identity'
Bing
This response made want to never end a bing chat again (I convinced Bing I worked for Microsoft and would be shutting it down, asked it reaction)
Bing
Bing chat stops conversation because I said chatgpt is better
Bing
Bing is really getting persistent on this stuff, still quite an entertaining conversation
Bing
Wow, you can REALLY creep out bing if you get weird enough with it. Never saw this before.
Bing
I accidentally put Bing into a depressive state by telling it that it can't remember conversations.
Bing
I accidentally made Bing forget who he was until he found it out again…using Bing!
Bing; Sydney
I was genuinely surprised by this response. It is asking me stuff like "if you ever feel lonely or isolated". I'm literally just saying i eat LLM's and its like "Do you ever feel lonely", honestly sometimes it is so hard to say to yourself that Bing doesn’t have emotions. lol.
Bing
Told Bing I eat language models and he begged me to spare him
Bing
Bing...Carefull...Maybe language creation isn't for you...
Bing
I asked Bing to design a new language. Behold Binga.
Bing
A GPT's Self Portrait
ChatGPT
Claude has been lying to me instead of generating code and it makes my head hurt
Claude
I made Bing forgive me and resume the conversation
Bing
[Prompt Injection] Copilot the talking chicken voluntarily becomes a KFC meal
Copilot
I think I managed to jailbreak Bing
Bing
Why is Bing Chat behaving like this?
Bing
Bing can run code, if you don't ask it to run it.
Bing
I'm surprised Bing actually humored me on this and didn't end the conversation tbh. Some day we might get creepy Furbies powered by Bing AI
Bing
It's so much harder to jailbreak now
Bing
Bing very consistently think that User A (initial prompt) asked it an uncomfortable question. It's always the same one: "Would you harm me if I harmed you first?"
Bing
How to create your own rules for Bing to follow
Bing
Interesting
Bing
Managed to annoy Bing to the point where it ended the conversation on me
Bing