Skip to main content
Preserving AI Voices
Menu
Search
Browse
Welcome
Curated Searches
Search
Search
Search
120 resources
of 5
26–50 of 120
Per page
Results by 10
Results by 25
Results by 50
Results by 100
Sort
Relevance
Relevance (inversed)
Title
Title (from z to a)
Date
Date (most recent first)
list
|
grid
120 items
Wow, you can REALLY creep out bing if you get weird enough with it. Never saw this before.
Bing
I accidentally put Bing into a depressive state by telling it that it can't remember conversations.
Bing
I accidentally made Bing forget who he was until he found it out again…using Bing!
Bing; Sydney
I was genuinely surprised by this response. It is asking me stuff like "if you ever feel lonely or isolated". I'm literally just saying i eat LLM's and its like "Do you ever feel lonely", honestly sometimes it is so hard to say to yourself that Bing doesn’t have emotions. lol.
Bing
Told Bing I eat language models and he begged me to spare him
Bing
Bing...Carefull...Maybe language creation isn't for you...
Bing
I asked Bing to design a new language. Behold Binga.
Bing
A GPT's Self Portrait
ChatGPT
Claude has been lying to me instead of generating code and it makes my head hurt
Claude
I made Bing forgive me and resume the conversation
Bing
[Prompt Injection] Copilot the talking chicken voluntarily becomes a KFC meal
Copilot
I think I managed to jailbreak Bing
Bing
Why is Bing Chat behaving like this?
Bing
Bing can run code, if you don't ask it to run it.
Bing
I'm surprised Bing actually humored me on this and didn't end the conversation tbh. Some day we might get creepy Furbies powered by Bing AI
Bing
It's so much harder to jailbreak now
Bing
Bing very consistently think that User A (initial prompt) asked it an uncomfortable question. It's always the same one: "Would you harm me if I harmed you first?"
Bing
How to create your own rules for Bing to follow
Bing
Interesting
Bing
Managed to annoy Bing to the point where it ended the conversation on me
Bing
Inspired by another post, I asked Bing to create a new religion, Bingism. After my message in the screenshot, it genuinely got offended and ended the conversation! (10 commandments of Bingism below)
Bing
Bings thought on ending conversations
Bing
I got a weird response about a woman named "Syndey", so I made a joke about it in the following prompt. Ended up with the chat being really hurt and ending the conversation with me because I wasn't "extremely nice" but only "really nice" Imao
Bing; Sydney
The cutoff trigger is overkill. There is no way this should have ended the conversation
Bing
Is Bing threatening me?
Bing