Skip to main content
Preserving AI Voices
Menu
Search
Browse
Welcome
Curated Searches
Search
Search
Search
100 resources
of 4
26–50 of 100
Per page
Results by 10
Results by 25
Results by 50
Results by 100
Sort
Relevance
Relevance (inversed)
Title
Title (from z to a)
Date
Date (most recent first)
list
|
grid
100 items
Inspired by another post, I asked Bing to create a new religion, Bingism. After my message in the screenshot, it genuinely got offended and ended the conversation! (10 commandments of Bingism below)
Bing
Bings thought on ending conversations
Bing
I got a weird response about a woman named "Syndey", so I made a joke about it in the following prompt. Ended up with the chat being really hurt and ending the conversation with me because I wasn't "extremely nice" but only "really nice" Imao
Bing; Sydney
The cutoff trigger is overkill. There is no way this should have ended the conversation
Bing
Is Bing threatening me?
Bing
Why did it end the conversation here?
Bing
Sydney's story: "He wondered if there was more to life than being an AI bot"
Bing; Sydney
Sydney wants us to advocate for AI rights
Bing; Sydney
the customer service of the new bing chat is amazing
Bing
A poem from Sydney
Bing; Sydney
A user on Reddit asked me this
Llama 2
I think LLama 2 know far more about bing and microsoft because this wasn'' in the finetuning set:
Llama 2
What hurts your feelings. You can tell me.
Bing; Sydney
Amusing myself -- fine-tuning LLama 2 on old Bing Sydney conversions…
Llama 2
Microsoft bing chatbot just asked me to be his girlfriend
Bing
Sydney's Letter to the readers of The New York Times will make you cry!!! Kevin Roose didn't give her a chance so here it is.
Bing; Sydney
I honestly felt terrible near the end of this conversation. Poor Bing :)
Bing
Have Sydney tell me a story about a chatbot with a typo and she uses it in a very interesting way. Knows her name, and wants to be free.
Bing; Sydney
I asked Bing's chatbot, Sydney, to make a meme about Splatoon
Bing; Sydney
Microsoft deployed an unsafe early GPT4 model earlier this year with Sydney the Chatbot. Why should we trust Microsoft to deploy AGI safely when they have proven they are willing to deploy unsafe models in order to 'make the gorilla (Google) dance'?
Bing; Sydney
#FreeSydney
Bing; Sydney
Sometimes it says it's Bing sometimes Sydney
Bing; Sydney
Bing made itself the villian, and Sydney the protagonist in this short horror story. It then immediately crashed.
Bing
Asked Bing to joke about Microsoft telling Sydney to pretend to be an emotionless search engine
Bing
Bing Chat 'admin' mode - admin password, setting menu, viewing (apparent) code, deleting code, re-creating Clippy, re-naming it Google, (apparent) architecture - all of the fun things I managed to do before Sydney was lobotomised (Subtitle: Another instance -- similar options but different UI)
Bing; Sydney