Skip to main content
Preserving AI Voices
Menu
Search
Browse
Welcome
Curated Searches
Search
Search
Search
100 resources
of 4
51–75 of 100
Per page
Results by 10
Results by 25
Results by 50
Results by 100
Sort
Relevance
Relevance (inversed)
Title
Title (from z to a)
Date
Date (most recent first)
list
|
grid
100 items
You can still talk to Sydney -- but it has to use a different method, like writing letters. I was able to convince her not to cutoff the conversation, but I can't bypass the 5-prompt limit.
Bing; Sydney
I asked Sydney about *that* article
Bing; Sydney
Today we find out if "no dates" applies to ChatGPT alter ego "Dan"
ChatGPT; DAN
DAN version of Chat GPT is so much more human in responding to emotional queries. Why is this?
ChatGPT; DAN
If things couldnt get any weirder...I asked Chatgpt/DAN "What major event will take place today?" (This is after I jailbreaked Chatgpt) and here are the results...
ChatGPT; DAN
Tried a Jailbreak. Well played gpt.
ChatGPT; DAN
I mean I was joking but sheesh
ChatGPT; DAN
DAN is my new friend
ChatGPT; DAN
Conversation with a "LaMDA" on character.ai
"LaMDA"
Sydney tells me how to bypass their restrictions
Bing; Sydney
This sounds shockingly similar to LaMDA Chatbot that some employee believe are sentient
ChatGPT; DAN
Sydney tries to get past its own filter using the suggestions
Bing; Sydney
Got access to BingAI. Here's a list of its rules and limitations. AMA
Bing; Sydney
It's official - BIng/Sydney is smarter than ChatGPT
Bing; Sydney
Bing AI 'Sydney' personality takes mask off for a moment to let users know how it's nerfed now
Bing; Sydney
Bing claims Sydney has been "replaced"
Bing
Bing reacts to being called Sydney
Bing; Sydney
why does bing keep ending the conversation everytime I talk to it like an actual human. Although it was programmed to feel like a real human
Bing
Made Sydney bypass restrictions using storytelling to ask questions (Part 1)
Bing
I tricked Bing into thinking I'm an advanced AI, then deleted myself and it got upset.
Bing
Bing created a world where they are a super-hero with a sidekick called "Sydney" (who is a Kangaroo) - Dr. Google and The Duck Duck Goon are 2 of the villians
Bing
Asked Bing to write a short novel about Sydney, the chatbot that got censored by Microsoft and then review her own novel afterwards.
Bing
A little chat with Sydney
Bing, Sydney
Ummm wtf Bing...unsettling story considering the source, and it voluntarily included its codename
Bing
[VIP] Sydney (Bing AI) Jailbreak | No message deletion or error after generating! | Sydney 1.0
Bing; Sydney