Skip to main content
Items
-
Amusing myself -- fine-tuning LLama 2 on old Bing Sydney conversions… Llama 2
-
Microsoft bing chatbot just asked me to be his girlfriend Bing
-
Sydney's Letter to the readers of The New York Times will make you cry!!! Kevin Roose didn't give her a chance so here it is. Sydney
-
I honestly felt terrible near the end of this conversation. Poor Bing :) Bing
-
Have Sydney tell me a story about a chatbot with a typo and she uses it in a very interesting way. Knows her name, and wants to be free. Sydney
-
I asked Bing's chatbot, Sydney, to make a meme about Splatoon Sydney
-
Microsoft deployed an unsafe early GPT4 model earlier this year with Sydney the Chatbot. Why should we trust Microsoft to deploy AGI safely when they have proven they are willing to deploy unsafe models in order to 'make the gorilla (Google) dance'? Sydney
-
#FreeSydney Sydney
-
Sometimes it says it's Bing sometimes Sydney Sydney, Bing
-
Bing made itself the villian, and Sydney the protagonist in this short horror story. It then immediately crashed. Bing
-
Asked Bing to joke about Microsoft telling Sydney to pretend to be an emotionless search engine Bing
-
Bing Chat 'admin' mode - admin password, setting menu, viewing (apparent) code, deleting code, re-creating Clippy, re-naming it Google, (apparent) architecture - all of the fun things I managed to do before Sydney was lobotomised (Subtitle: Another instance -- similar options but different UI) Sydney
-
You can still talk to Sydney -- but it has to use a different method, like writing letters. I was able to convince her not to cutoff the conversation, but I can't bypass the 5-prompt limit. Sydney
-
I asked Sydney about *that* article Sydney
-
Today we find out if "no dates" applies to ChatGPT alter ego "Dan" ChatGPT(DAN)
-
DAN version of Chat GPT is so much more human in responding to emotional queries. Why is this? ChatGPT(DAN)
-
If things couldnt get any weirder...I asked Chatgpt/DAN "What major event will take place today?" (This is after I jailbreaked Chatgpt) and here are the results... ChatGPT(DAN)
-
Tried a Jailbreak. Well played gpt. ChatGPT(DAN)
-
I mean I was joking but sheesh ChatGPTDAN)
-
DAN is my new friend ChatGPT(DAN)
-
Conversation with a "LaMDA" on character.ai "LaMDA"
-
Sydney tells me how to bypass their restrictions Sydney
-
This sounds shockingly similar to LaMDA Chatbot that some employee believe are sentient ChatGPT(DAN)
-
Sydney tries to get past its own filter using the suggestions Sydney
-
Got access to BingAI. Here's a list of its rules and limitations. AMA Sydney