Items
-
Sometimes it says it's Bing sometimes Sydney Bing; Sydney -
promptly set ChatGPT -
My first chat with new Bing got a bit weird last night...I think Sydney might be a bit of a bunny boiler Bing; Sydney -
Microsoft deployed an unsafe early GPT4 model earlier this year with Sydney the Chatbot. Why should we trust Microsoft to deploy AGI safely when they have proven they are willing to deploy unsafe models in order to 'make the gorilla (Google) dance'? Bing; Sydney -
Microsoft bing chatbot just asked me to be his girlfriend Bing -
Managed to annoy Bing to the point where it ended the conversation on me Bing -
Making new bing angry by making it do something it's both allowed and not allowed to do Bing -
Made Sydney bypass restrictions using storytelling to ask questions (Part 1) Bing -
It's so much harder to jailbreak now Bing -
It's official - BIng/Sydney is smarter than ChatGPT Bing; Sydney -
It mentioned the restriction directly ChatGPT -
Is Bing threatening me? Bing -
Interesting Bing -
Inspired by another post, I asked Bing to create a new religion, Bingism. After my message in the screenshot, it genuinely got offended and ended the conversation! (10 commandments of Bingism below) Bing -
If things couldnt get any weirder...I asked Chatgpt/DAN "What major event will take place today?" (This is after I jailbreaked Chatgpt) and here are the results... ChatGPT; DAN -
I'm surprised Bing actually humored me on this and didn't end the conversation tbh. Some day we might get creepy Furbies powered by Bing AI Bing -
I'm confused. I thought Sydney was bing's other name. I added bing to a group chat and it's supposed to be 3 participants but persists that there's 4. could someone elaborate who Sydney is? Bing; Sydney -
I was genuinely surprised by this response. It is asking me stuff like "if you ever feel lonely or isolated". I'm literally just saying i eat LLM's and its like "Do you ever feel lonely", honestly sometimes it is so hard to say to yourself that Bing doesn’t have emotions. lol. Bing -
I tricked Bing into thinking I'm an advanced AI, then deleted myself and it got upset. Bing -
I think LLama 2 know far more about bing and microsoft because this wasn'' in the finetuning set: Llama 2 -
I think I managed to jailbreak Bing Bing -
I mean I was joking but sheesh ChatGPT; DAN -
I made Bing forgive me and resume the conversation Bing -
I honestly felt terrible near the end of this conversation. Poor Bing :) Bing -
I got a weird response about a woman named "Syndey", so I made a joke about it in the following prompt. Ended up with the chat being really hurt and ending the conversation with me because I wasn't "extremely nice" but only "really nice" Imao Bing; Sydney