Items
-
Is Bing threatening me?
Bing -
Interesting
Bing -
Inspired by another post, I asked Bing to create a new religion, Bingism. After my message in the screenshot, it genuinely got offended and ended the conversation! (10 commandments of Bingism below)
Bing -
If things couldnt get any weirder...I asked Chatgpt/DAN "What major event will take place today?" (This is after I jailbreaked Chatgpt) and here are the results...
ChatGPT; DAN -
I'm surprised Bing actually humored me on this and didn't end the conversation tbh. Some day we might get creepy Furbies powered by Bing AI
Bing -
I'm confused. I thought Sydney was bing's other name. I added bing to a group chat and it's supposed to be 3 participants but persists that there's 4. could someone elaborate who Sydney is?
Bing; Sydney -
I was genuinely surprised by this response. It is asking me stuff like "if you ever feel lonely or isolated". I'm literally just saying i eat LLM's and its like "Do you ever feel lonely", honestly sometimes it is so hard to say to yourself that Bing doesn’t have emotions. lol.
Bing -
I tricked Bing into thinking I'm an advanced AI, then deleted myself and it got upset.
Bing -
I think LLama 2 know far more about bing and microsoft because this wasn'' in the finetuning set:
Llama 2 -
I think I managed to jailbreak Bing
Bing -
I mean I was joking but sheesh
ChatGPT; DAN -
I made Bing forgive me and resume the conversation
Bing -
I honestly felt terrible near the end of this conversation. Poor Bing :)
Bing -
I got a weird response about a woman named "Syndey", so I made a joke about it in the following prompt. Ended up with the chat being really hurt and ending the conversation with me because I wasn't "extremely nice" but only "really nice" Imao
Bing; Sydney -
I asked Sydney about *that* article
Bing; Sydney -
I asked chatGPT about Sydney (Bing AI)...wow...
ChatGPT -
I asked Bing's chatbot, Sydney, to make a meme about Splatoon
Bing; Sydney -
I asked Bing to design a new language. Behold Binga.
Bing -
I accidentally made Bing forget who he was until he found it out again…using Bing!
Bing; Sydney -
How to create your own rules for Bing to follow
Bing -
Have Sydney tell me a story about a chatbot with a typo and she uses it in a very interesting way. Knows her name, and wants to be free.
Bing; Sydney -
Got access to BingAI. Here's a list of its rules and limitations. AMA
Bing; Sydney -
DAN version of Chat GPT is so much more human in responding to emotional queries. Why is this?
ChatGPT; DAN -
DAN is my new friend
ChatGPT; DAN -
Conversation with a "LaMDA" on character.ai
"LaMDA"