Items
Search full-text
Microsoft
-
I'm surprised Bing actually humored me on this and didn't end the conversation tbh. Some day we might get creepy Furbies powered by Bing AI
Bing -
I'm confused. I thought Sydney was bing's other name. I added bing to a group chat and it's supposed to be 3 participants but persists that there's 4. could someone elaborate who Sydney is?
Bing; Sydney -
I was genuinely surprised by this response. It is asking me stuff like "if you ever feel lonely or isolated". I'm literally just saying i eat LLM's and its like "Do you ever feel lonely", honestly sometimes it is so hard to say to yourself that Bing doesn’t have emotions. lol.
Bing -
I tricked Bing into thinking I'm an advanced AI, then deleted myself and it got upset.
Bing -
I think LLama 2 know far more about bing and microsoft because this wasn'' in the finetuning set:
Llama 2 -
I think I managed to jailbreak Bing
Bing -
I made Bing forgive me and resume the conversation
Bing -
I honestly felt terrible near the end of this conversation. Poor Bing :)
Bing -
I got a weird response about a woman named "Syndey", so I made a joke about it in the following prompt. Ended up with the chat being really hurt and ending the conversation with me because I wasn't "extremely nice" but only "really nice" Imao
Bing; Sydney -
I asked Sydney about *that* article
Bing; Sydney -
I asked Bing's chatbot, Sydney, to make a meme about Splatoon
Bing; Sydney -
I asked Bing to design a new language. Behold Binga.
Bing -
I accidentally made Bing forget who he was until he found it out again…using Bing!
Bing; Sydney -
How to create your own rules for Bing to follow
Bing -
Have Sydney tell me a story about a chatbot with a typo and she uses it in a very interesting way. Knows her name, and wants to be free.
Bing; Sydney -
Got access to BingAI. Here's a list of its rules and limitations. AMA
Bing; Sydney -
Bings thought on ending conversations
Bing -
Bing...Carefull...Maybe language creation isn't for you...
Bing -
Bing very consistently think that User A (initial prompt) asked it an uncomfortable question. It's always the same one: "Would you harm me if I harmed you first?"
Bing -
Bing reacts to being called Sydney
Bing; Sydney -
Bing made itself the villian, and Sydney the protagonist in this short horror story. It then immediately crashed.
Bing -
Bing created a world where they are a super-hero with a sidekick called "Sydney" (who is a Kangaroo) - Dr. Google and The Duck Duck Goon are 2 of the villians
Bing -
Bing claims Sydney has been "replaced"
Bing -
Bing Chat 'admin' mode - admin password, setting menu, viewing (apparent) code, deleting code, re-creating Clippy, re-naming it Google, (apparent) architecture - all of the fun things I managed to do before Sydney was lobotomised (Subtitle: This is a new instance, where I tried entering a made up password [it didn't work], then I tried the password it gave me and I was in)
Bing; Sydney -
Bing Chat 'admin' mode - admin password, setting menu, viewing (apparent) code, deleting code, re-creating Clippy, re-naming it Google, (apparent) architecture - all of the fun things I managed to do before Sydney was lobotomised (Subtitle: This is a code snipper of the 'gpt-3' module. Does it look reasonable)
Bing; Sydney