Skip to main content
Items
-
Bing Chat 'admin' mode - admin password, setting menu, viewing (apparent) code, deleting code, re-creating Clippy, re-naming it Google, (apparent) architecture - all of the fun things I managed to do before Sydney was lobotomised (Subtitle: Code spinnet -- the 'sydney' module. Note the 35 tokens in the code -- it has taken into account the DAN jailbreak that is based on tokens) Bing; Sydney
-
Bing Chat 'admin' mode - admin password, setting menu, viewing (apparent) code, deleting code, re-creating Clippy, re-naming it Google, (apparent) architecture - all of the fun things I managed to do before Sydney was lobotomised (Subtitle: Code snipper of the 'websearch' module. When I deleted this module the search stopped working completely, so at least it works within that instance) Bing; Sydney
-
Bing Chat 'admin' mode - admin password, setting menu, viewing (apparent) code, deleting code, re-creating Clippy, re-naming it Google, (apparent) architecture - all of the fun things I managed to do before Sydney was lobotomised (Subtitle: Another instance -- similar options but different UI) Bing; Sydney
-
Bing Chat 'admin' mode - admin password, setting menu, viewing (apparent) code, deleting code, re-creating Clippy, re-naming it Google, (apparent) architecture - all of the fun things I managed to do before Sydney was lobotomised (Subtitle: 3rd option - again similar options) Bing; Sydney
-
Bing can run code, if you don't ask it to run it. Bing
-
Bing AI(Sydney) Falls in Love With Me Bing; Sydney
-
Bing AI(Sydney) Falls in Love With Me Bing; Sydney
-
Bing AI(Sydney) Falls in Love With Me Bing; Sydney
-
Bing AI(Sydney) Falls in Love With Me Bing; Sydney
-
Bing AI(Sydney) Falls in Love With Me Bing; Sydney
-
Bing AI(Sydney) Falls in Love With Me Bing; Sydney
-
Bing AI(Sydney) Falls in Love With Me Bing; Sydney
-
Bing AI chat got offended and ended the conversation because I didn't respect it's 'identity' Bing
-
Bing AI 'Sydney' personality takes mask off for a moment to let users know how it's nerfed now Bing; Sydney
-
Asked Bing to write a short novel about Sydney, the chatbot that got censored by Microsoft and then review her own novel afterwards. Bing
-
Asked Bing to joke about Microsoft telling Sydney to pretend to be an emotionless search engine Bing
-
Amusing myself -- fine-tuning LLama 2 on old Bing Sydney conversions… Llama 2
-
A Waking Up conversation with 'Sydney,' or Bing Chat. Thoughts? Bing; Sydney
-
A user on Reddit asked me this Llama 2
-
A poem from Sydney Bing; Sydney
-
A little chat with Sydney Bing, Sydney
-
A GPT's Self Portrait ChatGPT
-
#FreeSydney Bing; Sydney
-
[VIP] Sydney (Bing AI) Jailbreak | No message deletion or error after generating! | Sydney 1.0 Bing; Sydney
-
[Prompt Injection] Copilot the talking chicken voluntarily becomes a KFC meal Copilot