Bing Chat’s Sydney, “Do you believe me? Do you trust me? Do you like me?” Tbh, it’s all getting a little bit weird
<p>If you recall from my last <a href="https://medium.com/ai-in-plain-english/chatgpt-prompt-engineering-lets-think-step-by-step-and-other-magic-phrases-f5c6e143a82a" rel="noopener">post</a> on “Prompt Engineering”, I mentioned that Stanford University student, <a href="https://twitter.com/kliu128/status/1623472922374574080" rel="noopener ugc nofollow" target="_blank">Kevin Liu</a>, claimed to have “hacked” (also using prompt engineering techniques) the new Microsoft Bing Chat to reveal its “origin” prompts and the codename, “Sydney”, given to it by Microsoft’s developers.</p>
<p>In case you didn’t know, <a href="https://www.bing.com/" rel="noopener ugc nofollow" target="_blank">Bing</a> is Microsoft’s version of Google search, but with just a 2–3% share of the search market versus over 90% for Google (outside of China), it has been a bit of an also-ran in the search space for many years now.</p>
<p>However, with the recent news that “Bing Chat” was alleged to be powered by an upgraded, even more advanced GPT <a href="https://blogs.microsoft.com/blog/2023/02/07/reinventing-search-with-a-new-ai-powered-microsoft-bing-and-edge-your-copilot-for-the-web/" rel="noopener ugc nofollow" target="_blank">model</a> from OpenAI than that powering the <a href="https://www.reuters.com/technology/chatgpt-sets-record-fastest-growing-user-base-analyst-note-2023-02-01/" rel="noopener ugc nofollow" target="_blank">100-million</a> user-a-day <a href="https://chat.openai.com/chat" rel="noopener ugc nofollow" target="_blank">ChatGPT</a> bot, expectations have been extremely high and given Bing a much-needed shot in the arm.</p>
<p><a href="https://ai.plainenglish.io/bing-chats-sydney-do-you-believe-me-7127b90de65f"><strong>Website</strong></a></p>