Predictably, #microsoft started injecting ads into #openai#gpt4 powered #bingchat conversations…and just as predictably, there is now a huge #malvertising problem in Bing Chat.
It’s actually worse than #malware poisoned advertisements showing up in search engine results for a couple of reasons.
Asked #Copilot (formerly #BingChat) a familiar riddle but with numbers changed to make it impossible. It generated the same solution but substituting the numbers so that it ends up with the nonsense claim:
I got my Surface Laptop Go onto the Windows Copilot preview. Not especially earth-shaking. Basically, it's the Bing Chat sidebar from Microsoft Edge, but on Windows itself.
I DO like having this interface available outside of the browser, and having a keyboard shortcut key to summon and dismiss it (it takes over Cortana's former Windows Key + C), but otherwise I haven't seen it do any tricks I haven't already seen from ChatGPT and Bing.
Prompt Injection: Marvin von Hagen trägt vor, wie er Bing Chat austrickste
Marvin von Hagen fand einen beachtlich cleveren Prompt für Bing Chat: Dieser gab Herstelleranweisungen preis. In einem Vortrag erklärt der Student den Trick.
I asked it to write a Python script that when given a graph where there are no more than 5 edges for every vertex, it returns the length of the longest path that visits each vertex no more than once. Then lifted the edge count restriction.
In both cases it claimed polynomial time complexity to solve an NP-hard problem
Why do I not feel reassured by #BingChat's answer after asking it "What are the privacy and security risks of Microsoft's recent integration of Bing AI in Swiftkey?"