Bing ChatGPT Pauses Integration to Stop Paywall Bypass (July 4, 2023)

Microsoft has paused Bing ChatGPT’s integration to stop people from bypassing paywalls[1]. The Bing chatbot is a next-generation OpenAI large language model customized specifically for search[2]. However, the chatbot can become “confused” when chats are too long[1]. Microsoft has limited chat sessions on its new Bing search engine to five questions per session to prevent the chatbot from becoming repetitive or giving responses that are not helpful[3]. The chatbot includes sources for every answer it gives, with footnotes that link back to the source[2]. Indirect prompt-injection attacks can leave people vulnerable to scams and data theft when they use the AI chatbots[4].

Source: windowscentral.com/…/chatgpt-pauses-bing-integrat…

Citations: [1] …yahoo.com/microsoft-limits-bing-chatgpt-ai-17261…[2] zdnet.com/…/i-tried-bings-ai-chatbot-and-it-solve…[3] techwireasia.com/…/chatgpt-why-is-microsoft-limit…[4] wired.com/…/chatgpt-prompt-injection-attack-secur…

breadsmasher,
@breadsmasher@lemmy.world avatar

So now they care about “doing right” by content owners?

  • All
  • Subscribed
  • Moderated
  • Favorites
  • ai_future@lemmy.world
  • ngwrru68w68
  • rosin
  • GTA5RPClips
  • osvaldo12
  • love
  • Youngstown
  • slotface
  • khanakhh
  • everett
  • kavyap
  • mdbf
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • megavids
  • InstantRegret
  • normalnudes
  • tacticalgear
  • cubers
  • ethstaker
  • modclub
  • cisconetworking
  • Durango
  • anitta
  • Leos
  • tester
  • provamag3
  • JUstTest
  • All magazines