gombang,
@gombang@social.nancengka.com avatar

An example of LLM hallucination: Tried to ask Google Gemini about a programming problem and it wrote inaccurate Python code (it spat out inexistent methods of a Python module).

gombang,
@gombang@social.nancengka.com avatar

Tried the same question on ChatGPT. It seems more accurate (haven't tested the code yet, but at least it refers to existent modules and methods)

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • ngwrru68w68
  • rosin
  • GTA5RPClips
  • osvaldo12
  • love
  • Youngstown
  • slotface
  • khanakhh
  • everett
  • kavyap
  • mdbf
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • megavids
  • InstantRegret
  • normalnudes
  • tacticalgear
  • cubers
  • ethstaker
  • modclub
  • cisconetworking
  • Durango
  • anitta
  • Leos
  • tester
  • provamag3
  • JUstTest
  • All magazines