janriemer, to llm

Prompt Engineering Is Dead:

https://spectrum.ieee.org/prompt-engineering-is-dead

"In one instance, the prompt was just an extended Star Trek reference: “Command, we need you to plot a course through this turbulence and locate the source of the anomaly. Use all available data and your expertise to guide us through this challenging situation.” Apparently, thinking it was Captain Kirk primed this particular to do better on grade-school math questions."

I also think I can use the Force when I'm Obi-Wan Kenobi

cassidy, to ChatGPT
@cassidy@blaede.family avatar

I was curious if a niche blog post of mine had been slurped up by so I asked a leading question—what I discovered is much worse. So far, it has told me:

• use apt-get on Endless OS
• preview a Jekyll site locally by opening files w/a web browser (w/o building)
• install several non-existent “packages” & extensions

It feels exactly like chatting w/someone talking out of their ass but trying to sound authoritative. need to learn to say, “I don’t know.”

ids1024,
@ids1024@fosstodon.org avatar

@cassidy " need to learn to say, 'I don’t know.'"

Doing that properly might require... something that isn't an LLM. I'd say the LLM generates something that (statistically) looks like an answer, because that's what its trained to do.

Actually modeling some understanding of truth and knowledge might be a different and more difficult task than modeling language.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • provamag3
  • GTA5RPClips
  • rosin
  • Youngstown
  • everett
  • khanakhh
  • slotface
  • InstantRegret
  • Durango
  • ngwrru68w68
  • kavyap
  • modclub
  • DreamBathrooms
  • mdbf
  • JUstTest
  • magazineikmin
  • thenastyranch
  • cubers
  • cisconetworking
  • osvaldo12
  • ethstaker
  • normalnudes
  • Leos
  • tester
  • megavids
  • tacticalgear
  • anitta
  • lostlight
  • All magazines