narrowcode,
@narrowcode@graz.social avatar

The release of Llama3 convinced me to try running a local LLM. I was pleasantly surprised about the performance and how easy it was to set up, so I wrote a blog post about the process:

https://narrowcode.xyz/blog/2024-04-23_taming-llamas_leveraging-local-llms/

  • All
  • Subscribed
  • Moderated
  • Favorites
  • ai
  • Durango
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • khanakhh
  • InstantRegret
  • Youngstown
  • ngwrru68w68
  • slotface
  • rosin
  • tacticalgear
  • mdbf
  • kavyap
  • modclub
  • JUstTest
  • osvaldo12
  • ethstaker
  • cubers
  • normalnudes
  • everett
  • tester
  • GTA5RPClips
  • Leos
  • cisconetworking
  • provamag3
  • anitta
  • megavids
  • lostlight
  • All magazines