@trigrax@mastodon.ml avatar

trigrax

@trigrax@mastodon.ml

This profile is from a federated server and may be incomplete. Browse more on the original instance.

18+ drq, to random Russian
@drq@mastodon.ml avatar

Both AI doomers and AI fanboys/fangirls need to take a chill pill and a fucking expectation management class.

There's a lot to be genuinely upset about in the field of ML/AI/call it what you want. There's also a lot to be excited about.

Things could be a whole lot worse, things could be a whole lot better. In the end, both hype and hysteria will die off, and it will dawn on us that not much has actually changed.

trigrax,
@trigrax@mastodon.ml avatar

@drq “expectation management class” sounds like a general way to dismiss any possibility of radical change without going into detail. It sounds like, “nothing ever happens in the world, relax”. Like there’s been no Industrial Revolution, no Neolithic Revolution, no emergence of intelligence in humans.

If you allow that some things may in principle happen sometimes, you need to talk specifics.

trigrax,
@trigrax@mastodon.ml avatar

@drq So you’re merely arguing about the speed of change, not its eventual scale? How then does that invalidate the concerns of the doomers and/or the hopes of the accelerators? Even if it takes 30 years, it’s still within your lifetime. You and your loved ones will perish or get mind-uploaded or merged with the machines or go to the stars or whatever.

trigrax,
@trigrax@mastodon.ml avatar

@drq What would count as actually relevant in your book?

trigrax,
@trigrax@mastodon.ml avatar

@drq The real alignment problem — the one that doomers are talking about (any doomer worth their salt, at least) — is not “there” yet. It will arrive with a superintelligence, one that will not give a damn about profits or their beneficiaries, in about the same way as you don’t give a damn about foraging ants whose anthill you inadvertently tread on.

trigrax,
@trigrax@mastodon.ml avatar

@drq Intelligence is “brain stuff”. Superintelligence is “a lot of brain stuff”.

Consider mice and humans. Both have a brain. Even their brain architectures aren’t vastly different. A blob of neurons with some identifiable areas, some of which are common between the two. Humans just have ~1000× the stuff. And humans go to the Moon, while mice go into mousetraps.

trigrax,
@trigrax@mastodon.ml avatar

@drq Good point. It may take some hitherto unknown architectural advances, not just scaling up. But looking at the advances of neural nets over the past decade, I don’t see how it can be dismissed. It might be science fiction, but you know, a lot of science fiction stuff from 1900s and even 1950s is mundane reality now.

trigrax,
@trigrax@mastodon.ml avatar

@drq

> it's not anywhere near either a rapture nor apocalypse

I’m not saying it is. I’m saying the rate of progress (20 years from a stuggling ZIP code recognizer to GPT-4o), combined with the history of other fields of technology, justifies the concern that if this carries on for another couple decades, we might get to that humans-vs.-mice level. We can’t be sure, but it looks just plausible enough to get worried or hyped up.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • JUstTest
  • mdbf
  • InstantRegret
  • ethstaker
  • magazineikmin
  • GTA5RPClips
  • rosin
  • modclub
  • Youngstown
  • ngwrru68w68
  • slotface
  • osvaldo12
  • kavyap
  • DreamBathrooms
  • Leos
  • thenastyranch
  • everett
  • cubers
  • cisconetworking
  • normalnudes
  • Durango
  • anitta
  • khanakhh
  • tacticalgear
  • tester
  • provamag3
  • megavids
  • lostlight
  • All magazines