I know there’s other plausible reasons, but thought I’d use this juicy title.

What does everyone think? As someone who works outside of tech I’m curious to hear the collective thoughts of the tech minds on Lemmy.

  • Sekrayray@lemmy.worldOP
    link
    fedilink
    arrow-up
    2
    arrow-down
    2
    ·
    8 months ago

    Probably all done in the name of alignment. We only really have one shot to make an AGI that doesn’t kill everyone (or do other weird unaligned stuff).

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      8 months ago

      I think we need to start distinguishing better between AGI and ASI. We may have only one shot at ASI (though that’s hard to predict since it’s inherently something unknowable at the current time) but AGI will be “just this guy, you know?” I don’t see why a murderous rogue AGI would be harder to put down than a murderous rogue human.

      • Sekrayray@lemmy.worldOP
        link
        fedilink
        arrow-up
        3
        ·
        8 months ago

        Absolutely true. Thanks for the distinction.

        I think maybe the argument could be made that AGI’s could expedite the creation of singularity, but you are correct in saying that the alignment problems matters less with rudimentary AGI.