Discussion about this post

User's avatar
Lachlan Cannon's avatar

I was thinking along similar lines just this weekend. It seems that the terms AGI and ASI have become increasingly meaningless, where I remember them having pretty standardised and straightforward definitions a decade or so ago. I would have said AGI meant a system that can perform as well as or better than the average human at all tasks humans can do, while ASI meant a system that could perform all tasks humanity can do as well as or better than.

My take though was that we've been seeing a lot of fuzzyness, semantic drift etc from motivated bad actors. There are plenty of people who see a precise definition of AGI or ASI that roughly speaking means a very hard and very impressive thing. For a lot of people, claiming we've already met it, and messing around with the definitions if needed to do so is just something they find useful no matter what damage it does to more precise technical language.

This has been a really useful post for me as a re-calibration towards the fuzzyness in the term itself, and jaggedness as a catalyst for fragmenting definitions. I do still think I see a lot of motivated bad definitions though, and I don't share your hope on ASI still being a useful term, since I already see the cycle of weird, bad and misleading definitions starting up there.

Peter A. Jensen's avatar

BAN Superintelligence Until Safe.

19 more comments...

No posts

Ready for more?