You’re not wrong, but that’s also a bit misleading. “AI” is all-encompassing while terms like AGI and ASI are subsets. From the 1950s onward AI was expected to evolve quickly as computing evolved, that never happened. Instead, AI mostly topped out with decision trees, like those used for AI in videogames. ML pried the field back open, but not in the ways we expected.
AGI and ASI were coined in the early 2000s to set apart the goal of human-level intelligence from other kinds of AI like videogame AI. This is a natural result of the field advancing in unexpected, divergent directions. It’s not meant to move the goal post, but to clarify future goals against past progress.
It is entirely possible that we develop multiple approaches to AGI that necessitate new terminology to differentiate them. It’s the nature of all evolution, including technology and language.
You’re not wrong, but that’s also a bit misleading. “AI” is all-encompassing while terms like AGI and ASI are subsets. From the 1950s onward AI was expected to evolve quickly as computing evolved, that never happened. Instead, AI mostly topped out with decision trees, like those used for AI in videogames. ML pried the field back open, but not in the ways we expected.
AGI and ASI were coined in the early 2000s to set apart the goal of human-level intelligence from other kinds of AI like videogame AI. This is a natural result of the field advancing in unexpected, divergent directions. It’s not meant to move the goal post, but to clarify future goals against past progress.
It is entirely possible that we develop multiple approaches to AGI that necessitate new terminology to differentiate them. It’s the nature of all evolution, including technology and language.