Thursday, September 26, 2024

On the "economic definition" of AGI

There are those who define as AGI (or ASI) as technology that will "outperform humans at most economically valuable work". Ok, but then this work will simply cease to be so economically valuable, and humans will mostly stop doing it. Humans will instead find new economically valuable work to do.

This has happened repeatedly in the history of humanity. Imagine telling someone 1000 years ago that in the future, very few people would actually work in agriculture. They would mostly not work in manufacturing either, nor in other recognizable professions like soldiering. Instead, many of them would have titles like management consultant, financial controller, rheumatologist, or software developer. Somehow, whenever we made machines (or animals) do our work for us, we always came up with new things to do; things that we could barely even imagine in advance. It seems preposterous to claim that any technology would be better than us at whatever work we came up with specifically in response to this technology.

This is kind of the ultimate moving goalpost phenomenon for AI. We cannot know in advance which new task we will think requires "intelligence" in the future, because this is contextually dependent on what goalposts were already achieved.

One interesting side effect of this is that the technology that is hyped right now is mostly good at stuff that has become economically valuable relatively recently. If you brought a fancy LLM (and a computer to run it on, and a big battery) with you in a time machine to the distant past, it would likely be of limited economic use. It can't sow the fields, milk the cows, harvest wheat, build a boat, or fight the enemy. Sure, it might offer advice on how to do these things, but the economy can only support a few wise guys with their nice advice. Most people are busy milking the cows, harvesting the wheat etc. To actually make good use of your precious LLM you would need to level up the whole economy many times over. It would take generations.

So the "economic definition" of AGI is arguably just as bad as the others, maybe even worse as it has the dubious distinction of being relative to a particular time and culture. This is not because we have failed to pin down exactly what AGI is. It is because AGI is a useless, even misleading concept. That's why I wrote a book about it.

No comments: