Alberto Romero who writes some of the most knowledgeable and thoughtful commentary on AI has just posted "GPT-4: The Bitterer Lesson".

thealgorithmicbridge.substack.

Originally an idea credited to Peter Sutton, the "Bitter Lesson" is that "humans have contributed little to the best AI systems we have built".

And with each iteration of GPT-X that contribution is becoming less and less. From bitter, to bitterer.

Worth a read.

Follow

@boris_steipe

I'm not sure if you can have a with just . You can definitely "scale up" some existing capability by ading more computational power, but can you get (evolve to) something radically new?

@pj

Absolutely. All of biological evolution is like that. Once you start selecting for something, and you have enough parameters, you will achieve it. LLM training and biological evolution are analogous in that respect.

What's rather surprising is the richness of emergent behaviour that we get from merely predicting the next token.

Apparently, training for language creates thinking as a byproduct.

🙂

@boris_steipe

I don't know. is a fairly new "improvement" in biological evolution, and I'm not sure you can reverse engineer (artificial) from it.
You could argue that intelligence evolved ***before*** language. After all, you have quite a few intelligent animals with no language or with a very simple vocabulary.

@boris_steipe
Reading this from *The Bitter Lesson* by *Rich Sutton*:

> researchers seek to leverage their human knowledge of the domain, but the only thing that matters in the long run is the leveraging of computation. These two need not run counter to each other, but in practice they tend to. ***Time spent on one is time not spent on the other***.

incompleteideas.net/IncIdeas/B

For me, the last sentence means that the ***real value*** of like for humans is that they can free us from tedious, repetitive, unimaginative work such as in favor of more elaborate creative thinking.

People have always used previously developed more primitive tools to develop better ones. This was true for all tools and machines we invented so far, and is also true for 's ability of "upgrading itself".

@pj

:-) When I pointed out the analogy between training and biological , I wasn't referring to biology's ability to evolve language, but its capacity to develop almost anything at all: given enough tuneable parameters and a system for inheritance with variation under selective pressure, we get molecular rotors, nanofabrication, diffraction gratings for colours, magnetic compass, eyes, ears, labyrinths, social collaboration ... and even language.

The number of tuneable parameters in the human genome is almost two orders of magnitude smaller than the number of parameters in GPT-3 (and less precise: only 2bit / four nucleotides). And training a language model means to follow a trajectory in a high-dimensional parameter space, just like evolution is a trajectory in genetic-sequence-space. (The technical difference is that the former is a directed walk along a gradient, while the latter is a random walk under selection).

And what happened is : when trained on token-prediction, LLMs started showing emergent aspects of "intelligence".

Quanta just ran an article on that five days ago: quantamagazine.org/the-unpredi

... and Google's Jason Wei, who has a number of articles on arXiv on emergence, lists emergent abilities on his blog: jasonwei.net/blog/emergence

Molecular biology is one of my areas of expertise, I am not surprised this is happening: how could it not? But then again, to actually see such emergence is profound.

@boris_steipe

I glanced over the sources you listed but won't pretend I understand everything that's in there 😉
I guess what I'm trying to say is that, for example, a bird and an airplane both show the emergent property of *flying* while being two totally different "machines".
I believe where we differ in our views is that for you their flying is identical, or the flying of the airplane might be even superior to that of the bird, while for me they are quite different processes that cannot be compared so easily.
Also, and are two completely different processes. Evolution depends on large pools of (imperfect) of the same "thing", while learning is more like the of a single individual having the ability to "learn" (modify their internal ).

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.