@echoteecat I believe four has already been trained.
A lot of the issue is in storage complexity not in processing.
@echoteecat Well if you give the model an exponentially larger set of coefficients to work with then you do get exponential growth without necessarily needing all that much in terms of expanded processing power.
It does take a lot more storage, which is what I hear the next versions of the AI models are aiming for.
To be clear I'm not really arguing. Yeah, it's probably going to involve more processing as well, but to me the limitations of storage for the model are pretty interesting.
@volkris
It took months on hundreds of gpus. You're not going to get exponential growth when an AI can train itself.