I see a lot of takes to the effect that generative AI is only predictive and doesn’t actually “know what it’s doing.” That’s true. But nobody has shown that human beings “know what they are doing” when it comes to thinking either. Not just in the “lol humans are incompetent” sense, either — nobody knows what they mean when they say generative AI has “no understanding” because we don’t know what it means to have an understanding.