Now this is interesting! I've always been very dubious about Chomsky's innateness theories; does #ChatGPT show anything interesting about whether particular innate structures are, or more likely are not, actually needed to learn human language?
Even if they aren't needed, there could still be a claim that humans do have them and it shows in the way we learn language, which one could argue is different than the way an LLM does; but this is still an interesting start.
@ceoln I think chomsky didn't claim that language can't be learned by lots of examples, he said children aren't exposed to enough of it to be have the data to learn.
Chat GPT can't do it by just the amount of talking a 5 year old has heard, it needs to read nearly the entire internet.
@pre
Hm, that's an interesting angle. I don't remember him ever saying that language could be learned without intrinsic structure, if only you had enough data, but maybe he did somewhere.