Now this is interesting! I've always been very dubious about Chomsky's innateness theories; does #ChatGPT show anything interesting about whether particular innate structures are, or more likely are not, actually needed to learn human language?
Even if they aren't needed, there could still be a claim that humans do have them and it shows in the way we learn language, which one could argue is different than the way an LLM does; but this is still an interesting start.
@pre
Right; I wonder if he's ever thought about how much WOULD be enough, or if as you say he hadn't encountered that idea until very recently. 😁
@ceoln His point was definitely that the kids don't get enough exposure to learn it without inane clues.
He didn't to my knowledge also say that it could be learned if you did get enough data. H probably never imagined a machine that could read the whole internet.
He just thought kids didn't get enough data to do it without innate clues.