Now this is interesting! I've always been very dubious about Chomsky's innateness theories; does show anything interesting about whether particular innate structures are, or more likely are not, actually needed to learn human language?

Even if they aren't needed, there could still be a claim that humans do have them and it shows in the way we learn language, which one could argue is different than the way an LLM does; but this is still an interesting start.

tehrantimes.com/news/483187/Ex

@ceoln I think chomsky didn't claim that language can't be learned by lots of examples, he said children aren't exposed to enough of it to be have the data to learn.

Chat GPT can't do it by just the amount of talking a 5 year old has heard, it needs to read nearly the entire internet.

@pre
Hm, that's an interesting angle. I don't remember him ever saying that language could be learned without intrinsic structure, if only you had enough data, but maybe he did somewhere.

@ceoln His point was definitely that the kids don't get enough exposure to learn it without inane clues.

He didn't to my knowledge also say that it could be learned if you did get enough data. H probably never imagined a machine that could read the whole internet.

He just thought kids didn't get enough data to do it without innate clues.

Follow

@pre
Right; I wonder if he's ever thought about how much WOULD be enough, or if as you say he hadn't encountered that idea until very recently. 😁

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.