Now this is interesting! I've always been very dubious about Chomsky's innateness theories; does show anything interesting about whether particular innate structures are, or more likely are not, actually needed to learn human language?

Even if they aren't needed, there could still be a claim that humans do have them and it shows in the way we learn language, which one could argue is different than the way an LLM does; but this is still an interesting start.

tehrantimes.com/news/483187/Ex

@ceoln
Here’s a survey article on large language models and how they undermine Chomsky’s approach to language: lingbuzz.net/lingbuzz/007180

It’s a good survey article, but I read it with a grain of salt, and want to spend time with the bibliography.

Here’s another good survey article on the 60-year feud between generative linguistics and neural net based theories of how the mind works:

sites.socsci.uci.edu/~lpearl/c

Follow

@lain_7
"Note that this specific example was not in the model’s training set—there is no
possibility that Trump understands prime numbers." 😂

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.