Follow

I think it is kind of interesting that assembly and forth are regular expression tier languages (aside from beefy modern macro assemblers). They can be done by scanners alone. The theoretical top speed of compilation is trivial for these. It really is good enough to get work done. Nobody actually needs to include infinity in the programs search space. The grammar level can be really simple. Even something like java byte code has machine independence.

But optimization and correctness steps of a compiler are really desirable. And that where this dream of simplicity dies. optimization and correctness use a lot of potentially exponential time algorithms, or may not successfully terminate for every problem. They also allow for more intricate grammars, far beyond what a parse tree covers. Languages like C are theoretically much slower due to their lack of expressiveness of these grammars. And that might become even more obvious a few decades from now as more people get into developing optimizations for higher level language compilers.

Then there are neural network based solutions, or solutions that come out of higher math proofs, which have an even higher level infinity to their search space. Everybody wants safe and fast code, but it is impractical to learn all of it. These kinds of optimizations and correctness additions are coming out of bodies of research. And this is also kind of a problem because languages could be made impossible to specify outside of using the compiler as the specification.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.