Another day.. another 10 hours to dump into type theory studies.

For the logic gods.

Huh, that is kind of cool. Removing noise for physical sensors. It is like learning to see at a base level.

arxiv.org/pdf/2106.11936.pdf

Bertrand had it great. He never had to worry about money. He pretty much just did what he wanted all day.

Scientific journals kind of suck. Yeah, quality control. Engineering journals especially are obsessed with prestige hype, for a bunch of initilisms. Bite me OOPSLA

I also have hyperfocus. So for somethings I have really good small scale problems solving.

Show thread

I noticed something the other day. My intelligence is different.

For the moment I will say I am not just stupid. Scatterbrain thought is a huge handicap, but it lends itself to what I would call large scale problem solving.

In small scale problem solving, attention span is really important, a large working memory means being able to have more complex concept interactions. I, and probably most other people get around this by, "chunking" blocks of thought. But it very noticeable when someone is possibly more intelligent than other people, because they do not require a bunch of chunking to handle new stuff. So their ability to handle new ideas is both faster and broader.

In large scale problem solving, attention span is not as important. There is time to write down complex objects and interactions. What is more helpful is what I will describe as creative ability. This comes down to two parts. The first is the ability to gather a lot of information and do synthesis on it. This lends itself to the ability to recognize morphisms. "x is an example of y, so some set of tricks in y can be used in x". The second part is a randomness of thought. A strictly structured thought process gets stuck in local spaces of a problem too readily. And a natural evasion of distractions can also lead to generated logical spaces to overfit, because it is assumed, without knowing the actual theory, that more precision is more scientific or intellectual.

Scatterbrain behavoir is a trade off, not a total loss.

Trying not to let on that I have ADHD to a colleague. I was starting to wonder if I count as sentient.

Well, birthday is coming up. Please slow down, time... 😅

I think it is kind of interesting that assembly and forth are regular expression tier languages (aside from beefy modern macro assemblers). They can be done by scanners alone. The theoretical top speed of compilation is trivial for these. It really is good enough to get work done. Nobody actually needs to include infinity in the programs search space. The grammar level can be really simple. Even something like java byte code has machine independence.

But optimization and correctness steps of a compiler are really desirable. And that where this dream of simplicity dies. optimization and correctness use a lot of potentially exponential time algorithms, or may not successfully terminate for every problem. They also allow for more intricate grammars, far beyond what a parse tree covers. Languages like C are theoretically much slower due to their lack of expressiveness of these grammars. And that might become even more obvious a few decades from now as more people get into developing optimizations for higher level language compilers.

Then there are neural network based solutions, or solutions that come out of higher math proofs, which have an even higher level infinity to their search space. Everybody wants safe and fast code, but it is impractical to learn all of it. These kinds of optimizations and correctness additions are coming out of bodies of research. And this is also kind of a problem because languages could be made impossible to specify outside of using the compiler as the specification.

There are scanner generators. There are parser generators. But why do we avoid semantics generators? What would be a good language to specify this part of a compiler?

Reasoning About Recursive Tree Traversals
engineering.purdue.edu/~xqiu/p

I am not sure why algorithms, parallelization is taught in the decent depth that it is. They may drop that stuff at school in the next decade. It could be compressed into a single course at least I think, kind of like computer architecture is in some places. Compiler/PL research looks like it is succeeding at making the compiler do that stuff.

Although to be fair, we are only half a decade into crazy-lucrative neural network research. If we tossed every competent programmer at deep learning research, pretty much the entire CS/CE/SE degrees would have to be redone.

Utilizing some of these programs really break convention. Computational complexity goes to nothing if a program just converged onto a function that can compute solutions in constant time. It is not always possible for this to happen. But it super easy to do it for so many problem domains.

Using GPU's for cryptomining will probably also boost this research long term. It is causing a shortage now. But manufacturers now are receiving so much funding for their product from both gamers and miners, that they can set up a ton more R&D for computer engineers to make GPUs cheaper and faster.

Commutativity and associativity are some weird properties.

Like a few mathematicians I know like commutative algebra quite a bit more than the non-commutative stuff. Symmetry is nice because it is simple. But the basic element of vector spaces, and grammars, is that they are naturally non-commutative. It is like there is something about non-commutativity that does not scale I guess.

And associativity properties is basically the entire parallel programming research field in a nutshell. (at least the PL side, electronics is different)

Eh, I guess a lot of things are age old algebra with a different coat of paint.

The most successful area of AI is compiler theory.

Yep. I have a raging boner for research maths this summer.

I pretty anti-intelligence for someone that works in artificial intelligence.

Intelligence is not false. Just so much of the idea of it is an anthropomorphism.

It is weird how AI has a lot to do with programming language theory. Everything is always blending together in the formal sciences. The actual boundaries between topics are nuanced.

It is kind of like the feeling that math is all one subject, even though to an undergrad something like analysis and algebra might feel miles apart.

Feels kind of weird to do official research work from my house. Not sure why. It is not like I ever needed a physical lab.

A lot of what I think makes a person seem smart, is their belief that they can overcome intellectual obstacles.

Show more
Qoto Mastodon

QOTO: Question Others to Teach Ourselves. A STEM-oriented instance.

An inclusive free speech instance.
All cultures and opinions welcome.
Explicit hate speech and harassment strictly forbidden.
We federate with all servers: we don't block any servers.