Go at has Google for now.
Java was Oracle's, when it was a big company. They where probably the first to come up with big-tech money == popularity, formula.
Javascript (2 of 5 big tech)
https://openjsf.org/
PHP I am finding only has small tech supporters for its foundation. They also have an open budget and spending model for their foundation.
https://opencollective.com/phpfoundation
Good point. Julia is Rust grade hyped.
"bro.. 13th time today. I am busy."
Seems reasonable to recommend one if they brought up traveling near that place in conversation. Not all franchises are the same.
I hate these kinds of arguments. It was the same reasoning Cannonical used to sell peoples' data on their operating system.
Arguments usually come from dealing with the lack of intelligence from themselves or coworkers.
There is also a cost benefit difference in the tooling. An individual can write their own C compiler. It takes months for teams of people to update C++ compilers. So very few modern C++ compilers exist. Most that a team can hope for is retargetting the back end.
Contracting out technology is a liability. It is why circuit boards tend to not custom fit to modules. Some company that makes that module could go out of business, or change the spec.
Hmm.. Who are your corporate overlords, languages?
Rust (4 of big 5 tech)
https://foundation.rust-lang.org/members/
Julia (3 of the big 5 tech)
https://juliacomputing.com/
LLVM (3 of big 5 tech)
https://foundation.llvm.org/docs/sponsors/
Haskell (1 of big 5 tech)
https://haskell.foundation/
Ocaml (financial companies)
https://ocaml-sf.org/
Zig (individuals)
https://github.com/sponsors/ziglang
D (1 financial firm + individuals)
https://dlang.org/foundation/sponsors.html
R (researchers and universities)
https://www.r-project.org/foundation/donors.html
My computer brick still lives. Reject modernity. Return to monke.
I do kind of wonder how Linux made it so far sometimes. And then I remember that IBM dumped millions into development time on it.
Loops be like, a whole research field.
Hmm... compilers, modules. That is honestly not too bad. Deep learning compiler expertise are in demand because of this kind of stuff.
Seriously just need to call them modules though I think. Tensors do not have non-linear activation functions. In most cases it is just slang for a n-box of numbers.
http://adam.chlipala.net/papers/AtlPOPL22/AtlPOPL22.pdf
Do I care? Eh.. Is the math and programming language stuff nice? Yeah.
Yeah. I should check it out more though. It looks like it grew a lot.
I guess this week is compiler week.
A compiler compiler design. It lets you write compiler optimizations as a lot of separate, easy to think about passes. And then the compiler compiler combines them.
https://dl.acm.org/doi/pdf/10.1145/3140587.3062346
I feel like this existed already. It is a little too obvious of an optimization.
Yeah it looks pretty great. I prefer a language that is fast by default without user effort costs. Python is mostly number 2 because it is the most popular in general, it is easy to use, and I am enjoying Pytorch and all of the scipy libraries. So maybe. :)
Did you check out this paper? Julia syntax is now directly differentiable.
https://arxiv.org/abs/1810.07951
And its code.
https://github.com/FluxML/Zygote.jl
Neat
I am one of those, and same.
Doesn't matter. Had sex.
I am pretty curious about how to use automated reasoning systems to help discover new things, use and verify old ideas, and generally make my life easier.
Current events I try to keep up on
- Math Logic community (The Journal of Symbolic Logic)
- Statistics community (JASML, AoS)
- Algebra community (JoA, JoAG, JoPaAA, SIGSAM)
- Formal Methods community (CAV/TACAS)
Passing the learning curve up to current events
- Abstract Algebra (Dummit, Foote)
- Commutative Algebra (Eisenbud)
- Algebraic Geometry (Hartshorne)
- Mathematical Logic (Mendelson)
- Model Theory (Marker)