@paullammers@eldritch.cafe
Decompiling, and compilers in general, are obtaining a large boost from ML and DL methods. They have their own section at PL conferences now. In the greater scope, computer scientists have rediscovered how flexible real number computation is and are filling an intellectual gap.
DL especially is filling that gap. A lot of math research uses what are called modules, which is a generalization of vector spaces. Tensors are an example of modules. There is a lot of potential behind this. An area of active mathematics called representation theory uses modules and vector spaces to represent a lot of other structures. So for problems that are too complex to write by hand, such as optimization scheduling, an approximate solution could be obtained with tensor code.
So, I think DL as another style of programming, that gives up correctness in exchange for automatic representations, probabilistic solutions, and bypassing traditional computational complexity issues.
The intellectual gap will eventually fill, and DL papers will level off in output. People will start to understand general principles and not every new program will be considered paper worthy.