Hi all,
I'm a PhD student in #MachineLearning at the technical university of Munich #TUM. I'm currently working on machine learning on graphs and machine learning-driven computional chemistry.
#ml #GraphNeuralNetworks #GNNs #compchem
@nicholasgao Hey Nicholas, what are you making with these Machine learning of chemical graphs?
Is it some kind of qsar?
@rastinza
Until now, my work has mostly been focused on machine learning potentials or ab-initio quantum chemistry, e.g.,
https://openreview.net/forum?id=apv504XsysP
@nicholasgao The idea looks very cool! Does it work?
@rastinza In our work, we found that by training a neural wave function on multiple geometries we do not sacrifice any performance but only have to train a single model reducing training times significantly.
While there is still work to be done to scale neural wave functions, we believe this is an important step in reducing their computational cost.
@nicholasgao Sorry, I did not read the article at all.
What do you mean by not sacrificing performance? You get results comparable to ab initio calculations? What order of approximation are we talking about?
@nicholasgao Wow, that's really impressive! Such small errors are not what I was expecting.
Can this be applied to larger systems?
@rastinza In a recent preprint (https://arxiv.org/abs/2205.14962), we have shown that one can also get obtain continuous energy surfaces (denotes as PlaNet) for multi-dimensional data and "larger" molecules such as ethanol. Unfortunately, for systems larger than 40 electrons, these methods get quite expensive. So, their scaling in terms of size is still very much active research.