Comparing transferability in neural network approaches and linear models for machine-learning interaction potentials
AKA Kandy and K Rossi and A Raulin-Foissac and G Laurens and J Lam, PHYSICAL REVIEW B, 107, 174106 (2023).
DOI: 10.1103/PhysRevB.107.174106
Atomic simulations using machine learning interatomic potential (MLIP) have gained a lot of popularity owing to their accuracy in comparison to conventional empirical potentials. However, the transferability of MLIP to systems outside the training set poses a significant challenge. Here, we compare the transferability of three MLIP approaches: (i) neural network potentials (NNP), (ii) physical LassoLars interactions potential (PLIP) and (iii) linear potentials with Belher-Parrinello descriptors, trained over a small but diverse configuration of zinc oxide polymorphs. We compared the obtained models with density functional theory reference results for physical properties including bulk lattice parameters, surface energies, and vibrational density of states and showed the superiority of both NNP and PLIP models. However, the NNP model performed poorly when compared to the other two linear models for the structural optimization of nanoparticles and molecular dynamics simulation of liquid phases, which are systems outside the training set. While providing less accurate prediction for solid Zinc Oxides phases, both linear models appear more transferable than NNP when testing for nanoscale systems and liquid phases. Our results are finally rationalized by a combination of different statistical analysis including spread in force evaluation, information imbalance, convex hull calculation, and density in descriptor space.
Return to Publications page