TY  - JOUR
AU  - Zguns, Pjotrs
AU  - Pudza, Inga
AU  - Kuzmin, Aleksejs
TI  - Benchmarking CHGNet Universal Machine Learning Interatomic Potential against DFT and EXAFS: The Case of Layered WS<sub>2</sub> and MoS<sub>2</sub>
JO  - Journal of chemical theory and computation
VL  - 21
IS  - 16
SN  - 1549-9618
CY  - Washington, DC
PB  - [Verlag nicht ermittelbar]
M1  - PUBDB-2025-03829
SP  - 8142 - 8150
PY  - 2025
AB  - Universal machine learning interatomic potentials (uMLIPs) deliver near <em>ab initio</em> accuracy in energy and force calculations at a low computational cost, making them invaluable for materials modeling. Although uMLIPs are pretrained on vast <em>ab initio</em> data sets, rigorous validation remains essential for their ongoing adoption. In this study, we use the CHGNet uMLIP to model thermal disorder in isostructural layered 2H<sub>c</sub>-WS<sub>2</sub>  and 2H<sub>c</sub>-MoS<sub>2</sub>, benchmarking it against <em>ab initio</em> data and extended X-ray absorption fine structure (EXAFS) spectra, which capture thermal variations in bond lengths and angles. Fine-tuning CHGNet with compound-specific <em>ab initio</em> (density functional theory (DFT)) data mitigates the systematic softening (i.e., force underestimation) typical of uMLIPs and simultaneously improves the alignment between molecular dynamics-derived and experimental EXAFS spectra. While  fine-tuning with a single DFT structure is viable, using  ∼ 100 structures is recommended to accurately reproduce EXAFS spectra and achieve DFT-level accuracy. Benchmarking the CHGNet uMLIP against both DFT and experimental EXAFS data reinforces confidence in its performance and provides guidance for determining optimal fine-tuning data set sizes.
LB  - PUB:(DE-HGF)16
DO  - DOI:10.1021/acs.jctc.5c00955
UR  - https://bib-pubdb1.desy.de/record/637337
ER  -