metatrain is a command line interface (CLI) to train and evaluate atomistic
models of various architectures. It features a common yaml option inputs to configure
training and evaluation. Trained models are exported as standalone files that can be
used directly in various molecular dynamics (MD) engines (e.g. ASE, LAMMPS, i-PI,
TorchSim, ESPResSo,...) using the metatomic
interface.
The idea behind metatrain is to have a general training hub that provides a
homogeneous environment and user interface, transforming every ML architecture into an
end-to-end model that can be connected to MD engines. Any custom architecture compatible
with TorchScript can be integrated into
metatrain, gaining automatic access to a training and evaluation interface, as well as
compatibility with various MD engines.
Currently metatrain supports the following architectures for building an atomistic
model:
| Name | Description |
|---|---|
| PET | Point Edge Transformer (PET), interatomic machine learning potential |
| SOAP-BPNN | A Behler-Parrinello neural network with SOAP features |
| MACE | A higher order equivariant message passing neural network. |
| PhACE | SO(3)-equivariant message-passing model with physical radial functions and fast tensor products. |
| GAP | Sparse Gaussian Approximation Potential (GAP) using Smooth Overlap of Atomic Positions (SOAP). |
| FlashMD | An architecture for the direct prediction of molecular dynamics |
For details, tutorials, and examples, please visit our documentation.
Install metatrain with pip:
pip install metatrainInstall specific models by specifying the model name. For example, to install the SOAP-BPNN model:
pip install metatrain[soap-bpnn]We also offer a conda installation:
conda install -c conda-forge metatrain
⚠️ The conda installation does not install model-specific dependencies and will only work for architectures without optional dependencies such as PET.
After installation, you can use mtt from the command line to train your models!
To train a model, use the following command:
mtt train options.yamlWhere options.yaml is a configuration file specifying training options. For example, the following configuration trains a SOAP-BPNN model on the QM9 dataset:
# architecture used to train the model
architecture:
name: soap_bpnn
training:
num_epochs: 5 # a very short training run
# Mandatory section defining the parameters for system and target data of the training set
training_set:
systems: "qm9_reduced_100.xyz" # file where the positions are stored
targets:
energy:
key: "U0" # name of the target value
unit: "eV" # unit of the target value
test_set: 0.1 # 10% of the training_set are randomly split for test
validation_set: 0.1 # 10% of the training_set are randomly split for validationmetatrain comes with completion definitions for its commands for bash and zsh. You
must manually configure your shell to enable completion support.
To make the completions available, source the definitions in your shell’s startup file
(e.g., ~/.bash_profile, ~/.zshrc, or ~/.profile):
source $(mtt --shell-completion)Having a problem with metatrain? Please let us know by submitting an issue.
Submit new features or bug fixes through a pull request.
Thanks goes to all people who make metatrain possible:
The overall metatrain project is maintained by @frostedoyster, @pfebrer, and @PicoCentauri who will reply to issues and pull requests opened on this repository as soon as possible. You can mention them directly if you did not receive an answer after a couple of days. Additionally, different architectures are maintained by separate maintainers, you can find their names in the corresponding documentation
If you found metatrain useful for your work, please cite the corresponding article:
F. Bigi, J.W. Abbott, P. Loche et. al.
Metatensor and metatomic: foundational libraries for interoperable atomistic machine learning, (2026).
https://doi.org/10.1063/5.0304911
@article{bigi_metatensor_2026,
title = {Metatensor and Metatomic: {{Foundational}} Libraries for Interoperable Atomistic Machine Learning},
shorttitle = {Metatensor and Metatomic},
author = {Bigi, Filippo and Abbott, Joseph W. and Loche, Philip and Mazitov, Arslan and Tisi, Davide and Langer, Marcel F. and Goscinski, Alexander and Pegolo, Paolo and Chong, Sanggyu and Goswami, Rohit and Febrer, Pol and Chorna, Sofiia and Kellner, Matthias and Ceriotti, Michele and Fraux, Guillaume},
year = 2026,
month = feb,
journal = {J. Chem. Phys.},
volume = {164},
number = {6},
pages = {064113},
issn = {0021-9606},
doi = {10.1063/5.0304911},
}