Code for our paper Towards Principled Graph Transformers.
Our implementation of the
EdgeAttentionis built on top of the code provided in Bergen et al. 2021, Systematic Generalization with Edge Transformers, available at https://github.com/bergen/EdgeTransformer.
We recommend to use the package manager conda. Once installed run
conda create -n towards-principled-gts python=3.10
conda activate towards-principled-gtsInstall all dependencies via
pip install -e .We use hydra for configuring experiments. See here for a tutorial on the hydra override syntax.
NOTE: By default, logging with
wandbis disabled. To enable it setwandb_projectin the command line. Optionally, setwandb_entityandwandb_nameto configure your entity and run name, respectively.
For the BREC benchmark, run
python expressivity/main.py root=/path/to/data/rootrespectively, where /path/to/data/root specifies the path to your data folder. This folder will be created if it does not exist.
To run the ZINC, Alchemy or QM9 dataset, run
python molecular-regression/[zinc|alchemy|qm9].py root=/path/to/data/rootrespectively, where /path/to/data/root specifies the path to your data folder. This folder will be created if it does not exist. Add rrwp=True to use the RRWP encodings.
You may skip this first step if you just want to reproduce the timing figure in the paper
To run the timing experiments, run
python timing/timing.py results_path=/path/to/resultsThe results will be written to a file called results.csv under /path/to/results.
You can now reproduce the timing figure in the paper by running
python timing/results.pywhich will create a file called results.pdf containing the figure.
For the CLRS experiments see our dedicated fork at https://github.com/ksmdnl/clrs.