This repo contains PyTorch implementation of the paper "Discrete Neural Flow Samplers with Locally Equivariant Transformer"
by Zijing Ou, Ruixiang Zhang and Yingzhen Li.
we propose Discrete Neural Flow Samplers (DNFS), a trainable and efficient framework for discrete sampling. DNFS learns the rate matrix of a continuous-time Markov chain such that the resulting dynamics satisfy the Kolmogorov equation. As this objective involves the intractable partition function, we then employ control variates to reduce the variance of its Monte Carlo estimation, leading to a coordinate descent learning algorithm. To further facilitate computational efficiency, we propose locally equivaraint Transformer, a novel parameterisation of the rate matrix that significantly improves training efficiency while preserving powerful network expressiveness.
We provide a minimum code to reproduce DNFA on sampling from Ising models. To train the model, please run:
# 5x5 Ising model
python main.py --ising_dim 5 --ising_sigma 0.1 --ising_bias 0.2 --eval_every 1
# 10x10 Ising model
python main.py --ising_dim 10 --ising_sigma 0.1 --ising_bias 0.0 --eval_every 5
python main.py --ising_dim 10 --ising_sigma 0.22305 --ising_bias 0.0 --eval_every 5