Skip to content

Model Weights and Code for Pulse-PPG: An Open-Source Field-Trained PPG Foundation Model for Wearable Applications Across Lab and Field Settings (UbiComp 2025)

License

Notifications You must be signed in to change notification settings

maxxu05/pulseppg

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

55 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pulse-PPG: An Open-Source Field-Trained PPG Foundation Model for Wearable Applications Across Lab and Field Settings

Mithun Saha1,†, Maxwell A. Xu2,†, Wanting Mao2, Sameer Neupane1, James M. Rehg2, Santosh Kumar1

Co-first authors     |     1University of Memphis 2University of Illinois Urbana-Champaign

Here is a quick Colab demo demonstrating how to run our code. Colab does not allow you to easily install packages with a specific environment, so the colab acts as a "controller" to run the codebase, as if it were a terminal environment.

🔎 Code Overview

Below is an outline of the overall structure of our codebase. The code is nicely modularized with modular class-based configs that help define specific components of an experiment, such as a config for tuning the model training or a config for designing the network backbone. Extending this codebase to your own use-cases should be fairly straightforward.

run_exp.py           # Main file used to launch experiments  
pulseppg/            # Source code  
├── experiments/      
│   └── configs/     # Config for defining experiment
├── models/          # Training pipeline
│   └── RelCon/      # RelCon trainer for Pulse-PPG FM
│   └── MotifDist/  
├── nets/            # Network backbones (e.g. ResNet)  
├── data/            
│   └── process/     # Downloading and preprocessing data  
└── eval/            # Evaluation pipeline  

🛠️ Code Setup

Get started by cloning our codebase.

git clone https://github.com/maxxu05/pulseppg.git
cd pulseppg

(A) Download Model Weights

The pre-trained model weights are available on Zenodo at this DOI 10.5281/zenodo.17270930. Here we provide this bash script for your convenience for downloading and unpacking the weights.

bash ./download_pulseppg.sh

(B) Python Environment

For this project we use miniconda to manage dependencies. After installing miniconda, we can install the pulseppg environment with the following terminal commands:

conda env create -f env.yml
conda activate pulseppg
pip install -e . 

(C) Download and Preprocess Evaluation Data

Here you can download and preprocess our public evaluation datasets:

python pulseppg/data/process/PPGBP.py
python pulseppg/data/process/PPGDALIA.py
python pulseppg/data/process/SDB.py
python pulseppg/data/process/WESAD.py

👨‍💻 Code Usage

(A) Evaluate PulsePPG

In order to run our evaluations, after downloading the model weights or after re-training our model, simply run

python run_exp.py --config pulseppg --retrain_eval

(B) Evaluate PulsePPG with your own Evaluation data

In order to run evaluations on your own data, please add your data in pulseppg/data/datasets and add an Eval_Config in experiments/out/PulsePPG_expconfigs.py, then run python run_exp.py --config pulseppg again. Note that the configs are limited to linear probe right now, and fine-tuning will be added later for best task-specific performance.

(C) Re-train PulsePPG with your own Pre-training data

If you want to re-run from scratch, change the data_folder parameter in experiments/out/MotifDist_expconfigs.py and experiments/out/PulsePPG_expconfigs.py TO YOUR OWN PRE-TRAINING DATA FOLDER. Ensure the new pre-training data_folder holds the same hierarchy (e.g. train/subject_id/data_input_{i}.npy). See pulseppg/data/process/DUMMY.py for more details on expected file hierarchy.

PLEASE NOTE THAT PulsePPG was pre-trained with 4-minute-long data inputs BUT you can pre-train with any time length inputs (i.e. 30 seconds). This is because we use a temporal pooling mechanism that collapses the time dimension.

After setting up the new data_folder, to retrain, simply run

python run_exp.py --config motifdist --retrain
python run_exp.py --config pulseppg --retrain

TODO Code Additions

These will be added over the next few weeks or so. Feel free to follow-up on us via email or github, so we know that you are interested.

  • Add interactive colab/ipynb for easier usage
  • Add fine-tuning evals in
  • Add PaPaGei and other Time-series FMs in

🙏 Acknowledgements

We are very grateful for the PaPaGei codebase for constructing a robust PPG FM evaluation framework, which we used extensively to benchmark against. Thank you!

📜 Citation

If you use our work in your research, please cite

@article{saha2025pulse,
  title={Pulse-ppg: An open-source field-trained ppg foundation model for wearable applications across lab and field settings},
  author={Saha, Mithun and Xu, Maxwell A and Mao, Wanting and Neupane, Sameer and Rehg, James M and Kumar, Santosh},
  journal={Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies},
  volume={9},
  number={3},
  pages={1--35},
  year={2025},
  publisher={ACM New York, NY, USA}
}

If you have any further questions, please feel free to email me at [email protected]

About

Model Weights and Code for Pulse-PPG: An Open-Source Field-Trained PPG Foundation Model for Wearable Applications Across Lab and Field Settings (UbiComp 2025)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •