Hahaha is an education-first numerical computing and machine learning library written in C++23. It aims to provide a clear, inspectable implementation of tensors, automatic differentiation, and training loops, together with visualization tooling so learners can see what happens during training.
- Learning: read the code and understand how tensors, broadcasting, and autograd work end-to-end.
- Teaching / Demos: run examples that visualize the training process (loss curves, parameter updates, etc.).
- Small experiments: simple models and datasets (e.g. MNIST) without depending on a large framework.
- Tensor core:
Tensor/TensorWrapperwith shape (TensorShape), stride (TensorStride), broadcasting, reshape, transpose, and basic math. - Autograd + compute graph: define-by-run graph building via
compute/graph/*and backward propagation with unit tests. - Broadcasting as a first-class op: explicit
Broadcastnode with correct gradient reduction. - Visualization + educational UX:
- ImGui-based visualizer and demos under
examples/to show training dynamics. - Code is intentionally written to be readable and traceable, with tests as executable documentation.
- ImGui-based visualizer and demos under
- Tooling: Meson + Ninja build, GoogleTest, formatting script
python3 dev/format.py.
We provide a Python script to automatically set up the complete development environment.
python3 dev/setup_dev_env.pyWhat this script does:
- 📦 Installs System Dependencies: Automatically detects your package manager (
aptfor Debian/Ubuntu,pacmanfor Arch Linux) and installs compilers, build tools, and graphics libraries. - 🔧 Sets up Build Tools: Installs or configures Meson, Ninja, and Python tools.
- ⚓ Configures Git Hooks: Sets up pre-commit to ensure code quality before you commit.
- 📥 Manages Dependencies: Downloads necessary subprojects like GoogleTest and ImGui.
- ✅ Verifies Installation: Checks that all required tools are present and correctly configured.
If you prefer to set up manually or use a different OS:
- Compiler: C++23 compliant (GCC 13+ or Clang 16+)
- Build System: Meson and Ninja
- Libraries:
libglfw3-dev,libgl1-mesa-dev(for visualization)googletest(handled by Meson)
- Tools:
git,python3,pre-commit
meson setup builddir --buildtype=debug
meson compile -C builddir
meson test -C builddir -vAfter building, you can run:
# Basic tensor usage
./builddir/examples/basic_usage/hahaha_example_tensor_basic_usage
# Autograd demo
./builddir/examples/autograd/hahaha_example_autograd
# ML training demo (CLI)
./builddir/examples/ml_basic_usage/hahaha_example_ml_basic_usage
# Visualization demo (requires GLFW/OpenGL but not very nice)
./builddir/examples/ml_visualizer/hahaha_example_ml_visualizer#include "public/Tensor.h"
using hahaha::Tensor;
using hahaha::math::NestedData;
int main() {
Tensor<float> a(NestedData<float>{{1.0f, 2.0f}, {3.0f, 4.0f}});
Tensor<float> b(NestedData<float>{{10.0f, 20.0f}, {30.0f, 40.0f}});
a.setRequiresGrad(true);
b.setRequiresGrad(true);
auto c = a * b + 2.0f;
c.backward();
// Inspect gradients
// a.grad(), b.grad()
return 0;
}- Tensor & shape/stride:
core/include/math/TensorWrapper.h,core/include/math/ds/* - Compute graph & autograd:
core/include/compute/graph/*, especiallyComputeNodeandcompute_funs/* - Visualization:
core/src/display/*andexamples/ml_visualizer/* - Tests as documentation:
tests/core/*(broadcast + autograd tests show expected behavior)
Contributions are welcome! Please follow these guidelines:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a pull request
refer to how-to-contribute
Please ensure your code follows the project's coding standards by running python3 dev/format.py before submitting.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.