Explore point clouds in 3D, including in VR and in Python
ImmersivePoints is a Python package and web application that lets you visualize 3D point clouds in VR and Jupyter notebooks. Whether you're working with LiDAR data, 3D scans, ML embeddings, or any spatial data, ImmersivePoints makes it easy to see your data in an immersive, natural way.
- 🎮 VR Visualization: View point clouds in virtual reality (Oculus Quest, PC VR headsets)
- 📊 Jupyter Integration: Render point clouds inline in notebooks with a single function call
- 🎨 Flexible Formats: Support for XYZ, XYZI (with hue coloring), and XYZRGB point clouds
- ⚡ Fast: Efficient binary format and Three.js renderer for smooth performance
![]() Astyx LiDAR + Radar Automotive sensor fusion data |
![]() PandaSet RGB Full-color urban scene |
![]() Semantic Segmentation Labeled autonomous driving data |
→ View more examples on immersivepoints.com
The examples/ directory contains comprehensive tutorials:
inline_visualization.ipynb- Complete guide to Jupyter renderingexport_subsample.ipynb- Subsample large point cloudsexport_csv.ipynb- Load and export CSV dataexport_ply.ipynb- Work with PLY 3D scan filesexport_embeddings.ipynb- Visualize neural network embeddings
export_AEV_data.ipynb- AEV self-driving car datasetAstyx dataset lidar and radar.ipynb- Sensor fusion visualizationExport PandaSet.ipynb- PandaSet with semantic labels
ImmersivePoints supports three point cloud formats:
| Format | Columns | Description |
|---|---|---|
| XYZ | 3 | [x, y, z] - Positions only (auto-colored) |
| XYZI | 4 | [x, y, z, hue] - Positions + hue (0.0-1.0) |
| XYZRGB | 6 | [x, y, z, r, g, b] - Positions + RGB (0.0-1.0) |
All coordinates should be float32 numpy arrays.
Explore pre-loaded datasets:
- Astyx LiDAR + Radar Fusion
- PandaSet RGB Point Cloud
- PandaSet Semantic Segmentation
- AEV Autonomous Driving
- Notre Dame 3D Scan
- Brain Clustering Visualization
For detailed documentation, see: (https://immersivepoints.com/)** - Live demos and upload tool
ImmersivePoints works with:
- Oculus Quest 1/2/3 (standalone)
- Meta Quest Pro
- PC VR headsets (Valve Index, HTC Vive, Oculus Rift, etc.)
- Any WebXR-compatible browser
Roland Meertens is a robotics engineer and machine learning researcher with a passion for making data exploration more intuitive and accessible.
The story of ImmersivePoints began ~10 years ago when Roland was working on clustering algorithms for 3D brain data. Existing visualization tools were slow and uninformative, so he learned JavaScript to build better 3D visualizations. When Google launched Cardboard, he added VR support. Years later, when the Oculus Quest made untethered VR accessible, he finally realized his vision: walking freely through data in virtual reality.
Website: pinchofintelligence.com
This project is licensed under the MIT License - see the LICENSE file for details.
If you use ImmersivePoints in your research, please cite:
@software{immersivepoints2024,
author = {Meertens, Roland},
title = {ImmersivePoints: 3D Point Cloud Visualization in VR},
year = {2024},
url = {https://github.com/rmeertens/ImmersivePoints}
}

