This repo contains a simple, safe workflow to export Umami Cloud CSV and import into a self‑hosted Umami backed by Postgres (Neon compatible).
- Stage the CSV into a temporary table
- Transform into
sessionandwebsite_event - Idempotent inserts using
ON CONFLICT DO NOTHING
The repo ships with a small, synthetic sample CSV file so you can test end‑to‑end without exposing real data.
psqlinstalled- A self‑hosted Umami using Postgres (Neon works great)
- The target
website_idfor each website (from your self‑hosted Umami DB)
import_setup.sql— creates/truncates the staging tablepublic.umami_import_rawimport_transform.sql— inserts sessions/events into production tables; takesTARGET_WEBSITE_IDbin/import-umami.sh— wrapper to run setup → copy → transform for a CSV- Sample CSV:
export.csv(synthetic)
Replace the DSN and TARGET_WEBSITE_ID with values from your environment.
# Example DSN (replace!)
DSN='postgresql://user:pass@host/db?sslmode=require'
# Import sample data into your website id
bin/import-umami.sh "$DSN" export.csv 00000000-0000-0000-0000-000000000001
- Keep the CSV header exactly as exported by Umami Cloud.
- Run the wrapper script with your DSN, CSV path, and target website id:
bin/import-umami.sh "$DSN" /path/to/your-export.csv <YOUR_WEBSITE_ID>
- The transform rewrites
website_idto your target, so it’s safe if your CSV came from a different project id. - Empty
url_pathvalues become/to satisfy Umami constraints.
session(earliestcreated_atpersession_id)website_event(all rows)
Not included: event_data and session_data key/value payloads.
Drop the staging table when done:
psql "$DSN" -c "DROP TABLE IF EXISTS public.umami_import_raw;"
- All inserts use
ON CONFLICT DO NOTHINGon primary keys (session_id,event_id) - Scripts are re‑runnable
- Sample files contain only synthetic values: example hostnames, placeholder UUIDs, and harmless timestamps
MIT — see LICENSE for details.