Suggestions and recommendations for import/export of old data into PocketBase #6287
Replies: 3 comments 1 reply
-
|
As for pocketbase-import, it is not much more than PB client scripts with a small parser that lets it create typed collections from your input. I tried to make it flexible to a reasonable extent, though the core premise is the support for CSV and flat JSON data. As it is actually used by some people I will try my best to maintain it and keep it compatible. PRs are welcome but I don't intend to add any connectors like the one mentioned for PostgreSQL. |
Beta Was this translation helpful? Give feedback.
-
CSV Importer LibraryI’ve created a PocketBase JSVM module library that lets you pass a CSV string and the name of a collection / table. It’ll create the table as a PocketBase collection and columns if they don’t exist (by reading from the CSV), and it’ll truncate the table and import the data from the CSV. Reading the CSV is up to you. You can read the file from the disk or a URL, depending on your implementation. In my case, I’m reading directly from a public URL. Your usage may vary. Caution It is not tested thoroughly for all types of csv formats. However simple csv, with just commas will work. Cell values with quotes are not tested. Important Everything happens in a transaction ensuring data safety. It will either import the whole CSV, fail and import nothing at all. Make sure your csv data is clean before performing imports. Usageconst utils = require(__hooks + "/import.js");
utils.import_csv(csvString, "<collection_name>");Source Code
/// <reference path="../pb_data/types.d.ts" />
module.exports = {
//-------------------------------------------//
// Import CSV to a table
//-------------------------------------------//
import_csv: (csv_string, table_name) => {
// Run in transaction
$app.runInTransaction(($txApp) => {
// Get Rows
const rows = csv_string.split("\n");
// Get Headers
const headers = rows[0]
.split(",")
.map((header) =>
header
.trim()
.toLowerCase()
.replace(/[^a-z0-9]/g, "_")
)
.filter((h) => h.length > 0);
// Get or create collection
let collection;
try {
collection = $txApp.findCollectionByNameOrId(table_name);
} catch (err) {
collection = new Collection({ name: table_name });
$txApp.save(collection);
}
// Add fields to collection
headers.forEach((h) => {
collection.fields.add(new TextField({ name: h }));
$txApp.save(collection);
});
// Truncate the collection
$txApp.truncateCollection(collection);
// Import new records from CSV
for (let i = 1; i < rows.length - 1; i++) {
const values = rows[i].split(",");
const data = Object.fromEntries(headers.map((h, i) => [h, values[i]]));
const record = new Record(collection, data);
$txApp.save(record);
}
});
},
//-------------------------------------------//
};Note If anyone comes up with any cool modifications, please share them with me so I can improve my own version. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Since there were several posts related to importing old db data into PocketBase, this post attempts to combine in a single place the previously discussed possible approaches for doing that.
As mentioned in the recently closed related issue, at least for me, import/export of data is a difficult topic to generalize because the desired behavior usually would vary from one project to another.
I'm not sure if the below instructions would be of any help but they could be at least used as pseudo-code examples and a research starting point to create your own custom import/export scripts.
With that said - while it may be strange coming from the author of the project, if you have an existing and working application that you are not planning further developing you might be better sticking with your existing stack instead of migrating to PocketBase because the migration most likely won't be as straightforward as you think, especially if you expect a direct 1:1 migration from PostgreSQL or other database to work out of the box.
v1: Using the PocketBase Web APIs.
Based on the past discussions, this seems to be one of the most popular way for importing/exporting external data.
While pottentially slow, it is a simple and reasonable approach (especially for data sets < 100k records).
It usually requires the collection(s) that you are going to import the data into to be created beforehand either via the UI or progmatically via Go or JS
pb_hooks/pb_migrations.Here is what a minimal Node.js import script could look like:
For exporting collection data, you could simply call
const data = pb.collection("example").getFullList().v2: Custom PocketBase console commands
This is similar to the previous approach but instead of using the Web APIs we can register custom import/export console commands to interact directly with the database.
The below examples are using the JS
pb_hooksbecause they are not documented that well at the moment but you can do the same (and more) with Go.A minimal JS
pb_hooks/import.pb.jsimport command could look like:For export we can do something like:
v3: Mini Go program that connects to both
pb_dataand old PostgreSQL/MySQL/etc. databaseBecause PocketBase is distributed as a regular Go package, you could create a mini Go program that connects to both the PocketBase
pb_dataand your old external database within the same process and perform the import/export programmatically from there using the PocketBase APIs and dbx abstraction.This is a little more advanced and a lot more involving from the previous 2 options but it is also more flexible and allow performing the import while the old application is still running which could potentially minimize the downtime.
It is a little outdated but one such example you can find in https://github.com/presentator/v2tov3migrate.
v4: Using external tools
If your exported data is in CSV, one possible solution for example could be to use the sqlite3
.import/.excel/.dumpcommands, something like:Another option could be to use a comminity maintained import tool, such as https://github.com/michal-kapala/pocketbase-import (I haven't personally tested it and I'm not sure how well it works in practice).
v5: RAW SQLite scripts
This is another more traditional approach but if you are migrating from non-SQLite database it requires to manually adjust your PostgreSQL/MySQL/etc. dump to be SQLite syntax compatible.
This approach also requires the related
_collectionstable entries to be created beforehand so that the inserted data to be accessible in the PocketBase dashboard and Web APIs (I recommend creating the collections from the UI or programmatically with the Collection model helpers via Go or JSpb_hooks/pb_migrations).Note that while we use and store the collections data in plain SQLite tables, PocketBase has its own nuances what column type and how the data is structured/used (e.g. multivalued fields are stored as serialized JSON array). Therefore my recommendation if you are going with the raw SQLite scripts approach is to first create and play with a test PocketBase application and explore the structure of the generated
pb_data/data.dbto get a general idea in what format and constraints your imported data are expected to be.Beta Was this translation helpful? Give feedback.
All reactions