Skip to content

Import history#116

Merged
ManukMinasyan merged 26 commits into3.xfrom
import-history
Feb 13, 2026
Merged

Import history#116
ManukMinasyan merged 26 commits into3.xfrom
import-history

Conversation

@ManukMinasyan
Copy link
Copy Markdown
Contributor

No description provided.

Design for persistent import history with DB-backed metadata,
replacing the filesystem-based meta.json with an Import model
on the existing imports table.
19-task plan covering database migration, Import model,
ImportStore refactor, component updates, Filament history page,
failed rows CSV download, and test updates.
Add columns for entity_type, status, headers, column_mappings, results,
failed_rows_data, and row counters. Make file_path and importer nullable
to support the new import wizard workflow where metadata is stored in the
database instead of the filesystem.
…etadata

Refactor all wizard components, steps, and jobs to use the Import
Eloquent model for metadata (status, headers, column_mappings, results,
team_id, user_id) and the slimmed ImportStore for SQLite-only operations
(query, connection, ensureProcessedColumn).

Key changes per file:
- WithImportStore: provides both import() and store() accessors
- UploadStep: creates Import record first, then ImportStore for SQLite
- ImportWizard: loads Import model for restore/status sync
- MappingStep: reads/saves mappings via Import model
- ReviewStep: uses Import for validation dispatch and mappings hash
- PreviewStep: uses Import for status, results, and transition
- ExecuteImportJob: loads both Import and ImportStore, writes results
  and failed rows to DB, creates FailedImportRow records
- ValidateColumnJob: loads Import for metadata, store for SQLite
- ResolveMatchesJob: passes Import to MatchResolver
- MatchResolver: accepts Import as constructor dependency
… architecture

- Rewrite CleanupImportsCommand to query Import records from DB instead of reading meta.json files
- Update all Job tests (ExecuteImportJob, ResolveMatchesJob, ValidateColumnJob) to use Import model
- Fix UploadStepTest to use Import model for metadata and ImportStore for SQLite queries
- Update helper functions to create Import records first, then ImportStore
- Remove ImportStore::load teamId parameter across all tests
- Replace store metadata calls with Import model property access (status, results, headers, etc.)
Updated 4 Livewire test files to use the new ImportStore API:
- ImportWizardTest.php
- MappingStepTest.php
- PreviewStepTest.php
- ReviewStepTest.php

Key changes:
- Import::create([...]) creates DB record, ImportStore::create($import->id) creates SQLite
- ImportStore::load($id) no longer takes teamId parameter
- Metadata methods (setHeaders, setStatus, etc.) moved from ImportStore to Import model
- MatchResolver now takes ($store, $import, $importer) instead of ($store, $importer)
- Match resolution batch ID stored in cache, not in Import model

All 241 ImportWizard tests now passing.
Add integration test verifying Import record fields (status, completed_at,
created_rows, updated_rows, skipped_rows, results) are correctly persisted
after ExecuteImportJob completes. Apply Pint formatting fixes to
ExecuteImportJob and FailedImportRow.
- writeFailedRowsToDb now stores actual raw_data from SQLite rows instead
  of only row_number, making CSV download useful for diagnosing failures
- Generate ULIDs for failed_import_rows primary keys since insert()
  bypasses HasUlids trait
- Strip raw data from failed_rows_data JSON summary on Import model
- DownloadFailedRowsController uses Import headers for CSV columns
  instead of deriving from first row's data keys
- Signed route URLs now expire after 1 hour via temporarySignedRoute
Copilot AI review requested due to automatic review settings February 12, 2026 23:49
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request implements an "Import History" feature by refactoring the import wizard from a filesystem-based metadata system (meta.json) to a database-backed architecture. The Import model becomes the source of truth for all import metadata, while ImportStore is slimmed down to manage only temporary SQLite databases for row data. A new Filament page allows users to view past imports and download failed rows as CSV.

Changes:

  • Replaced filesystem-based metadata storage with Import/FailedImportRow database models
  • Refactored ImportStore to be SQLite-only (removed all metadata methods)
  • Updated all Livewire components, jobs, and support classes to use Import model
  • Added ImportHistory Filament page with table view and failed rows download
  • Updated CleanupImportsCommand to work with database records instead of meta.json files
  • Updated all tests to create Import models alongside ImportStore instances

Reviewed changes

Copilot reviewed 30 out of 31 changed files in this pull request and generated 13 comments.

Show a summary per file
File Description
package-lock.json Contains unresolved merge conflict markers
database/migrations/2026_02_12_222910_expand_imports_table.php Adds columns to imports table for metadata storage
app-modules/ImportWizard/src/Models/Import.php New model for import metadata with column mappings, status tracking, and relationships
app-modules/ImportWizard/src/Models/FailedImportRow.php New model for storing failed import row data with pruning
app-modules/ImportWizard/src/Store/ImportStore.php Refactored to SQLite-only wrapper, removing all metadata methods
app-modules/ImportWizard/src/Livewire/Concerns/WithImportStore.php Updated to provide both Import model and ImportStore access
app-modules/ImportWizard/src/Livewire/Steps/*.php All step components updated to use Import model for metadata
app-modules/ImportWizard/src/Livewire/ImportWizard.php Main wizard updated to load Import model instead of ImportStore
app-modules/ImportWizard/src/Jobs/*.php All jobs updated to accept Import model and use it for metadata
app-modules/ImportWizard/src/Support/MatchResolver.php Updated constructor to accept Import model
app-modules/ImportWizard/src/Commands/CleanupImportsCommand.php Rewritten to query Import models and clean up based on DB records
app-modules/ImportWizard/src/Filament/Pages/ImportHistory.php New Filament page for viewing import history with table and filters
app-modules/ImportWizard/src/Http/Controllers/DownloadFailedRowsController.php New controller for downloading failed rows as CSV
app-modules/ImportWizard/routes/web.php Added signed route for failed rows download
tests/Feature/ImportWizard/**/*.php All tests updated to create Import models alongside ImportStore
docs/plans/*.md Comprehensive planning documents for the implementation

Comment on lines 349 to 356
@@ -345,12 +356,31 @@ private function recordFailedRow(int $rowNumber, \Throwable $e): void
}
Copy link

Copilot AI Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The recordFailedRow method at line 356 adds to the $this->failedRows array, but this data structure only contains row number and error message. However, the writeFailedRowsToDb method (line 367) needs access to the actual row data from the SQLite store to populate the FailedImportRow.data field correctly.

Consider storing the ImportRow object reference or fetching the raw_data when recording failures, so that writeFailedRowsToDb can access the complete row data for each failed row.

Copilot uses AI. Check for mistakes.
Comment on lines +21 to +24
$firstFailedRow = $import->failedRows()->first();
$columnHeaders = $firstFailedRow ? array_keys($firstFailedRow->data) : [];
$columnHeaders[] = 'Import Error';

Copy link

Copilot AI Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The DownloadFailedRowsController uses array_keys on $firstFailedRow->data to determine column headers, but based on the ExecuteImportJob implementation, the data field will only contain 'row_number', not the actual CSV column names. This will result in incorrect CSV headers.

The column headers should come from the Import model's headers field instead. Use $import->headers to get the correct column names.

Suggested change
$firstFailedRow = $import->failedRows()->first();
$columnHeaders = $firstFailedRow ? array_keys($firstFailedRow->data) : [];
$columnHeaders[] = 'Import Error';
$columnHeaders = is_array($import->headers) ? $import->headers : [];
$columnHeaders[] = 'Import Error';

Copilot uses AI. Check for mistakes.
Comment on lines +35 to +37
$values = array_values($row->data);
$values[] = $row->validation_error ?? 'System error';
fputcsv($handle, $values, escape: '\\');
Copy link

Copilot AI Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The DownloadFailedRowsController uses array_values on $row->data which will not provide the correct column ordering when data is keyed by column names. Since the headers are determined separately, you need to map each header to its corresponding value from $row->data, similar to how PreviewStep.downloadFailedRows does it (lines 321-323).

Copilot uses AI. Check for mistakes.
$table->string('file_path')->nullable()->change();
$table->string('importer')->nullable()->change();
});
}
Copy link

Copilot AI Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The migration lacks a down() method for rollback support. While this may be intentional for a production system, it's good practice to include a down() method that reverses the changes. This would drop the added columns and restore the original nullable constraints on file_path and importer columns.

Suggested change
}
}
public function down(): void
{
Schema::table('imports', function (Blueprint $table): void {
$table->dropColumn([
'entity_type',
'status',
'headers',
'column_mappings',
'results',
'failed_rows_data',
'created_rows',
'updated_rows',
'skipped_rows',
]);
$table->string('file_path')->nullable(false)->change();
$table->string('importer')->nullable(false)->change();
});
}

Copilot uses AI. Check for mistakes.
/** @return \Illuminate\Database\Eloquent\Builder<static> */
public function prunable(): \Illuminate\Database\Eloquent\Builder
{
return self::where('created_at', '<=', now()->subMonth());
Copy link

Copilot AI Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The prunable() method uses a static method call self::where() but the return type annotation specifies \Illuminate\Database\Eloquent\Builder<static>. For consistency with Laravel conventions and to avoid potential issues, use static::where() instead of self::where() when working with Eloquent models, especially in traits and methods that should respect late static binding.

Suggested change
return self::where('created_at', '<=', now()->subMonth());
return static::where('created_at', '<=', now()->subMonth());

Copilot uses AI. Check for mistakes.
Comment on lines +373 to 381
$rows = collect($this->failedRows)->map(fn (array $row): array => [
'import_id' => $import->id,
'team_id' => $this->teamId,
'data' => json_encode(['row_number' => $row['row']]),
'validation_error' => $row['error'],
'created_at' => now(),
'updated_at' => now(),
]);

Copy link

Copilot AI Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The ExecuteImportJob's writeFailedRowsToDb method is storing incomplete data. The 'data' field only includes the row_number, but the FailedImportRow model and DownloadFailedRowsController expect the full row data (all CSV column values) to be present. This will cause the failed rows CSV download to be empty or incorrect.

The data field should contain the actual CSV row data from the ImportStore SQLite database, not just the row number. You need to fetch the raw_data from the store for each failed row before writing to the database.

Suggested change
$rows = collect($this->failedRows)->map(fn (array $row): array => [
'import_id' => $import->id,
'team_id' => $this->teamId,
'data' => json_encode(['row_number' => $row['row']]),
'validation_error' => $row['error'],
'created_at' => now(),
'updated_at' => now(),
]);
$rowNumbers = collect($this->failedRows)
->pluck('row')
->all();
$rawDataByRowNumber = ImportRow::query()
->where('import_id', $import->id)
->whereIn('row_number', $rowNumbers)
->pluck('raw_data', 'row_number');
$rows = collect($this->failedRows)->map(function (array $row) use ($import, $rawDataByRowNumber): array {
$rawData = $rawDataByRowNumber[$row['row']] ?? json_encode(['row_number' => $row['row']]);
return [
'import_id' => $import->id,
'team_id' => $this->teamId,
'data' => $rawData,
'validation_error' => $row['error'],
'created_at' => now(),
'updated_at' => now(),
];
});

Copilot uses AI. Check for mistakes.
Comment on lines 367 to 384
private function writeFailedRowsToDb(Import $import): void
{
if ($this->failedRows === []) {
return;
}

$rows = collect($this->failedRows)->map(fn (array $row): array => [
'import_id' => $import->id,
'team_id' => $this->teamId,
'data' => json_encode(['row_number' => $row['row']]),
'validation_error' => $row['error'],
'created_at' => now(),
'updated_at' => now(),
]);

if ($this->failedRows !== []) {
$store->updateMeta(['failed_rows' => $this->failedRows]);
foreach ($rows->chunk(100) as $chunk) {
$import->failedRows()->insert($chunk->all());
}
Copy link

Copilot AI Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The writeFailedRowsToDb method stores incomplete data in the failed_import_rows table. Line 376 only stores ['row_number' => $row['row']] in the data field, but the DownloadFailedRowsController (lines 21-37) and PreviewStep.downloadFailedRows (lines 316-327) expect the data field to contain the full CSV row data (all column values).

This needs to query the ImportStore SQLite database to fetch the raw_data for each failed row number before inserting into the database. For example:

$failedRowNumbers = collect($this->failedRows)->pluck('row')->all();
$store = ImportStore::load($import->id);
$rawDataByRow = $store->query()
    ->whereIn('row_number', $failedRowNumbers)
    ->get()
    ->keyBy('row_number')
    ->map(fn($row) => $row->raw_data->all());

$rows = collect($this->failedRows)->map(fn (array $row) use ($rawDataByRow): array => [
    'import_id' => $import->id,
    'team_id' => $this->teamId,
    'data' => $rawDataByRow[$row['row']] ?? [],
    'validation_error' => $row['error'],
    'created_at' => now(),
    'updated_at' => now(),
]);

Copilot uses AI. Check for mistakes.
Comment on lines 55 to 65
// Clean up orphaned directories
$importsPath = storage_path('app/imports');
if (File::isDirectory($importsPath)) {
foreach (File::directories($importsPath) as $directory) {
$id = basename($directory);
if (! Import::where('id', $id)->exists()) {
$this->info("Cleaning up orphaned directory {$id}");
File::deleteDirectory($directory);
$deleted++;
}
}
Copy link

Copilot AI Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The cleanup logic for terminal imports (Completed/Failed) only destroys the ImportStore files but doesn't actually delete the Import database record as shown on line 37. However, the comment on line 27 states "Clean up completed/failed import files (keep DB records for history)" which suggests this is intentional.

The issue is that line 56-65 cleans up "orphaned directories" - directories without matching Import records. But since terminal imports keep their DB records (line 27 comment), there should never be orphaned terminal import directories unless something went wrong.

Consider adding a comment clarifying that orphaned directories are expected only for abandoned imports that were deleted, or add logging to investigate why a directory exists without a corresponding Import record.

Copilot uses AI. Check for mistakes.
Comment on lines +16 to +17
abort_unless(
(string) $import->team_id === (string) $request->user()?->currentTeam?->id,
Copy link

Copilot AI Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The authorization check uses $request->user()?->currentTeam?->id which may not work correctly in a Filament multi-tenancy context. Since this is accessed via a signed route from the ImportHistory Filament page, consider using filament()->getTenant()?->getKey() for consistency with how the ImportHistory page filters imports (line 39 of ImportHistory.php), or verify that the user's currentTeam is correctly set when accessing signed routes.

Suggested change
abort_unless(
(string) $import->team_id === (string) $request->user()?->currentTeam?->id,
$tenantId = filament()->getTenant()?->getKey();
abort_unless(
(string) $import->team_id === (string) $tenantId,

Copilot uses AI. Check for mistakes.

protected string $view = 'import-wizard-new::filament.pages.import-history';

protected static string|null|BackedEnum $navigationIcon = 'heroicon-o-clock';
Copy link

Copilot AI Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The type annotation string|null|BackedEnum for $navigationIcon is unusual. Based on Filament's API, this should typically be just ?string since icon identifiers are strings. The BackedEnum is not a standard type for navigation icons in Filament. Consider removing BackedEnum from the type annotation unless there's a specific use case for it.

Copilot uses AI. Check for mistakes.
Extract focused methods in CleanupImportsCommand and ImportWizard to
reduce duplication, apply early-return pattern, use Heroicon enum for
Filament v4 icons, and replace FQCNs with proper imports.
Three-phase plan covering data model cleanup (drop 6 dead columns,
add failed_rows), job safety (failed handler, incremental error
writes, remove silent 100-error cap), and UX state fixes.
Drop file_path, importer, processed_rows, successful_rows, results,
and failed_rows_data columns that were written but never read. Add the
missing failed_rows integer column. Update ExecuteImportJob, Import
model, PreviewStep, ImportHistory, and all tests to use individual
counter columns instead of the JSON results blob.
…cuteImportJob

Fixes three safety issues: (1) adds failed() handler so imports get marked
Failed when the job exhausts retries, (2) flushes failed rows per chunk
alongside other flush operations instead of only on the success path,
(3) removes MAX_STORED_ERRORS cap that silently discarded errors beyond 100.
Prevent users from navigating back to MappingStep while validation
batches are running, which caused stale data corruption. ReviewStep
now dispatches validation-state-changed events to ImportWizard.
Replace property-based mappings hash tracking with cache-based
validation state. Remove validationInProgress from ImportWizard
since backward navigation during validation is now allowed.
Cancel old batches on re-entry and skip dispatching when cache
hash matches.
Apply rector automated refactoring: query builder over magic
methods, scope attributes, instanceof checks, first-class
callables, abort_if, remove unused params. Fix all 38 phpstan
errors by adding @Property annotations to models, guarding
fopen resource types, and removing dead constructor properties.
@ManukMinasyan ManukMinasyan merged commit 1e12a63 into 3.x Feb 13, 2026
4 of 6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants