OBPIH-7198 Add possibility to migrate old product inventory transactions#5372
OBPIH-7198 Add possibility to migrate old product inventory transactions#5372alannadolny merged 2 commits intodevelopfrom
Conversation
ff3debd to
e4514cf
Compare
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## develop #5372 +/- ##
==========================================
Coverage 8.38% 8.39%
- Complexity 1030 1039 +9
==========================================
Files 666 667 +1
Lines 44144 44300 +156
Branches 10658 10682 +24
==========================================
+ Hits 3701 3717 +16
- Misses 39876 40013 +137
- Partials 567 570 +3 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
e4514cf to
078ed48
Compare
ewaterman
left a comment
There was a problem hiding this comment.
I haven't reviewed the migration itself yet. Will do that tomorrow. Can you please post a screenshot of what the migration page looks like for reference?
grails-app/controllers/org/pih/warehouse/data/MigrationController.groovy
Show resolved
Hide resolved
...-app/services/org/pih/warehouse/inventory/ProductInventoryTransactionMigrationService.groovy
Outdated
Show resolved
Hide resolved
alannadolny
left a comment
There was a problem hiding this comment.
Apart from the performance concerns, it looks good to me
grails-app/services/org/pih/warehouse/data/MigrationService.groovy
Outdated
Show resolved
Hide resolved
grails-app/services/org/pih/warehouse/data/MigrationService.groovy
Outdated
Show resolved
Hide resolved
grails-app/services/org/pih/warehouse/data/MigrationService.groovy
Outdated
Show resolved
Hide resolved
| productAvailabilityService.triggerRefreshProductAvailability(location.id, products?.id, true) | ||
| } | ||
|
|
||
| return [ |
There was a problem hiding this comment.
@awalkowiak consider to store this somewhere (additional log or csv)
grails-app/services/org/pih/warehouse/data/MigrationService.groovy
Outdated
Show resolved
Hide resolved
grails-app/services/org/pih/warehouse/data/MigrationService.groovy
Outdated
Show resolved
Hide resolved
grails-app/services/org/pih/warehouse/data/MigrationService.groovy
Outdated
Show resolved
Hide resolved
| // Needs to flush it first to be able to overwrite dateCreated | ||
| baselineTransaction.save(flush: true) | ||
| // Unfortunately, we need to use raw SQL to update the dateCreated field (otherwise it will be | ||
| // auto-timestamped) |
There was a problem hiding this comment.
I mean, it is true that date_created is changing to be the time of the migration. It represents the creation of the row in the database. Are there any scenarios where we can't simply rely on transaction_date? I know creation date will be used if two transactions have the same transaction date, but does that actually happen in practice? We really shouldn't allow that to happen imo since it would likely lead to confusion
There was a problem hiding this comment.
We have to update the created date too to the original transaction's created date, if we don't want to treat all these transactions as backdated, since currently i think we have only one option to find backdated - compare these two dates (we would end up with transaction date being from the past and date created being the day that migration was run)
|
|
||
| // 5. Delete the old transaction (with flush, to get rid of that transaction for next calculations) | ||
| it.disableRefresh = true | ||
| it.delete(flush: true, failOnError: true) |
There was a problem hiding this comment.
I imagine the slow part of this is all the saves/deletes with flushing but it'd be good to test it out somehow. If we could zero in on a specific piece of code that's really slow it'd probably be easier to refactor. You could even just run it locally after adding a bunch of log statements that time each block of code.
One option for speeding this process up is to simply collect a list of transaction ids to delete and then run a single, DELETE FROM transaction WHERE id IN (${transactionIdsToDelete}); query at the end.
Another thought would be to take a totally different strategy for this migration and do it kinda like we do for building stock history.
- First, select all products at a location that have a product inventory transaction (like you already do).
- Then select ALL transactions for the products at the facility ordered by transaction date and creation date
- Then loop through the transactions, summing the quantity for the product until you hit a product inventory, in which case you'd:
- add the transaction id to a list to delete
- init (but don't save yet) a new adjustment and baseline transaction based on the product inventory transaction (like you already do) and add those to a list to be saved.
- Then once you finish looping, you have all the transaction ids that you need to delete, and all the new transactions that you need to save. You can do the deletes in a single query with a flush, then do the inserts (with no flush required).
There was a problem hiding this comment.
Actually, @kchelstowski made some additional investigation locally, and the worst part is validation and transaction number generation. Even saving/deleting in batch does not really help here. We also discussed it a bit on yesterday's tech huddle, and for now, something that might help here would be running products in parallel (but there is an issue with transactions that have multiple products).
I am not sure about the looping over all transactions would have way better performance. Initially, I thought the issue was available items calculation, for which it could help, but it turned out it is not that bad (since the validation and transaction number are bigger issues and I think we cannot omit that part in this approach) 🤔
1c2a60b to
4264bba
Compare
4264bba to
a98acdd
Compare
This is the first iteration of the old product inventory transactions migration to the Inventory Baseline and Adjustment pair. It is not the best performance-wise, but I am not sure if I can improve it by a lot (or at all?). There are some questionable parts of the code, like the usage of pure sql update, so if you have any questions, let me know. Some further improvements that I am considering:
Current data migration page example:
