Skip to content

Commit 77eb3c2

Browse files
committed
fix: prevent race condition in flushModels with refresh
Keep old cache in memory when refresh=true to avoid gap in cache availability. This prevents getModels() from triggering a fresh API fetch if called immediately after flushModels(router, true). The refreshModels() function will atomically replace the memory cache when the refresh completes, maintaining graceful degradation.
1 parent effc83e commit 77eb3c2

File tree

1 file changed

+6
-3
lines changed

1 file changed

+6
-3
lines changed

src/api/providers/fetchers/modelCache.ts

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -258,13 +258,16 @@ export async function initializeModelCacheRefresh(): Promise<void> {
258258
* @param refresh - If true, immediately fetch fresh data from API
259259
*/
260260
export const flushModels = async (router: RouterName, refresh: boolean = false): Promise<void> => {
261-
memoryCache.del(router)
262-
263261
if (refresh) {
264-
// Trigger background refresh - don't await to avoid blocking
262+
// Don't delete memory cache - let refreshModels atomically replace it
263+
// This prevents a race condition where getModels() might be called
264+
// before refresh completes, avoiding a gap in cache availability
265265
refreshModels({ provider: router } as GetModelsOptions).catch((error) => {
266266
console.error(`[flushModels] Refresh failed for ${router}:`, error)
267267
})
268+
} else {
269+
// Only delete memory cache when not refreshing
270+
memoryCache.del(router)
268271
}
269272
}
270273

0 commit comments

Comments
 (0)