-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Feature/custom cache #4726
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/custom cache #4726
Conversation
|
This pull request is automatically built and testable in CodeSandbox. To see build info of the built libraries, click here or the icon next to each commit SHA. Latest deployment of this branch, based on commit d1d39b3:
|
Codecov Report
Additional details and impacted files@@ Coverage Diff @@
## v5 #4726 +/- ##
=====================================
Coverage ? 90.73%
=====================================
Files ? 89
Lines ? 3487
Branches ? 897
=====================================
Hits ? 3164
Misses ? 301
Partials ? 22 Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report at Codecov. |
|
Hello, my understanding of the feature is to allow users to provide a cache with different strategies. My hope would be to provide a LRU cache like this one : class LRUCache<K, V> {
private cache: Map<K, V>;
private order: K[];
private capacity: number;
constructor(capacity: number) {
this.capacity = capacity;
this.cache = new Map<K, V>();
this.order = [];
}
public get(key: K): V | undefined {
if (!this.cache.has(key)) {
return undefined;
}
this.updateOrder(key);
return this.cache.get(key);
}
public set(key: K, value: V): void {
if (this.cache.has(key)) {
this.updateOrder(key);
} else {
if (this.order.length === this.capacity) {
this.evict();
}
this.order.push(key);
}
this.cache.set(key, value);
}
public clear(): void {
this.cache.clear();
this.order = [];
}
public entries(): IterableIterator<[K, V]> {
return this.cache.entries();
}
public values(): IterableIterator<V> {
return this.cache.values();
}
public forEach(callbackfn: (value: V, key: K, map: Map<K, V>) => void, thisArg?: any): void {
this.cache.forEach(callbackfn, thisArg);
}
public has(key: K): boolean {
return this.cache.has(key);
}
private updateOrder(key: K): void {
const index = this.order.indexOf(key);
this.order.splice(index, 1);
this.order.push(key);
}
private evict(): void {
const keyToEvict = this.order.shift();
if (keyToEvict !== undefined) {
this.cache.delete(keyToEvict);
}
}
}So to allow me to provide this cache, i expect to be able to provide something else than a interface Cache<K,V> {
has(key: K) => boolean;
get(key: K) => V | undefined;
set(key: K, value: V) => Cache<K,V>;
delete(key: K) => boolean;
clear() => void;
entries() => IterableIterator<[K, V]>;
values() => IterableIterator<V>;
forEach(callbackfn: (value: V, key: K, map: Map<K, V>) => void, thisArg?: any): void;
}
interface QueryCacheConfig {
createCache?: () => Cache<string, Query>
onError?: (error: unknown, query: Query<unknown, unknown, unknown>) => void
onSuccess?: (data: unknown, query: Query<unknown, unknown, unknown>) => void
}
class QueryCache extends Subscribable<QueryCacheListener> {
config: QueryCacheConfig
private queries: Cache<string, Query>
...
} |
- rename from createCache - expose store interface - pass in the cache into the createStore function (this is a bit weird because cache.queries doesn't exist yet, but it does once the store is created and returned :/ - add another test that checks if the "size limit" use-case works - mark as experimental because the interface might not be final
|
@ecyrbe have a look at my last commit: I've made the interface more generic. One problem with an LRU cache is that if you set the size to 5, and you have 6 queries on screen, it will remove an active query. With the next re-render, this query will start to go into a hard loading state and fetch, and once data comes back, it will want to write to the cache, at which point you'll remove another active query. This will probably go infinite. Have a look at the test in the commit. I'm passing the The weird thing is that the invocation is: so we pass |
|
Oh i see the issue now. So we can only apply cache mutations on inactive queries. So if we want to do LRU, the LRU cache would need to be able to :
I saw the Maybe the query store is the bad abstraction for this feature ? we want to delegate the cache handling strategy but still maintain control of how we handle active/inactive/notifications of the cache. Here are other options :
export interface QueryStore {
has: (queryKey: string) => boolean
set: (queryKey: string, query: Query, cache: QueryCache) => void
get: (queryKey: string) => Query | undefined
delete: (queryKey: string) => void
values: () => IterableIterator<Query>
}
export interface QueryStore {
has: (queryKey: string) => boolean
set: (queryKey: string, query: Query) => void
get: (queryKey: string) => Query | undefined
delete: (queryKey: string) => void
values: () => IterableIterator<Query>
on(event: string, callback: ({ event: string, key: string, query: Query }) => void );
}i like this last solution better since it breaks cyclic dependency. what do you think ? |
|
here is an example implementation with LRU cache with events : import { EventEmitter } from "events";
class LRUCache<K, V extends Query> extends EventEmitter {
private cache: Map<K, V>;
private order: K[];
private capacity: number;
constructor(capacity: number) {
super();
this.capacity = capacity;
this.cache = new Map<K, V>();
this.order = [];
}
public get(key: K): V | undefined {
if (!this.cache.has(key)) {
return undefined;
}
this.updateOrder(key);
return this.cache.get(key);
}
public set(key: K, value: V): void {
if (this.cache.has(key)) {
this.updateOrder(key);
} else {
if (this.order.length === this.capacity) {
this.evict();
}
this.order.push(key);
}
this.cache.set(key, value);
this.emit("change", key, value);
}
public clear(): void {
this.cache.clear();
this.order = [];
this.emit("clear");
}
public delete(key: K): boolean {
const value = this.cache.get(key);
if (value) {
this.cache.delete(key);
const index = this.order.indexOf(key);
this.order.splice(index, 1);
this.emit("delete", value);
}
return Boolean(value);
}
public entries(): IterableIterator<[K, V]> {
return this.cache.entries();
}
public values(): IterableIterator<V> {
return this.cache.values();
}
public forEach(callbackfn: (value: V, key: K, map: Map<K, V>) => void, thisArg?: any): void {
this.cache.forEach(callbackfn, thisArg);
}
public has(key: K): boolean {
return this.cache.has(key);
}
private updateOrder(key: K): void {
const index = this.order.indexOf(key);
this.order.splice(index, 1);
this.order.push(key);
}
private evict(): void {
for (let i = this.order.length - 1; i >= 0; i--) {
const key = this.order[i];
const value = this.cache.get(key);
if (value !== undefined && !value.isActive()) {
this.order.splice(i, 1);
this.cache.delete(key);
this.emit("evict", value);
break;
}
}
}
}then on QueryCache : export class QueryCache extends Subscribable<QueryCacheListener> {
private queries: Map<string, Query>
private queries: QueryStore
constructor(public config: QueryCacheConfig = {}) {
super()
this.queries = config.experimental_createStore?() ?? defaultQueryStore()
this.queries.on('delete', (query) => this.notify({ type: 'removed', query })
this.queries.on('evict', (query) => {
query.destroy();
this.notify({ type: 'removed', query }
})
}
remove(query: Query<any, any, any, any>): void {
const queryInMap = this.queries.get(query.queryHash)
if (queryInMap) {
query.destroy()
if (queryInMap === query) {
this.queries.delete(query.queryHash) // on delete callback called
}
}
}
ps: all this was generated with the help of chatGPT, so it might have bugs. |
where would there still be a cyclic dependency in this example ? You won't be able to access I honestly like this solution best. It would still be compatible with a standard
We already have that on the whole query/packages/query-core/src/queryCache.ts Lines 66 to 73 in a358cb5
If such an event would be enough, we wouldn't even need the custom store 😅 . Maybe it is ... ? You could listen to the event when a query is added, find all other inactive queries and remove them ... |
|
I'm really starting to think it's two different issues:
Also, you could still remove queries from the cache by finding out if they are active or not even without having access to the |
so that we can access query.queryCache to find and remove other queries
|
Hello @TkDodo ,
The issue i have is the "explicitly". I like when things are modular. Here if tomorrow you change the behaviour of QueryCache, you'll also need to change the QueryStore. |
|
@ecyrbe yeah I think we do have circular things in there already, because a Query lives in the QueryCache, but the Query also has access to the QueryCache itself. In my last commit, I've removed passing the QueryCache to the |
|
Yes you have internal circular dependency and i think it's ok, but exposing circular dependency in userland is what i think you should try to evoid. |
# Conflicts: # docs/react/guides/migrating-to-v5.md
# Conflicts: # docs/react/guides/migrating-to-v5.md
Hmm yeah we are exposing the dependency by making a public getter on the what could that break in your opinion? Note that the fields were only typescript private, so at runtime, they were always there ... |
# Conflicts: # docs/react/guides/migrating-to-v5.md
Proposed solutionSo exposing the cache linked to the query means users will for sure use it now. Another approachSo the current solution is to plug a custom store to the cache. What do you think about this second approach ? If it's not clear, i'm all for setting a call on discord if you want. |
most things can be done by doing queryCache.subscribe, and we can still add this later if necessary
# Conflicts: # docs/react/guides/migrating-to-v5.md
|
So I worked on this today with @ecyrbe and we found out that size limitation of the cache can actually be achieved better by subscribing to the cache, as shown in this test: query/packages/query-core/src/tests/queryCache.test.tsx Lines 101 to 114 in f141be1
since we couldn't come up with a good use-case that actually needs the custom store, we've reverted the change for now (but kept the refactoring towards an internal map). We can expose this feature in a future minor if we want to. |
No description provided.