-
Notifications
You must be signed in to change notification settings - Fork 231
reids key "conj:xxxxxxx" keep being bigger #340
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Comments
There is an unfinished but working PR with a command to reap those #323. You may try it. |
@Suor: curious what does |
This is empty key |
Suor
added a commit
that referenced
this issue
Feb 25, 2023
The idea is that instead of saving all dependent cache keys in conj set we put a simple random stamp in those and store a checksum of all related conj stamps along with cache data. This makes cache reads more complicated - `MGET` key + conj keys, validate stamps checksum. However, we no longer need to store potentially big conj sets and invalidation becomes faster, including model level invalidation. It also removes strong link between conj and cache keys, i.e. loss of conj keys no longer leads to a stale cache, instead we will simply drop the key on next read. This opens easier way for maxmemory and cluster. So: - more friendly to `maxmemory`, even assumes that, see #143 - eliminates issues with big conj sets and long invalidation, see #340, - `reapconjs` is not needed with it, see #323, #434 Followups: - docs - remove `CACHEOPS_LRU` as it's superseeded by this generally - make insideout default or even drop the old ways?
Suor
added a commit
that referenced
this issue
Feb 25, 2023
The idea is that instead of saving all dependent cache keys in conj set we put a simple random stamp in those and store a checksum of all related conj stamps along with cache data. This makes cache reads more complicated - `MGET` key + conj keys, validate stamps checksum. However, we no longer need to store potentially big conj sets and invalidation becomes faster, including model level invalidation. It also removes strong link between conj and cache keys, i.e. loss of conj keys no longer leads to a stale cache, instead we will simply drop the key on next read. This opens easier way for maxmemory and cluster. So: - more friendly to `maxmemory`, even assumes that, see #143 - eliminates issues with big conj sets and long invalidation, see #340, - `reapconjs` is not needed with it, see #323, #434 Followups: - docs - remove `CACHEOPS_LRU` as it's superseeded by this generally - make insideout default or even drop the old ways?
Suor
added a commit
that referenced
this issue
Feb 25, 2023
The idea is that instead of saving all dependent cache keys in conj set we put a simple random stamp in those and store a checksum of all related conj stamps along with cache data. This makes cache reads more complicated - `MGET` key + conj keys, validate stamps checksum. However, we no longer need to store potentially big conj sets and invalidation becomes faster, including model level invalidation. It also removes strong link between conj and cache keys, i.e. loss of conj keys no longer leads to a stale cache, instead we will simply drop the key on next read. This opens easier way for maxmemory and cluster. So: - more friendly to `maxmemory`, even assumes that, see #143 - eliminates issues with big conj sets and long invalidation, see #340, #350, #444 - `reapconjs` is not needed with it, see #323, #434 Followups: - docs - remove `CACHEOPS_LRU` as it's superseeded by this generally - make insideout default or even drop the old ways?
Using |
# for free
to join this conversation on GitHub.
Already have an account?
# to comment
django-cacheops==4.0.4
redis==5.0.4
I have been suffering from memory leak whiling using
cachops
.Redis memory keep going down til max memory then fail.
First, It tried to find big keys with
--bigkeys
option.The is the biggest key 'conj:live_live:author_id=userdata: (nil)&type=0&status=1' and the below is output of several redis commands.
I don't know which query makes the situation and how to track the issue more to solve and even I don't understand what is the meaning of each key.
Please give me any suggestions.
The text was updated successfully, but these errors were encountered: