Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not all keys are backed up. #5

Open
victorKochkarev opened this issue Feb 7, 2022 · 2 comments
Open

Not all keys are backed up. #5

victorKochkarev opened this issue Feb 7, 2022 · 2 comments

Comments

@victorKochkarev
Copy link

Backup operation does not backs up all values.

Test was performed using memcached version 1.6.9, memory limit was set to 4096 MB

Memcached server data was populated using the following python script:

from pymemcache.client.base import Client

def main():
    print("it works")
    client = Client("127.0.0.1")
    for i in range(10000000):
        new_key = 'z_' + str(i)
        client.set(new_key, i, 7200)


if __name__ == "__main__":
    main()

As a result 9999999 key values where set. With 7200 seconds TTL. Same test was also performed with no TTL it produced similar result.

Than I performed data dump using the flowing command:

./memcached-util --addr "127.0.0.1:11211" --op "backup" --filename "mem.json"

Command backed up only 58254 values so 9941745 values are missing.
Please see utility output:

2022-02-07T22:02:11.721038 main ▶ INFO 001 address 127.0.0.1:11211
2022-02-07T22:02:11.830346 backupCache ▶ INFO 002 58254 values found in the storage
2022-02-07T22:02:13.024945 backupCache ▶ INFO 003 Output file successfully generated at: mem.json

After the test I made sure that there are records present on Memcached server, but missing in mem.json file.

@victorKochkarev
Copy link
Author

I made a quick investigation, and found out that function *func (client memClient) ListKeys() []Key { is using stats cachedump TELNET COMMAND.
The problem with this command is, its output is limited to 1MB. This limit is causing the issue described above.

The problem could be solved by using lru_crawler metadump all command instead. That function has no data limit.

I would be happy to work on that problem.

@qeepcologne
Copy link

is there any size limit? the dump is only 32M while the service reached its limit of 1G.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants