-
Notifications
You must be signed in to change notification settings - Fork 123
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Only 50 posts are backed up using --incremental #223
Comments
At the time you ran the incremental backup, was there really more than 50 new posts?
Without If you at any point just want to regenerate the archive and index.html, use my fork and pass |
I got sidetracked sorry but this is still not working as before. --incremental works as you describe, it only does 50 posts. Then if I try again, it does nothing. I just saw the update And no-clobber works. I am backing up beyond 50 posts now. I did a backup of the folder before but hopefully it doesn't overwrite the media because it's over 50GB. So now I guess I'd need to do without incremental, no clobber and timestamping which would basically download the entire blog again (which is like 70000 posts) so that the index and stuff are updated and I guess then I can do incremental beyond 50 posts? Or so where do I go from here for the future? I don't want to download the whole blog again every time, and there are alwayts more than 100 posts. Can I only grab what is missing and regenerate index and stuff somehow? It's weird that incremental stopped working like it did before, used to grab 100s of posts. I have my own API key |
I forgot to mention that Assuming none of your previous incremental backups were interrupted, it sounds like |
We have reproduced the issue two times, the -i parameter takes only the 50 recent posts into account. We try to run the script daily now, but a fix would be welcome just in case. |
@jorong1 @slowcar If I run these commands, I get the expected result:
The first command makes a small backup that is 175 posts out of date, and the next command backs up the missing posts using |
Hi. I finally did another blog backup with cebtenzzre@ce10f29 and it seems incremental is working properly now.
My process was to do a full non-incremental backup like @cebtenzzre recommended, followed by an incremental. This "reset" wherever I was stuck. I don't think the issue should be closed because @slowcar is having an issue with it, so it must be something else, but I am good now so I don't mind closing it. The solution here would be to possibly using Cebtenzzre's fork if you're having issues. I don't know if that's appropriate to recommend. |
I was hoping I wouldn't have to post this, but I am now getting the same error again. There's a good couple weeks of content there missing. |
Version using, f8ae83d
command:
python2.7 tumblr-utils/tumblr_backup.py --incremental --save-video-tumblr --no-ssl-verify --save-audio --json BLOGNAME
Output:
50 posts backed up
I am backing up to a folder with existing post json media archive and index files.
Without --incremental it grabs posts no problem but it will probably want to grab all the posts, which I don't want. I do want archive and index.html generated files but cancelling a mass-run doesn't do that.
My last successful --incremental backup was end of September using 08cbe44 with my own API key
I use --incremental to avoid overwriting existing older posts and media, and avoid going too far back. If I don't use that will it just stop once it finds existing files?
I could use count this time to update, but I'm sure it'll still only incremental grab 50 posts in the future.
I have my own API key and I'm using the latest git for this project. Also tried cebtenzzre@e5537c0 and same 50 post output.
Don't know what to do here thanks!
The text was updated successfully, but these errors were encountered: