-
Notifications
You must be signed in to change notification settings - Fork 608
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Script not scraping any content #2121
Comments
same to me, OF changed something i think |
Same here. |
Seems like OF changed the API response. Try this: In the onlyfans.py: In the apis\onlyfans\classes post / message_models, at the end of
With this I’m able to download source quality, previews would be wrong but I don’t care about those. |
Could you double check please?
TIA! |
I just realized I’m on a very old version. The relevant files are probably in https://github.com/UltimaHoarder/UltimaScraperAPI . The structure has changed a lot from the old version and is a lot less readable to me. I'm not able to find the relevant sections in the current version, so I think the only one that would be able to help is @UltimaHoarder |
Thanks for verifying tho! :D |
I'm seeing the same thing - here's the output of "poetry run start_us.py": [2024-09-03 09:55:35] Assigning Job |
Same here, mine is logging in and revealing all the OF subs, but crashes on attempting to download, whether all or specific types:
Using Python version 3.10.11 Tried these troubleshooting steps without a resolution:config.json regenerated and configured Latest UltimaScraper from Github |
Over the last two days the script has stopped scraping any content for me. The script passes through all stages as normal but does not download anything other than the onlyfans profile picture before saying "archiving complete", even though there is content to be scraped.
Have checked all settings in my auth.json and config.json and all looks normal. As mentioned, this worked earlier in the week and nothing has changed since.
The text was updated successfully, but these errors were encountered: