-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Free up memory #103
Comments
I think the seedoutput data is a contender for getting removed from the R environment. This one has substantially more rows than any of the others and the model only needs the last generation to work with at any given time. |
Useful how to here for reading in and binding loads of files: https://stackoverflow.com/a/32888918 |
See use of |
@rosemckeon All three simulations still running on Generations 106, 119, and 154. Memory is near maxed out at 16 GB. |
@bradduthie Seems like the manual |
Ah, okay @rosemckeon -- I do have 5-10 of each in the data folders now. Suspect this should be fine. Memory usage is quite high, but nothing seems to be slowing down my computer. |
@bradduthie excellent! That may be a fair bit faster / it could just be that quite a few have finished because you're running the most disturbed first. Really good to know they're running OK anyway! |
@rosemckeon Now with 7-12 files saved, all still running. |
Brill, thanks @bradduthie Can you push some tomorrow? |
@rosemckeon Whoa! That is quite memory heavy. Too late for this now I suspect, but if you build on these simulations for a future project, it might make sense to get rid of unused memory during the simulation (e.g., by printing prior generations to a file, then removing them from the R environment). The
gc()
function might be able to free up a bit by garbage collection.I've got 16 GB to work with here. I'll start with one run to see how things look, then start trying everything in parallel.
Originally posted by @bradduthie in #61 (comment)
The text was updated successfully, but these errors were encountered: