Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Free up memory #103

Open
rosemckeon opened this issue Aug 19, 2019 · 9 comments
Open

Free up memory #103

rosemckeon opened this issue Aug 19, 2019 · 9 comments
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@rosemckeon
Copy link
Owner

@rosemckeon Whoa! That is quite memory heavy. Too late for this now I suspect, but if you build on these simulations for a future project, it might make sense to get rid of unused memory during the simulation (e.g., by printing prior generations to a file, then removing them from the R environment). The gc() function might be able to free up a bit by garbage collection.

I've got 16 GB to work with here. I'll start with one run to see how things look, then start trying everything in parallel.

Originally posted by @bradduthie in #61 (comment)

@rosemckeon rosemckeon self-assigned this Aug 19, 2019
@rosemckeon rosemckeon added the enhancement New feature or request label Aug 19, 2019
@rosemckeon rosemckeon added this to the version 2 milestone Aug 19, 2019
@rosemckeon
Copy link
Owner Author

I think the seedoutput data is a contender for getting removed from the R environment. This one has substantially more rows than any of the others and the model only needs the last generation to work with at any given time.

@rosemckeon
Copy link
Owner Author

Useful how to here for reading in and binding loads of files: https://stackoverflow.com/a/32888918
I could save each gen as a tmp file, so I don't need to keep adding to a huge R object, then bind them all together at the end.

@rosemckeon
Copy link
Owner Author

See use of tmpdir() in unlink() examples with dir_to_clean: https://www.rdocumentation.org/packages/base/versions/3.6.1/topics/unlink

@bradduthie
Copy link
Collaborator

bradduthie commented Aug 20, 2019

@rosemckeon All three simulations still running on Generations 106, 119, and 154. Memory is near maxed out at 16 GB.

@rosemckeon
Copy link
Owner Author

@bradduthie Seems like the manual gc() might only be a small help. Are those generations you mentioned for the first set of tests? (the simulation files have multiple different tests to go through). You can see quickly by looking in each data folder to see how many files end in *10.rds

@bradduthie
Copy link
Collaborator

Ah, okay @rosemckeon -- I do have 5-10 of each in the data folders now. Suspect this should be fine. Memory usage is quite high, but nothing seems to be slowing down my computer.

@rosemckeon
Copy link
Owner Author

@bradduthie excellent! That may be a fair bit faster / it could just be that quite a few have finished because you're running the most disturbed first. Really good to know they're running OK anyway!

@bradduthie
Copy link
Collaborator

@rosemckeon Now with 7-12 files saved, all still running.

@rosemckeon
Copy link
Owner Author

Brill, thanks @bradduthie Can you push some tomorrow?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants