Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Postprocessing memory issue #78

Open
OliS1311 opened this issue Dec 7, 2024 · 3 comments
Open

Postprocessing memory issue #78

OliS1311 opened this issue Dec 7, 2024 · 3 comments

Comments

@OliS1311
Copy link

OliS1311 commented Dec 7, 2024

When running cell_inference.py on larger WSI files (currently just piloting this on individual slides), the postprocessing stage does not complete. I can also see that memory usage increases from ~20 gb max during inference to the maximum amount that I've requested from my HPC after inference has finished (most recently went up to 63 gb):

2024-12-07 19:08:55,566 [INFO] - Detected cells before cleaning: 913943
2024-12-07 19:08:55,566 [INFO] - Initializing Cell-Postprocessor

No errors are thrown, but the session times out at this stage. Do you have any advice for resolving this issue? I'm able to run inference on smaller files without any problems. I'm running the PanNuke-pretrained CellViT-SAM-H-x40.pth model with --geojson.

@FabianHoerst
Copy link
Collaborator

Sorry, I think the postprocessing in the current stage is very memory intensive. Our machines for inference have large ram. Maybe you need to change the postprocessor. I have this on my list for a new version, but have not find the time to implement it on my own

@OliS1311
Copy link
Author

Thanks! I'll try again and request more memory

@OliS1311
Copy link
Author

I can confirm that more memory (~ 120 gb) does fix the issue, cheers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants