Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

process killed when compute voronoi indexes for a 250,000 data array point #17

Open
liangtianumich opened this issue Jan 16, 2019 · 2 comments

Comments

@liangtianumich
Copy link

liangtianumich commented Jan 16, 2019

I have tested and found that it works for 225,000 data points, however, it failed for 250,000 data points, which should be related to memory error. Is there any solution for large data points?
Both pyvoro and tess has this issue.

@utf
Copy link

utf commented Jun 10, 2019

@liangtianumich I am currently running into this problem also. Did you manage to find another package that can support this many points?

@liangtianumich
Copy link
Author

liangtianumich commented Jun 10, 2019

@utf I don't find any. I think that this may not be needed since we are not interested in calculating all atoms voro index of a large sample. By shrinking the interested atom number, we can let it work. See this issue:
#18

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants