Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement a method to calculate only largest eigenvalues #112

Open
StHagel opened this issue Mar 14, 2024 · 6 comments
Open

Implement a method to calculate only largest eigenvalues #112

StHagel opened this issue Mar 14, 2024 · 6 comments
Labels
enhancement New feature or request

Comments

@StHagel
Copy link

StHagel commented Mar 14, 2024

Is your feature request related to a problem? Please describe.
faer is very fast at computing all eigenvalues of a matrix, but oftentimes you only need a few eigenvalues (e.g. the largest/smallest ones). In fact, when I benchmark it against the spectra C++ library (which is based on eigen3), faer outperforms it by a lot when calculating all or almost all eigenvalues of a 4096x4096 matrix (~23s for faer, ~650s for spectra), but when only calculating the largest eigenvalue, spectra outperforms faer by a lot (again 23s for faer, ~0.22s for spectra).

Describe the solution you'd like
It would be great, if faer would incorporate a pure-rust implementation of an eigenvalue finding algorithm, that is able to calculate only a specified number of the largest/smallest eigenvalues. Typically, other libraries (spectra, arpack, slepc, ...) use methods based on Krylov subspaces, such as Arnoldi iteration, for these tasks. Maybe something along those lines would be a feasible and worthwhile extension to faer?

Describe alternatives you've considered
Thus far, I have not found any pure-rust crates to efficiently calculate eigenvalue problems with dense, non-hermitian matrices, where only a few eigenvalues need to be calculated. The ones that come closest are arpack-ng, which is essentially a wrapper around the Fortran library arpack, and faer. arpack-ng has its own fair share of problems, which is why I thus far always used the cpp crate to call into the spectra library.

Additional context
Benchmarks run on a AMD Ryzen 9 5950X processor
https://en.wikipedia.org/wiki/Arnoldi_iteration
https://en.wikipedia.org/wiki/Krylov_subspace
https://spectralib.org/
https://slepc.upv.es/
https://crates.io/crates/arpack-ng

@StHagel StHagel added the enhancement New feature or request label Mar 14, 2024
@guiburon
Copy link

Hi,

I am not related to the faer crate but I am curious.

Was your 4096x4096 matrix dense in your example?
Spectra seems to be a sparse eigensolver. I think Arnoldi iteration eigensolvers in general are much more suited to sparse eigenproblems. Also the rule of thumb for Arnoldi iteration is that you need a Krylov subspace 2 times larger than the number of eigenpairs wanted so it is very ill suited to computing all the eigenpairs.

May I suggest taking a look at power iteration and subspace iteration if you want the dominant or the few largest eigenpairs? Otherwise, I think for general dense matrices it is typically not faster to extract less than all of the eigenpairs. I am curious about what would be the faer approach though.

@sarah-quinones
Copy link
Owner

I've recently been studying iterative solvers so this is on my to-do list for the near future

@StHagel
Copy link
Author

StHagel commented Mar 14, 2024

Hi,

I am not related to the faer crate but I am curious.

Was your 4096x4096 matrix dense in your example? Spectra seems to be a sparse eigensolver. I think Arnoldi iteration eigensolvers in general are much more suited to sparse eigenproblems. Also the rule of thumb for Arnoldi iteration is that you need a Krylov subspace 2 times larger than the number of eigenpairs wanted so it is very ill suited to computing all the eigenpairs.

May I suggest taking a look at power iteration and subspace iteration if you want the dominant or the few largest eigenpairs? Otherwise, I think for general dense matrices it is typically not faster to extract less than all of the eigenpairs. I am curious about what would be the faer approach though.

Spectra does have methods for dense matrices as well (https://spectralib.org/doc/classspectra_1_1geneigssolver), which I have been using in my example benchmarks (the 4096x4096 matrices have all been dense).
Spectra is in fact not well suited for finding all the eigenvalues, but in my applications I only need the leading eigenpairs, hence this issue, asking for a way to get those using faer.

I've recently been studying iterative solvers so this is on my to-do list for the near future

Awesome! Btw, if such a method would also calculate the eigenvectors, that would be even more awesome :)

@StHagel
Copy link
Author

StHagel commented Jun 9, 2024

Hey there, it's me again.

Did you have time yet to look into this?
If not, is there a way I can help out with it? If you point me towards some specific algorithm(s), which would suite this usecase in faer and point me towards where in the repo such a method would make most sense, I could see if I can write up a draft for a PR to work with.

@Makogan
Copy link

Makogan commented Sep 1, 2024

@StHagel
I am not related to the faer project, but if oyu want to implement a state of the art iterative eigenvalue solver that only finds leading eigenvalues, I recommend searching for the krylov schur method.

@StHagel
Copy link
Author

StHagel commented Sep 24, 2024

I recently saw commit bb213d1. I think that means it is being worked on.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants