-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
partialschur finds only one of three degenerate eigenvectors #112
Comments
Is this still happening? I also have a problem with multiple repeated eigenvalues, and failure is sometimes observed. |
I get 2 out of 3 with default settings for the current version, but all 3 with |
@Jutho did you every look into restart issues with repeated/multiple/tightly clustered eigenvalues in IRA / Krylov-Schur? In #114 it seems to sometimes completely destroy the accuracy of the partial schur decomp From the KrylovKit docs
I haven't checked if/how classic ARPACK deals with this differently. See also Jutho/KrylovKit.jl#23 Would be worth it to compare the schursorts side by side, they're slightly different in how many vecs they keep when shrinking the subspace, but other than that it's mostly the same. I think ArnoldiMethod should be slightly more efficient by doing pretty much everything in-place, but as a result it's more bug prone. Q is where do ArnoldiMethod and KrylovKit deviate, maybe that's the easiest way to debug... |
Maybe I should revise the docs as this description sounds a bit too optimistic. I think that in general, it is very finicky to rely on numerical noise to generate you the new linearly independent eigenvectors that are not present in the initial starting vector. I have also seen several issues reported about something like this. With the default values for KrylovKit.eigsolve ( But something is strange with the ArnoldiMethod results reported above. It is not that it just does not find 3 linearly independent eigenvectors and thus reports higher lying eigenvalues. The second value it reports is not a correct eigenvalues for this matrix. I think this may point towards a bug. |
This may account for the problem seen here: ISTM that the re-sorting of Ritz values in |
@RalphAS I think that might be an issue yes. So, a repeated eigenpair converges later than others, and then gets inserted somewhere between already locked vectors, whereas typically eigenpairs converge in order However, completely at the end, reordering the schur form should still work. Will have a look. |
Looks like #116 resolves the issue, but I don't immediately see why. Initially the following is locked:
but then a duplicate is found and the last
Looks like purging is where things are going wrong, cause suddenly the Arnoldi relation -- I think the bug is that a not-yet-converged eigenvalue ends up between the locked ones, because of sorting. |
I'm afraid the change did not fix things at my end. |
That's why I closed it ;p |
@RalphAS thanks for pointing me somewhat in the right direction. In the end though, sorting only the active part is incorrect, as the algorithm would return the first eigenvalues that converged, instead of the best eigenvalues it can detect over time. For most applications that coincides (dominant eigenvalues converge first), but with repeated eigenvalues that was not the case: they start to converge rather late, sometimes by accident, and at that point you better toss out some already converged eigenvalues further from the target, in favor of those repeated eigenvalues closer to the target, and give them a few more iterations to converge. PR #116 implements a double bug fix for purging of converged, unwanted Schur vectors. On top of that, I noticed there that this package's Since I'm the only maintainer of this package, feel free to review, otherwise I'll self-merge. |
One thing I'm not 100% sure about is: My implementation of purging retains converged vectors in the search subspace, they're simply at index > number of requested eigenvalues I think Arpack may actually completely remove them from the search subspace: put them at index > mindim and they're automatically truncated at restart. The latter gives more space for new eigenvectors to converge. But is that helpful at all? I'm pretty sure the vector you completely toss out starts to converge immediately again, whereas by keeping it in the subspace its deflated. |
That is an interesting comparison. |
In the end purging was implemented to actually remove the converged but unwanted vecs, because they may cause stability issues in the Krylov-Schur restart due to tiny residuals. After they're removed, they may reappear due to rounding noise, but that's better than loss of precision at restart. |
I have matrix with a three-fold degenerate extremal eigenvalue and
ArnoldiMethod.partialschur
finds only one of them while e.g.Arpack.eigs
finds all three. See the following MWE:prints
The text was updated successfully, but these errors were encountered: