Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lazy identities via FillArrays #94

Merged
merged 18 commits into from
May 4, 2023
Merged

Lazy identities via FillArrays #94

merged 18 commits into from
May 4, 2023

Conversation

amilsted
Copy link
Collaborator

@amilsted amilsted commented Apr 19, 2023

Makes identityoperator() return a DataOperator containing FillArrays.Eye, which is a lazy representation of an identity matrix, as suggested in #90.

  • Adds support for skipping over square identity matrices in LazyTensor mul!().
  • Replaces SimpleIsometry, used internally by LazyTensor mul!(), with Eye.
  • Fixes handling of beta=0 in mul!() on lazy operators.
  • Does less work in lazy operator mul!() when alpha==0.

NOTE: Currently, a tensor product of lazy identities gives us a Diagonal DataOperator, rather than a big Eye.

@amilsted amilsted requested a review from david-pl April 19, 2023 23:55
@codecov
Copy link

codecov bot commented Apr 21, 2023

Codecov Report

Merging #94 (c569b86) into master (65c0d0a) will increase coverage by 0.15%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master      #94      +/-   ##
==========================================
+ Coverage   96.49%   96.65%   +0.15%     
==========================================
  Files          26       26              
  Lines        3398     3410      +12     
==========================================
+ Hits         3279     3296      +17     
+ Misses        119      114       -5     
Impacted Files Coverage Δ
src/operators.jl 100.00% <ø> (ø)
src/QuantumOpticsBase.jl 100.00% <100.00%> (ø)
src/operators_lazyproduct.jl 100.00% <100.00%> (ø)
src/operators_lazysum.jl 99.25% <100.00%> (+0.03%) ⬆️
src/operators_lazytensor.jl 97.91% <100.00%> (+1.13%) ⬆️
src/operators_sparse.jl 100.00% <100.00%> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@amilsted
Copy link
Collaborator Author

https://github.com/qojulia/QuantumOptics.jl/blob/master/test/test_timeevolution_schroedinger.jl#L103 now fails because u0 is a lazy identity. Arguably, one should always use dense operators for propagators?

We could catch calls to schroedinger() where the initial state is the wrong type and return a more sensible error.

@Krastanov
Copy link
Collaborator

We could catch calls to schroedinger() where the initial state is the wrong type and return a more sensible error.

Maybe we can add schroedinger(tspan, u0::NotDenseOrSomething, ...) = schroedinger(tspan, dense(u0), ...)

@amilsted
Copy link
Collaborator Author

We could catch calls to schroedinger() where the initial state is the wrong type and return a more sensible error.

Maybe we can add schroedinger(tspan, u0::NotDenseOrSomething, ...) = schroedinger(tspan, dense(u0), ...)

Yeah, we could. I'm slightly wary because apparently sparse operators worked before. I suppose there could be cases for which one knows some symmetry is preserved and so sparsity makes sense..? Simplest example is a diagonal Hamiltonian and initial state.

@amilsted
Copy link
Collaborator Author

@Krastanov Do you know the best way to avoid Aqua complaining about this without disabling too may tests?

@Krastanov
Copy link
Collaborator

@Krastanov Do you know the best way to avoid Aqua complaining about this without disabling too may tests?

I will look into this tonight (Apr 22nd)

@Krastanov
Copy link
Collaborator

@Krastanov Do you know the best way to avoid Aqua complaining about this without disabling too may tests?

I will look into this tonight (Apr 22nd)

Actually, it seems you already have set the exclude filter. I believe this is the proper way to do it. Although I would suggest excluding FillArrays the package, not Base.reshape the function.

If not all flags are supported, I occasionally just do ambiguities=false and just call the dedicated test_ambiguities with more detailed arguments. https://juliatesting.github.io/Aqua.jl/dev/#Aqua.test_ambiguities-Tuple{Any}

@david-pl
Copy link
Member

@amilsted the code here looks good as far as I can tell. Performance also lives up to the previous one (except for the explicit isometries case), though I only ran some quick checks.

Regarding the test failure in schroedinger: Would it be fair to leave the conversion to an explicit sparse/dense operator to the user here? After all, essentially the same error would occur if you attempted in-place multiplication on a Diagonal or solving an ODEProblem with one as initial state. The case we're seeing is really just a test and I wouldn't expect a lot of cases where people throw an identityoperator into a time evolution. We could then just fix the failure downstream. What do you think?

@amilsted amilsted changed the title Draft: Lazy identities via FillArrays Lazy identities via FillArrays Apr 25, 2023
@amilsted
Copy link
Collaborator Author

Breakage should no longer fail after qojulia/QuantumOptics.jl#362 is merged.

@amilsted
Copy link
Collaborator Author

amilsted commented May 2, 2023

@david-pl @Krastanov Do you think this is okay to merge?

@Krastanov
Copy link
Collaborator

Looks good to me! Should we make an issue for the note you have?

Currently, a tensor product of lazy identities gives us a Diagonal DataOperator, rather than a big Eye.

@david-pl david-pl merged commit c0336cf into qojulia:master May 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants