Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement missing LinearAlgebra wrappers and add support for uplo parameter #51

Merged
merged 8 commits into from
Oct 23, 2023

Conversation

danielwe
Copy link
Contributor

Finally got around to #46 (long since closed but never fixed)

Bonus: I noticed that the existing support for Symmetric was buggy as it didn't support the uplo parameter, i.e., adapt_structure would do the wrong thing with an instance of Symmetric with non-default uplo. So fixed that too for Symmetric and Hermitian.

@danielwe
Copy link
Contributor Author

Reading the docstring for WrappedArray, shouldn't the {D,Bid,Trid,SymTrid}iagonal wrappers use the Src type parameter rather than Dst? They wrap 1-3 vectors and turn them into a matrix, so the wrapper has different dimensionality from the parent(s).

In this PR I just followed the pattern already in use for {D,Trid}iagonal, so I used Dst for {Bi,SymTri}diagonal.

@maleadt
Copy link
Member

maleadt commented Mar 21, 2022

Thanks for the PR!

Reading the docstring for WrappedArray, shouldn't the {D,Bid,Trid,SymTrid}iagonal wrappers use the Src type parameter rather than Dst? They wrap 1-3 vectors and turn them into a matrix, so the wrapper has different dimensionality from the parent(s).

It seems that way, yes. Can you come up with a test to make sure this doesn't regress?

@danielwe
Copy link
Contributor Author

I think that should do it. Testing that all wrapped arrays are subtypes of the appropriate dimension-specific AnyCustomArray{T,N} union.

@maleadt
Copy link
Member

maleadt commented Mar 24, 2022

Thanks. Did you validate these changes with any of the packages that rely on Adapt.jl (like CUDA.jl)?

@codecov
Copy link

codecov bot commented Mar 24, 2022

Codecov Report

Attention: 12 lines in your changes are missing coverage. Please review.

Comparison is base (5063b0f) 83.33% compared to head (d29a1e8) 77.77%.
Report is 6 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master      #51      +/-   ##
==========================================
- Coverage   83.33%   77.77%   -5.56%     
==========================================
  Files           6        6              
  Lines          72       81       +9     
==========================================
+ Hits           60       63       +3     
- Misses         12       18       +6     
Files Coverage Δ
src/wrappers.jl 64.44% <53.84%> (-7.78%) ⬇️

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@danielwe
Copy link
Contributor Author

Thanks for the reminder. The entire CUDA.jl test suite passes on my end.

Looks like UpperHessenberg only exists from 1.3 onwards. Is it important to maintain support for 1.2?

@maleadt
Copy link
Member

maleadt commented Mar 24, 2022

No. We can bump the requirement to the latest LTS, 1.6, and then also include the other recent versions that are missing from the CI roster (1.6, 1.7, 1.8-beta1).

@mtfishman
Copy link

Can this be merged? I just hit a scalar indexing issue caused by missing adapt rules for Hermitian.

@maleadt maleadt merged commit adf4bde into JuliaGPU:master Oct 23, 2023
15 checks passed
@mtfishman
Copy link

Thanks!

maleadt added a commit that referenced this pull request Oct 26, 2023
maleadt added a commit that referenced this pull request Oct 26, 2023
@maleadt
Copy link
Member

maleadt commented Oct 26, 2023

Sorry, reverting this due to CUDA.jl breakage: #70 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants