From 218eb2ff36a44b65719e91b17ee772474df2f362 Mon Sep 17 00:00:00 2001 From: Willow Ahrens Date: Mon, 8 May 2023 15:29:23 -0400 Subject: [PATCH 1/4] cleanup embedding docs --- docs/make.jl | 2 +- docs/src/algebra.md | 7 ++- docs/src/embed.md | 20 --------- docs/src/fibers.md | 57 +----------------------- docs/src/interop.md | 74 +++++++++++++++++++++++++++++++ src/Finch.jl | 22 --------- src/annihilate.jl | 5 ++- src/fibers.jl | 18 ++++---- src/levels/denselevels.jl | 8 ++-- src/levels/elementlevels.jl | 8 ++-- src/levels/patternlevels.jl | 6 +-- src/levels/repeatrlelevels.jl | 8 ++-- src/levels/sparsebytemaplevels.jl | 8 ++-- src/levels/sparsecoolevels.jl | 8 ++-- src/levels/sparsehashlevels.jl | 8 ++-- src/levels/sparselistlevels.jl | 8 ++-- src/levels/sparsevbllevels.jl | 8 ++-- 17 files changed, 128 insertions(+), 147 deletions(-) delete mode 100644 docs/src/embed.md create mode 100644 docs/src/interop.md diff --git a/docs/make.jl b/docs/make.jl index 678d56fee..1e181677e 100644 --- a/docs/make.jl +++ b/docs/make.jl @@ -18,9 +18,9 @@ makedocs(; "Home" => "index.md", "Array Formats" => "fibers.md", "The Deets" => "listing.md", - "Embedding" => "embed.md", "Custom Functions" => "algebra.md", "Tensor File I/O" => "fileio.md", + "C, C++, ..." => "interop.md", "Development Guide" => "development.md", ], ) diff --git a/docs/src/algebra.md b/docs/src/algebra.md index dd68e9ad5..d97c7eb2b 100644 --- a/docs/src/algebra.md +++ b/docs/src/algebra.md @@ -6,7 +6,8 @@ CurrentModule = Finch ## User Functions -Finch supports arbitrary Julia Base functions over [`isbits`](@ref) types. You +Finch supports arbitrary Julia Base functions over +[`isbits`](https://docs.julialang.org/en/v1/base/base/#Base.isbits) types. You can also use your own functions and use them in Finch! Just remember to define any special algebraic properties of your functions so that Finch can optimize them better. You must declare the properties of your functions before you call @@ -85,7 +86,9 @@ the behavior of Finch in different ways, and call those Finch functions during precompilation, the resulting behavior is undefined. There are several packages that take similar, but different, approaches to -allow user participation in staged Julia programming (not to mention Base `eval` or `@generated`): [StagedFunctions.jl](https://github.com/NHDaly/StagedFunctions.jl), +allow user participation in staged Julia programming (not to mention Base `eval` +or `@generated`): +[StagedFunctions.jl](https://github.com/NHDaly/StagedFunctions.jl), [GeneralizedGenerated.jl](https://github.com/JuliaStaging/GeneralizedGenerated.jl), [RuntimeGeneratedFunctions.jl](https://github.com/SciML/RuntimeGeneratedFunctions.jl), or [Zygote.jl](https://github.com/FluxML/Zygote.jl). diff --git a/docs/src/embed.md b/docs/src/embed.md deleted file mode 100644 index 7f9ed0729..000000000 --- a/docs/src/embed.md +++ /dev/null @@ -1,20 +0,0 @@ -```@meta -CurrentModule = Finch -``` - -# Public Functions - -```@docs -Finch.h.FINCH_SCOPE -Finch.h.finch_escape -Finch.h.finch_eval -Finch.h.finch_consume_vector -Finch.h.finch_free -Finch.h.finch_mirror_vector -Finch.h.finch_initialize -Finch.h.finch_root -Finch.h.finch_exec -Finch.h.finch_T -Finch.h.finch_call -Finch.h.finch_finalize -``` \ No newline at end of file diff --git a/docs/src/fibers.md b/docs/src/fibers.md index cc3678639..af0880d0c 100644 --- a/docs/src/fibers.md +++ b/docs/src/fibers.md @@ -181,59 +181,4 @@ ElementLevel SparseListLevel SparseCOOLevel SparseHashLevel -``` - -## 0-Index Compatibility - -Julia, Matlab, etc. index arrays [starting at -1](https://docs.julialang.org/en/v1/devdocs/offset-arrays/). C, python, etc. -index starting at 0. In a dense array, we can simply subtract one from the -index, and in fact, this is what Julia will does under the hood when you pass a -vector [between C to -Julia](https://docs.julialang.org/en/v1/manual/embedding/#Working-with-Arrays). - -However, for sparse array formats, it's not just a matter of subtracting one -from the index, as the internal lists of indices, positions, etc all start from -zero as well. To remedy the situation, Finch defines a handy zero-indexed integer -type called `CIndex`. The internal representation of `CIndex` is one less than the -value it represents, and we can use `CIndex` as the index or position type of -a Finch array to represent arrays in other languages. - -For example, if `idx_c`, `ptr_c`, and `val_c` are the internal arrays of a CSC -matrix in a zero-indexed language, we can represent that matrix as a one-indexed -Finch array without copying by calling -```@meta -DocTestSetup = quote - using Finch - using Finch: Cindex -end -``` -```jldoctest example2 -julia> m = 4; n = 3; ptr_c = [0, 3, 3, 5]; idx_c = [1, 2, 3, 0, 2]; val_c = [1.1, 2.2, 3.3, 4.4, 5.5]; - -julia> ptr_jl = unsafe_wrap(Array, reinterpret(Ptr{Cindex{Int}}, pointer(ptr_c)), length(ptr_c); own = false) -4-element Vector{Cindex{Int64}}: - Cindex{Int64}(0) - Cindex{Int64}(3) - Cindex{Int64}(3) - Cindex{Int64}(5) -julia> idx_jl = unsafe_wrap(Array, reinterpret(Ptr{Cindex{Int}}, pointer(idx_c)), length(idx_c); own = false) -5-element Vector{Cindex{Int64}}: - Cindex{Int64}(1) - Cindex{Int64}(2) - Cindex{Int64}(3) - Cindex{Int64}(0) - Cindex{Int64}(2) -julia> A = Fiber(Dense(SparseList{Cindex{Int}, Cindex{Int}}(Element{0.0, Float64}(val_c), m, ptr_jl, idx_jl), n)) -Dense [:,1:3] -├─[:,1]: SparseList (0.0) [1:Cindex{Int64}(3)] -│ ├─[Cindex{Int64}(1)]: 1.1 -│ ├─[Cindex{Int64}(2)]: 2.2 -│ ├─[Cindex{Int64}(3)]: 3.3 -├─[:,2]: SparseList (0.0) [1:Cindex{Int64}(3)] -├─[:,3]: SparseList (0.0) [1:Cindex{Int64}(3)] -│ ├─[Cindex{Int64}(0)]: 4.4 -│ ├─[Cindex{Int64}(2)]: 5.5 -``` - -We can also convert between representations by by copying to or from `Cindex` fibers. \ No newline at end of file +``` \ No newline at end of file diff --git a/docs/src/interop.md b/docs/src/interop.md new file mode 100644 index 000000000..f951efc00 --- /dev/null +++ b/docs/src/interop.md @@ -0,0 +1,74 @@ +# Using Finch with Other Languages + +You can use Finch in other languages through our C interface! We also include +convenience types for converting between 0-indexed and 1-indexed arrays. + +## finch.h + +Refer to +[finch.h](https://github.com/willow-ahrens/Finch.jl/blob/main/embed/finch.h) for +detailed documentation. The public functions include a few shortcuts for +constructing finch datatypes, as well as convenience functions for calling Julia +from C. Refer also to the [Julia +documentation](https://docs.julialang.org/en/v1/manual/embedding/) for more +general advice. Refer to the tests for a [working +example](https://github.com/willow-ahrens/Finch.jl/blob/main/test/embed/test_embed_simple.c) +of embedding in C. Note that calling `finch_init` will call `jl_init`, as well +as initializing a few function pointers for the interface. Julia cannot see C +references to Julia objects, so `finch.h` includes a few functions to introduce +references on the Julia side that mirror C objects. + +## 0-Index Compatibility + +Julia, Matlab, etc. index arrays [starting at +1](https://docs.julialang.org/en/v1/devdocs/offset-arrays/). C, python, etc. +index starting at 0. In a dense array, we can simply subtract one from the +index, and in fact, this is what Julia will does under the hood when you pass a +vector [between C to +Julia](https://docs.julialang.org/en/v1/manual/embedding/#Working-with-Arrays). + +However, for sparse array formats, it's not just a matter of subtracting one +from the index, as the internal lists of indices, positions, etc all start from +zero as well. To remedy the situation, Finch defines a handy zero-indexed integer +type called `CIndex`. The internal representation of `CIndex` is one less than the +value it represents, and we can use `CIndex` as the index or position type of +a Finch array to represent arrays in other languages. + +For example, if `idx_c`, `ptr_c`, and `val_c` are the internal arrays of a CSC +matrix in a zero-indexed language, we can represent that matrix as a one-indexed +Finch array without copying by calling +```@meta +DocTestSetup = quote + using Finch + using Finch: Cindex +end +``` +```jldoctest example2 +julia> m = 4; n = 3; ptr_c = [0, 3, 3, 5]; idx_c = [1, 2, 3, 0, 2]; val_c = [1.1, 2.2, 3.3, 4.4, 5.5]; + +julia> ptr_jl = unsafe_wrap(Array, reinterpret(Ptr{Cindex{Int}}, pointer(ptr_c)), length(ptr_c); own = false) +4-element Vector{Cindex{Int64}}: + Cindex{Int64}(0) + Cindex{Int64}(3) + Cindex{Int64}(3) + Cindex{Int64}(5) +julia> idx_jl = unsafe_wrap(Array, reinterpret(Ptr{Cindex{Int}}, pointer(idx_c)), length(idx_c); own = false) +5-element Vector{Cindex{Int64}}: + Cindex{Int64}(1) + Cindex{Int64}(2) + Cindex{Int64}(3) + Cindex{Int64}(0) + Cindex{Int64}(2) +julia> A = Fiber(Dense(SparseList{Cindex{Int}, Cindex{Int}}(Element{0.0, Float64}(val_c), m, ptr_jl, idx_jl), n)) +Dense [:,1:3] +├─[:,1]: SparseList (0.0) [1:Cindex{Int64}(3)] +│ ├─[Cindex{Int64}(1)]: 1.1 +│ ├─[Cindex{Int64}(2)]: 2.2 +│ ├─[Cindex{Int64}(3)]: 3.3 +├─[:,2]: SparseList (0.0) [1:Cindex{Int64}(3)] +├─[:,3]: SparseList (0.0) [1:Cindex{Int64}(3)] +│ ├─[Cindex{Int64}(0)]: 4.4 +│ ├─[Cindex{Int64}(2)]: 5.5 +``` + +We can also convert between representations by by copying to or from `Cindex` fibers. \ No newline at end of file diff --git a/src/Finch.jl b/src/Finch.jl index a7b7efdd0..a7bf3fceb 100644 --- a/src/Finch.jl +++ b/src/Finch.jl @@ -83,28 +83,6 @@ include("modifiers.jl") export fsparse, fsparse!, fsprand, fspzeros, ffindnz, countstored -module h - using Finch - function generate_embed_docs() - finch_h = read(joinpath(dirname(pathof(Finch)), "../embed/finch.h"), String) - blocks = map(m -> m.captures[1], eachmatch(r"\/\*\!(((?!\*\/)(.|\n|\r))*)\*\/", finch_h)) - map(blocks) do block - block = strip(block) - lines = collect(eachline(IOBuffer(block))) - key = Meta.parse(strip(lines[1])) - body = strip(join(lines[2:end], "\n")) - @eval begin - """ - $($body) - """ - $key - end - end - end - - generate_embed_docs() -end - include("base/abstractarrays.jl") include("base/abstractunitranges.jl") include("base/broadcast.jl") diff --git a/src/annihilate.jl b/src/annihilate.jl index 138a5ecad..b8e62f3e8 100644 --- a/src/annihilate.jl +++ b/src/annihilate.jl @@ -15,8 +15,9 @@ end choose(z)(a, b) `choose(z)` is a function which returns whichever of `a` or `b` is not -[isequal](@ref) to `z`. If neither are `z`, then return `a`. Useful for getting -the first nonfill value in a sparse array. +[isequal](https://docs.julialang.org/en/v1/base/base/#Base.isequal) to `z`. If +neither are `z`, then return `a`. Useful for getting the first nonfill value in +a sparse array. ```jldoctest setup=:(using Finch) julia> a = @fiber(sl(e(0.0)), [0, 1.1, 0, 4.4, 0]) SparseList (0.0) [1:5] diff --git a/src/fibers.jl b/src/fibers.jl index 3c3a8d305..f5a9b00d5 100644 --- a/src/fibers.jl +++ b/src/fibers.jl @@ -252,7 +252,7 @@ end function Base.show(io::IO, mime::MIME"text/plain", fbr::Fiber) if get(io, :compact, false) - print(io, "@fiber($(summary_f_code(fbr.lvl)))") + print(io, "@fiber($(summary_fiber_abbrev(fbr.lvl)))") else display_fiber(io, mime, fbr, 0) end @@ -260,7 +260,7 @@ end function Base.show(io::IO, mime::MIME"text/plain", fbr::VirtualFiber) if get(io, :compact, false) - print(io, "VirtualFiber($(summary_f_code(fbr.lvl)))") + print(io, "VirtualFiber($(summary_fiber_abbrev(fbr.lvl)))") else show(io, fbr) end @@ -272,7 +272,7 @@ end function Base.show(io::IO, mime::MIME"text/plain", fbr::SubFiber) if get(io, :compact, false) - print(io, "SubFiber($(summary_f_code(fbr.lvl)), $(fbr.pos))") + print(io, "SubFiber($(summary_fiber_abbrev(fbr.lvl)), $(fbr.pos))") else display_fiber(io, mime, fbr, 0) end @@ -280,7 +280,7 @@ end function Base.show(io::IO, mime::MIME"text/plain", fbr::VirtualSubFiber) if get(io, :compact, false) - print(io, "VirtualSubFiber($(summary_f_code(fbr.lvl)))") + print(io, "VirtualSubFiber($(summary_fiber_abbrev(fbr.lvl)))") else show(io, fbr) end @@ -327,7 +327,7 @@ function f_decode(ex) elseif ex isa Expr return Expr(ex.head, map(f_decode, ex.args)...) elseif ex isa Symbol - return :(@something($f_code($(Val(ex))), Some($(esc(ex))))) + return :(@something($fiber_abbrev($(Val(ex))), Some($(esc(ex))))) else return esc(ex) end @@ -340,7 +340,7 @@ Construct a fiber using abbreviated level constructor names. To override abbreviations, expressions may be interpolated with `\$`. For example, `Fiber(DenseLevel(SparseListLevel(Element(0.0))))` can also be constructed as `@fiber(sl(d(e(0.0))))`. Consult the documentation for the helper function -[f_code](@ref) for a full listing of level format codes. +[fiber_abbrev](@ref) for a full listing of level format codes. Optionally, an argument may be specified to copy into the fiber. This expression allocates. Use `fiber(arg)` for a zero-cost copy, if available. @@ -364,10 +364,10 @@ end end end -@inline f_code(@nospecialize ::Any) = nothing +@inline fiber_abbrev(@nospecialize ::Any) = nothing -Base.summary(fbr::Fiber) = "$(join(size(fbr), "×")) @fiber($(summary_f_code(fbr.lvl)))" -Base.summary(fbr::SubFiber) = "$(join(size(fbr), "×")) SubFiber($(summary_f_code(fbr.lvl)))" +Base.summary(fbr::Fiber) = "$(join(size(fbr), "×")) @fiber($(summary_fiber_abbrev(fbr.lvl)))" +Base.summary(fbr::SubFiber) = "$(join(size(fbr), "×")) SubFiber($(summary_fiber_abbrev(fbr.lvl)))" Base.similar(fbr::AbstractFiber) = Fiber(similar_level(fbr.lvl)) Base.similar(fbr::AbstractFiber, dims::Tuple) = Fiber(similar_level(fbr.lvl, dims...)) \ No newline at end of file diff --git a/src/levels/denselevels.jl b/src/levels/denselevels.jl index 4d0eee924..dc5624c0b 100644 --- a/src/levels/denselevels.jl +++ b/src/levels/denselevels.jl @@ -37,10 +37,10 @@ DenseLevel{Ti, Lvl}(lvl) where {Ti, Lvl} = DenseLevel{Ti, Lvl}(lvl, zero(Ti)) const Dense = DenseLevel """ -`f_code(d)` = [DenseLevel](@ref). +`fiber_abbrev(d)` = [DenseLevel](@ref). """ -f_code(::Val{:d}) = Dense -summary_f_code(lvl::Dense) = "d($(summary_f_code(lvl.lvl)))" +fiber_abbrev(::Val{:d}) = Dense +summary_fiber_abbrev(lvl::Dense) = "d($(summary_fiber_abbrev(lvl.lvl)))" similar_level(lvl::DenseLevel) = Dense(similar_level(lvl.lvl)) similar_level(lvl::DenseLevel, dims...) = Dense(similar_level(lvl.lvl, dims[1:end-1]...), dims[end]) @@ -115,7 +115,7 @@ function (ctx::Finch.LowerJulia)(lvl::VirtualDenseLevel) end end -summary_f_code(lvl::VirtualDenseLevel) = "d($(summary_f_code(lvl.lvl)))" +summary_fiber_abbrev(lvl::VirtualDenseLevel) = "d($(summary_fiber_abbrev(lvl.lvl)))" function virtual_level_size(lvl::VirtualDenseLevel, ctx) ext = Extent(literal(lvl.Ti(1)), lvl.shape) diff --git a/src/levels/elementlevels.jl b/src/levels/elementlevels.jl index bdc03cf13..7c65b8d21 100644 --- a/src/levels/elementlevels.jl +++ b/src/levels/elementlevels.jl @@ -29,10 +29,10 @@ ElementLevel{D}(val::Vector{Tv}) where {D, Tv} = ElementLevel{D, Tv}(val) ElementLevel{D, Tv}() where {D, Tv} = ElementLevel{D, Tv}(Tv[]) """ -`f_code(e)` = [ElementLevel](@ref). +`fiber_abbrev(e)` = [ElementLevel](@ref). """ -f_code(::Val{:e}) = Element -summary_f_code(::Element{D}) where {D} = "e($(D))" +fiber_abbrev(::Val{:e}) = Element +summary_fiber_abbrev(::Element{D}) where {D} = "e($(D))" similar_level(::ElementLevel{D}) where {D} = ElementLevel{D}() pattern!(lvl::ElementLevel) = Pattern() @@ -89,7 +89,7 @@ function virtualize(ex, ::Type{ElementLevel{D, Tv}}, ctx, tag=:lvl) where {D, Tv VirtualElementLevel(sym, Tv, D) end -summary_f_code(lvl::VirtualElementLevel) = "e($(lvl.D))" +summary_fiber_abbrev(lvl::VirtualElementLevel) = "e($(lvl.D))" virtual_level_resize!(lvl::VirtualElementLevel, ctx) = lvl virtual_level_size(::VirtualElementLevel, ctx) = () diff --git a/src/levels/patternlevels.jl b/src/levels/patternlevels.jl index 1cb65966f..2af30c9a2 100644 --- a/src/levels/patternlevels.jl +++ b/src/levels/patternlevels.jl @@ -19,10 +19,10 @@ struct PatternLevel end const Pattern = PatternLevel """ -`f_code(p)` = [PatternLevel](@ref). +`fiber_abbrev(p)` = [PatternLevel](@ref). """ -f_code(::Val{:p}) = Pattern -summary_f_code(::Pattern) = "p()" +fiber_abbrev(::Val{:p}) = Pattern +summary_fiber_abbrev(::Pattern) = "p()" similar_level(::PatternLevel) = PatternLevel() countstored_level(lvl::PatternLevel, pos) = pos diff --git a/src/levels/repeatrlelevels.jl b/src/levels/repeatrlelevels.jl index d64875112..86404f09c 100644 --- a/src/levels/repeatrlelevels.jl +++ b/src/levels/repeatrlelevels.jl @@ -43,10 +43,10 @@ RepeatRLELevel{D, Ti, Tp, Tv}() where {D, Ti, Tp, Tv} = RepeatRLELevel{D, Ti, Tp RepeatRLELevel{D, Ti, Tp, Tv}(shape) where {D, Ti, Tp, Tv} = RepeatRLELevel{D, Ti, Tp, Tv}(Ti(shape), Tp[1], Ti[], Tv[]) """ -`f_code(rl)` = [RepeatRLELevel](@ref). +`fiber_abbrev(rl)` = [RepeatRLELevel](@ref). """ -f_code(::Val{:rl}) = RepeatRLE -summary_f_code(::RepeatRLE{D}) where {D} = "rl($(D))" +fiber_abbrev(::Val{:rl}) = RepeatRLE +summary_fiber_abbrev(::RepeatRLE{D}) where {D} = "rl($(D))" similar_level(::RepeatRLELevel{D}) where {D} = RepeatRLE{D}() similar_level(::RepeatRLELevel{D}, dim, tail...) where {D} = RepeatRLE{D}(dim) data_rep_level(::Type{<:RepeatRLELevel{D, Ti, Tp, Tv}}) where {D, Ti, Tp, Tv} = RepeatData(D, Tv) @@ -141,7 +141,7 @@ function (ctx::Finch.LowerJulia)(lvl::VirtualRepeatRLELevel) end end -summary_f_code(lvl::VirtualRepeatRLELevel) = "rl($(lvl.D))" +summary_fiber_abbrev(lvl::VirtualRepeatRLELevel) = "rl($(lvl.D))" function virtual_level_size(lvl::VirtualRepeatRLELevel, ctx) ext = Extent(literal(lvl.Ti(1)), lvl.shape) diff --git a/src/levels/sparsebytemaplevels.jl b/src/levels/sparsebytemaplevels.jl index d899a2a01..21da64bbc 100644 --- a/src/levels/sparsebytemaplevels.jl +++ b/src/levels/sparsebytemaplevels.jl @@ -16,10 +16,10 @@ SparseByteMapLevel{Ti, Tp, Lvl}(lvl, shape) where {Ti, Tp, Lvl} = SparseByteMapLevel{Ti, Tp, Lvl}(lvl, Ti(shape), Tp[1], Bool[], Tuple{Tp, Ti}[]) """ -`f_code(sbm)` = [SparseByteMapLevel](@ref). +`fiber_abbrev(sbm)` = [SparseByteMapLevel](@ref). """ -f_code(::Val{:sbm}) = SparseByteMap -summary_f_code(lvl::SparseByteMapLevel) = "sbm($(summary_f_code(lvl.lvl)))" +fiber_abbrev(::Val{:sbm}) = SparseByteMap +summary_fiber_abbrev(lvl::SparseByteMapLevel) = "sbm($(summary_fiber_abbrev(lvl.lvl)))" similar_level(lvl::SparseByteMapLevel) = SparseByteMap(similar_level(lvl.lvl)) similar_level(lvl::SparseByteMapLevel, dims...) = SparseByteMap(similar_level(lvl.lvl, dims[1:end-1]...), dims[end]) @@ -121,7 +121,7 @@ function (ctx::Finch.LowerJulia)(lvl::VirtualSparseByteMapLevel) end end -summary_f_code(lvl::VirtualSparseByteMapLevel) = "sbm($(summary_f_code(lvl.lvl)))" +summary_fiber_abbrev(lvl::VirtualSparseByteMapLevel) = "sbm($(summary_fiber_abbrev(lvl.lvl)))" function virtual_level_size(lvl::VirtualSparseByteMapLevel, ctx) ext = Extent(literal(lvl.Ti(1)), lvl.shape) diff --git a/src/levels/sparsecoolevels.jl b/src/levels/sparsecoolevels.jl index 289f18caa..50407cc1b 100644 --- a/src/levels/sparsecoolevels.jl +++ b/src/levels/sparsecoolevels.jl @@ -56,10 +56,10 @@ SparseCOOLevel{N, Ti, Tp, Tbl, Lvl}(lvl, shape) where {N, Ti, Tp, Tbl, Lvl} = SparseCOOLevel{N, Ti, Tp, Tbl, Lvl}(lvl, Ti(shape), ((Vector{ti}() for ti in Ti.parameters)...,), Tp[1]) """ -`f_code(sc)` = [SparseCOOLevel](@ref). +`fiber_abbrev(sc)` = [SparseCOOLevel](@ref). """ -f_code(::Val{:sc}) = SparseCOO -summary_f_code(lvl::SparseCOOLevel{N}) where {N} = "sc{$N}($(summary_f_code(lvl.lvl)))" +fiber_abbrev(::Val{:sc}) = SparseCOO +summary_fiber_abbrev(lvl::SparseCOOLevel{N}) where {N} = "sc{$N}($(summary_fiber_abbrev(lvl.lvl)))" similar_level(lvl::SparseCOOLevel{N}) where {N} = SparseCOOLevel{N}(similar_level(lvl.lvl)) similar_level(lvl::SparseCOOLevel{N}, tail...) where {N} = SparseCOOLevel{N}(similar_level(lvl.lvl, tail[1:end-N]...), (tail[end-N+1:end]...,)) @@ -164,7 +164,7 @@ function (ctx::Finch.LowerJulia)(lvl::VirtualSparseCOOLevel) end end -summary_f_code(lvl::VirtualSparseCOOLevel) = "sc{$(lvl.N)}($(summary_f_code(lvl.lvl)))" +summary_fiber_abbrev(lvl::VirtualSparseCOOLevel) = "sc{$(lvl.N)}($(summary_fiber_abbrev(lvl.lvl)))" function virtual_level_size(lvl::VirtualSparseCOOLevel, ctx::LowerJulia) ext = map((ti, stop)->Extent(literal(ti(1)), stop), lvl.Ti.parameters, lvl.shape) diff --git a/src/levels/sparsehashlevels.jl b/src/levels/sparsehashlevels.jl index 548b2d9b2..a9b73d2e3 100644 --- a/src/levels/sparsehashlevels.jl +++ b/src/levels/sparsehashlevels.jl @@ -58,10 +58,10 @@ SparseHashLevel{N, Ti, Tp, Tbl, Lvl}(lvl, shape, tbl) where {N, Ti, Tp, Tbl, Lvl SparseHashLevel{N, Ti, Tp, Tbl, Lvl}(lvl, Ti(shape), tbl, Tp[1], Pair{Tuple{Tp, Ti}, Tp}[]) """ -`f_code(sh)` = [SparseHashLevel](@ref). +`fiber_abbrev(sh)` = [SparseHashLevel](@ref). """ -f_code(::Val{:sh}) = SparseHash -summary_f_code(lvl::SparseHashLevel{N}) where {N} = "sh{$N}($(summary_f_code(lvl.lvl)))" +fiber_abbrev(::Val{:sh}) = SparseHash +summary_fiber_abbrev(lvl::SparseHashLevel{N}) where {N} = "sh{$N}($(summary_fiber_abbrev(lvl.lvl)))" similar_level(lvl::SparseHashLevel{N}) where {N} = SparseHashLevel{N}(similar_level(lvl.lvl)) similar_level(lvl::SparseHashLevel{N}, tail...) where {N} = SparseHashLevel{N}(similar_level(lvl.lvl, tail[1:end-N]...), (tail[end-N+1:end]...,)) @@ -173,7 +173,7 @@ function (ctx::Finch.LowerJulia)(lvl::VirtualSparseHashLevel) end end -summary_f_code(lvl::VirtualSparseHashLevel) = "sh{$(lvl.N)}($(summary_f_code(lvl.lvl)))" +summary_fiber_abbrev(lvl::VirtualSparseHashLevel) = "sh{$(lvl.N)}($(summary_fiber_abbrev(lvl.lvl)))" function virtual_level_size(lvl::VirtualSparseHashLevel, ctx::LowerJulia) ext = map((ti, stop)->Extent(literal(ti(1)), stop), lvl.Ti.parameters, lvl.shape) diff --git a/src/levels/sparselistlevels.jl b/src/levels/sparselistlevels.jl index a4209c9a3..80755ef47 100644 --- a/src/levels/sparselistlevels.jl +++ b/src/levels/sparselistlevels.jl @@ -50,10 +50,10 @@ SparseListLevel{Ti, Tp, Lvl}(lvl, shape) where {Ti, Tp, Lvl} = SparseListLevel{Ti, Tp, Lvl}(lvl, Ti(shape), Tp[1], Ti[]) """ -`f_code(l)` = [SparseListLevel](@ref). +`fiber_abbrev(l)` = [SparseListLevel](@ref). """ -f_code(::Val{:sl}) = SparseList -summary_f_code(lvl::SparseListLevel) = "sl($(summary_f_code(lvl.lvl)))" +fiber_abbrev(::Val{:sl}) = SparseList +summary_fiber_abbrev(lvl::SparseListLevel) = "sl($(summary_fiber_abbrev(lvl.lvl)))" similar_level(lvl::SparseListLevel) = SparseList(similar_level(lvl.lvl)) similar_level(lvl::SparseListLevel, dim, tail...) = SparseList(similar_level(lvl.lvl, tail...), dim) @@ -147,7 +147,7 @@ function (ctx::Finch.LowerJulia)(lvl::VirtualSparseListLevel) end end -summary_f_code(lvl::VirtualSparseListLevel) = "sl($(summary_f_code(lvl.lvl)))" +summary_fiber_abbrev(lvl::VirtualSparseListLevel) = "sl($(summary_fiber_abbrev(lvl.lvl)))" function virtual_level_size(lvl::VirtualSparseListLevel, ctx) ext = Extent(literal(lvl.Ti(1)), lvl.shape) diff --git a/src/levels/sparsevbllevels.jl b/src/levels/sparsevbllevels.jl index 253b2c4c8..437ac7d42 100644 --- a/src/levels/sparsevbllevels.jl +++ b/src/levels/sparsevbllevels.jl @@ -17,10 +17,10 @@ SparseVBLLevel{Ti, Tp, Lvl}(lvl, shape) where {Ti, Tp, Lvl} = SparseVBLLevel{Ti, Tp, Lvl}(lvl, shape, Tp[1], Ti[], Ti[]) """ -`f_code(svb)` = [SparseVBLLevel](@ref). +`fiber_abbrev(svb)` = [SparseVBLLevel](@ref). """ -f_code(::Val{:svb}) = SparseVBL -summary_f_code(lvl::SparseVBLLevel) = "svb($(summary_f_code(lvl.lvl)))" +fiber_abbrev(::Val{:svb}) = SparseVBL +summary_fiber_abbrev(lvl::SparseVBLLevel) = "svb($(summary_fiber_abbrev(lvl.lvl)))" similar_level(lvl::SparseVBLLevel) = SparseVBL(similar_level(lvl.lvl)) similar_level(lvl::SparseVBLLevel, dim, tail...) = SparseVBL(similar_level(lvl.lvl, tail...), dim) @@ -130,7 +130,7 @@ function (ctx::Finch.LowerJulia)(lvl::VirtualSparseVBLLevel) end end -summary_f_code(lvl::VirtualSparseVBLLevel) = "svb($(summary_f_code(lvl.lvl)))" +summary_fiber_abbrev(lvl::VirtualSparseVBLLevel) = "svb($(summary_fiber_abbrev(lvl.lvl)))" function virtual_level_size(lvl::VirtualSparseVBLLevel, ctx) ext = Extent(literal(lvl.Ti(1)), lvl.shape) From 4ea187ce2a0fecb4c66c305540bea21ac177c1ba Mon Sep 17 00:00:00 2001 From: Willow Ahrens Date: Mon, 8 May 2023 15:54:04 -0400 Subject: [PATCH 2/4] cleanup unresolved paths --- docs/make.jl | 1 - docs/src/development.md | 4 ++-- docs/src/fibers.md | 1 + docs/src/listing.md | 4 ++-- src/FinchNotation/syntax.jl | 5 +++-- src/fibers.jl | 15 ++++++++------- src/levels/denselevels.jl | 4 ++-- src/levels/elementlevels.jl | 4 ++-- src/levels/patternlevels.jl | 4 ++-- src/levels/repeatrlelevels.jl | 4 ++-- src/levels/sparsebytemaplevels.jl | 2 +- src/levels/sparsecoolevels.jl | 4 ++-- src/levels/sparsehashlevels.jl | 4 ++-- src/levels/sparselistlevels.jl | 4 ++-- src/levels/sparsevbllevels.jl | 2 +- 15 files changed, 32 insertions(+), 30 deletions(-) diff --git a/docs/make.jl b/docs/make.jl index 1e181677e..a6a562e1b 100644 --- a/docs/make.jl +++ b/docs/make.jl @@ -17,7 +17,6 @@ makedocs(; pages=[ "Home" => "index.md", "Array Formats" => "fibers.md", - "The Deets" => "listing.md", "Custom Functions" => "algebra.md", "Tensor File I/O" => "fileio.md", "C, C++, ..." => "interop.md", diff --git a/docs/src/development.md b/docs/src/development.md index 86479e998..dda6e9d9d 100644 --- a/docs/src/development.md +++ b/docs/src/development.md @@ -138,7 +138,7 @@ TODO more on the way... Every virtual tensor must be in one of two modes: read-only mode or update-only mode. The following functions may be called on virtual tensors throughout their life cycle. ```@docs -initialize! +declare! get_reader get_updater freeze! @@ -151,7 +151,7 @@ Fiber levels are implemented using the following methods: ```@docs default -initialize_level! +declare_level! assemble_level! reassemble_level! freeze_level! diff --git a/docs/src/fibers.md b/docs/src/fibers.md index af0880d0c..235281d7a 100644 --- a/docs/src/fibers.md +++ b/docs/src/fibers.md @@ -171,6 +171,7 @@ of supported formats is described below: @fiber fiber fiber! +fiber_abbrev ``` ### Level Constructors diff --git a/docs/src/listing.md b/docs/src/listing.md index dbe4c6923..a429ba8df 100644 --- a/docs/src/listing.md +++ b/docs/src/listing.md @@ -2,8 +2,8 @@ CurrentModule = Finch ``` -# Public Functions +# All Documentation ```@autodocs Modules = [Finch] -``` +``` \ No newline at end of file diff --git a/src/FinchNotation/syntax.jl b/src/FinchNotation/syntax.jl index d0363baa6..58dadbfd3 100644 --- a/src/FinchNotation/syntax.jl +++ b/src/FinchNotation/syntax.jl @@ -68,8 +68,9 @@ end """ initwrite(z)(a, b) -`initwrite(z)` is a function which may assert that `a` [isequal](@ref) to `z`, and -`returns `b`. By default, `lhs[] = rhs` is equivalent to `lhs[] +`initwrite(z)` is a function which may assert that `a` +[`isequal`](https://docs.julialang.org/en/v1/base/base/#Base.isequal) to `z`, +and `returns `b`. By default, `lhs[] = rhs` is equivalent to `lhs[] <>= rhs`. """ initwrite(z) = InitWriter{z}() diff --git a/src/fibers.jl b/src/fibers.jl index f5a9b00d5..3a7722aa2 100644 --- a/src/fibers.jl +++ b/src/fibers.jl @@ -52,7 +52,7 @@ FinchNotation.finch_leaf(x::VirtualSubFiber) = virtual(x) """ level_ndims(::Type{Lvl}) -The result of `level_ndims(Lvl)` defines [ndims](@ref) for all subfibers +The result of `level_ndims(Lvl)` defines [ndims](https://docs.julialang.org/en/v1/base/arrays/#Base.ndims) for all subfibers in a level of type `Lvl`. """ function level_ndims end @@ -62,7 +62,7 @@ function level_ndims end """ level_size(lvl) -The result of `level_size(lvl)` defines the [size](@ref) of all subfibers in the +The result of `level_size(lvl)` defines the [size](https://docs.julialang.org/en/v1/base/arrays/#Base.size) of all subfibers in the level `lvl`. """ function level_size end @@ -71,7 +71,7 @@ function level_size end """ level_axes(lvl) -The result of `level_axes(lvl)` defines the [axes](@ref) of all subfibers in the +The result of `level_axes(lvl)` defines the [axes](https://docs.julialang.org/en/v1/base/arrays/#Base.axes-Tuple{Any}) of all subfibers in the level `lvl`. """ function level_axes end @@ -80,8 +80,9 @@ function level_axes end """ level_eltype(::Type{Lvl}) -The result of `level_eltype(Lvl)` defines [eltype](@ref) for all subfibers in a -level of type `Lvl`. +The result of `level_eltype(Lvl)` defines +[`eltype`](https://docs.julialang.org/en/v1/base/collections/#Base.eltype) for +all subfibers in a level of type `Lvl`. """ function level_eltype end @inline Base.eltype(::AbstractFiber{Lvl}) where {Lvl} = level_eltype(Lvl) @@ -90,7 +91,7 @@ function level_eltype end """ level_default(::Type{Lvl}) -The result of `level_default(Lvl)` defines [default](@ref) for all subfibers in a +The result of `level_default(Lvl)` defines [`default`](@ref) for all subfibers in a level of type `Lvl`. """ function level_default end @@ -340,7 +341,7 @@ Construct a fiber using abbreviated level constructor names. To override abbreviations, expressions may be interpolated with `\$`. For example, `Fiber(DenseLevel(SparseListLevel(Element(0.0))))` can also be constructed as `@fiber(sl(d(e(0.0))))`. Consult the documentation for the helper function -[fiber_abbrev](@ref) for a full listing of level format codes. +[`fiber_abbrev`](@ref) for a full listing of level format abbreviations. Optionally, an argument may be specified to copy into the fiber. This expression allocates. Use `fiber(arg)` for a zero-cost copy, if available. diff --git a/src/levels/denselevels.jl b/src/levels/denselevels.jl index dc5624c0b..c3af4f691 100644 --- a/src/levels/denselevels.jl +++ b/src/levels/denselevels.jl @@ -5,7 +5,7 @@ A subfiber of a dense level is an array which stores every slice `A[:, ..., :, i]` as a distinct subfiber in `lvl`. Optionally, `dim` is the size of the last dimension. `Ti` is the type of the indices used to index the level. -In the [@fiber](@ref) constructor, `d` is an alias for `DenseLevel`. +In the [`@fiber`](@ref) constructor, `d` is an alias for `DenseLevel`. ```jldoctest julia> ndims(@fiber(d(e(0.0)))) @@ -37,7 +37,7 @@ DenseLevel{Ti, Lvl}(lvl) where {Ti, Lvl} = DenseLevel{Ti, Lvl}(lvl, zero(Ti)) const Dense = DenseLevel """ -`fiber_abbrev(d)` = [DenseLevel](@ref). +`fiber_abbrev(d)` = [`DenseLevel`](@ref). """ fiber_abbrev(::Val{:d}) = Dense summary_fiber_abbrev(lvl::Dense) = "d($(summary_fiber_abbrev(lvl.lvl)))" diff --git a/src/levels/elementlevels.jl b/src/levels/elementlevels.jl index 7c65b8d21..f6d6f957c 100644 --- a/src/levels/elementlevels.jl +++ b/src/levels/elementlevels.jl @@ -4,7 +4,7 @@ A subfiber of an element level is a scalar of type `Tv`, initialized to `D`. `D` may optionally be given as the first argument. -In the [@fiber](@ref) constructor, `e` is an alias for `ElementLevel`. +In the [`@fiber`](@ref) constructor, `e` is an alias for `ElementLevel`. ```jldoctest julia> @fiber(d(e(0.0)), [1, 2, 3]) @@ -29,7 +29,7 @@ ElementLevel{D}(val::Vector{Tv}) where {D, Tv} = ElementLevel{D, Tv}(val) ElementLevel{D, Tv}() where {D, Tv} = ElementLevel{D, Tv}(Tv[]) """ -`fiber_abbrev(e)` = [ElementLevel](@ref). +`fiber_abbrev(e)` = [`ElementLevel`](@ref). """ fiber_abbrev(::Val{:e}) = Element summary_fiber_abbrev(::Element{D}) where {D} = "e($(D))" diff --git a/src/levels/patternlevels.jl b/src/levels/patternlevels.jl index 2af30c9a2..9b5672757 100644 --- a/src/levels/patternlevels.jl +++ b/src/levels/patternlevels.jl @@ -5,7 +5,7 @@ A subfiber of a pattern level is the Boolean value true, but it's `default` is false. PatternLevels are used to create tensors that represent which values are stored by other fibers. See [`pattern`](@ref) for usage examples. -In the [@fiber](@ref) constructor, `p` is an alias for `ElementLevel`. +In the [`@fiber`](@ref) constructor, `p` is an alias for `ElementLevel`. ```jldoctest julia> @fiber(d(p(), 3)) @@ -19,7 +19,7 @@ struct PatternLevel end const Pattern = PatternLevel """ -`fiber_abbrev(p)` = [PatternLevel](@ref). +`fiber_abbrev(p)` = [`PatternLevel`](@ref). """ fiber_abbrev(::Val{:p}) = Pattern summary_fiber_abbrev(::Pattern) = "p()" diff --git a/src/levels/repeatrlelevels.jl b/src/levels/repeatrlelevels.jl index 86404f09c..a2e2b4022 100644 --- a/src/levels/repeatrlelevels.jl +++ b/src/levels/repeatrlelevels.jl @@ -9,7 +9,7 @@ The fibers have type `Tv`, initialized to `D`. `D` may optionally be given as the first argument. `Ti` is the type of the last fiber index, and `Tp` is the type used for positions in the level. -In the [@fiber](@ref) constructor, `rl` is an alias for `RepeatRLELevel`. +In the [`@fiber`](@ref) constructor, `rl` is an alias for `RepeatRLELevel`. ```jldoctest julia> @fiber(rl(0.0), [11, 11, 22, 22, 00, 00, 00, 33, 33]) @@ -43,7 +43,7 @@ RepeatRLELevel{D, Ti, Tp, Tv}() where {D, Ti, Tp, Tv} = RepeatRLELevel{D, Ti, Tp RepeatRLELevel{D, Ti, Tp, Tv}(shape) where {D, Ti, Tp, Tv} = RepeatRLELevel{D, Ti, Tp, Tv}(Ti(shape), Tp[1], Ti[], Tv[]) """ -`fiber_abbrev(rl)` = [RepeatRLELevel](@ref). +`fiber_abbrev(rl)` = [`RepeatRLELevel`](@ref). """ fiber_abbrev(::Val{:rl}) = RepeatRLE summary_fiber_abbrev(::RepeatRLE{D}) where {D} = "rl($(D))" diff --git a/src/levels/sparsebytemaplevels.jl b/src/levels/sparsebytemaplevels.jl index 21da64bbc..b168f297f 100644 --- a/src/levels/sparsebytemaplevels.jl +++ b/src/levels/sparsebytemaplevels.jl @@ -16,7 +16,7 @@ SparseByteMapLevel{Ti, Tp, Lvl}(lvl, shape) where {Ti, Tp, Lvl} = SparseByteMapLevel{Ti, Tp, Lvl}(lvl, Ti(shape), Tp[1], Bool[], Tuple{Tp, Ti}[]) """ -`fiber_abbrev(sbm)` = [SparseByteMapLevel](@ref). +`fiber_abbrev(sbm)` = [`SparseByteMapLevel`](@ref). """ fiber_abbrev(::Val{:sbm}) = SparseByteMap summary_fiber_abbrev(lvl::SparseByteMapLevel) = "sbm($(summary_fiber_abbrev(lvl.lvl)))" diff --git a/src/levels/sparsecoolevels.jl b/src/levels/sparsecoolevels.jl index 50407cc1b..312393fd6 100644 --- a/src/levels/sparsecoolevels.jl +++ b/src/levels/sparsecoolevels.jl @@ -12,7 +12,7 @@ order. Optionally, `dims` are the sizes of the last dimensions. `Ti` is the type of the last `N` fiber indices, and `Tp` is the type used for positions in the level. -In the [@fiber](@ref) constructor, `sh` is an alias for `SparseCOOLevel`. +In the [`@fiber`](@ref) constructor, `sh` is an alias for `SparseCOOLevel`. ```jldoctest julia> @fiber(d(sc{1}(e(0.0))), [10 0 20; 30 0 0; 0 0 40]) @@ -56,7 +56,7 @@ SparseCOOLevel{N, Ti, Tp, Tbl, Lvl}(lvl, shape) where {N, Ti, Tp, Tbl, Lvl} = SparseCOOLevel{N, Ti, Tp, Tbl, Lvl}(lvl, Ti(shape), ((Vector{ti}() for ti in Ti.parameters)...,), Tp[1]) """ -`fiber_abbrev(sc)` = [SparseCOOLevel](@ref). +`fiber_abbrev(sc)` = [`SparseCOOLevel`](@ref). """ fiber_abbrev(::Val{:sc}) = SparseCOO summary_fiber_abbrev(lvl::SparseCOOLevel{N}) where {N} = "sc{$N}($(summary_fiber_abbrev(lvl.lvl)))" diff --git a/src/levels/sparsehashlevels.jl b/src/levels/sparsehashlevels.jl index a9b73d2e3..fd687bdfe 100644 --- a/src/levels/sparsehashlevels.jl +++ b/src/levels/sparsehashlevels.jl @@ -11,7 +11,7 @@ in the subfiber, so fibers in the sublevel are the slices `A[:, ..., :, i_1, `Ti` is the type of the last `N` fiber indices, and `Tp` is the type used for positions in the level. -In the [@fiber](@ref) constructor, `sh` is an alias for `SparseHashLevel`. +In the [`@fiber`](@ref) constructor, `sh` is an alias for `SparseHashLevel`. ```jldoctest julia> @fiber(d(sh{1}(e(0.0))), [10 0 20; 30 0 0; 0 0 40]) @@ -58,7 +58,7 @@ SparseHashLevel{N, Ti, Tp, Tbl, Lvl}(lvl, shape, tbl) where {N, Ti, Tp, Tbl, Lvl SparseHashLevel{N, Ti, Tp, Tbl, Lvl}(lvl, Ti(shape), tbl, Tp[1], Pair{Tuple{Tp, Ti}, Tp}[]) """ -`fiber_abbrev(sh)` = [SparseHashLevel](@ref). +`fiber_abbrev(sh)` = [`SparseHashLevel`](@ref). """ fiber_abbrev(::Val{:sh}) = SparseHash summary_fiber_abbrev(lvl::SparseHashLevel{N}) where {N} = "sh{$N}($(summary_fiber_abbrev(lvl.lvl)))" diff --git a/src/levels/sparselistlevels.jl b/src/levels/sparselistlevels.jl index 80755ef47..098a23d5d 100644 --- a/src/levels/sparselistlevels.jl +++ b/src/levels/sparselistlevels.jl @@ -9,7 +9,7 @@ slices are stored. Optionally, `dim` is the size of the last dimension. `Ti` is the type of the last fiber index, and `Tp` is the type used for positions in the level. -In the [@fiber](@ref) constructor, `sl` is an alias for `SparseListLevel`. +In the [`@fiber`](@ref) constructor, `sl` is an alias for `SparseListLevel`. ```jldoctest julia> @fiber(d(sl(e(0.0))), [10 0 20; 30 0 0; 0 0 40]) @@ -50,7 +50,7 @@ SparseListLevel{Ti, Tp, Lvl}(lvl, shape) where {Ti, Tp, Lvl} = SparseListLevel{Ti, Tp, Lvl}(lvl, Ti(shape), Tp[1], Ti[]) """ -`fiber_abbrev(l)` = [SparseListLevel](@ref). +`fiber_abbrev(l)` = [`SparseListLevel`](@ref). """ fiber_abbrev(::Val{:sl}) = SparseList summary_fiber_abbrev(lvl::SparseListLevel) = "sl($(summary_fiber_abbrev(lvl.lvl)))" diff --git a/src/levels/sparsevbllevels.jl b/src/levels/sparsevbllevels.jl index 437ac7d42..1374f0b07 100644 --- a/src/levels/sparsevbllevels.jl +++ b/src/levels/sparsevbllevels.jl @@ -17,7 +17,7 @@ SparseVBLLevel{Ti, Tp, Lvl}(lvl, shape) where {Ti, Tp, Lvl} = SparseVBLLevel{Ti, Tp, Lvl}(lvl, shape, Tp[1], Ti[], Ti[]) """ -`fiber_abbrev(svb)` = [SparseVBLLevel](@ref). +`fiber_abbrev(svb)` = [`SparseVBLLevel`](@ref). """ fiber_abbrev(::Val{:svb}) = SparseVBL summary_fiber_abbrev(lvl::SparseVBLLevel) = "svb($(summary_fiber_abbrev(lvl.lvl)))" From fd01897ec022fb55eb970ff0da749c972a21c9f6 Mon Sep 17 00:00:00 2001 From: Willow Ahrens Date: Mon, 8 May 2023 15:54:07 -0400 Subject: [PATCH 3/4] rm listing --- docs/src/listing.md | 9 --------- 1 file changed, 9 deletions(-) delete mode 100644 docs/src/listing.md diff --git a/docs/src/listing.md b/docs/src/listing.md deleted file mode 100644 index a429ba8df..000000000 --- a/docs/src/listing.md +++ /dev/null @@ -1,9 +0,0 @@ -```@meta -CurrentModule = Finch -``` - -# All Documentation - -```@autodocs -Modules = [Finch] -``` \ No newline at end of file From 9e96c16403a171126b8853cc519735783141d796 Mon Sep 17 00:00:00 2001 From: Willow Ahrens Date: Mon, 8 May 2023 15:58:01 -0400 Subject: [PATCH 4/4] rudimentary landing page cleanup --- docs/src/index.md | 16 ++++++++-------- 1 file changed, 8 insertions(+), 8 deletions(-) diff --git a/docs/src/index.md b/docs/src/index.md index 3c1183bc9..c4bd81337 100644 --- a/docs/src/index.md +++ b/docs/src/index.md @@ -5,11 +5,10 @@ CurrentModule = Finch # Finch [Finch](https://github.com/willow-ahrens/Finch.jl) is an adaptable compiler for -loop nests over structured arrays. Finch can specialize to tensors with runs of -repeated values, or to tensors which are sparse (mostly zero). Finch supports -general sparsity as well as many specialized sparsity patterns, like clustered -nonzeros, diagonals, or triangles. In addition to zero, Finch supports -optimizations over arbitrary fill values and operators. +loop nests over sparse or otherwise structured arrays. Finch supports general +sparsity as well as many specialized sparsity patterns, like clustered nonzeros, +diagonals, or triangles. In addition to zero, Finch supports optimizations over +arbitrary fill values and operators, even run-length-compression. At it's heart, Finch is powered by a domain specific language for coiteration, breaking structured iterators into units we call Looplets. The Looplets are @@ -24,6 +23,7 @@ julia> using Pkg; Pkg.add("Finch") ## Usage: -We're working on adding more documentation, for now take a look at the -[benchmarks](https://github.com/willow-ahrens/Finch.jl/blob/main/benchmark/benchmarks.jl) -for a few example algorithms. \ No newline at end of file +We're working on adding more documentation, for now take a look at the examples +for [linear +algebra](https://github.com/willow-ahrens/Finch.jl/blob/main/apps/linalg.jl) or +[graphs](https://github.com/willow-ahrens/Finch.jl/blob/main/apps/graphs.jl). \ No newline at end of file