-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
added knowledgepack creation scripts
- Loading branch information
1 parent
f70588a
commit 6c15260
Showing
12 changed files
with
179 additions
and
12 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
18 changes: 18 additions & 0 deletions
18
example_scripts/creating_knowledge_packs/genie_knowledge_pack.jl
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,18 @@ | ||
# The example below demonstrates the creation of Genie knowledge pack | ||
|
||
using Pkg | ||
Pkg.activate(temp = true) | ||
Pkg.add(url = "https://github.com/JuliaGenAI/DocsScraper.jl") | ||
using DocsScraper | ||
|
||
# The crawler will run on these URLs to look for more URLs with the same hostname | ||
crawlable_urls = ["https://learn.genieframework.com/"] | ||
|
||
index_path = make_knowledge_packs(crawlable_urls; | ||
target_path = joinpath("knowledge_packs", "dim=3072;chunk_size=384;Float32"), index_name = "genie", custom_metadata = "Genie ecosystem") | ||
|
||
# The index created here has 1024 embedding dimensions with boolean embeddings and max chunk size is 384. | ||
|
||
# The above example creates an output directory index_name which contains the sub-directories "Scraped" and "Index". | ||
# "Scraped" contains .jls files of chunks and sources of the scraped URLs. Index contains the created index along with a .txt file | ||
# containing the artifact info. The output directory also contains the URL mapping csv. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
24 changes: 24 additions & 0 deletions
24
example_scripts/creating_knowledge_packs/juliaLang_knowledge_pack.jl
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,24 @@ | ||
# The example below demonstrates the creation of JuliaLang knowledge pack | ||
|
||
using Pkg | ||
Pkg.activate(temp = true) | ||
Pkg.add(url = "https://github.com/JuliaGenAI/DocsScraper.jl") | ||
using DocsScraper | ||
|
||
# The crawler will run on these URLs to look for more URLs with the same hostname | ||
crawlable_urls = [ | ||
"https://docs.julialang.org/en/v1/", "https://julialang.github.io/IJulia.jl/stable/", | ||
"https://julialang.github.io/PackageCompiler.jl/stable/", "https://pkgdocs.julialang.org/dev/", | ||
"https://julialang.github.io/JuliaSyntax.jl/dev/", | ||
"https://julialang.github.io/AllocCheck.jl/dev/", "https://julialang.github.io/PrecompileTools.jl/stable/", | ||
"https://julialang.github.io/StyledStrings.jl/dev/"] | ||
|
||
index_path = make_knowledge_packs(crawlable_urls; | ||
target_path = joinpath("knowledge_packs", "dim=3072;chunk_size=384;Float32"), | ||
index_name = "julialang", custom_metadata = "JuliaLang ecosystem") | ||
|
||
# The index created here has 1024 embedding dimensions with boolean embeddings and max chunk size is 384. | ||
|
||
# The above example creates an output directory index_name which contains the sub-directories "Scraped" and "Index". | ||
# "Scraped" contains .jls files of chunks and sources of the scraped URLs. Index contains the created index along with a .txt file | ||
# containing the artifact info. The output directory also contains the URL mapping csv. |
26 changes: 26 additions & 0 deletions
26
example_scripts/creating_knowledge_packs/makie_knowledge_pack.jl
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,26 @@ | ||
# The example below demonstrates the creation of Makie knowledge pack | ||
|
||
using Pkg | ||
Pkg.activate(temp = true) | ||
Pkg.add(url = "https://github.com/JuliaGenAI/DocsScraper.jl") | ||
using DocsScraper | ||
|
||
# The crawler will run on these URLs to look for more URLs with the same hostname | ||
crawlable_urls = ["https://docs.juliahub.com/MakieGallery/Ql23q/0.2.17/", | ||
"https://beautiful.makie.org/dev/", | ||
"https://juliadatascience.io/DataVisualizationMakie", | ||
"https://docs.makie.org/v0.21/explanations/backends/glmakie", "https://juliadatascience.io/glmakie", | ||
"https://docs.makie.org/v0.21/explanations/backends/cairomakie", "https://juliadatascience.io/cairomakie", "http://juliaplots.org/WGLMakie.jl/stable/", | ||
"http://juliaplots.org/WGLMakie.jl/dev/", "https://docs.makie.org/v0.21/explanations/backends/wglmakie", | ||
"https://docs.juliahub.com/MakieGallery/Ql23q/0.2.17/abstractplotting_api.html", "http://juliaplots.org/StatsMakie.jl/latest/", | ||
"https://docs.juliahub.com/StatsMakie/RRy0o/0.2.3/manual/tutorial/", "https://geo.makie.org/v0.7.3/", "https://geo.makie.org/dev/", | ||
"https://libgeos.org/doxygen/geos__c_8h.html", "https://docs.makie.org/v0.21/"] | ||
|
||
index_path = make_knowledge_packs(crawlable_urls; | ||
target_path = joinpath("knowledge_packs", "dim=3072;chunk_size=384;Float32"), index_name = "makie", custom_metadata = "Makie ecosystem") | ||
|
||
# The index created here has 1024 embedding dimensions with boolean embeddings and max chunk size is 384. | ||
|
||
# The above example creates an output directory index_name which contains the sub-directories "Scraped" and "Index". | ||
# "Scraped" contains .jls files of chunks and sources of the scraped URLs. Index contains the created index along with a .txt file | ||
# containing the artifact info. The output directory also contains the URL mapping csv. |
29 changes: 29 additions & 0 deletions
29
example_scripts/creating_knowledge_packs/plots_knowledge_pack.jl
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,29 @@ | ||
# The example below demonstrates the creation of plots knowledge pack | ||
|
||
using Pkg | ||
Pkg.activate(temp = true) | ||
Pkg.add(url = "https://github.com/JuliaGenAI/DocsScraper.jl") | ||
using DocsScraper | ||
|
||
# The crawler will run on these URLs to look for more URLs with the same hostname | ||
crawlable_urls = [ | ||
"https://docs.juliaplots.org/stable/", "https://docs.juliaplots.org/dev/", | ||
"https://docs.juliaplots.org/latest/", | ||
"https://docs.juliaplots.org/latest/generated/statsplots/", "https://docs.juliaplots.org/latest/ecosystem/", | ||
"http://juliaplots.org/PlotlyJS.jl/stable/", | ||
"http://juliaplots.org/PlotlyJS.jl/stable/manipulating_plots/", "https://docs.juliaplots.org/latest/gallery/gr/", | ||
"https://docs.juliaplots.org/latest/gallery/unicodeplots/", | ||
"https://docs.juliaplots.org/latest/gallery/pgfplotsx/", | ||
"https://juliaplots.org/RecipesBase.jl/stable/", | ||
"https://juliastats.org/StatsBase.jl/stable/", "https://juliastats.org/StatsBase.jl/stable/statmodels/", | ||
"http://juliagraphs.org/GraphPlot.jl/", | ||
"https://docs.juliahub.com/GraphPlot/bUwXr/0.6.0/"] | ||
|
||
index_path = make_knowledge_packs(crawlable_urls; | ||
target_path = joinpath("knowledge_packs", "dim=3072;chunk_size=384;Float32"), index_name = "plots", custom_metadata = "Plots ecosystem") | ||
|
||
# The index created here has 1024 embedding dimensions with boolean embeddings and max chunk size is 384. | ||
|
||
# The above example creates an output directory index_name which contains the sub-directories "Scraped" and "Index". | ||
# "Scraped" contains .jls files of chunks and sources of the scraped URLs. Index contains the created index along with a .txt file | ||
# containing the artifact info. The output directory also contains the URL mapping csv. |
51 changes: 51 additions & 0 deletions
51
example_scripts/creating_knowledge_packs/sciml_knowledge_pack.jl
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,51 @@ | ||
# The example below demonstrates the creation of SciML knowledge pack | ||
|
||
using Pkg | ||
Pkg.activate(temp = true) | ||
Pkg.add(url = "https://github.com/JuliaGenAI/DocsScraper.jl") | ||
using DocsScraper | ||
|
||
# The crawler will run on these URLs to look for more URLs with the same hostname | ||
crawlable_urls = ["https://sciml.ai/", "https://docs.sciml.ai/DiffEqDocs/stable/", | ||
"https://docs.sciml.ai/DiffEqDocs/stable/types/sde_types/", | ||
"https://docs.sciml.ai/ModelingToolkit/dev/", "https://docs.sciml.ai/DiffEqFlux/stable/", | ||
"https://docs.sciml.ai/NeuralPDE/stable/", "https://docs.sciml.ai/NeuralPDE/stable/tutorials/pdesystem/", | ||
"https://docs.sciml.ai/Optimization/stable/", | ||
"https://docs.sciml.ai/SciMLSensitivity/stable/", "https://docs.sciml.ai/DataDrivenDiffEq/stable/", "https://turinglang.org/", | ||
"https://turinglang.org/docs/tutorials/docs-00-getting-started/", "https://juliamath.github.io/MeasureTheory.jl/stable/", | ||
"https://juliamath.github.io/MeasureTheory.jl/stable/", "https://docs.sciml.ai/DiffEqGPU/stable/", | ||
"https://chevronetc.github.io/DistributedOperations.jl/dev/", "https://docs.sciml.ai/DiffEqBayes/stable/", | ||
"https://turinglang.org/docs/tutorials/10-bayesian-differential-equations/index.html", "https://docs.sciml.ai/OrdinaryDiffEq/stable/", | ||
"https://docs.sciml.ai/Overview/stable/", "https://docs.sciml.ai/DiffEqDocs/stable/solvers/sde_solve/", | ||
"https://docs.sciml.ai/SciMLSensitivity/stable/examples/dde/delay_diffeq/", "https://docs.sciml.ai/DiffEqDocs/stable/tutorials/dde_example/", | ||
"https://docs.sciml.ai/DiffEqDocs/stable/types/dae_types/", "https://docs.sciml.ai/DiffEqCallbacks/stable/", | ||
"https://docs.sciml.ai/SciMLBase/stable/", | ||
"https://docs.sciml.ai/DiffEqDocs/stable/features/callback_library/", "https://docs.sciml.ai/LinearSolve/stable/", | ||
"https://docs.sciml.ai/ModelingToolkit/stable/", | ||
"https://docs.sciml.ai/DataInterpolations/stable/", "https://docs.sciml.ai/DeepEquilibriumNetworks/stable/", | ||
"https://docs.sciml.ai/DiffEqParamEstim/stable/", | ||
"https://docs.sciml.ai/Integrals/stable/", "https://docs.sciml.ai/EasyModelAnalysis/stable/", | ||
"https://docs.sciml.ai/GlobalSensitivity/stable/", | ||
"https://docs.sciml.ai/ExponentialUtilities/stable/", "https://docs.sciml.ai/HighDimPDE/stable/", | ||
"https://docs.sciml.ai/SciMLTutorialsOutput/stable/", | ||
"https://docs.sciml.ai/Catalyst/stable/", "https://docs.sciml.ai/Surrogates/stable/", | ||
"https://docs.sciml.ai/SciMLBenchmarksOutput/stable/", | ||
"https://docs.sciml.ai/NeuralOperators/stable/", "https://docs.sciml.ai/NonlinearSolve/stable/", | ||
"https://docs.sciml.ai/RecursiveArrayTools/stable/", | ||
"https://docs.sciml.ai/ReservoirComputing/stable/", "https://docs.sciml.ai/MethodOfLines/stable/", "https://lux.csail.mit.edu/dev/" | ||
] | ||
|
||
# Crawler would not look for more URLs on these | ||
single_page_urls = [ | ||
"https://johnfoster.pge.utexas.edu/hpc-book/DifferentialEquations_jl.html", | ||
"https://julialang.org/blog/2019/01/fluxdiffeq/", "https://juliapackages.com/p/galacticoptim", | ||
"https://julianlsolvers.github.io/Optim.jl/stable/"] | ||
|
||
index_path = make_knowledge_packs(crawlable_urls; single_urls = single_page_urls, | ||
target_path = joinpath("knowledge_packs", "dim=3072;chunk_size=384;Float32"), index_name = "sciml", custom_metadata = "SciML ecosystem") | ||
|
||
# The index created here has 1024 embedding dimensions with boolean embeddings and max chunk size is 384. | ||
|
||
# The above example creates an output directory index_name which contains the sub-directories "Scraped" and "Index". | ||
# "Scraped" contains .jls files of chunks and sources of the scraped URLs. Index contains the created index along with a .txt file | ||
# containing the artifact info. The output directory also contains the URL mapping csv. |
21 changes: 21 additions & 0 deletions
21
example_scripts/creating_knowledge_packs/tidier_knowledge_pack.jl
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,21 @@ | ||
# The example below demonstrates the creation of Tidier knowledge pack | ||
|
||
using Pkg | ||
Pkg.activate(temp = true) | ||
Pkg.add(url = "https://github.com/JuliaGenAI/DocsScraper.jl") | ||
using DocsScraper | ||
|
||
# The crawler will run on these URLs to look for more URLs with the same hostname | ||
crawlable_urls = ["https://tidierorg.github.io/Tidier.jl/dev/", | ||
"https://tidierorg.github.io/TidierPlots.jl/latest/", | ||
"https://tidierorg.github.io/TidierData.jl/latest/", | ||
"https://tidierorg.github.io/TidierDB.jl/latest/"] | ||
|
||
index_path = make_knowledge_packs(crawlable_urls; | ||
target_path = joinpath("knowledge_packs", "dim=3072;chunk_size=384;Float32"), index_name = "tidier", custom_metadata = "Tidier ecosystem") | ||
|
||
# The index created here has 1024 embedding dimensions with boolean embeddings and max chunk size is 384. | ||
|
||
# The above example creates an output directory index_name which contains the sub-directories "Scraped" and "Index". | ||
# "Scraped" contains .jls files of chunks and sources of the scraped URLs. Index contains the created index along with a .txt file | ||
# containing the artifact info. The output directory also contains the URL mapping csv. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters