diff --git a/README.md b/README.md
index ec240e4e..acf06192 100644
--- a/README.md
+++ b/README.md
@@ -4,7 +4,7 @@ Dive into building GenAI applications!
This repository contains examples, applications, starter code, & tutorials to help you kickstart your GenAI projects.
- These are built using LanceDB, a free, open-source, serverless vectorDB that **requires no setup**.
-- It **integrates into python data ecosystem** so you can simply start using these in your existing data pipelines in pandas, arrow, pydantic etc.
+- It **integrates into Python data ecosystem** so you can simply start using these in your existing data pipelines in pandas, arrow, pydantic etc.
- LanceDB has **native Typescript SDK** using which you can **run vector search** in serverless functions!
@@ -15,10 +15,10 @@ Join our community for support - Discord
---
-This repository is divided into 3 sections:
+This repository is divided into 2 sections:
- [Examples](#examples) - Get right into the code with minimal introduction, aimed at getting you from an idea to PoC within minutes!
- [Applications](#projects--applications) - Ready to use Python and web apps using applied LLMs, VectorDB and GenAI tools
-- [Tutorials](#tutorials) - A curated list of tutorials, blogs, Colabs and courses to get you started with GenAI in greater depth.
+
## Examples
Applied examples that get right into the code with minimal introduction, aimed at getting you from an idea to PoC within minutes!
@@ -27,46 +27,137 @@ Examples are available as:
* **Python scripts** - for cases where you'd like directly to use the file or snippets to integrate in your application
* **JS/TS scripts** - Some examples are written using lancedb's native js library! These script/snippets can also be directly integrated in your web applications.
-If you're looking for in-depth tutorial-like examples, checkout the [tutorials](#tutorials) section!
+The following examples are organized into different tables to make similar types of examples easily accessible.
-| Example | Notebook & Scripts | Read The Blog! |
-|-------- | ------------- | ------------- |
-| | | |
-| [Youtube transcript search bot](/examples/Youtube-Search-QA-Bot/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/Youtube-Search-QA-Bot/main.py) [![JS](https://img.shields.io/badge/javascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](./examples/Youtube-Search-QA-Bot/index.js) [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)||
-| [Langchain: Code Docs QA bot](/examples/Code-Documentation-QA-Bot/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/Code-Documentation-QA-Bot/main.py) [![JS](https://img.shields.io/badge/javascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](./examples/Code-Documentation-QA-Bot/index.js) [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)||
-| [Databricks DBRX Website Bot](./examples/databricks_DBRX_website_bot/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/databricks_DBRX_website_bot/main.py) [![Databricks LLM](https://img.shields.io/badge/databricks-api-red)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)|
-| [CLI-based SDK Manual Chatbot with Phidata](/examples/CLI-SDK-Manual-Chatbot-Locally/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/CLI-SDK-Manual-Chatbot-Locally/assistant.py) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)|
-| [TransformersJS Embedding example](./examples/js-transformers/) |[![JS](https://img.shields.io/badge/javascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](./examples/js-transformers/index.js) [![LLM](https://img.shields.io/badge/local-llm-green)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)| |
-| [Inbuilt Hybrid Search](/examples/Inbuilt-Hybrid-Search) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)||
-| [Audio Search](./examples/audio_search/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/audio_search/main.py) [![LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| |
-| [Multi-lingual search](/examples/multi-lingual-wiki-qa) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/multi-lingual-wiki-qa/main.py) [![LLM](https://img.shields.io/badge/cohere-api-pink)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| |
-| [Hybrid search BM25 & lancedb ](./examples/Hybrid_search_bm25_lancedb/) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/hybrid-search-combining-bm25-and-semantic-search-for-better-results-with-lan-1358038fe7e6)|
-| [Search Within Images](/examples/search-within-images-with-sam-and-clip/) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/examples/search-within-images-with-sam-and-clip/main.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/search-within-an-image-331b54e4285e)|
-| [Accelerate Vector Search Applications Using OpenVINO](/examples/Accelerate-Vector-Search-Applications-Using-OpenVINO/) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/examples/Accelerate-Vector-Search-Applications-Using-OpenVINO/clip_text_image_search.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/accelerate-vector-search-applications-using-openvino-lancedb/)|
+### Build from Scratch
+
+Build applications/examples using LanceDB for efficient vector-based document retrieval.
+
+| Build from Scratch | Interactive Notebook & Scripts |
+|-------- | -------------: |
+|||
+| [Build RAG from Scratch](./tutorials/RAG-from-Scratch) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/RAG-from-Scratch/RAG_from_Scratch.ipynb) [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| |
+| [Local RAG from Scratch with Llama3](./tutorials/Local-RAG-from-Scratch) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./tutorials/Local-RAG-from-Scratch/rag.py) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| |
+||||
+
+### MultiModal
+
+Create a multimodal search application using LanceDB for efficient vector-based retrieval of text and image data. Input text or image queries to find the most relevant documents and images from your corpus.
+
+| Multimodal | Interactive Notebook & Scripts | Blog |
+| --------- | -------------------------- | ----------- |
+||||
| [Multimodal CLIP: DiffusionDB](/examples/multimodal_clip_diffusiondb/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/multimodal_clip_diffusiondb/main.py) [![LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/multi-modal-ai-made-easy-with-lancedb-clip-5aaf8801c939/)|
| [Multimodal CLIP: Youtube videos](/examples/multimodal_video_search/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/multimodal_video_search/main.py) [![LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/multi-modal-ai-made-easy-with-lancedb-clip-5aaf8801c939/)|
| [Multimodal Image + Text Search](/examples/multimodal_search/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/multimodal_search/main.py) [![LLM](https://img.shields.io/badge/local-llm-green)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/multi-modal-ai-made-easy-with-lancedb-clip-5aaf8801c939/)|
-| [Movie Recommender](/examples/movie-recommender/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/movie-recommender/main.py) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| |
-| [Product Recommender](./examples/product-recommender/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/product-recommender/main.py)[![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)| |
-| [Arxiv paper recommender](/examples/arxiv-recommender) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/arxiv-recommender/main.py) [![LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| |
+||||
+
+### RAG
+
+Develop a Retrieval-Augmented Generation (RAG) application using LanceDB for efficient vector-based information retrieval. Input text queries to retrieve relevant documents and generate comprehensive answers by combining retrieved information.
+
+| RAG | Interactive Notebook & Scripts | Blog |
+| --------- | -------------------------- | ----------- |
+||||
| [Improve RAG with Re-ranking](/examples/RAG_Reranking/) | [![LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/simplest-method-to-improve-rag-pipeline-re-ranking-cf6eaec6d544)|
-| [Improve RAG with FLARE](/examples/Advanced-RAG-with-FLARE) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/Advanced-RAG-with-FLARE/app.py) [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/better-rag-with-active-retrieval-augmented-generation-flare-3b66646e2a9f)|
+| [Instruct-Multitask](./examples/instruct-multitask) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/instruct-multitask/main.py) [![LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/multitask-embedding-with-lancedb-be18ec397543)|
| [Improve RAG with HyDE](/examples/Advance-RAG-with-HyDE/) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/advanced-rag-precise-zero-shot-dense-retrieval-with-hyde-0946c54dfdcb)|
| [Improve RAG with LOTR ](/examples/Advance_RAG_LOTR/) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/better-rag-with-lotr-lord-of-retriever-23c8336b9a35)|
| [Advanced RAG: Parent Document Retriever](/examples/parent_document_retriever/) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/modified-rag-parent-document-bigger-chunk-retriever-62b3d1e79bc6)|
+| [Corrective RAG with Langgraph](./tutorials/Corrective-RAG-with_Langgraph/) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/Corrective-RAG-with_Langgraph/CRAG_with_Langgraph.ipynb) [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/implementing-corrective-rag-in-the-easiest-way-2/)|
+| [Contextual-Compression-with-RAG](/examples/Contextual-Compression-with-RAG/) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/examples/Contextual-Compression-with-RAG/main.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/enhance-rag-integrate-contextual-compression-and-filtering-for-precision-a29d4a810301/) |
+| [Improve RAG with FLARE](./examples/better-rag-FLAIR) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/examples/better-rag-FLAIR/main.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/better-rag-with-active-retrieval-augmented-generation-flare-3b66646e2a9f/) |
| [Query Expansion and Reranker ](/examples/QueryExpansion&Reranker/) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/improving-rag-with-query-expansion-reranking-models/)|
| [RAG Fusion](/examples/RAG_Fusion/) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)|
-| [Contextual-Compression-with-RAG](/examples/Contextual-Compression-with-RAG/) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/examples/Contextual-Compression-with-RAG/main.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/enhance-rag-integrate-contextual-compression-and-filtering-for-precision-a29d4a810301/) |
-| [Instruct-Multitask](./examples/instruct-multitask) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/instruct-multitask/main.py) [![LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/multitask-embedding-with-lancedb-be18ec397543)|
+| [Agentic RAG ](/tutorials/Agentic_RAG/) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)|
+||||
+
+### Vector Search
+
+Build a vector search application using LanceDB for efficient vector-based document retrieval. Input text queries to find the most relevant documents from your corpus.
+
+| Vector Search | Interactive Notebook & Scripts | Blog |
+| --------- | -------------------------- | ----------- |
+||||
+| [Inbuilt Hybrid Search](/examples/Inbuilt-Hybrid-Search) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)||
+| [Hybrid search BM25 & lancedb ](./examples/Hybrid_search_bm25_lancedb/) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#) |[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/hybrid-search-combining-bm25-and-semantic-search-for-better-results-with-lan-1358038fe7e6)|
+| [NER powered Semantic Search](./tutorials/NER-powered-Semantic-Search) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/NER-powered-Semantic-Search/NER_powered_Semantic_Search_with_LanceDB.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/ner-powered-semantic-search-using-lancedb-51051dc3e493) |
+| [Audio Search](./examples/audio_search/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/audio_search/main.py) [![LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| |
+| [Multi-lingual search](/examples/multi-lingual-wiki-qa) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/multi-lingual-wiki-qa/main.py) [![LLM](https://img.shields.io/badge/cohere-api-pink)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| |
+| [Facial Recognition](./examples/facial_recognition) | [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)|
+[Sentiment Analysis : Analysing Hotel Reviews](/examples/Sentiment-Analysis-Analyse-Hotel-Reviews/) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/examples/Sentiment-Analysis-Analyse-Hotel-Reviews/Sentiment_Analysis_using_LanceDB.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/sentiment-analysis-using-lancedb-2da3cb1e3fa6)|
+| [Imagebind demo app](./examples/imagebind_demo/) | [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)|
+| [Search Within Images](/examples/search-within-images-with-sam-and-clip/) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/examples/search-within-images-with-sam-and-clip/main.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/search-within-an-image-331b54e4285e)|
+| [Vector Search with TransformersJS](./examples/js-transformers/) |[![JS](https://img.shields.io/badge/javascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](./examples/js-transformers/index.js) [![LLM](https://img.shields.io/badge/local-llm-green)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)| |
+| [Accelerate Vector Search Applications Using OpenVINO](/examples/Accelerate-Vector-Search-Applications-Using-OpenVINO/) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/examples/Accelerate-Vector-Search-Applications-Using-OpenVINO/clip_text_image_search.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/accelerate-vector-search-applications-using-openvino-lancedb/)|
+||||
+
+### Chatbot
+
+Create a chatbot application using LanceDB for efficient vector-based response generation. Input user queries to retrieve relevant context and generate coherent, context-aware replies.
+
+| Chatbot | Interactive Notebook & Scripts | Blog |
+| --------- | -------------------------- | ----------- |
+||||
+| [Databricks DBRX Website Bot](./examples/databricks_DBRX_website_bot/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/databricks_DBRX_website_bot/main.py) [![Databricks LLM](https://img.shields.io/badge/databricks-api-red)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)|
+| [CLI-based SDK Manual Chatbot with Phidata](/examples/CLI-SDK-Manual-Chatbot-Locally/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/CLI-SDK-Manual-Chatbot-Locally/assistant.py) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)|
+| [Youtube transcript search bot](/examples/Youtube-Search-QA-Bot/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/Youtube-Search-QA-Bot/main.py) [![JS](https://img.shields.io/badge/javascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](./examples/Youtube-Search-QA-Bot/index.js) [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)||
+| [Langchain: Code Docs QA bot](/examples/Code-Documentation-QA-Bot/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/Code-Documentation-QA-Bot/main.py) [![JS](https://img.shields.io/badge/javascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](./examples/Code-Documentation-QA-Bot/index.js) [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)||
+| [Context-Aware Chatbot using Llama 2 & LanceDB](./tutorials/chatbot_using_Llama2_&_lanceDB) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/chatbot_using_Llama2_&_lanceDB/main.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/context-aware-chatbot-using-llama-2-lancedb-as-vector-database-4d771d95c755) |
+||||
+
+
+### Evaluation
+
+Develop an evaluation application. Input reference and candidate texts to measure their performance on various metrics.
+
+| Evaluation | Interactive Notebook & Scripts | Blog |
+| --------- | -------------------------- | ----------- |
+||||
| [Evaluating Prompts with Prompttools](/examples/prompttools-eval-prompts/) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)| |
+| [Evaluating RAG with RAGAs](./examples/Evaluating_RAG_with_RAGAs/) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)| |
+||||
+
+### AI Agents
+
+Design an AI agents coordination application with LanceDB for efficient vector-based communication and collaboration. Input queries to enable AI agents to exchange information, coordinate tasks, and achieve shared goals effectively.
+
+| AI Agents | Interactive Notebook & Scripts | Blog |
+| --------- | -------------------------- | ----------- |
+||||
| [AI Agents: Reducing Hallucination](/examples/reducing_hallucinations_ai_agents/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/reducing_hallucinations_ai_agents/main.py) [![JS](https://img.shields.io/badge/javascript-%23323330.svg?style=for-the-badge&logo=javascript&logoColor=%23F7DF1E)](./examples/reducing_hallucinations_ai_agents/index.js) [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#) |[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/how-to-reduce-hallucinations-from-llm-powered-agents-using-long-term-memory-72f262c3cc1f/)|
| [AI Trends Searcher with CrewAI](./examples/AI-Trends-with-CrewAI/) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/track-ai-trends-crewai-agents-rag/)|
| [SuperAgent Autogen](/examples/SuperAgent_Autogen) | [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)||
-[Sentiment Analysis : Analysing Hotel Reviews](/examples/Sentiment-Analysis-Analyse-Hotel-Reviews/) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/examples/Sentiment-Analysis-Analyse-Hotel-Reviews/Sentiment_Analysis_using_LanceDB.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/sentiment-analysis-using-lancedb-2da3cb1e3fa6)|
-| [Facial Recognition](./examples/facial_recognition) | [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)|
-| [Imagebind demo app](/examples/imagebind_demo/) | [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)|
+||||
+
+### Recommender Systems
+
+Create a recommender system application with LanceDB for efficient vector-based item recommendation. Input user preferences or item features to generate personalized recommendations and enhance user experience.
+
+| Recommender Systems | Interactive Notebook & Scripts | Blog |
+| --------- | -------------------------- | ----------- |
+||||
+| [Movie Recommender](/examples/movie-recommender/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/movie-recommender/main.py) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| |
+| [Movie Recommender with Genre](./examples/movie-recommendation-with-genres/) | [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/movie-recommendation-system-using-lancedb-and-doc2vec/)|
+| [Product Recommender](./examples/product-recommender/) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/product-recommender/main.py)[![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)| |
+| [Arxiv paper recommender](/examples/arxiv-recommender) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./examples/arxiv-recommender/main.py) [![LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| |
+||||
+### Concepts
+Checkout concepts of LLM applications pipeline to ensures accurate information retrieval.
+
+| Concepts | Interactive Notebook | Blog |
+| --------- | -------------------------- | ----------- |
+| | | |
+| [A Primer on Text Chunking and its Types](./tutorials/different-types-text-chunking-in-RAG) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/different-types-text-chunking-in-RAG/Text_Chunking_on_RAG_application_with_LanceDB.ipynb) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/a-primer-on-text-chunking-and-its-types-a420efc96a13) |
+| [Langchain LlamaIndex Chunking](./tutorials/Langchain-LlamaIndex-Chunking) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/Langchain-LlamaIndex-Chunking/Langchain_Llamaindex_chunking.ipynb) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/chunking-techniques-with-langchain-and-llamaindex/) |
+| [Comparing Cohere Rerankers with LanceDB](./tutorials/cohere-reranker) | [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/benchmarking-cohere-reranker-with-lancedb/) |
+| [Product Quantization: Compress High Dimensional Vectors](https://blog.lancedb.com/benchmarking-lancedb-92b01032874a-2/) |[![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#) | [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/benchmarking-lancedb-92b01032874a-2/) |
+| [LLMs, RAG, & the missing storage layer for AI](https://blog.lancedb.com/llms-rag-the-missing-storage-layer-for-ai-28ded35fa984) | [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/llms-rag-the-missing-storage-layer-for-ai-28ded35fa984/) |
+| [Fine-Tuning LLM using PEFT & QLoRA](./tutorials/fine-tuning_LLM_with_PEFT_QLoRA) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/fine-tuning_LLM_with_PEFT_QLoRA/main.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/optimizing-llms-a-step-by-step-guide-to-fine-tuning-with-peft-and-qlora-22eddd13d25b) |
+| [Extracting Complex tables-text from PDFs using LlamaParse ](./tutorials/Advace_RAG_LlamaParser) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/Advace_RAG_LlamaParser/main.ipynb) [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![LlamaCloud](https://img.shields.io/badge/Llama-api-pink)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| |
+||||
## Projects & Applications
These are ready to use applications built using LanceDB serverless vector database. You can explore these open source projects, use parts of them in your projects or build your applications on top of these.
@@ -89,27 +180,7 @@ These are ready to use applications built using LanceDB serverless vector databa
| [ Fastapi RAG template ](https://github.com/lancedb/vectordb-recipes/tree/main/applications/Chatbot_RAG_with_FASTAPI) | FastAPI based RAG template with Websocket support | ![image](./assets/chatbot_fastapi.png)|
| [ GTE MLX RAG ](https://github.com/lancedb/vectordb-recipes/tree/main/applications/GTE_mlx_RAG) | mlx based RAG model using lancedb api support | ![image](./assets/rag-mlx.png)|
| [ Healthcare Chatbot ](https://github.com/lancedb/vectordb-recipes/tree/main/applications/Healthcare_chatbot/) | Healthcare chatbot using domain specific LLM & Embedding model | ![image](./assets/chatbot_medical.png)|
-
-
-
-## Tutorials
-Looking to get started with LLMs, vectorDBs, and the world of Generative AI? These in-depth tutorials and courses cover these concepts with practical follow along colabs where possible.
-| Tutorial | Interactive Environment | Blog Link |
-| --------- | -------------------------- | ----------- |
-| | | |
-| [Build RAG from Scratch](./tutorials/RAG-from-Scratch) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/RAG-from-Scratch/RAG_from_Scratch.ipynb) [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| |
-| [Local RAG from Scratch with Llama3](./tutorials/Local-RAG-from-Scratch) | [![Python](https://img.shields.io/badge/python-3670A0?style=for-the-badge&logo=python&logoColor=ffdd54)](./tutorials/Local-RAG-from-Scratch/rag.py) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| |
-| [A Primer on Text Chunking and its Types](./tutorials/different-types-text-chunking-in-RAG) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/different-types-text-chunking-in-RAG/Text_Chunking_on_RAG_application_with_LanceDB.ipynb) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/a-primer-on-text-chunking-and-its-types-a420efc96a13) |
-| [Langchain LlamaIndex Chunking](./tutorials/Langchain-LlamaIndex-Chunking) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/Langchain-LlamaIndex-Chunking/Langchain_Llamaindex_chunking.ipynb) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/chunking-techniques-with-langchain-and-llamaindex/) |
-| [Comparing Cohere Rerankers with LanceDB](./tutorials/cohere-reranker) | [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/benchmarking-cohere-reranker-with-lancedb/) |
-| [NER powered Semantic Search](./tutorials/NER-powered-Semantic-Search) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/NER-powered-Semantic-Search/NER_powered_Semantic_Search_with_LanceDB.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![beginner](https://img.shields.io/badge/beginner-B5FF33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/ner-powered-semantic-search-using-lancedb-51051dc3e493) |
-| [Product Quantization: Compress High Dimensional Vectors](https://blog.lancedb.com/benchmarking-lancedb-92b01032874a-2/) |[![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#) | [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/benchmarking-lancedb-92b01032874a-2/) |
-| [Corrective RAG with Langgraph](./tutorials/Corrective-RAG-with_Langgraph/) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/Corrective-RAG-with_Langgraph/CRAG_with_Langgraph.ipynb) [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/implementing-corrective-rag-in-the-easiest-way-2/)|
-| [LLMs, RAG, & the missing storage layer for AI](https://blog.lancedb.com/llms-rag-the-missing-storage-layer-for-ai-28ded35fa984) | [![intermediate](https://img.shields.io/badge/intermediate-FFDA33)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/llms-rag-the-missing-storage-layer-for-ai-28ded35fa984/) |
-| [Fine-Tuning LLM using PEFT & QLoRA](./tutorials/fine-tuning_LLM_with_PEFT_QLoRA) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/fine-tuning_LLM_with_PEFT_QLoRA/main.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/optimizing-llms-a-step-by-step-guide-to-fine-tuning-with-peft-and-qlora-22eddd13d25b) |
-| [Context-Aware Chatbot using Llama 2 & LanceDB](./tutorials/chatbot_using_Llama2_&_lanceDB) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/chatbot_using_Llama2_&_lanceDB/main.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)| [![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/context-aware-chatbot-using-llama-2-lancedb-as-vector-database-4d771d95c755) |
-| [Better RAG with FLARE](./tutorials/better-rag-FLAIR) | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/better-rag-FLAIR/main.ipynb) [![local LLM](https://img.shields.io/badge/local-llm-green)](#) [![LLM](https://img.shields.io/badge/openai-api-white)](#) [![advanced](https://img.shields.io/badge/advanced-FF3333)](#)|[![Ghost](https://img.shields.io/badge/ghost-000?style=for-the-badge&logo=ghost&logoColor=%23F7DF1E)](https://blog.lancedb.com/better-rag-with-active-retrieval-augmented-generation-flare-3b66646e2a9f/) |
-
+||||
**🌟 New! 🌟 Applied GenAI and VectorDB course on Udacity**
diff --git a/assets/imagebind-demo.png b/assets/imagebind-demo.png
new file mode 100644
index 00000000..c1e28ae1
Binary files /dev/null and b/assets/imagebind-demo.png differ
diff --git a/assets/movie-recommendation-with-genre.png b/assets/movie-recommendation-with-genre.png
new file mode 100644
index 00000000..6b26c968
Binary files /dev/null and b/assets/movie-recommendation-with-genre.png differ
diff --git a/assets/rag_evaluation_flow.png b/assets/rag_evaluation_flow.png
new file mode 100644
index 00000000..4b74b6cb
Binary files /dev/null and b/assets/rag_evaluation_flow.png differ
diff --git a/assets/superagent-autogen.png b/assets/superagent-autogen.png
index d41d2484..9908c624 100644
Binary files a/assets/superagent-autogen.png and b/assets/superagent-autogen.png differ
diff --git a/examples/Code-Documentation-QA-Bot/lancedb_cloud/README.md b/examples/Code-Documentation-QA-Bot/lancedb_cloud/README.md
new file mode 100644
index 00000000..88b6eca7
--- /dev/null
+++ b/examples/Code-Documentation-QA-Bot/lancedb_cloud/README.md
@@ -0,0 +1,33 @@
+# Code documentation Q&A bot example with LangChain
+
+![imgonline-com-ua-twotoone-RaRlTe66ft3RUvK](https://github.com/lancedb/vectordb-recipes/assets/15766192/4682b39d-62f4-4722-bc64-f45d45ec8a22)
+
+
+This Q&A bot will allow you to query your own documentation easily using questions. We'll also demonstrate the use of LangChain and LanceDB Cloud using the OpenAI API. In this example we'll use **Numpy 1.26** documentation, but, this could be replaced for your own docs as well.
+Colab walkthrough -
+
+
+### Set credentials
+if you would like to set api key through an environment variable:
+```
+export LANCEDB_API_KEY="sk_..."
+```
+or
+```
+import os
+import getpass
+
+os.environ["LANCEDB_API_KEY"] = getpass.getpass("Enter Your LANCEDB API Key:")
+```
+
+replace the following lines in main.py with your project slug and api key"
+```
+db_url="db://your-project-slug-name"
+api_key="sk_..."
+region="us-east-1"
+```
+
+### Run the script
+```python
+OPENAI_API_KEY=... python main.py --query "what is a vectordb?"
+```
\ No newline at end of file
diff --git a/examples/Code-Documentation-QA-Bot/lancedb_cloud/main.ipynb b/examples/Code-Documentation-QA-Bot/lancedb_cloud/main.ipynb
new file mode 100644
index 00000000..5b40c699
--- /dev/null
+++ b/examples/Code-Documentation-QA-Bot/lancedb_cloud/main.ipynb
@@ -0,0 +1,484 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "id": "13cb272e",
+ "metadata": {},
+ "source": [
+ "# Code documentation Q&A bot example with LangChain\n",
+ "![picture](https://lancedb.github.io/lancedb/assets/ecosystem-illustration.png)\n",
+ "\n",
+ "This Q&A bot will allow you to query your own documentation easily using questions. We'll also demonstrate the use of LangChain and LanceDB using the OpenAI API.\n",
+ "\n",
+ "In this example we'll **Numpy 1.26** documentation, but, this could be replaced for your own docs as well"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "9a0e829a",
+ "metadata": {
+ "id": "wgPbKbpumkhH"
+ },
+ "source": [
+ "### Credentials\n",
+ "\n",
+ "Copy and paste the project name and the api key from your project page.\n",
+ "These will be used later to [connect to LanceDB Cloud](#scroll-to=5q8m6GMD7sGu)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 1,
+ "id": "6553603f",
+ "metadata": {
+ "id": "rqEXT5-fmofw"
+ },
+ "outputs": [],
+ "source": [
+ "project_slug = \"your-project-slug\" # @param {type:\"string\"}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 2,
+ "id": "36ef9c45",
+ "metadata": {
+ "id": "5LYmBomPmswi"
+ },
+ "outputs": [],
+ "source": [
+ "api_key = \"sk_...\" # @param {type:\"string\"}"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "33ba6af1",
+ "metadata": {
+ "id": "Xs6tr6CMnBrr"
+ },
+ "source": [
+ "You can also set the LANCEDB_API_KEY as an environment variable. More details can be found **here**."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "Le27BWs2vDbB"
+ },
+ "source": [
+ "Since we will be using OPENAI API, let us set the OPENAI API KEY as well."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "-2-fyVPKu9fl"
+ },
+ "outputs": [],
+ "source": [
+ "openai_api_key = \"sk-...\" # @param {type:\"string\"}"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "1991331f-4316-417a-b693-e2f27cbe9ea7",
+ "metadata": {},
+ "source": [
+ "### Installing dependencies"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "e8a49c31",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "! pip install -U langchain langchain-openai langchain-community"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "66638d6c",
+ "metadata": {
+ "id": "QR9W53zStdlz"
+ },
+ "outputs": [],
+ "source": [
+ "! pip install -qq tiktoken unstructured pandas lancedb"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "0QQL4lm8lTzg"
+ },
+ "source": [
+ "### Importing libraries"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "vP6d6JUShgqo"
+ },
+ "outputs": [],
+ "source": [
+ "import openai\n",
+ "import os\n",
+ "import re\n",
+ "import pickle\n",
+ "import requests\n",
+ "import zipfile\n",
+ "from pathlib import Path\n",
+ "\n",
+ "from langchain.document_loaders import UnstructuredHTMLLoader\n",
+ "from langchain.text_splitter import RecursiveCharacterTextSplitter\n",
+ "from langchain.vectorstores import LanceDB\n",
+ "from langchain_openai import OpenAI, OpenAIEmbeddings\n",
+ "from langchain.chains import RetrievalQA\n",
+ "\n",
+ "os.environ[\"OPENAI_API_KEY\"] = openai_api_key\n",
+ "assert openai.models.list() is not None"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "8eKRYd2F7v5n"
+ },
+ "source": [
+ "### Get the data\n",
+ "To make this easier, we've downloaded Numpy documentation and stored the raw HTML files for you to download. Once the docs are downloaded, we then use LangChain's HTML document readers to parse them and store them in LanceDB as a vector store, along with relevant metadata.\n",
+ "By default we use numpy docs, but you can replace this with your own docs as well."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "l0ezDr7suAf_"
+ },
+ "outputs": [],
+ "source": [
+ "numpy_docs = requests.get(\"https://numpy.org/doc/1.26/numpy-html.zip\")\n",
+ "with open(\"numpy-html.zip\", \"wb\") as f:\n",
+ " f.write(numpy_docs.content)\n",
+ "\n",
+ "file = zipfile.ZipFile(\"numpy-html.zip\")\n",
+ "file = file.extractall(path=\"numpy_docs\")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "HJf8xZmX8VJC"
+ },
+ "source": [
+ "We'll create a simple **helper function** that can help to extract metadata, so it can used later when querying with filters. In this case, we want to keep the lineage of the uri or path for each document that has been processed:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "5aljyqpUiViE"
+ },
+ "outputs": [],
+ "source": [
+ "# Pre-processing and loading the documentation\n",
+ "\n",
+ "# Next, let's pre-process and load the documentation. To make sure we don't need to do this repeatedly if we were updating code,\n",
+ "# we're caching it using pickle so we can retrieve it again (this could take a few minutes to run the first time you do it).\n",
+ "# We'll also add some more metadata to the docs here such as the title and version of the code:\n",
+ "\n",
+ "\n",
+ "def get_document_title(document_list):\n",
+ " titles = []\n",
+ " for doc in document_list:\n",
+ " if \"metadata\" in doc and \"source\" in doc[\"metadata\"]:\n",
+ " m = str(doc[\"metadata\"][\"source\"])\n",
+ " title = re.findall(\"numpy_docs(.*).html\", m)\n",
+ " print(title)\n",
+ " if title:\n",
+ " titles.append(title[0])\n",
+ " else:\n",
+ " titles.append(\"\")\n",
+ " else:\n",
+ " titles.append(\"\")\n",
+ " return titles"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "PCufm9Xr8eWp"
+ },
+ "source": [
+ "### Pre-processing and loading the documents\n",
+ "\n",
+ "Next, let's pre-process and load the documents. To make sure we don't need to do this repeatedly while updating code, we're caching it using pickle so it can be retrieved again (this could take a few minutes to run the first time you do it). We'll also add extra metadata to the docs here such as the title and version of the code:\n",
+ "\n",
+ "*Note*: This step might take up to 10 minutes to run!\n",
+ "*Note*: If there is some issue with nltk package, kindly try using\n",
+ "```\n",
+ "import nltk\n",
+ "nltk.download('punkt')\n",
+ "```\n",
+ "or try to manually install the [nltk_data](https://github.com/nltk/nltk_data/tree/gh-pages) package and unzip the **punkt tokenizer** zip and the **averaged_perceptron_tagger** zip file in the packages folder."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 443
+ },
+ "id": "964Z2sZA247g",
+ "outputId": "236df468-a630-4691-85a4-886835cfc02d"
+ },
+ "outputs": [],
+ "source": [
+ "from tqdm import tqdm\n",
+ "\n",
+ "docs = []\n",
+ "docs_path = Path(\"docs.pkl\")\n",
+ "for p in tqdm(Path(\"numpy_docs\").rglob(\"*.html\")):\n",
+ " if p.is_dir():\n",
+ " continue\n",
+ " loader = UnstructuredHTMLLoader(p)\n",
+ " raw_document = loader.load()\n",
+ " # docs.append(raw_document)\n",
+ " title = get_document_title(raw_document)\n",
+ " m = {\"title\": title}\n",
+ " if raw_document:\n",
+ " raw_document[0].metadata.update(m)\n",
+ " raw_document[0].metadata[\"source\"] = str(raw_document[0].metadata[\"source\"])\n",
+ " docs.extend(raw_document)\n",
+ "\n",
+ "\n",
+ "if docs:\n",
+ " with open(docs_path, \"wb\") as fh:\n",
+ " pickle.dump(docs, fh)\n",
+ "else:\n",
+ " with open(docs_path, \"rb\") as fh:\n",
+ " docs = pickle.load(fh)\n",
+ "\n",
+ "len(docs)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "cntAuaUU_TER"
+ },
+ "source": [
+ "### Generating emebeddings from our docs\n",
+ "\n",
+ "Now that we have our raw documents loaded, we need to pre-process them to generate embeddings:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "dHw2DSAj3u9B"
+ },
+ "outputs": [],
+ "source": [
+ "text_splitter = RecursiveCharacterTextSplitter(\n",
+ " chunk_size=1000,\n",
+ " chunk_overlap=200,\n",
+ ")\n",
+ "documents = text_splitter.split_documents(docs)\n",
+ "embeddings = OpenAIEmbeddings()"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "IiM4DJvC_2dV"
+ },
+ "source": [
+ "### Store data in LanceDB Cloud\n",
+ "\n",
+ "Let's connect to LanceDB so we can store our documents, It requires 0 setup !"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "GV77SSi-AK0v"
+ },
+ "outputs": [],
+ "source": [
+ "uri = \"db://\" + project_slug\n",
+ "table_name = \"langchain_vectorstore\"\n",
+ "\n",
+ "vectorstore = LanceDB(\n",
+ " embedding=embeddings,\n",
+ " uri=uri, # your remote database URI\n",
+ " api_key=api_key,\n",
+ " region=\"us-east-1\",\n",
+ " table_name=table_name, # Optional, defaults to \"vectors\"\n",
+ " mode=\"overwrite\", # Optional, defaults to \"overwrite\"\n",
+ ")\n",
+ "\n",
+ "doc_ids = vectorstore.add_documents(documents=documents)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "sZOUxfqzXr1m"
+ },
+ "source": [
+ "Now let's create our RetrievalQA chain using the LanceDB vector store:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "4nDltKClAhhU"
+ },
+ "outputs": [],
+ "source": [
+ "qa = RetrievalQA.from_chain_type(\n",
+ " llm=OpenAI(), chain_type=\"stuff\", retriever=vectorstore.as_retriever()\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "xoS-WKXMXvvR"
+ },
+ "source": [
+ "And thats it! We're all setup. The next step is to run some queries, let's try a few:"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "7SKSlyq2iwpK"
+ },
+ "source": [
+ "### Query"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "6aSZr8fCXx9s",
+ "outputId": "ac5b5663-d45f-48c0-9f0a-f272e1a3ec2d"
+ },
+ "outputs": [
+ {
+ "data": {
+ "text/plain": [
+ "{'query': 'tell me about the numpy library?',\n",
+ " 'result': ' The NumPy library is an open source Python library that is used for working with numerical data in Python. It contains multidimensional array and matrix data structures, and provides methods for efficient operations on these arrays. It is widely used in various fields of science and engineering and is a core component of the scientific Python and PyData ecosystems. It also offers a large library of high-level mathematical functions for working with arrays and matrices. '}"
+ ]
+ },
+ "execution_count": 14,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "query = \"tell me about the numpy library?\"\n",
+ "qa.invoke(query)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "EtBw5EH7lv9_",
+ "outputId": "1745f881-fa15-44b5-e692-b702babce734"
+ },
+ "outputs": [
+ {
+ "data": {
+ "text/plain": [
+ "{'query': \"What's the current version of numpy?\",\n",
+ " 'result': '\\nThe current version of numpy is 1.16.4.'}"
+ ]
+ },
+ "execution_count": 15,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "query = \"What's the current version of numpy?\"\n",
+ "qa.invoke(query)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "fR4CmF9ylvzw",
+ "outputId": "1b33bb78-4b3f-4dea-addd-75f56eb4e5e6"
+ },
+ "outputs": [
+ {
+ "data": {
+ "text/plain": [
+ "{'query': 'What kind of linear algebra related operations can be done in numpy?',\n",
+ " 'result': ' The numpy package provides various operations related to linear algebra, such as decompositions, matrix eigenvalues, norms, solving equations and inverting matrices, and performing linear algebra on several matrices at once. It also has support for logic functions, masked array operations, mathematical functions, matrix library, miscellaneous routines, padding arrays, polynomials, random sampling, set routines, sorting, searching, counting, statistics, and window functions.'}"
+ ]
+ },
+ "execution_count": 16,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "query = \"What kind of linear algebra related operations can be done in numpy?\"\n",
+ "qa.invoke(query)"
+ ]
+ }
+ ],
+ "metadata": {
+ "colab": {
+ "provenance": []
+ },
+ "kernelspec": {
+ "display_name": "Python 3",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.12.1"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 0
+}
diff --git a/examples/Code-Documentation-QA-Bot/lancedb_cloud/main.py b/examples/Code-Documentation-QA-Bot/lancedb_cloud/main.py
new file mode 100644
index 00000000..e9eb9605
--- /dev/null
+++ b/examples/Code-Documentation-QA-Bot/lancedb_cloud/main.py
@@ -0,0 +1,136 @@
+# %% [markdown]
+# # Code documentation Q&A bot example with LangChain
+#
+# This Q&A bot will allow you to query your own documentation easily using questions. We'll also demonstrate the use of LangChain and LanceDB using the OpenAI API.
+#
+# In this example we'll use Pandas 2.0 documentation, but, this could be replaced for your own docs as well
+
+import argparse
+import os
+import pickle
+import re
+import zipfile
+from pathlib import Path
+
+import openai
+import requests
+from langchain.chains import RetrievalQA
+from langchain.text_splitter import RecursiveCharacterTextSplitter
+from langchain_community.document_loaders import BSHTMLLoader
+from langchain_community.vectorstores import LanceDB
+from langchain_openai import OpenAI, OpenAIEmbeddings
+
+
+def get_document_title(document_list):
+ titles = []
+ for doc in document_list:
+ if "metadata" in doc and "source" in doc["metadata"]:
+ m = str(doc["metadata"]["source"])
+ title = re.findall("numpy_docs(.*).html", m)
+ if title:
+ titles.append(title[0])
+ else:
+ titles.append("")
+ else:
+ titles.append("")
+ return titles
+
+
+def arg_parse():
+ default_query = "tell me about the numpy library?"
+ # default_query = "What's the current version of numpy?"
+
+ parser = argparse.ArgumentParser(description="Code Documentation QA Bot")
+ parser.add_argument(
+ "--query", type=str, default=default_query, help="query to search"
+ )
+ parser.add_argument("--openai-key", type=str, help="OpenAI API Key")
+ args = parser.parse_args()
+
+ if not args.openai_key:
+ if "OPENAI_API_KEY" not in os.environ:
+ raise ValueError(
+ "OPENAI_API_KEY environment variable not set. Please set it or pass --openai_key"
+ )
+ else:
+ openai.api_key = args.openai_key
+
+ return args
+
+
+def pre_process():
+ from tqdm import tqdm
+
+ docs = []
+ docs_path = Path("docs.pkl")
+ for p in tqdm(Path("numpy_docs").rglob("*.html")):
+ if p.is_dir():
+ continue
+ # loader = UnstructuredHTMLLoader(p)
+ loader = BSHTMLLoader(p, open_encoding="utf8")
+ raw_document = loader.load()
+ # docs.append(raw_document)
+ title = get_document_title(raw_document)
+ m = {"title": title}
+ if raw_document:
+ raw_document[0].metadata.update(m)
+ raw_document[0].metadata["source"] = str(raw_document[0].metadata["source"])
+ docs.extend(raw_document)
+
+ if docs:
+ with open(docs_path, "wb") as fh:
+ pickle.dump(docs, fh)
+ else:
+ with open(docs_path, "rb") as fh:
+ docs = pickle.load(fh)
+
+ return docs
+
+
+if __name__ == "__main__":
+ args = arg_parse()
+
+ numpy_docs = requests.get("https://numpy.org/doc/1.26/numpy-html.zip")
+ with open("numpy-html.zip", "wb") as f:
+ f.write(numpy_docs.content)
+
+ file = zipfile.ZipFile("numpy-html.zip")
+ file = file.extractall(path="numpy_docs")
+
+ docs = pre_process()
+
+ print("Loaded {} documents".format(len(docs)))
+ text_splitter = RecursiveCharacterTextSplitter(
+ chunk_size=1000,
+ chunk_overlap=200,
+ )
+ documents = text_splitter.split_documents(docs)
+ embeddings = OpenAIEmbeddings()
+
+ db_url = "db://your-project-slug"
+ api_key = "sk_..."
+ region = "us-east-1"
+ table_name = "langchain_vectorstore"
+
+ vectorstore = LanceDB(
+ embedding=embeddings,
+ uri=db_url, # your remote database URI
+ api_key=api_key,
+ region="us-east-1",
+ table_name=table_name, # Optional, defaults to "vectors"
+ mode="overwrite", # Optional, defaults to "overwrite"
+ )
+
+ # insert documents in batches
+ batch_size = 10000
+ for i in range(0, len(documents), batch_size):
+ print(f"ingesting batch of {i} : {i+batch_size}")
+ batch = documents[i : i + batch_size]
+ vectorstore.add_documents(batch)
+
+ qa = RetrievalQA.from_chain_type(
+ llm=OpenAI(), chain_type="stuff", retriever=vectorstore.as_retriever()
+ )
+
+ result = qa.run(args.query)
+ print(result)
diff --git a/examples/Code-Documentation-QA-Bot/lancedb_cloud/requirements.txt b/examples/Code-Documentation-QA-Bot/lancedb_cloud/requirements.txt
new file mode 100644
index 00000000..79d2c50b
--- /dev/null
+++ b/examples/Code-Documentation-QA-Bot/lancedb_cloud/requirements.txt
@@ -0,0 +1,8 @@
+argparse
+openai
+langchain-community
+langchain-openai
+lancedb
+unstructured
+tiktoken
+polars
diff --git a/examples/Evaluating_RAG_with_RAGAs/Evaluating_RAG_with_RAGAs.ipynb b/examples/Evaluating_RAG_with_RAGAs/Evaluating_RAG_with_RAGAs.ipynb
new file mode 100644
index 00000000..f69e55d3
--- /dev/null
+++ b/examples/Evaluating_RAG_with_RAGAs/Evaluating_RAG_with_RAGAs.ipynb
@@ -0,0 +1,1158 @@
+{
+ "nbformat": 4,
+ "nbformat_minor": 0,
+ "metadata": {
+ "colab": {
+ "provenance": []
+ },
+ "kernelspec": {
+ "name": "python3",
+ "display_name": "Python 3"
+ },
+ "language_info": {
+ "name": "python"
+ },
+ "widgets": {
+ "application/vnd.jupyter.widget-state+json": {
+ "91f4187ef74b4c0791fa9058899f7454": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_name": "HBoxModel",
+ "model_module_version": "1.5.0",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HBoxModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HBoxView",
+ "box_style": "",
+ "children": [
+ "IPY_MODEL_d9fb5f1092e24ba59cb768842ff8f828",
+ "IPY_MODEL_6441a4ce0f644c51aa6c140de43ed31d",
+ "IPY_MODEL_a6b459a38e5c4386b85ee7ebc0e302a4"
+ ],
+ "layout": "IPY_MODEL_cf96076499974020b541a541648028f4"
+ }
+ },
+ "d9fb5f1092e24ba59cb768842ff8f828": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_name": "HTMLModel",
+ "model_module_version": "1.5.0",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_82cd48cdf6f144e19cb3ef0a0553b689",
+ "placeholder": "",
+ "style": "IPY_MODEL_8e537aa094004828b08a55a94cbd7dff",
+ "value": "Evaluating: 100%"
+ }
+ },
+ "6441a4ce0f644c51aa6c140de43ed31d": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_name": "FloatProgressModel",
+ "model_module_version": "1.5.0",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "FloatProgressModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "ProgressView",
+ "bar_style": "success",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_55a45baafaad4a4ca2cad720da0a80aa",
+ "max": 27,
+ "min": 0,
+ "orientation": "horizontal",
+ "style": "IPY_MODEL_7f2a940f7f114e439f63e5e900748511",
+ "value": 27
+ }
+ },
+ "a6b459a38e5c4386b85ee7ebc0e302a4": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_name": "HTMLModel",
+ "model_module_version": "1.5.0",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_50282721d29447c89bc05559a99183dd",
+ "placeholder": "",
+ "style": "IPY_MODEL_23c5a724d0fa40efac45f1fe8fc0b9c5",
+ "value": " 27/27 [00:23<00:00, 4.90s/it]"
+ }
+ },
+ "cf96076499974020b541a541648028f4": {
+ "model_module": "@jupyter-widgets/base",
+ "model_name": "LayoutModel",
+ "model_module_version": "1.2.0",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "82cd48cdf6f144e19cb3ef0a0553b689": {
+ "model_module": "@jupyter-widgets/base",
+ "model_name": "LayoutModel",
+ "model_module_version": "1.2.0",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "8e537aa094004828b08a55a94cbd7dff": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_name": "DescriptionStyleModel",
+ "model_module_version": "1.5.0",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ },
+ "55a45baafaad4a4ca2cad720da0a80aa": {
+ "model_module": "@jupyter-widgets/base",
+ "model_name": "LayoutModel",
+ "model_module_version": "1.2.0",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "7f2a940f7f114e439f63e5e900748511": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_name": "ProgressStyleModel",
+ "model_module_version": "1.5.0",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "ProgressStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "bar_color": null,
+ "description_width": ""
+ }
+ },
+ "50282721d29447c89bc05559a99183dd": {
+ "model_module": "@jupyter-widgets/base",
+ "model_name": "LayoutModel",
+ "model_module_version": "1.2.0",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "23c5a724d0fa40efac45f1fe8fc0b9c5": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_name": "DescriptionStyleModel",
+ "model_module_version": "1.5.0",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ }
+ }
+ }
+ },
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "source": [
+ "## Evaluating RAG with RAGAs using GPT-4o\n",
+ "\n",
+ "Ragas is a **framework for evaluating Retrieval Augmented Generation (RAG) pipelines**.\n",
+ "\n",
+ "Ragas provides you with the tools/metrics based on the latest research for evaluating LLM-generated text to give you insights about your RAG pipeline. Ragas can be integrated with your CI/CD to provide continuous checks to ensure performance.\n",
+ "\n",
+ "GPT4-o is used as an LLM to generate responses out of semantically close context chunks.\n",
+ "\n",
+ "![flow.png](data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABSYAAAJUCAYAAAAIH/gPAAAABHNCSVQICAgIfAhkiAAAABl0RVh0U29mdHdhcmUAZ25vbWUtc2NyZWVuc2hvdO8Dvz4AAAAtdEVYdENyZWF0aW9uIFRpbWUAVHVlc2RheSAyOCBNYXkgMjAyNCAwMjozODoxOCBQTS8GbXgAACAASURBVHic7N13XJV1/8fx12FvkD0OIO69TcgsUyutrDRNK73bZcv0ltavvNvz1rRdNtSs1HCUd8OtWSSO3HsLB0EERPY65/z+QI4SuBUQ3s/Ho0eHc13X9/pcB/Tx4O3n+/0arFarFREREREREREREZFqZFfTBYiIiIiIiIiIiEj9o2BSREREREREREREqp2CSREREREREREREal2CiZFRERERERERESk2imYFBERERERERERkWqnYFJERERERERERESqnYJJERERERERERERqXYKJkVERERERERERKTaKZgUERERERERERGRaqdgUkRERERERERERKqdgkkRERERERERERGpdgomRUREREREREREpNopmBQREREREREREZFqp2BSREREREREREREqp2CSREREREREREREal2CiZFRERERERERESk2imYFBERERERERERkWqnYFJERERERERERESqnYJJERERERERERERqXYKJkVERERERERERKTaKZgUERERERERERGRaqdgUkRERERERERERKqdgkkRERERERERERGpdg41XYCIiIiIiIiIiNROJpOJuLg4AJKTk0lKSqpwzGQy1VRp1cpoNFZ4HR4ebvu6W7duGI1GYmJiaqK0y5rBarVaa7oIERERERERERGpWSaTiaSkJBISEmz/nYlvqH81VFbzMg+ln/Eco9FoCy1vv/12BZVnQcGkiIiIiIiIiEg9Vt4VOXHixErHfEP9ib7tSnxD/fEL8z/+np/tdX2TkVwWUGYeyrB9nXkond1rdrJ7zc4K5xqNRgYNGsTgwYMrdFzKCQomRURERERERETqoaoCyfIgsmnXFjTt2rwGq7s8ZSSfCClX/RQPYJvmrS7KyhRMioiIiIiIiIjUM7Gxsba1IwFufOwWut3avd52Ql4KGcnprPopnl8/mWd7Lzo6mpkzZ9ZgVbWLgkkRERERERERkXrCZDIxZswYEhISbN2RNz52a02XVaeVB5QJP/5F5qF0jEYjM2fO1PRuFEyKiIiIiIiIiNQLK1euZOjQoUDZlO2nJj+tDslqdHIHZfn6k6NHj67psmqU/csvv/xyTRchIiIiIiIiIiKXTlxcHA8//DAA3W7tzqgpz+Dm5VbDVdUvbl5uNO3aAoC/l6yx7Xpen9edVMekiIiIiIiIiEgddnKn5I2P3aKp27VARnI679/3XzIPpTNq1Kh62zmpYFJEREREREREpI4ymUx0794dgKcmP6OdtmuRjOR0XrrhWYxGI+PGjauXnZN2NV2AiIiIiIiIiIhcGmPGjAGgadfmCiVrGb8wf4a9fj8mk4nY2FhMJlNNl1TtFEyKiIiIiIiIiNRBQ4YMISEhgaZdm/PU5GdquhypQvRt3el2a3fbbun1jYJJEREREREREZE6ZuXKlbbNVbSmZO1242O30LRrcxISEli5cmVNl1OtFEyKiIiIiIiIiNQxs2fPBsp24NYU7trNL8zfFh5PnDixhqupXgomRURERERERETqmLi4OKCsG09qP99Qv3rZNalgUkRERERERESkDomNjQXKuiX9wvxruBo5G35h/nS7tWz39PJu1/pAwaSIiIiIiIiISB2ibsnLU/mU+5UrV9abrkkFkyIiIiIiIiIidUR5KKluyctP2VqTt2AymWwbF9V1CiZFREREREREROoIk8kEoA1vLlNNu7YAUDApIiIiIiIiIiKXl/JAS8Hk5ck31A84ETDXdQomRURERERERETqiPJgUtO4L09+Yf407dock8lUL8JJBZMiIiIiIiIiInVA+YYp5bs7y+UtKSmppku45BRMioiIiIiIiIjUAfWhw64+uPGxW4H6sc6kgkkRERERERERkTpg1apVwNmvL2lnLsU5PxPHwmN4p2y+lKVVq64z7iFg7+81XcYFqw/BpENNFyAiIiIiIiIiIhfP2awvGbR7MY1WfoZj4TEASly8SRg2E6ud/aUu75Jzzk/HNzGBI42vqelSzkv5Bjj1gYJJEREREREREZF6pFHC5xg3zSLXvwmHm11PwJ4lOOdnErRzAaktb6zp8i6cxYLrMROeR3bgkp2KXWkRpc6eZDSMAQw1Xd1Zqw9T8xVMioiIiIiIiIjUAeWbpZyu4y5ozxKMm2aRHtWD7X1exGqwI61JLzrNeZSwbT9dlsGkXWkRwbsW4pJ9CNfsQ9hZSvFK20HHuU/azinyCOJYSDtKnT1qsNKzU592VFcwKSIiIiIiIiJSB5ypw85gMRO5Zip5/o3Zce1zWA1lW4/k+jehwDsM16zz3wXa70A8/gf+wrEomxz/ZqQ37E6eX6PjR62EbZ6Dd+oWLPaO5AQ040iT3hS7NqgwhktOCkG7loDVzNGIbmQHtqh0H6eCo3imbqPUzYdjQa0AA2FbfiRq9ZcVzitx8eZQ61vJCWpBrm9jit18T/u5nGoK+5lqcs0+hP++FfglrcYtcz9mR3c23PIeRR6BtnPszKU0SFpNwP4/8TiyA5fcNExtB3Kg6/2n+UTrBwWTIiIiIiIiIiJ1yKk67gL2rcAlJ4UtV76GxcGpwrGDne/B8/DWStd4H95KxNppeB/egsXeiVz/xqS07M+RRlefuN/+eFotfgWD1QqA78EEItZ/x8FOw0nsdBeNV35O2ObZtvMD9ywjYv10dvV8moyIaAAi1n9PxLpp2JlLAYhc9y27eoyu0MEZtGcJTf74APuSfACyQ9qyue/rZER0wyN9F3l+TcgObEGL5e+Q7xPOwc7DKz2PwWohfMNMUptdT7G7H41XfkrY5jkkdryzUlB4upqc8jNpteR1vI5vGmR2cCXPvzElzp5Y7U7EbWFbfyJy7VQcinIAyPcJ51hQG0r+Ecr+k2+ov6Zyi4iIiIiIiIjI5eFMQZbX4S1Y7BzIMnapdCytybWkNbnW9rXBaqHpigkE71wAWMkJaE5uQFO8Dm+j5eLXaNDyRvbEPI7FwZFGq7/EarBnS783yAptj2NRNsE7fiN02zzMTm6EbplLoWcIm296mwKvENyyTBg3/kDzZe/y9+2fEbBvBQ3XTCYrrAP7r3gQi50jbRa8SPimH2zBZNjWH2kc/zElLl4kt7kL94y9+CWuImj3Eg616s/2PmNttRd6BGFfXFDlZ2Bfkk/DNV+XTem2lnVyAkSsn8Hh5n0p8AoFwLhp1mlr8k1abQslkzoM5WDne7DYV47ZjBtn4lCUQ4mLNxtu+8A2/pn4hfmReSgdk8mE0Wg8q2suRwomRURERERERETqCN/Q069PaLBawGo54zh2pYUE75wPgKn9Hey/4kGshrKNY/wSV9FiyZsYLGaS2g/B9ZiJ5DYDOGrsBECxawMSO95FYse7CNn+Mwarhb3Rj9hCuXyfcHZdM4Zd14zBKS+dhmsnU+rsSXLrARS5+eOVtg2HojwKvL0BcM49QqO/PqHQM4QNt31AsasPduYSOs96GLODc6XarfaOOBTlVv1gZU2dBO5ahOeRHWQHtiClVX+aL/8vvomrSG4z4KxqSmvSG88jOwne/gvGTXE4FmVzsOPdFaZwA+y49nmaL38Xl5xU2vz2Agc7/4sjjXvaPsv6TsGkiIiIiIiIiEg9kN6wB6Fb5xGx/tszrm9odnQDINe/Kfu6PVThWEZENzIiu+G3/09Sm10PQKFXSJXjOOUfPe3xiA3TMZhLKPYKpfXCl2zvW+wd2H/FAwD4HVyJwWplb8wIil19jh93ZM2QyVWOabBaygLYk3gd3kZ2UEucCsrq8UrbTqFnMNtueI1iV28arv4Kl5zDZ12Txd6R3Vc9RUqLGwnb8iOBu5cSuGshaU36YGo/mHyfCACOhbRl7eCvCNnxC8E759Ni6ZtErJuGqf0dpDXpU2WXZX1Sv59eRERERERERKSeyArrQEbD7kSsn45jYQ77r3jgtLtUWxycMDu5VXrf7egBfBNXkx3YyrZ2ol1pcZVjlB+3N1d93H/fH2Q07M723i8SsHcpPoc2UezuR1rjnuQ3aAhg25jGK20bGQ2vPONzFrs2wCXrxLR294x9dPjpKdYP/Bj7ovzjz+bMln5v2ILO3MCWuB09eNY1lcv1b8rOnk+zN+ZRwrb8SOjWuQTvXICp3e3si37E9jkmtxlAcpsB+BzaQPj6GTT7fTwR675j6w2vkucbdcZnqqsUTIqIiIiIiMhFZzKZiIuLIyEhwfZ1fdjIoa45eW07o9FIdHQ00dHRxMTE1GBVciG29RlLo4TPCd36E0G7FpAZEY3Z3gmXvDQcC3M40vgaDnYaBsDhZtcTsu1nWix5k6zQDthZSvE6vAX//X+Q3yCK7de9iG/SWgCcCjKrvJ/BagbAMb/q4w7FOdiVFmO1syet6XWkNb2u0jnpUT2IXP8d4Rtm4nY0kSONryUruA3FHgFVjlnkHohTQRZOBVmUuHgRtvVHLA7O5HsbaWBaB0BS+ztsXY1QtjO5cWMcdubSs6oJIGzzLHxSNrM3ZgSFniEc7DyMpPaDiFozBeOmWeT6NSI96ho6/vQkKS1uIqVVf7JCO5AV2gGvtB20XPwqrRe8xOo7v6ly/PpAwaSIiIiIiIhcFOVh5MSJE095Tl3exKEuOjlMNplMtqDZaDQSExPDqFGj9D29zFjt7Nl75WMktx1A2KbZeKXtwC0rCbOzO9kBzTlq7Gw7d2/0I1gM9vgfTCBw7zIsDk7k+DVl9zWxHGl0DRY7B3L9GmNxcMJi71Tl/XICW8LWeZV2AbcdD2pNA9NavFM2cSykXZXnlLh4se7WD2j2+3v4HVyJ38GVABR5BJId2IK0Jr3IaNjddv6x4FaEbZ5F57gHMVgtOBTlcKDr/Zgd3choeCX7oh/hUKv+Fe6R0rwfQTsXYF+Sf1Y1AZS4+eF7cCUNEleTHdKGEhdvDBYLnkd2AcensdvZAdAk/kOMm+LI82+Mxc4Rx8JjOOYfxeLoil1p8Sk/n7rOYLUe38tdRERERERE5DxNmDDBFkg6ODjQrVs323/h4eGEhYXVcIVyIZKTkwFISEhg9erVzJgxAygLKAcNGsTo0aNrsjw5LjIyEt9Qf15d+M4lGL08Pqq8aYvnkZ0UeIdR6lT1tPAGpnUcNXas8lqvtO20nzcas4Mr+6If5kjja7DaOeF+dB+Bu5fhdyCew017c7DLPQC4ZSXhfWgjXmnbcc/ch0tOKoeb3cDemBG2MQ1WK60WvoTPoY1khnchtcVNto15zsa51OR+9CDG9dPxStuOc94RLPZOFHkFk9rsBlJa9sdi74CduZSgXfMJ3rEAl5xD2JcUUOLsSVZoR5LbDSLXv0mlGt6/7112r9lJfHx8nQ7/FUyKiIiIiIjIBRkyZIitk27UqFEKqeqBf3bHGo1GZs6cWacDlMvBpQ0mL50GprW0WPoWjoXZx98xUB6EFnkEsvOaWLLCOtarmhRMioiIiIiIiJxBeShpNBoZN26c1h6sZ0wmE0OGDMFkMmE0GomPj6/pkuq1yzWYBLArLSJg3++4ZSVhsJgp8A7lWEh78n3C62VN9SWY1BqTIiIiIiIicl4mTJhgCyUVSNVP5Z2SY8aMISEhgdjYWMaNG1fTZcllyOLgzOFm19d0GRXUxprqGruaLkBEREREREQuT+XTeBVE1W9Go5Hx48cDEBcXx8qVK2u4IhG5XCiYFBERERERkXM2YcIEAAYPHqzp22Kbyg+cdld2EZGTKZgUERERERGRc1YePo0aNaqGK5HaojygNplM6pqUWmfhF7+wb/2emi5D/kHBpIiIiIiIiJyTuLg4oKxbsi5vyiDnxmg0MnjwYEwmU02XInVIUX4h+dn5FzzOuoVrWfvrqotQkVxMCiZFRERERETkvHTr1q2mS5Ba5vbbbwdg9uzZNVyJ1AWm7Yn8X88xPHPlk3z170/Jzcw577GsFiuFuQUXsTq5GLQrt4iIiIiIiJyTVavKuo4upFty8+bNtG3b9mKVJLVEeHg4gKZy13KWUjMHtx5g34Y9HNi0jyMHD1OYV0TDtlEMe/1+HJxqR1w0f9LPFOUXEtbcyKZl60nansiTX47BL8z/nMeymC0Y7NSfV9voOyIiIiIiIiLnJCkpCTgRQp2rwsJCnnziKWbMmHExy5JaQFP7a6/9G/ey8Itf+Pjh93g65knG3/0mP46L48CmfTi6OOPp58Xm5Rs5tPvCp+IfPpDKs91HkpmScUHjHNy8n1ZXteX52a/w9PSxAEy89x3Sk9LOeSxzqRlnN+cLqkcuvtoRgYuIiIiIiMhlo3wNwQsJoXy97Zk1cyrFRcX8655/XazSpBYwGo1aZ7KWWfTVb/w0YRYALh6utOvVkVY92tLyytZ4+HrazivKL8TZzeWsx92xciupe1O45u7eGAwG2/sF2fnkHctjV8J2ogdcVem6zcs34hfqR2iz0/8dkpOZzTXdegNgbBFO7Pf/xwcPjOP9+/7L09NfxCvA+6xrtZSacXJVMFnbKJgUERERERGRamexWJnzgS9DxnyPFSv33HNPTZckUmcFNwoBoH3vjtz77iM4OjtWed65hJIACyb9wu41O8nPzuPGx261vW+1WABI3HYQP2MAWYePUlpSSnCjEKLaN2b/hj2s/XUV97378CnHLsovpLS4FDcvd9t7Hg08eeLzf/P24FdYPHk+A58Zcta1mkvNuKhjstZRMCkiIiIiIiLnxGQyXZwpu1aYOd6PobFlU7oVTtYN5R2TF+3nRC5Y067NAQhrHn7KUPJ83PvOwyydupBdq3dwRf8YNi3dQLrpCIlbDwCwYvpSVkxfCoDBYKBVj7Y8+slTODg5cnh/ylnd46RGTAC8ArzpcF0nXDxcz6nWkuISHF2dzukaufQUTIqIiIiIiEjNscKM8b70GDYVs9nC/fffV9MVidRZph1JJG0/SFFeEQe37idx60GSdyaRk5FN535XcMcLdwNlG8XEz1rB1hWbsJSa8QluQP+nbsfzpGnfAN6BPgx4+g4A/v5tNUnbDxLVvjHNu7Xki1Efc/MTt9F3RP9KdXg08MC0PZHCvEJc3Mu6NAvzCnF0csTe0Z5vx04mYe6fACz88ldMO5IIbhRCcONQQhqHcscLwwDYuHgdpp2J9BtxC3b2FbdRyc3MsU1Tt1qt5B3NxSfAx/Z8i7/+jW63dsc70IfZ78xg2bRFXP/QTdzy1MAK48z//Gd++2we5hKz7b07X76H7oOuJjv9GF+P+Yw9f+8CyjpOjS3CcfN2586X/oWXf9lU8xXTl/LLRz+SdywPgMCGwfiG+NHmmnb0HNbnnL6HdY2CSREREREREalR//kwn/69fFi2cCaAwkmRS2TT0vVsWrre9rVXgDctY1oT1iKc1j3aAVBSVMKUZyaxael6IttGEdI4lIxDGXw6YgKPfDQS70CfCmN+O3Yy/Ub0p3O/K+jc7woAW4hXmFdYZR1OLmWdiyWFxbi4u/DHjGXMens6PkENeH7OK1z/4I2Yi0tZ80sCRxLT+P37JRWub3VVW25/bii/fvoTyTtNXHNXbzwanAhNczKyeaHXGEZ/8xxR7RtTXFiM1Wq1dVkW5Rcy7/05uHq6YbVaWTZtEQCLvvyVKwdehX94IABLpy7k5w/n0qxbC24dPQgHR0c+e+J9lkyeT/dBV7Ptj822UPL6B2/kpsdvw97RvtLzLvr6N/KO5eHp68mY7/7PNr4omBQREREREZGaZICIEEceHOgIBleGPzeT4uIiRowYUdOVidQ57j4eBEQE0umGrjTr1pKw5sYKm9YATH76c7as2MSIT56idY+2APzwxnesmL6UeRNnM/zNByqcv37BWvzD/Ct0Rto72mMwGCgqKKqyDsfyYLKohLnjfmDJlAW4uLuQkZzOrx//xMBnhnDPOw9xcMt+vPy9uX/cCHKP5nD4QCqpe1PIO5aLnZ0dKXtTCGoYXCGUBNi4ZB0Ws4X84x2KxflldZRvfmO1lp23at5fHNyyn4btGtFjSE+mvfA1W1Zsoufdfcg6fJT/fTAHd293et7dB5/ABuzfuJf8Y/kERJYFi11uiubglv38Gfc7S6YsIC8rl74j+tMg2LdCPfe89RDTXviKjOR0Pn30fW587BY69b0Cg90/5qnXQ3ZnPkVERERERETkErFSFkoefz3tbT/+9+NsPvvss0tyuxUrVtC6dWtat24NQH5+PlFRURgMBgwGA717967yunnz5tGiRQvbeeX/ubu789577531/Z9//vlKY0SER7Bs2TIA2rVrV6G+quTm5nLvvfdWGsdgMNC3b18yMzPPWMc999xT6dqBAweSl5d31s8iInKhFEyKiIiIiIhIrTF9fjEhQU4krPiRTz/99KKPn5+fz7Zt29i2bRtWqxV3d3cOHDiAn68fAEuXLuXjjz+ucE1kZCS33norJpOJFStWYLVasVqtJCUl4e/vz5gxYwgKCsJa3oZVhdzcXOzt7Xn77bcB6N27N+PHj2fSpEkEBQfRq1cvYmNj2bx5s62+qkyaNAlPT0+mTp3KY489RnZ2NlarFbPZzPvvv8+CBQvw8/Nj/vz5VV5fWlqKvb0933zzDa+++io5OTkUFxezcOFCfvvtNzw8PIiNjT2fj1YuAx36dCL2+xfodc/1GFuEV+qWPLhlP5uWrmdg7B22bsmi/EI2LP4bgNU/ryR5Z1KFa9y83EjakVjhPYvZUuH/AMWFxZh2HL/2+J+VWW9PZ8mUBUQPuIo3lo3H2CKchB//tP1ZcnB2BEPZlPPQZkY6Xt+Ffo/2Z9Bzd+Ib4oel1ExAZFCFexcVFLHgi18AaNy5me0ZANvGPzkZ2QAc2LQP3xA/Rnw0kituuRKvAG8ykzOAsrUtS0tK8Q70YdLIj3ih1xi+HP0JpSUl3DLq9rL6nBwYMnY4z8wcS5eboln7yype7vcc346dTOq+E5v7NOnSjBfnvc6g5+/EwdmByc9M4vVbXyRh7p8V1q6sjxRMioiIiIiISK2xP8nMl6/4MOVNPzau+YVPPrn44WS5//73vzz99NNYrVbSM9LJzc3Fx8eHxx9/3HaOr68viYmJNGrUiNzcXHr06GE7ZjQaOXjwIABpaWmMGjXqlPfy8vLCYikLaQoKCli8eDH//ve/eeihh1izZg2///4748ePx8/P75RjrF69mkceeQSAv/76i48//hhPz7IprHZ2dowcOZKxY8cC0K9fP0pLSyuNERgYiMViITk5mbFjx+Lh4YGjoyPXXXcdBQUFdOrUifHjx3Ps2LGz/RjlMlJaXPln4mRHU8q6bRt1agqUrRH5dezn5KRn0+3W7mAtCxNPHsfL35vkHRXDSjt7O9x9PCjMLbC999fsFUz4V1kwX1xYDJStedl90NUMe+0+nN1c6NT3CvKz80nZcwgA3xA/jqVlVVmrvaM9nn5eHNi0j9yjOWX15hbw1ahPbM9Rcvw+5f9mkJeVC2Ab08nFiUc/HYWHrycGg4Godo1J3Vt27/UL19KuV0eejXuJ4W8+QPSAq+j7yM08F/cSrbq3qVBLeMtIhr9xP68tGUffh/uzedkG3rh1LHP+O9N2jqOzIz3v7sPzs1/hqa+fpkGwH9+OncyrN/8fh3aZTvt9qcsUTIqIiIiIiEit8X8PuoIVsMJn//Fm5R8/88H7H1ySe40fP553333X9rW7uztHjx6tcM7hw4cB2LFjxynH6du3LwAffFB1nSEhIbYOsLy8PFxcXCqdc/XVV/Pwww+TkZFxyvtcccUVvPnmmzzzzDPExMRUec6rr75qe71o4aJKx8ufLzQ0tMrrV6xYAcBzzz13yjrkMnS8MzInM+e0p0W2jcLByYFJT37I509+yEvXP8OOv7Yy7I37Gf7G/QyIvYPda3by0UPjyUwp+1n1DvShtKRy4Nkg2JdDu5OxWq0U5hawcs4fGFuEA1BwPLAMaRLGoOfutF3TvncnDAYDu9eU/XkztggnIzmdkqKSKuu98dFbyD2aw7g732Dqs1/w5oCX2LFqO11vjgZgzc8JwIlOyfIuxoKcfAD63N+X4EYhtvHCW0awd/0ezCVm8rPzKC0qwd7Bnm63XMmw1+7j5icHENIkrEINS6YuYNLIj0g3HcHNy41+j/bn1UXvcu3w61g6dSGr5v1FSVEJb93+EiumL8VqsdL0ihY88cW/if3+BSwWC5NGfnTa70tdps1vREREREREpFbavBsyjpaya+sSJkywMHr0qTsSz8e33357xnMcHR1PO0Ub4JFHHrFNnS4tLcXBoeKv2qmpqQA0bNgQNze3U47zwQcfMGnSpNPe6/nnnz9jzeUWLlpIvxv7VXnMYrFgZ1e5V8nd3f2MzyuXHxd3F6I6NKFl91OvXQplYeK97zzMnP/OZOfKbRhbRtB/5ECadm0OQK97ric/J58Fk35m1lvTefiDJ+hwXWdb8HeyqA6NWTF9Kf+57hlyj+ZgNlt48osxQNkmPPYO9gx/437bRjgAQVHBdOrb1dZB2OWmaDYtXU9xYXGV9+gx9FrcfTyYO/4HNi5ZR6sebbnvnuuJaBOF1WIl5Xj3o09QA/yNAaxfuJbe995A22s7MPDpIVw1pGeF8WIG9mDl3D8pzCugUYcmbP9rK7vX7LQ9f1W8A3zYvHwDW1Zsoknnpng08MRitpC45QBQNm3c3t4OMPDDG9+xZMoCjC3CcXByJPdoDseOHMPV3YWSopIqn7GuM1j1N46IiIiIiIicg8jISIxGI/Hx8ed1fWFhIUMH3cyPHwad9ryn3s7lvec8sTdYefyNXJq06sno0aPP657l5s+fT79+ZWHdufw6XFpayn/+8x82bNiAwWAgJiaGZ599FkdHR3bv3k2zZmVr2eXk5ODh4WG77tNPP+Oxxx4FID09/bRTtYEKa/6drr74+HjGjx9Pfn4+np6ePPfcc3Tu3LnCGDfddBM///xzleMHBgayf//+0wal52vIkCEkJCQQHx+P0Wi86OPLqUVGRuIb6s+r33VpMwAAIABJREFUC9+5pPcpyMnHxd31tLtKZx0+yocPjqMgt4AOfTpzzV29CYoKth3Py8rF3cej0nXZ6cfIPJRBw3aNLmrNRxLTyE4/RuPj09TP5MCmfUz419s4uToz8Ok76HhDVxydHEnelcTfv65m45J1dLk5mpufuI3UvYdY8MUv7N+4j6zDmTg4OeJv9Cf6tqvoMeRa7B3tMZeYWTn3DxLmxnMkKY2i/ELcvN1p3q0lve+5AWPLiAr3f/++d9m9Zmed/3OkjkkRERERERGpld5/zgOwghU+fsGDx99YzttvFfHc89U7zXj48OEVuiu7dOnC559/ztixY+nRowdTpkyxHfvnuo5vvfWm7bWXl9cF15KYmEjLli3Jzy+biurn50ezZs3o0qULTk5O7Ny5E19fXzIzM8lIr7w794ABA5g7dy5paWm4u7vj6+vLyJEjeemlly64tvrCZDIxZswYoqOjiY6OPuW0+rrM1fPMgbZPUAPG/u+NUx6vKpSEsjUrvfy9z7u2UwmICCQgIvCsz2/YrhGPfDySqc9+wXf/mcJ3/5mCwWCw/YNBg2BfmnYp66QMbhzKPW8/dNrx7B3tueqOnlx1R8/zfoa6SMGkiIiIiIiI1Hp5Rfas3ZyNnd1KXnvtdcaOfbFa7nvDDTewcOFCAKZPn87QoUNtx4qLi2nevDlXX3217b3yDW7KlZScWBuvqqnT5yI1NZXIyEigbOOdXbt24erqajs+Z84coqKicHIqmxqbnVN5A5s5c+YwYMAAfvzxRwAyMzN5+eWXefnll3F1deX111/n3//+9wXVWdclJSWRkJBAQkLZ+oVGo5FBgwYxePDgOt3ZVh+16t6G1xb/l/UL1nB4fyrmUjMBEYE0u6IFgQ2DzzyAnJE2vxEREREREZFa4+bHj3Aos/Kvqo+9ls2SyUY+fN6DI6Y1vPLKq1VcfXHt2LHDFkq++uqrFUJJACcnJ/bv309mZuXOxHIXc/W0li1b2l4nJiZWCCUBBg4cyJdffklxcdlOxGazucpx5s6dS1ZWFsOHD6/wfkFBAWPGjLEFm1K1mJgY4uPjGTVqFEajEZPJxMSJE+nevTtDhgwhLi6upkuUi8jJxYlut3bnllG3MyD2Dq66o6dCyYtIwaSIiIiIiIjUGgVFBu4YmUjK0Yq/rk59zR0PFwtY4YPnPchKXXvJpx+//PIrttdjx4495Xnl4WVVTg75/tlNeS4KCwvJysoCYOTIkRXWojzZAw88UGnznap4e3vzzTffYLVaSUtLq7B2Z0lJCc7Ozudda31gNBoZPXo08fHxFULKhIQEYmNj6d69O7GxsQopRc5AwaSIiIiIiIjUGmazmYWLfmPAY/tJzjjFxhp2sG5bHmlJ63ju2Uu33uSsWWcXKoWGhp7y2HPPnagvPT39vGvZtm277fU/Ox3/6VzXsgwICOC9997DYrHQvHnZmnnFxcUUFRWde6H1UHlIOXPmTGbMmMHgwYMxmUzExcUppPwHrz+34vvz6pouQ2oRBZMiIiIiIiJSq7i5ubF8+WIGPXGAQ5mVw8k7YzOZ8k4wn/7Hi2PpO3n66acvSR0n7659vh599FHb69atW5/23NMFgW5uJ6Ztn7xu5bn48ssv+eijj055H4PBwPbtJwLQrVu3ntd96iuj0UhMTAzjxo0jPj6ecePGnTKkXLlyZU2XWyM8NuzDf/afNV2G1CIKJkVERERERKTWcXFxYfnvS7j98QOkHK0YTk5/15eoIMAKn451ozRvD0899dRFr+Ghh07ssvvP3bZP9swzz5zymMFgwNfXF4CjR4+edjr33XcNO+Wx8k5GgIkTJ57yPOCUa16+++67PPnkkxV2GK+q3qruKefGaDQyePDgU4aUQ4cOrZ8hpcWKfW5BTVchtYiCSREREREREamVnJ2dWfHHMgY+foDE9JPCyZP3k7HCll05HMs4wJNPPnlR7//aa6/ZXoeHh1d5zoIFC5k9e/Zpxzl5Cre9vX2V53z//ffMnjPLFmL+k8Fg4JZbbgHghx9+YOfOnVWe5+7ufso6Nm3aBMCDDz5IXl5eledkZ2ef1Vhy9qoKKaOjoyuFlBMmTKg1IaVz4hFCPpyHU9KRizquwWIGg6IoOeHMK+KKiIiIiIiI1BBHR0f++GM5V/foycwPIgn3r3j8zmeP8sKIIHp2gSfeNPHYY4/zyScfX5R7u7i4MH/+fPr27Utqairu7u5s2bKFqKgoiouL6dKlC5s3b2bu3LkMGDDglOMYDAZycnLw9PS0fe3n58f1119PcXExv/32G/n5+fz444/cdtttpxxn7ty5tmCzRYsWDB8+nG+++QaA2bNnM3ToUPz8/HBzcyc9vXKg5OLiwpo1a+jatSseHh707duXX3/91dYl+fnnnzNixAgADh8+fH4f2kVmMpnOeE5SUtIFjXE290hOTr6gGs50n/KdvaEsyIyPjz/jeJdSyKRf8fpjKwGz/qTE34uiiECyr25DVu8OlHq5nfe4BrMVs6t2fZcTFEyKiIiIiIhIrebg4MCf8Su45upezHjfSJjviZbJ6e80KHthhY+ed+OxNw4xYsQIPvvss4ty7xtuuIG8vDwCAwPJy8ujUaNGtmMvvfQS69ev5+DBg2ccx8PDA4vFwosvjuXNN98gIyOD6dOn257v8OHDBAYGnnYMOzs7rFYr/333HZ559jmmTZvGtGnTAGjQoAEHDx4kNDQUb29vAAoKKk+Z7dKlCyUlJTz44INMnToVO7uK3WsRERHs3r27wm7il9qQIUNISEiotvvVdjExMTVdAgDFIb4URgVhcXXGOfEIoe//RMhH/yMnpgUZt8WQ06XZuQ9aasbqqh3f5QQFkyIiIiIiIlLr2dnZ8fuKpfS8phffTwjB6Ff1dNA9ibn4NTAzfPhwW2h3sr59+2K1Wqu48tTc3NzIzc0FwGKxYDAYKqzF2KhRo7Ma02Aw8MYbr/PGG69jtVqxWq2VgsFyp9tZ++lnnuXpZ5613fPkWgCOHTt22jocHByYMmUKU6ZMsdXxz2eqTuHh4WfsWjQajWcc43TCwsLOWMeZ7nGm42dTh9FotE3hPnmtUKPRyKBBgxg8ePBZ3edSCpi5Aq8/tnLg7fvIjmlpe9858Qh+c+LxXbAOrz+2kvav3qQ+cMM5jW0oKcXs6nixS5bLmIJJERERERERqTVKSix0v/KqUx53cHBk2JjDLPgiDGfHihvJjJ9WytWdG/Diw4489U4ed911F99///1Fre9UQeK5OlMQGBAQcFZjXOo6qsO4ceNq9P7VoTyMnDVrVoUQ1mg0MmrUKAYPHlyD1VVkdSj7GfdYt7dCMFkUEcChUbeR+nA/GsV+SeA3S8hvGUH2lS1PNVQlhlIzFpdz78Y1lJixOla9Pqtc3hRMioiIiIiISK0R/23IWZ5ZeXfrMcNP/Ir7/rOuPPlWPoMHDyYuLu4iVVd9arprTi6O03VHjh49ugYrO7VS37JuXbO7M+7r9+Kxfi8Zt8VQ6nt8jVSrFbOHKwB2+YUVrjVYrHgv3YDrnhSKQn3Jur5ThSDSrrAEi7tLlfd1SsmkwYJ1YLGQE9OS/JZl3afuG/bRaMwXJL14J1nXtqt0XeA3S7C4OZM+6NT/oHE2DGYLVvvK//BgV1SC+4a9AOR2bILV6cTfM27bk7DLLSC3a8Vp7Y5HjmGfV0hhw6ALqqk+UDApIiIiIiIidZKHuwNpOzMZOGAQc+bOqtFazGYzEydOZMyYMac8p7S01Pb6u+++q46y5BKo7VO1z+j4z6HF2ZGAH1bg9dd2Amb8TrHRH0OpGaeUTAzFpWT078axXh1slzmlZNLw/6bgsi/V9l5A3B/s/vxJLG5lYaR9QRGlAd6Vbhk4bSlBUxdhKDEDEDR1MabY28ns3w2n1EwMpWY8V+2oFEw6J6YRNHkhmTddYXvPc+0uGvy8GofsfEp9PDgy+CoKWkZUuM5gsRLw/TKO9u1Cib8XoR/Ow3/Wn6QN60XqQ31t57nuSibile9wNqUDUOLvxf7xD9kCx4iXv8Vghe0/PH9ibLOFJo99jGNaFlt/fRXzKYJYKaM92kVERERERKTO2bATVm04yu9TggkPMjN40JAaq8VqteLg4EBsbCx5eXmnPG/kyJEABAcHn9WaiFJ7mEwmJkyYwJAhQ+jevXuFHbZHjRpFfHw8o0ePrv2hJGB3PBwEyLg1BouzI3ZFJbjsTcHqaE/Wte3ZN+FhkmNvx2pXtgyAfVYejUZPwtmUzqGnbmX7nLGkDeuFc+IRvP7abhvPUFCM2a3i5jcBM1cQ/OV88to2ZM/nT7Lrq9GUBHoTMON3AI71bEeprycuB9Iq1Ro0ZTFWFyfS/tUbgMDvlxM15kvc9qRQ6uOB1dGBqOen4L5xf8VnzC8k+Iv5eP25Fb+5f+E/68+y679bhvOhDADctiXS+MlPcUrJJPPWaDIGxOCYnk3glEVlz1JcimNaVqWw1H3dHhzTsshrH6VQ8iyoY1JERERERETqnA7NYenXZV1N7z/rxjPj8xg65C5mzLy4a06eDYPBQPv27dm4cSMeHh6sW7eOjh072o6bzWaeffZZPv30U+zt7UlJSan2GuX8XPbdkVUwFB/v3LVYyYluwZ5Pn6DhC1NxSskk65p2thDwZEFTF+OUksnRfl3I7dgYu8JinBPLgsSTd+G2LyiqMLXb8cgxgr+cj9nLjfTbr6LE3xu3rQexzy2i1OhRVoaLE0f7dcF3zl9gtcLxNVE91u3FZ+lGkscMpCTQB/+4Pwj+/FfShvci9cGyrkfv3zfTYP5ajG//wK5psVgdjq9TeXyvqgYL/sZ1RxL5LcPJuC2G8Ld+wHPldopuv4rw16djsFjY9/4I8to2BMAutxCrU9nmPQ5ZeRgsVopC/Sp8Fr6/rQUg5ZEbz/dbUK8omBQREREREZGaUY17rrwb687zE3K5+67hfPd95d26L7UNGzaQnp5OkyZN6NSpU6XjBoOBzp07s3bt2mqvTc7dhAkTKm1kM2rUKIxGY63ayOZ8GI7v9m5fWAxAYeMQdn/xFJGvfEfwVwtwNqVjir3dttaiY1oWfj+vorBhED4L19HgtxM/w7mdGlfYQMeusATLSWs0Bn63DErNlPh70fCFqbb3rY72pDzcz/Z1drcWBHy3DPcN+8jr2BiHY3mEv/0DuR0bk3HzFRgsVoK/Xkh2j9a2UBKw1eJ8KAO//60ifcCVADgczQXKuiKLQ3w58NZ9lPq4EzzpN5xSjuKyPxXn5AzSB11lCyUBkl6880SNxzcJsrOcWO/WbVsiPks3cqxXe/JbR57T515fKZgUERERERGRamUwGMjOg+53H6r2ezs75fHll1/x4IMPVPu9/f39ycrKAmDr1q1kZWXh5+dHixYtqr0WOX8mk6nCVO3avJHN+Sj1KetUdEw9anvP7OnK/ncfIGjSbwROX45TSib737kPi5sLnit3YCguJemluzE72uP76xocjuWT17YhWdd1tE33BsBqxSE73/al9++byb6qNYkvD8Nn8XrcN+yj1N+LrF7tKYwKtp2X3yaSUl9Pgr9eSNL/DSH8nTjssvMxvf8IGAzY5+Rjl19EXtsooGwNycCpi/FauZ3sHq1xX7+PoCmLONa9FSWBPjimZwNl62juf+d+ShuUPXNBq0hcDhzGcryz0nVvCobi0gob3tg+Ew9XzK7O+M5LIK91JA7ZeYS/PgOsVtKGV+4qlaopmBQREREREZFq5ezszNJli2u6jBrVunXrmi7hkjm5i7AuMhqNjBs3jpiYmMtyqvaZlASU7cptX1Bc4X2rnYHUETdS2DiY8Ld+IOLV6Rx4+z4ccsqCRkNxCcWNgkk9zRRmi5MDzgdPrBVpn5OPXXEpVns7jt7QmaM3dK7yOqu9HYfvv56wcbNpMfRtMBhIHHsnxSG+AJR6ulLYMIigrxbgvvkALnsO2daGTB41ALctB2g05guaPP4xiS/ciX1uAQBH7upJUWSg7T4FTUMJmLGCkmBfcq5ojufqnTQd8SHpt8WQ37Zh2aY3x6eSW50cOHL3tQR/OZ+mD71vG6OwYRAFjYKRs6PNb0REREREROScGI3GOh8+yYWri6Fduct1/cizURTqh9nVmZzOTas8nnVdJ5LG3oVzctkmMXltGgIQ/OUCDBbracfObxeF59pd2OUXln3dpiEeq3fivmHfGevK6N+NjAFXUurrSdILQ8nqfWJHcAwGkl66m8KoYDxW78Tq6EBy7O2Y/j0Qq52BvHZRJL48DPvcQiJf+Y7cK5qR8tjNHLmzZ4V7ZN50BaXebtgXFHHgzXtJH3AlzklHMI6fQ7N736N1/5eJevorAqcvByBteC+SnruDnC5Nye3UBIBjPdue8VnkBIPVaj39T42IiIiIiIjISYYMGUJCQgLx8fF1NpyR89e9e3dMJhMHDx6s6VLqncjISHxD/Xl14TsXNI7BYq04BbsqJ21E0/C5yWXTpmNakjriRooiArE/lofnqh00WLgOl32p7Jz+LJRacN98gJzoFljtDGU7Xz/xCRZXZ1Ieu4msa9tjdXTAdV8KPos34PXHFo5e34nD919/Qc9Tzq6oBKvBUOXU7FNxOJqLx9+7cduWhOtOE04pGZQE+rD348ex2p/o92vw21rC3/6BHd8+TXF4wAXX+v5977J7zc46//espnKLiIiIiIjIeUlKSqrTvzDL+TGZTPq5uMydMZQEWygJkPjKMIxvzcRn2Sa8Vm4vO3a8D87qaM+xq9ticXTA6mpP9pUnNsPJbxXB/nfuJ+LV7zG+Owvju7MqXFsS6ENe+0YX7bkszo7nfE1pAw+y+nQkq0/H057nHb+V/FYRFyWUrE8UTIqIiIiIiMg5iY6OJiEhoabLkFooLi4OgJiYmBquRKqTxdmRxJeHkTYsBa+E7dgfzcXs40FB01DyOjTC4uJ0ymtzuzZjx6wX8F62EZfEI2C2UBTmR16HRhRFBJ7yutrEPisPz4QdHHrilpou5bKjYFJERERERETOSXR0NACzZ89WACVVCgsLq+kSpAYUNgmhsEnIOV9ncXbkaN8ul6Ci6tFg0ToAsnq1r+FKLj/a/EZERERERETOSXh4OAArV66s4Uqktpk1axZQtze+EfmnBr+uIadLM8xebjVdymVHwaSIiIiIiIicE6PRSHR0NCaTyTZ1V2TlypW2Kf6DBw+u4WpEqofrrmRc96WS17ZhTZdyWVIwKSIiIiIiIuds1KhRAEycOLGGK5HaovxnofxnQy5/VquVLb9v5EhiWk2XUmvZH8sDoDjMr4YruTwpmBQREREREZFzFh4ebuuajI2NrelypIZNmDDB1i05evToGq5GLpak7Yl89vgHvHbLi3w15lNW/RRP9pFjNV1WrZLfLorUh/qSfWWrmi7lsqTNb0REREREROScGY1Gxo8fT/fu3YmLiyMsLEyBVD0VFxdn65acMWNGDVcjF1NIoxCCooI5vD+V9QvWsn7BWgBCmxnp0KcTVw/thYevZ7XXtWPlVlL3pnDN3b0xGAxndU1xYTFLJi+gw3WdCGly8TZnsjg7kjas10Ubr75Rx6SIiIiIiIicF6PRyLhx44CyabxDhgzBZDLVcFVSXcq7Zcs7ZkeNGqVd2usYRxcnet/bF4DON3bjoYmPc/WdvSgpLObXT+bxYp9Yfv5wLlartVrrWjDpF2a9PZ3fPp131tcc3LyfXz7+kY8efo9jaVmXsDo5F+qYFBERERERkfM2ePBgjEYjsbGxJCQkMGTIEGJiYrj99tsVUtVR5Zsenby+6Lhx47ThTR1VkJMPwHX39cXYMoL2fToBcDQ1kz9mLGPBF7+QnnSE4W8+gL2DfbXUdO87D7N06kJ2rd7BDQ/ffFb3bdq1OQ+Mf5TVP69k8/INXHVHz0tfqJyRgkkRERERERG5IDExMcycOdMWVsXFxREXF4fRaMRoNBIeHk5YWBhGo7GmS5VzdHIHbEJCgm0dyXLR0dHqlKzjDu1OxsnFidBmFf/8Ngj2pcfQa9m/cS9rf11Ft9u60/LK1uRn57Poq185tMuET1ADoto3puvNMdg7VgwP96zdxbb4Lbh6uNLlxm40CPGtdO/D+1M5tMtEWHMjgQ2Dbe97B/ow4Ok7Kp2/f+NeNiz8m52rt3PkwGEi2jTkqcnP2I53vKELHW/ocqEfCeZSc7WFsHWdgkkRERERERG5YEajkdGjRzN48GDi4uJsIZbJZKoUZsnlzWg0MmjQIFu3rNRth3aZsHOwZ867M3H38SAn8xiH96eSui/FNiW6c78raH5FC8wlZj4ZMYEDm/bZro+ftYKl0xbxwHuPEtQwmJKiEqY+9wUbFv1tO2f+Z/9j9LTnMbYIB8BSambe+3NYMmWBbZr4tcOv4/Znh1ZZ4561u/juP5Ntu4d7BXgT0TYKf2PAGZ9v/4Y9/PLxPPat342jsyNhLcLpcce1tgDTYraw+Ovf6HZrd7wDfZj9zgyWTVvE9Q/dxC1PDTyPT1ROpmBSRERERERELprygBLKuu2SkpIqdN1pDcrLT3n4aDQa1RlZz1jMFlL3HaKkqITl3y0GwMXdhYCIIJp0bkZEm4Y07tiUyDZRGOwMrP11FQc27aN9n04M/c+/cPd2Z++63cz//H98NfoTnp/zCl+P+ZTNyzdy9Z3X0vveviTvTGLSyI/4Y8ZS7nz5HqwWK5OfncT6BWsJaRJG575dWbdgDcumLaL74GsIbhQCwLdjJ9NvRH/8wvz5a/YKWyj5yEcjaXNNuyo3xdm1agf7Nuyh7yM3YzFb+P7lqaz6MR6r1UpkmygiWkeyb8NevhrzKd1XXs2g5++itLiEee/PwdXTDavVyrJpiwBY9OWvXDnwKvzDA6vpu1E3KZgUERERERGRS6J8KreIXJ6OJKZRUlRC6x5tueqOnoS3isQnqMEpz9+8fCP2jvYMeXEYnsd3627atTlNuzYHYOPidWxevpHINlF0vSkaewd7Du0q+8cKZzcXANb+uor1C9bSvncn7h8/AnsHe9pe24GPHhqPnd2JsHH9grX4h/nTd0R/+j81kKOpmexes5Np//clfe7ryzXD+uDs6lyhvqQdB5n/+f+4/oF+FBeVkDD3TwD63NeXW0cPwnB8/K0rNvF17OdYzBYGPD0EgFXz/uLglv00bNeIHkN6Mu2Fr9myYhM97+5zMT7qekvBpIiIiIiIiIiIVJJ28DAA19zdm1ZXtT3j+dlHsnB2dcbL37vSMavVyi8f/4hXgDcZh9IZP+wt2zHvQB963XM9AJuWrgdg2Ov32dZxDGsezlsrJlYYz83LjaQdiUDZepdPTX6GjUvWE//Dcv734VyWTF3INXf15uo7r8Wjgefxa9wpLS4ldV+Kbc3M8JaR3Dam4sZNra9uR5ue7Vm/6G/63N8PgAOb9uEX5s+Ij0bi3sCDnybOJjM544yfiZyegkkREREREREREakkNzMbgNLi0rM6Pz87j5KikiqPpR04zKHdyQz+v7voenMMf81eQcqeQ4Q1M9K1f4ytw7I8jNy/ce9pw1Avf2+SdyRVeK997460792RtIOHWfTVbyz44meWTJ7P3a/dR6e+XW2BqWlnEqHNjDg6O+Li4VJp7JQ9yWxdsYlGHZrY1tF0cnHi0U9H4XG8zqh2jUnde+isPhc5NQWTIiIiIiIiIiJSSU5GWTCZnnTkrM43l1ooKSqhMLcAFw/XCsfys/MAKCkqwc3LjT739a1yjKvu6MmGxX/z2eMfcPXQXrTq0YZGHZpUGs870IdjR7JsX09++nM8/bzoP3IAgZFB3P3qvdz46C1Me/Erpj7/BeGtIvEO8AFOBK3Rt3Xnj5nLmfLMJJoe37xn7/rdbFi0ltAmRu4fP4KdK7cB0Of+vrb1LQHCW0aw6Ov5mEvMlXYcl7OnYFJERERERERERCrx9PMCwM/of1bnN2zXiAzTEdtO2icLaRyKi4cry79dTLdbrrSN/U9NujTjiUljmPn6tyz/bjHLv1uMwc5ASONQIts2oufdvQlrHk6H6zrj6Oxou87Ny43l3y5m9by/iOrQBGc3Z0oKi0naloi5xEzesVzCW0ZibBmBsWUEALfF3oHBzo4tyzey9tdVODo7YmwZybDX7qdz367YHV/fcuDTQ7hqSM8KdcYM7MHKuX9SmFeAu4/HWX0+UpnBWtVPi4iIiIiIiIiIXFYiIyPxDfXn1YXvXJTxLGYLOxK20ap7m7M6v6igiEM7k4jq0KTK40unLmTOf2cSFBXMHS8Mo3GnpphLS9m7bjer561k1+od/OutB2gR0xqr1cr+DXvZt343+zft4/C+FLLTj3HnS/fQ8YYuVY6/LX4Ly6YuJGXvIbIzsnHzdCOsRThXD/1/9u48LKry/eP4e4ABBBFBFGVR3BVXzAXcVzT3Bco0Nf2mmUthufRLM5f65pqaaRl+1dRyQTO1TExcUgTDNTVTc4NRURZBAYFhZn5/DHMEwV0ZlPt1XV7MOXPmnIdhkfnM/Tx3W+q183ng2E3xWH7dvM1hweBZnIs6Q3h4+EvdREwqJoUQQgghhBBCCCFEHhaWFo8cSgLYFLO5bygJ0HaQP1mZWn5ZtJmFb89BpVLlCgQr1KmIa8VyynYlnypU8rn/+e7l3az2Y403p8ISSBY1EkwKIYQQQgghhBBmtGPHDhITE+nbt6+5hyLEc+c/tAu+PZtzZEcUN68morZVU7ayGzV8vZXGMqLokGBSCCGEEEIIIYQwo4MHD7Jv3z4JJkWRUaK0I637tzf3MEQhYGHuAQghhBBCCCGEEDklJyebewgFSqfTcfv2bXMPQwghCpwEk0IIIYQQQgghnsr58+eZOnUqFy5ceOpzzZo1i/r163Py5MlnMLIXg06nk/XthBBFkgSTQgghhBBCCCGeysyZM1m2bBlt2rShcePG9O3bl++//56bN28+1nkMBgNbtmxBr9c9KIrIAAAgAElEQVSTnp7+0OPXr1/PpEmTSEhIeNKhFwo6nQ47O7sCu17Lli355ZdfCux6QghxPxJMCiGEEEIIIYR4auXLl6d9+/b4+vpy+/ZtPv30Uxo2bMiwYcPYt2/fI50jPDycmJgYAJycnO573IkTJ+jTpw/jxo1j1apV7N2796nHn5CQwMqVKxkyZAhNmjShevXqdOrUidOnTz/1uR9Gq9Vib2//3K9jEhsbS1hYWIFdT4jCbPs+S85H343H9HpYvdWKWylmHFQRIs1vhBBCCCGEEEI8seDgYEJDQ1m2bBnt2rVT9p8/f57vv/+eDRs2EBoayujRoxk7duwDz7Vy5UrltqOjY577dTods2bN4rvvvkOv1wPQq1cvunXr9kRjnzBhAsePHyc1NZWYmBhUKhVeXl54e3tjaWlJREQEISEhTJ48+YnO/6i0Wm2BVkzqdDouXrzIsWPHiImJ4c6dO5QsWZIOHTrIlHJR5Bw6ZcnNW3oqlzf+TsnQqth3yBLvynoaeOvNPLqXnwSTQgghhBBCCCGemJWV8WVlRERErmCycuXKTJs2jfHjxzNgwAAWLlxI/fr1ad8+/068u3fvJjQ0FHt7e1JTU3FwcMh1f1JSEiNGjCA8PJyaNWsycuRIfHx88PDweOKxHzhwgOjoaMqVK8fUqVPp1q0bzs7Oyv0xMTFYWloq21qtFrVa/cTXu5/nHUymp6cTEhLC5cuXuXz5MllZWRw9epQePXoox7i7u9OkSZN8A2EhXmZ6A6Rn3N02GIwf09MlpC8IEkwKIYQQQgghhHhiZcqUAcDe3p7IyEjCw8MZOHAgpUuXBozrRpYoUQKA1NTUfM+RmprKxIkTqV27Nn5+fgQHB+cKAC9cuMDgwYOJjo7mk08+4T//+c8zqexr2rQpMTEx7Nq1K99g0NPTU7l9+/ZtWrZsSdeuXZk+fXqeY3fs2MHOnTuZNWvWY48jLS0tTxD7LC1fvpwZM2bk2ufs7MygQYOoX78+tWrVUr5ejys0NJQdO3aQlJREnTp16NixIzVr1nwWwxYvsNh4FXujLGndWIdrKYO5h/NABj25fp/o9dnjlcUPC4QEk0IIIYQQQgghnphWqwXAzs6O4OBgdu7cSXBwMF5eXmi1WmJiYsjIyKBfv373nXI9YcIEEhISWL58OVu2bAHAwsKYCty5c4e3334bjUbDkiVLyMzMZODAgfj7+9OuXTvc3NyeeOzOzs7Y29sroaRWq+XPP//k2LFjnDlzhtu3b+Pg4MDgwYMpXbo0iYmJ7N69O895srKymDZtWp4wdcmSJcTExODo6Ej79u3p06dPvuNIS0ujXLlyyvb27dv5+++/ef/993NVbIJxLcxSpUrl2rdv3z7WrFlDUlISLi4uDBkyhPr16yv3t23blhMnTuDt7Y2Pjw9jxoyhcuXKBAUF5Tseg8HATz/9RHBwMOfPn6dkyZL4+vryzjvvULt2beW40NBQhg8frkyr37lzJwsXLiQoKIhRo0bJtPAi7Oedlhw9bcmuSEscHQyUdYEG3joa1dZhX3CrFjwSnR5srA25tgFsnn1xtMiHBJNPYcqUKbm2bUuX53qqnthUPbEpudchKFa6PACu9sb/XMsWtyA9Lvrutv39o/hLly7l2Td83OQHPkYIIYQQQgghCkJmZqZy+8033yQ8PJw7d+5w+vRpatasSdeuXQkMDMTPzy/fxy9fvpytW7cyc+ZMqlevjkqlUqaHg7Hj9/nz55k3bx7+/v589NFH/PHHH/zxxx9MmjQJLy8vKlWqhJeXFy1atKBp06bY2to+0tjj4+NJSUmhV69eyjqTaWlp2NvbU7FiRezt7UlOTiY0NJSPPvqI9u3bExYWRlpaWq4Ky/Xr1xMTE8PSpUsB2L9/P8OGDcPGxoZGjRrh5ubGJ598QkpKCoMGDcozDtM1TebPn8/p06d56623ck0tj4+Pp3HjxmzYsIEGDRoA8M033zBjxgwqVapEnTp1UKvVDBkyhG+++YYmTZoAUL16dRYvXqycx93d/b7Vq7GxsYwaNYqoqCgA/P39cXR0ZN++fWzfvp3JkyczYMAADAYDM2bMwMLCglWrVuHr60tSUhJr1qxh1apVeHp60rNnz0f6Oohnx8PDA41GY+5hAFDayUC5MnpsrOF6vAVrt1mxfrsVdavpaNVYR81KhaOSUq8Da+uc28ZAPWdYaQ4JVxIAnmq5iheBBJNPaPDgwaz9dRdlW7yRY+9Z0uOjcx2XHheNbXYoadq+V877lX0uefcBJJ3eT9LpcGYuXs6myLN0qmTzZJ+AEEIIIYQQQjwDGRnGxdl0Oh1t2rRh06ZNDBs2jOjoaLp06cLo0aPv+9jjx4/z2Wef0aNHD/r27UtKSgqXLl3CYDCwevVqunXrxrp166hVqxa9e/cGYMSIEdjb27N//37++ecfLl26pEzHXrZsGTVq1GDDhg2PNDU6Otr4+uzIkSO4urrSv39/2rdvT8OGDXOFoyYDBw5k586d/Pvvv9StWxcwVjDOnj2bHj160KFDB44ePcpbb72Fn58fS5Yswc7OjsTERJYuXcq0adNo0aIFlSpVynXe1NRUJZjMysri3LlzVKpUKVcoCcYKRZ1OR1JSEgBLly5lxowZuRoL/fbbb2zYsIFx48axa9eufD8Pa2tr4uPj831OTpw4QVRUFNbW1ixYsIDOnTsr4/ruu++YNGkSdnZ2+Pj4KFPsmzdvDoCLiwujR49+4NdcvPx+P2CslhzZX0vdaneLtmLjVew+aEnkMeP9XVrp6N42y4wjNcrSq7BVG3JsGz+aO5gsKiSYfAIrVqxgxYoVtPkh0QxXn8DFjTO59NNMalhcA7zMMAYhhBBCCCGEMDJN401LSwOgZs2a/PLLL4waNYo5c+Zw8eJFZsyYgXXOkiSMgd7w4cPJysoiKioKPz8/YmNjlfNt3LiRBg0akJaWRvHixZXHlS9fnk8++QSdTsdrr73GoUOHOHv2LHFxccyYMYOff/6Z7du3ExgY+NCxJyQk0KhRI0aNGkXz5s3zDfFy8vX1xdbWlrCwMOrWrYvBYGDcuHGoVCplRt2CBQtwdXVl0aJFSlXlhg0bAGO4N2vWLL799ttc501LS1OeH4PBQFZWFhUrVsxzzNdffw1A48aN0el0zJs3j44dO+bqdh4SEgLA5cuX+eGHH/Kt0NTr9crzbHLkyBF8fHyUQHfQoEFKKAnGJkcjRoxg2bJlbN++HXd3dwAqVKjwwOdMFCxTxWTClXhKubuYZQym1QfOXrDIFUyWdTHwRpcserbX8dUqNb/utcTL3UDd6jqzjNMkK8uAtU3ObeNHG+v8jy8oiVfzf/PgZSPB5BPy6j3BbNcu1/INLv00Ey8vL7ONQQghhBBCCCEAZb3DK1euKPscHR1ZsWKFEsLFxMSwYsUKpSowOTmZIUOGEBsbi5WVFY6OjjRo0IDatWtz/Phx1q5dy+TJk/H29qZy5cpERUXx+++/06FDB+Ua6enp6PV6rK2tUalUlCtXjiZNmvDzzz8rzXYeRqPR0Lx5c5o2bcqtW7dITU3lxo0b3Lx5E71ej5OTE05OTlSsWBFLS0tsbGxo27YtK1eupHPnzmzYsIGwsDAWLVqkVDdevXqV6tWrK2MICwtj7ty5eHt7o1Kp+O2339i7dy+tWrVSxmEwGLh58yYAarUaFxcXjh49SmJiIs7Ozty+fZuRI0dy9epV5XPXarWkpKTQqFEjwFixunDhQsLCwujYsSMREREsWLAAf3//XOtXgrGy8cKFC8r26dOn6dWrF1u3bsXGxpjQ5AyDTTZt2kRcXBwNGjQgOTlZGYsofBKvJpgtmLQvZqw0dHAwsOhHK/46Y4m12kDpUsZp0/E3VWizoEVDHXVyBJdxN1V8s0bNlet31yYt62Lg/97JxDY7JLydqmL5T1ac+te4tJ2lpYEhvbNoWDt30H49XsWOA5bE31RhXwzqVNfjVy9vAGowQEqaCqfid6sjb6UYr+/oYP6KyZd9GjdIMPlE8lvzsSDlN/VbCCGEEEIIIczBFHrdu2ahpaUl//d//0fNmjX58MMPGT16NMuWLSMuLo7+/ftz6dIlli5dStu2bXM1SVGr1axdu5azZ8/i4+PDf//7X95++22GDh1K8+bN8fDw4MaNGxw4cID09HQmTJigNIgJDQ3FyspKWVvxQVJSUrhz5w7Lly9n+fLlDzx23rx5ylTyoKAgunTpgr+/PwADBgyga9euyrGNGzdm1apVDBgwgFu3bnHs2DF8fHxYsWIFWVlZ9O7dm7fffptPP/2U/v37o1KpsLW15d9//1XOERQUxKRJk+jRowcNGjQgKiqK69ev06tXLzZt2sSmTZt4++23qVq1KnPmzCEqKorTp08THR3Nm2++yfTp0zl06BBvvvkmvXv3Zt68efj6+irnd3d3JyEhgYSEBEqWLMmKFSuwtbWlcuXK2NnZUaVKFRYvXkxKSgrVq1dXmv5ERETQt29f3n33XX799VcA4uLiHvpci4Lj6elJZGSkWcegy7r789yykZ5/LliQqVVxJRY8yuppWEuPn4+O6hXvBn+3U1XMX6EmOUVF385ZNPDWs+ugJdv3WfLXGQsa19GTkgazl6m5Hq+ibjU9HmX17I2yZO02KxrWvrvW7T/nLVi8Vo3aCqpW0OFUAtb8YkVGBrRunDuczMwyhpM2OSomM439vLA148p5CVeM1ZISTAohhBBCCCGEEA9QoUIF7O3tlXUG79WzZ0+srKyYO3cuAHPnziUxMZFVq1blGyCaunHHxMQAxunTW7du5dNPPyUyMpJDhw5Rrlw5unXrxjvvvEOVKlUA+Ouvv9izZw+dOnWiZMmSDx23Wq3GycmJlJQUtFotbm5u9OrVC39/f7y9vUlLSyMrK4vixYvnaqZTvXp1Zs6cyeeff06vXr2YOHFirvOOGzeO+Ph4/vjjD4oXL86wYcP44IMPKFasGIASWk6dOpVatWrh4+ND48aN2bRpk7LW5IABA3BycuLzzz8nNDSU1q1bM3ToUOrVq4der+fcuXOoVCq+/vprxo8fz969e3Fzc+OLL76gX79+gDEgXbRokdIhe//+/crn8corrxAcHEz79u3R6/UkJSUxfvx4paI1ODiY2bNnExISQlJSEiVKlMDPz49169YpAae3tzc2NjZKhaUQJtrs7M+ghzpV9Xz0tpZv1qqJu6migbeeLq3yVi7+uteS+CQVTevrqF5RT6YWricYA05TteRPvxtDyT7+Wfg3M57DuSTsCL/buf6ixoKvfrCiupee4W9osVFDShrsjLBk/W9W1Kysx7XU3UA0e4ncXNO2MzJBpQJrScwKhDzNQgghhBBCCCGemIuLCydPnlQCxfx07dqVLl26ANCnTx/Gjx+fp7GLSZ8+fUhOTqZ169bKvooVK7Jy5coHjuPs2bMASmXjw9jY2BAeHk5mZiaZmZm4urrmuv/eNTHvHWOfPn3yvc/R0THPGpI5VahQgV27dpGenq5Ml540aRIdO3ZUwkswPmc5KzFNvvrqK+V2jRo12LJly32v1aFDBw4fPoxKpcoVIHbs2JEOHToQERFBq1at6NevX65guVKlSnzzzTeAcT3K/L62FStWJCQkJM9amMK8TGt/nov6h6qNqpt5NOJJnYs6AxgrYF92EkwKIYQQQgghhHgqDwolTUzTtU1rIj7IkCFDHnsMBoOBypUr07Zt20d+jL29vVIlWJCsrKxyreHo6OhI+/btn8u1clZ7mlhYWLB06dJHevyDvrb16tV74nGJ58NU0WoKtszBkF2QmKk1/sy7lzXw8XAtweut2LLLirgEFW92z8LUa+pmMuw7bEm50gYO/mXJgWN3KyBrVNJTu6px/cjj/1jgWU6vVEsCtHhFR4tX7m7/useSkg4w7DVjtSTAgaPG8+n08PNOK955Xascn5E9xpx9rzIyjQ18cqwwUeBMjW8eZVmKF50Ek4/AtKak6WNSUhKpmoukx0UDsuajEEIIIYQQQphbQEAAnTp1Qq1Wm3soQpiNqcIu4UqC2cZQ3M6YTCYk3d1nZ2tg9JtaNoVZsWO/JfFJFox609jU5q+zlmRlwdBALVZWEH7EgpQ0C6pU0NOkjg5TNm5paSDploqbySqcHPNvTJN4S4V7GQPFsvP4v85asGW3FZ7l9ICKI39bcOpfC2pVyW6Wk32a1LS75zAYVGRlQYYWJdwsaKavn5+fn3kGUIAkmMzh0qVLrFixgr179wKwZ8+eBx4f92feknkrO0fjR3tHbF08UTu4YGn79O/A2bp4otPq0KWnkHwmEvsKjQg9nULHmnk7pQkhhBBCCCFEUaNSqXBwcDD3MIQwKw8PD3x9fYmMjCThSrxZOnM7Z4eG6Zm5Sw4tLKBPhyw8XPV8v0nN/0LUjOyv5U52Y3dtFri7GujdQQfkXYeyna+en363ZPq31vg3zaKal54KbsbA0qRKeWNDnAUr1dxJV3HxiopKHnpGvZmFTmdg1lJrFq9R81qnLFo21KHOTsWuxamoU814W602ni82zoIKbvp7h1EgDm4OB6T5TZEyZcoUpk6dmme/pX1Z5bYuNfaB9wNkZR+TlZZMelw0lnZO2Hn4UKL6o08nyCkz+QoAqfG3sveosPPyI+6Pb7hzR3v/BwohhBBCCCGEEKLISryaYJZg0sXJgI011KiYf6jXpK4eSwstW3YbI6lKnsYgcHOYFaPf1HK/1QM6Ns/C1sbA1t2WbNppfKzaCjzLGajorqdb2yx6ttdxO1XFqX8tsLUx4N9MR7c2Oqyzw8b3B2lZ8L2a9dutKF/OQEUPPaWdDBz920KZIl7J04ClBRz52zzBpGkafmBgYIFf2xxUBoMh//rXIsa03km7Ln05GG2NdZn6WLs+/noZutRYdCnXuXMxlMzrx9GlxqIu6Ua1UaHPdLynPqvDpsOJ9Gzg9EzPK4QQQgghhBBCiBdXSEgIY8eOpfOI7nQe0cMsYzAYVKhUD46bDIa76zgu+kHNX2ctqFNNTx//LMq6GLtpnzynIvK4JVeuq/g8KBMba2Nl5ekLFlyMUXFRY0FcogqdQcUHgzIpU+rhEZdOb1z/spiN8di4RBXJKSqqlL8bQl6IUWFvR64O3gUl8udwVk9aRmBgIHPmzCnw6xc0qZgEVqxYodx29B1PccebT3wuS/uyWNqXxdq1HpnXj5MY9gHapKukXo7CvsLDF3l+HO5O9+8SJ4QQQgghhBBCiKLHtC5h5M8HzBZMPiyUNB5z9/aw17Qs32TF4VOWnDhrjUp1t4mOpaWBBt56rLJ74qitoG41PXWrQX5Tvh/G0gIllAQo7WygtHPu8ZqqOM3BNI27KDS+AQkmAfDy8lI+Hjj95KHkvSyLuyq3tUlXocIzOzUAbiVlUWchhBBCCCGEEELclXOdycifw/Ht2czcQ3ootRqGvZaFJlbPyXPGCkYHe/Asp6eal95sTWgK2rmoM5yLOoOHh0eRmcp9n5n7RYspmIyJS3/ic1Rztyd8ri8rPqyr7LO0L4t1GeN0cHVJt6caoxBCCCGEEEIIIcSjCAoKAmDb4rxNewszj7J6OrXQ8fqrWXRumUWdqkUnlATjNG64+/UrCiSY5G4wmZNDMSsWj6zF0E6ej3SOrk3KUKmsHQ0ql8BCpXr4A54BmcothBBCCCGEEEKIe3l6euLr60vi1XilmYoo/EzTuItKtSRIMJlLzq7b/du40aupK2P7VHykx3ZsYOx09c2v0eiln5AQQgghhBBCPBcRERE0a9aMiIgIcw9FiELLw8ODgIAAALYt3mzm0YhHsWriMqBoVUuCBJN5mMJJe1vLR35Mo2qO1K3oQHJqFivDrtxzvuvPdHxCCCGEEEIIUZRt3LgRjUbD/PnzzT0UIQo1UxOcc1FnJJws5M5FnSmS1ZIgwaTCNJ3b0r4sADdTtI/8WNN075VhV0hNf/yOUEIIIYQQQgghHo2pUjIyMlKqJoV4AA8PD+bMmQMYO3TLlO7CKeFKPAsGzwJg7dq1eHh4mHlEBUu6ct/HjiPxfD6oWp79i0bWolLZYnSfegRtlh6HYlb4N3DBYIAfdl/Nc7ypAjN2xywsbR0eawxqR2mYI4QQQgghhBAmISEhaDQaZXv+/PlKVZgQIq/AwEClwnjVxGW8v3wcpdxdzD0skcPqScYp3IGBgUXy95kEk/dxLTEjz1qRVdzs6N3UFYBa5Ytz7MItuvuWwUZtwd4TiVy+cee+57NJv4ZaFw9AfHx8nvtdXPL5xZB6Mdf+nLebTZjwWJ+PEEIIIYQQQrxsTFWT5nox//rrr6PRaAgPDzfL9YV4FIGBgURGRhIZGcmCwbOZtmOmuYcksq2auIxzUWfw9fVVqluLGgkms126dCnXtk5vIPG2FodiVqhUYDDAW+3vltNm6Q1YWqgY0bU8AKt25V5bEnI300m3q0ZmMWcAipXIe/3U+4wrXe2q3NYk392/f+ZMZsyY8ZDPSgghhBBCCCFeHgcPHlRue3h4oNFo2Lhxo1mCyZCQECIjIwHMGo4K8TAeHh7MnTtXCdIn+0+QykkzS7gSz+pJy5Tp9XPnzjXziMxHgskHiEvOxKWENZ4uxShmY8HAdnenVqel6+jhW4ZKZe24kpDOjsN5qyBzsvd+A2vXes9sbCknVj6zcwkhhBBCCCHEiyAkJCTPvoiICLMEgzlD0sjISAkmRaHm4eHBunXrlHByweDZ+PZsSucRPcw9tCLHuKbkbBKvxivrgBa1dSVzkuY35K2WNEm4ZWyAU9XdjoXveqO2slCmd9/J1PFeDy8Avtp8Ga3OkOfxpkY6QgghhBBCCCGejimU9PX1VfYFBQUpVZPmGg/Ahg0bCvz6QjwuUzgZFBRE4tV4ti3eIt26C9i2xZv5tOMEEq/G4+vrS3h4eJF/U0OCyQeIv5UJwKIRtajj5cDCLZc5fO4WAF+P8Ka6hz2Xrt9hzd5r9z2HdZlnVyUphBBCCCGEEEWVqelNQECAsi8wMBAwVk3mbIrzvN1buanRaKRDuHgheHh4MGbMGIKCggDYtngLk/0nsG3xZhKuPHgmqHgyCVfi2bZ4c/bzvAUwvqmybt06M4+scJBgEvDy8gLyVjhaWxmfHkd7K/b8lciM9RfYccT4g9q0phMAH39/Fm2WvuAGK4QQQgghhBBFkGk9Rz8/P3x9fZUgMmfX4YKSX4WkOao2hXhSY8aMITw8nMDAQKV6csHg2UozFvH0TIHkpx2NgaRp6vbatWsZM2aMuYdXaMgakw9w7moqUJq/o1MY/vVJ9AYDS0NjaFTNEf8GLiz+JZrdxxPMPUwhhBBCCCGEeOmZgsmca7HFxMQQFBRESEhIgVUsajQaZSw5ScWkeNGY1jc0/QzNnz+fg5vjObg5HGc3F6o2qk4p91JUbVSDqo2qm3u4hV7CFeNzByiVkSZBQUEEBgYW6bUk70eCyRx0qbFkXj+uNKmZveEiWyNvcO5aGplaY1VkeqaeQXP/wqWEtTLV+0EybxxXzg3PZlp3zm7fQgghhBBCCPGyM02dNk3d9vX1JTIyEo1Gg5+fH4GBgYSEhDB27FjmzJnzXMdiqsw0XdO05mVkZKR05xYvJNP0btP3dGRkJJGRkRzcbJravQVnN2MH71LupZTb+W2/7BKvGp+ThCsJubbzqzL18PAgICBAqiMfQoJJYM+ePcrt5MhZlO7xAwA6vYFT0Sn5PuZhoaQuNZbEnR8CYO1aj2KVOj6bwQohhBBCCCFEEWPqgN2kSZNc+03TuQuyatJ0jT59+iiBaUBAAJGRkWzcuFGCSfHCMgWUYPzZiomJUUJKU5WwMYiTqd75MQWRvr6+8nvgMUgwyd01JsEYKMZt7k+xSv4UrzPoic6XcuJ7Uk6sVLaL136y89yPdPsWQgghhBBCFCWmMND0Yj9nlSIYAwFTFWVISIhSWfmshYSEoNFoCAwMxNPTU9lvGpdM5xYvCw8PDzw8PHIFbKY3AmJiYnJtF2TjqcLA9NwAeHp6yvTspyTBJMZgsnXr1krlpC41lpQTK7lzYQfF6wx8rCAwOXJWrqnWFrbOJEfOyvfYnMfldw1Le1f0mXcrNg1Z6Rgyb2NnLc12hBBCCCEKu4iICCU0yVltIp6O6QWg6cVykyZN8PPzkxeGLzlT8PGgr/PcuXNp1qwZ8+fPf27BpKnpzb2VmzmDUZnOLV5Wpp8/+X0rniUJJrMtX76cFStWsHfv3lwB5f1CxUdhWdwdy+L3/4G1dKiQa9vCxinPMRbFcty2dQagvL0Ga/2tJx6XEEII8Sg0Go2yzlBRfUdcFIycL3D8/Pxo0qTJcwsVCoJGo+HDDz/MN4gsSutwPS+m30OmKbQhISGyjtdL7t71JQGlWjHn/0sFUTVp+rk2dQLPSaZzCyHE45NgMpuXlxdTpkwB4NKlS8q/vXv3cunSJWV/zo/3srB1xtq1MQDWZZsoQeKzpkmvivP175/LuYUQQoiIiAjGjh2bbwjpUbq0GUYkXnY5v9dCQkKUzqABAQEvXAfLefPmKY0xnN1c8O3ZlKqNauDsVopS7hJKPisJV+42G0i8Gs+2xVuYP38+GzZsUDqfipdHfutLmn4v3Pt/VVBQEH379n0uVZP5BaQ5yXRuIYR4fCqDwWAw9yBeNFOmTGHmioPYer2KPj3xuQWQD5K0933kSyeEEOJZyxmqAAQFBuJZpgwepUvjW6uWGUcmXmaauDjjxxs30MTFEXHqFBuyZ7B4eHiwbt26FyKcHDt2rBJcdB7Rnc4jeph5REVHwpV4Dm4OZ9viLXh4eDBnzhypWHuJNGvWDI1GQ3h4eK7fBffb//rrrxMZGcnatWuf6ffBvdeLiIigb9+++Pr6sm7duud6bSGEeFlZmHsALzpzhJJCCCHE85AzlPStVYvLISGMee01Alq3llBSPFcepUsr4XdA69bMHTmSuSNH4lG6NLmYX6gAACAASURBVBqNhtdff73QLyMQERGhhJJTQ2dKKFnASrm70HlED978bAgajSbXGyzixXe/9SVN26ZGHCZBQUGA8c2CZ8XU9MbX1zfPOHI2wQkICACQ70EhhHhEEkwKIYQQgoiICOVF1LopU1iXvbyJEOYS0Lo166ZOxbdWLSWcLMxMPz+dR3SXKdtmVLVRdao2qq6sMShefA+aPp0zEMzJz88PX1/fZ1qxaJpObgoe78d0zcL+ZooQQhQWEkwKIYQQQglVpDpSFCYepUszd+RIwPgiv7Cu22ZqEuXs5iKVkmZmqpwEqVh7WZgCvj59+jzW49atW8ecOXOe2Tgetr6kiakBT2H+nSWEEIWJBJNPSJ+RaO4hCCGEEM9ERESE0mV0zGuvmXk0QuSWM5x8ltMyn4fOI7qbewgCY9Wks5uLVKy9JAIDA82+XuOjhpImpqnkEo4LIcTDSTD5BFq3bk1m7J+kX/qtwK+tT08k5fhCrEpWKfBrCyGEeDmZXrwHtG4tXbdFoVTYq3hNUzxF4VHKvRQg3ZFfBh4eHmZvIrNhwwbg0as2TVPMJRwXQoiHszL3AF5EXl5etG7dmj17tuepnNSn562kvHdffsfkx9RYJ+fHzNg/sa3QCVuvV4mOvUX5siWe5FMQQgghFKZQxa+Qhz+i6DI1xok8dYqIiAizhxT3MjXeqNqouplHIkyqNqrOuagz5h6GeAloNBplVsGj/u4xTeeOjIwslL+zhBCiMJFg8gl4eXmxfPly9uzZ88jHP6lLly7l2q5UIxhNkhXR1ySUFEII8WyYKooKe1WaEFA4K5BMY5KmN4VPZGSkhELiqZimY5umZ+f0oN9HQUFB9O3bl/nz58v3oBBCPIAEk0/Iy8uLt956y9zDEEIIIYQoEny9vYk8darQBpPObhJKFiby9SgaTNXK9+vO/SyY3rx71PUlTWQ6txBCPBpZY1IIIYQo4kwvmmR9SVGYeZYpY+4hFChdlo5Zr09ndt/PzD0UIQot5f8vD4/ncv6QkBA0Gg2BgYEPvIa7u3uefTm7c5ua5wghhMhLgkkhhBBCvHSh5I6oKNaGhZl7GEI8seiTF4k+dYnYC9fMPRSzSU1K4ecvN3ArLtncQxGFlEajeW6hZE6P2vTmXqbp39IgSxQVWVlZz+37vWXLlvzyyy/P5dzCvCSYFEIIIcRL5+Dff7Ns2zZzD0OIJxa+cR8AxZ2Km3kk5qHN0PL10C/Zuew3kuKSzD0cUQiZqhCf5/qNgYGBXL58+Ymv4efnx5w5c5gzZ84zHpkQhdPevXt57bXXcn3Pnzp1ipo1a7Jy5cqnOndsbCxhT/im8759+1i2bBkGg+GpxiCeDwkmhRBCCPHS0en13E5LM/cwhHgiGWnpHN0eBYBdCXszj8Y81k5bRczpy+YehijETFVZTZo0MfNIHuxx16YU4kV27NgxAL799ltu3boFwOHDh0lLS1MaST0pnU7HxYsXOXbsGFu3bmX9+vXs2LHjkcLGr7/+mqlTpz71GMTzIc1vzGTs2LFEREQQHh5u7qEIIYQQTy01PZ3UO3co4+Rk7qEAxj9eVeYehBBPaNviLWTcyQDA1qGYmUdT8A5tO8jBzfI3sniwJ21KI4R4fs6cOQOAVqslLCyMXr16cfLkSQASEhIe61zp6emEhIRw+fJlLl++TFZWFkePHqVHjx7KMe7u7jRp0gRHR8cHnuurr74iODiYiIgIRo8ejZWVRGGFiXw1zCQiIkI6tAkhhHihZWq1bP/zT9bt2kXEyZPoDQaWjh9P+4YNzT00dHo9dra2z/06izZtYvm2baitrAgKDOS1Nm1QqQpPJHrh6lV6TZzIttmzcXcpfF2K09LT+XTZMlr7+NDlOU7HfJFoTkeze9Xv1PCrxYWj57C2tX6i88TH3OCvXceIPnWJ24m3aPlGO+q183nGo30+7EsWp/OIHty4FMuhbbI2n8grZ1Mac5LXc0Lk9s8//yi3t2/fjr+/P6Ghoco+nU6HpaVlnsdFR0ezadMmsrKyaNu2LT4+PixfvpwZM2bkOs7Z2ZlBgwZRv359atWqRel71kgPDQ1lx44dJCUlUadOHTp27EjNmjVxdXVl0qRJea5rMBg4dOgQCQkJNGrUiFKlSuW6X6vVsnv3brZv387Ro0e5du0aQ4YMYfz48U/0/Ij8STAphBBCiMdyNT6eJVu28PO+fSSlpOBgZ8drbduiovB0TtZmZWFf7PlWmu08dIhZP/6obI//5hs27NnDl6NGFZrnITk1laSUFMJPnOC1Nm3y3L/vr784p9Ew+NVXzRKojl28mF8jIqjm6Vng1y6MtBlaVk36H7b2tgz4fAjTuk7E0irvCziTw7/9SfKNJFr1a4el2njcH2t2EblpP9F/554GXcLFMVcwqcvScTM2ERePBze+uvTXBb5772sCP+6Hj7/xTYerZzXsD9nLndtpVG5QFb9eLZTrA1w6cYHNX24g5nQ0JV2dqFi3Em0H+VOuSt7Oxfmp2bQWNZvWYuPMtY90vCh6TNMxn7QpzbNWEA14hCjsLly4wOXLl6lZsya3b99m7969fPHFFyQl3V0n+NatWzjdM7tm4cKFLFiwAK1WCxirG2fMmEHbtm05ceIE3t7e+Pj4MGbMGCpXrqw0lbpXaGgow4cPR6/XA7Bz504WLlxIUFAQo0aNyvN3TkJCAh988AF79uwBQK1WM3/+fLp27QrA999/z5dffqmMv1KlSjRs2BCXQvhG74tOgkkzkXfXhBBCvGguXL1K8NatbNy7lwytlrLOzkwcOJD+HTpgXwDViY9Dq9NhZ2Pz3M5/4+ZNZvzwAwBThwyhW7NmhOzezew1a+g7ZQqbPv+8UExrN/1xfvLCBcqXKcO1hAS0Oh1V3N1pUK0aX2/cSOTff5OcksKY114r0LEFb93Kr9lTMYXR+s9/4MoZDUMXjMKxTElUFipUFndfSC0ePh/7kvYMmjGUPat3smHGGgCS45LoNfY1jmyPYv3nxu/LMl5ladWvLTWa1sKxdEls7HL/PHzeczJx0df5v41Tcauaf2CYfCOJb0YswKDXU6FORQwGAxtnrmXvj2EY9MY1vaJ+ieT4ziOMWDIGlUpF4tUEvhk+n9TkVNS21qTcvE3Epv38+UsEfca/Qcs38gbk95Oemg6A2lpesoi75s2bp1RLPs/GN0KIx7N9+3bAuLzCzZs3WbhwIatWrUKlUtGmTRt27dpFXFxcrmAyODiYOXPm0LRpUz766COsra0ZMmQIS5YsYc+ePSxevFg51t3dndTU1HyvbTAYmDFjBhYWFqxatQpfX1+SkpJYs2YNq1atwtPTk/DwcN577z08PT1JTEwkICCACxcu0K5dO7y9vVm1ahWTJ09Wgslvv/2WpKQkSpUqxaZNm6hQocJzfPaKNvlf3ozknTUhhBAvkgGffYYmLo7yrq58+PrrdG3aFKt8puMUBtqsrOc2lVsTF0fg5MlcjY+nibc33Zs1w7lECYb36IFH6dKMnDePGT/8wJejRj2X6z/Mlfh4tkVEEH3jBifOnwfg++3b+T77BYNKpaKNjw/L/+//+CooiOCtW4k4dYrROl2BfT1v3r7N7DVrKOXoSEJycoFcs7A7uDmciJ/20WZAB6Wy0aA35KqYzNJquXziIn/vP8FPc9ZT3MmBtFup7F+3h66je+Fe3RPb4sVIT7kDgGvFsrh6lc1zrYy0dG5cigUgJfF2vuMx6A2smriMtFupvLv4fZzLlSLkvz+y98cwKjeoStfRvXB2K8XqScs4feAU185dwa2aB7tW7iA1OZUWfdsQ+NEbWFhZcjM2kR1LtxEa/AuVG1TFvfqj/Q2caVpns3jRW2dT5C8iIkKplrxf1ZQQouDp9XrWrl2LlZUVnTt3BmDp0qXcuXMHf39/evbsya5du7h48SLVqlUDjF22Z8+eTcmSJRk8eDBly5blyJEjJCcn4+Xlleca1tbWxMfH53v9ixcvcuHCBQYPHkzz5s0BcHFxYfTo0YwePRqAjz/+GE9PT9577z1mzJjBhQsXmDhxIsOGDQOMweeSJUuUc86fP58PP/yQmJgY3nrrLcaMGUPXrl2xsJAe0s+aBJNCCCGEeCR1K1dGExdHbGIiKXfumH0txbT0dKzV6nzDtLT0dBzs7PJ93MG//2bvsWM42NnRo3lz3B5jSk6mVsuAzz7janw8/x02jP4dOuS6v2vTpvx28CCb9u3j4wEDcHnIYuz3E339Opv27SNLp6Ntgwb4VK2a6/7LsbH8GhHB7qNHORMdTfFixdgwfTpuLi58/N137Dl6NNfxNStUILBNG+pWrkzNChUonj3N3dXJiUkDBz7RGHMydcR81O8JJwcHjvzvf8TcuEGnsWOf+vovutjzV1k3fTXlqrjTdpA/V85ouPTXeTLS0vnnwCnWfbaanh8GkpGaQfKNJL5772usba15f/k4joRG8ds3W7l84iJVGlbj401T2ThjLcfDjvD10C+p2rgGXUf1pHKDu99DNna2ytqV5Wt7cSsumYNbDlCrRR3cqhlDw18Xb+afiFN0f7833s3rcPH4efb+GIaDswNtB3WkdPkyxMfEcfNaInA3PDy59y/sHe3p+UEAFtmhqlNZZ16f9CavT3rzsZ6XjDRjMGllrX66J1i8FCIiIujbty9gDCWlyEOIwiMsLIzLly/Ts2dPypUrB8D06dOZPn06o0ePxi77b7KzZ8/SsWNHABYtWoRWq8XLy4uhQ4cq51Kr1UyYMCHPNfR6vTITxOTIkSP4+Phw48YNgAdWNTo6OnLq1CkAfv/9d7y9vZVQEuCNN97gjTfeULabNGlCWFgYP/74I+vWrWP06NHMnz+f4cOH06tXL9Rq+b/pWZFgUgghhBCPZN7o0XiULs3K0FAmBgfz7ebNDOvWjdfbtcMmnz/OLly9ypItW4i5cQNHe3vaN2xIn1atnnoct1JT+WL1atbu2kUJOzsCWrcmKDAwVxCZlpFBuXsWMM/Qanl/wQJ+O3i3mcZXGzeycfp0vPN5Zz4/a3ft4sLVqwzs1ClPKGnyapMm/HLgAAdOnqR7s2aciY5m3a5dDOvenbLOzrmOTU5Nxd7WNle4unDjRhZs2IA2K8s4xg0bmPHOO7zRvj1xSUmM+PJL/jx9GgB7W1u8vbwoWbw46uwOkwGtWuHk4IBP1aq4Ojnxzpw5dPb15T9dujzS53g1Pp4v16/n96goMrOyqFiuHN2aNmVot27KOCP//pu4mzfp1qwZ/165Qr+pU9Hp9WydMeORg97ixYqResdY2ZffQvhFRdqtNJaMXkhmeibX/r3CJ+3H5bo/404GZSuVQ22tJjkuiYw7GVhaWTJ03gjKVXGnXpae377ZSvKNmwA4lyvF0AUjiT1/lQMb9xG+4Q/mDZxBlVeq8eq73anuWxMAA+BR3RNbe1tWjP+Ok3uP83f4Sd5fNo6oXyIJXfIL9do3oMPbxsqXn+eGoLZRY+/kQPD7X+caY4f/vIqzm/Hn7VZcEs5upbCxe/qK5cz07GBSXXS/P4RRzlAyMDCQMWPGmHlEQgiThIQEpkyZApAr6AsMDKRPnz5KhaGrqyuHDh1S7t+2bRv+/v4sWrSIzZs3ExkZiaurK927d1eqKnNycXHhwoULyvbp06fp1asXW7duJTl79kV6evp9x1m6dGn+/vtvAKysrLh+/TpXr17Fzc3tvo+xsbFh8ODBDB48mIiICBYtWsS4ceP46quvWLp0KTVq1HiEZ0g8jNSgCiGEEOKR2FpbM3HgQCK//Zbx/fqRqdXyyf/+h9/w4eyIisp17P6//qLrhAnsiIqieLFiuLm48MnSpcp04pxuZTdoAQg7fJgPvv6aq9lTdX7cuRO/d9/l70uXlOM/Dg7mx5070ev1WKvVfL99OwGffMKNmzeVY9LS03Ote2kwGBjx5Zf8dvAgAzt2ZP+iRQSPH09aejorc3SLfJi9x45RsnhxJg8alGt/cmoq89avJzU9/W71YPZ9a8PC+N+vv3L03Lk85wucPJklW7Yo28FbtzJn7Voa1ajBli++YPucObi5uCjH7D56VAklR/TqxfHly9kwfTpLJ0ygdMmSAHRr1oz5o0czqFMn2r3yCgAp2QFgfsYtXkxMdqXByu3baTdmDCG7d2Nna0ufVq1wtLdn1po1BE6ezLWEBABCDx7k02XLSE5NZeisWVy/eZP45GRmr1nzyM8lQGr2C4jCtkZpQUm7lcY3784nLvoGTuWc8apbiXaDOjJ41jA+/mkqrl5lsba1oVW/dhgMBm7FG1949Rr7GtX9vAEoV8UNtY2aK2eN65drM7RoM7SUrexG7/GvM+33Wbz6bjc0Z2L4euhc9qzeCYCFhQXWdjZE/RrJyb3HAYg+eYldK3ewetIyPGqUZ+AXb6NSqUi6fpPzR87h17sFH2+cwsAv3savdwvaDvLnvf+No8eYAOO10zPJTM9Em6F9Js+PNt14HosHNAASLzeNRsO8efNyVUrOmTPHzKMSQuT0448/otFoaN68ObVq1cp1X85pzy1btuTAgQOkpaUBkJycTEZGBlZWVvTp04fZs2czduzYfENJME61TkhIICEhAZ1Ox4oVK7C1taVy5cpkZb+ZGxcXd99xurq6kpmZCcCQIUNISEigc+fOLF68mMOHDyvNd0wyMjLo1KkTK1euRK/X4+fnx+rVq/n555/R6XS5qjzF05GKSTMwNb6R6QdCCCFeRE4ODozs1Yt3unfnlwMH+O+qVbwzZw5TBw9mYKdOHD13jre++AK/WrVYMnYsdra2JN66xdJffmHaihW0qFuXSjnenV7x22/siIpizaefMv6bb4hPTqZ2pUqUKlGCj7/7DoPBwJItW1jw3ntcjo1la3g4AD5Vq7Lm00859u+/9J82jZHz5hEybRpgDLxyduUO/fNPdh46RL0qVejZogVqKyv+uWzsWvw4odiVuDhKlyypVCearN6xg/khIVyKjUWlUmFhYUHdypUBOJ19nVfu+UP7wtWrnImOJum2cY2/2MREZq9ZQ8nixRncuTNlS5XiyNmzJKek4JU9Lapn8+Yc//dffty5k++2bOHm7du816fPfasU1VZWqFQq0jIy7vs5/RoRgWeZMrwXEMDaXbtIS0+nWZ06fPvhh5Swtwfg/JUrDJs9m75TprBrwQIMwO07d+g3dSoXr11j7siRrNqxg7DDhx/5uQS4kz2uYs+xUVFhdTvhFgvfnsO1f6/y2sT+tHyjbZ5jijnYcf1SLDdjEylWvBgqlYqyld1yHWtpZYl7NU/O/vkPAPvW7Wbbos30GBNA04CW2Dva02VkT1q/2YG5/f/LpjnradCpEQ6lSnD5xEXORZ1BbWtNMYdi3IpL5qdZ6yjp6sTwRe9hU8z4dUlNMjYb0GZosbCypHE3Pxp3y9t0RJelA+BWwq1n8hyZKibVannJUhTNmzdPWU8SYM6cOQQGBppxRHlduXLF3EMQwuzat2/PmjVr+PDDDx94XLt27QgJCWHHjh307NmTV155hT/++IPIyEh8fX0fep1XXnmF4OBg2rdvj16vJykpifHjx2Nvb4+3tzc2NjbYPODviVdffVW5/91338XBwYEvv/ySmTNnAsbqyNq1a1O/fn3GjBlDsey/Iz/55BO+++47vL29sba2JjExkRs3blC8eHEyMjIeeE3xaOR/eSGEEEI8EStLS3q2aIFf7dr0/PhjZvzwA/06dGBBSAiuTk4sGjNGaUCzYe9eALJ0Omb9+CPf5lhXUG8w8PelS4ycN4/47Kk4kadOsfvoUepVqYLBYODQP8bQ5eK1a8rjhvfoQTEbG/xq1WJgp04s37aNq/HxuLm4GNefzA4PDQYDX65fTxknJzQ3btB70iTlHK5OTgzt1u2RP+cqHh5sP3iQawkJuaaKe5U1Nhj5ed8+ADo0bEiF7H0ZWi32trZ5unSbuno39jZWvi3atAmtToeXszNDZ81SjlNbWTGhXz8ArNVqPh86lDfat2f5tm1s3r+fDXv20KtFC97p0YMq7rm7K+uy12LS6XTKvjsZGVy8dk2Zvu5YvDinsitSHYoVw9LCIlcoCVDZ3Z0hXbrw8XffcSY6mvjkZDK1Wk5evMjEgQMJaN2a9MxMJgYHk3LnjrKG5cOYgklrq6L3J+lv324h4UoCg2YOpWHnJvkeY6q+TbgST5VXqvH2/JE4lXXGwjL3pCf/oZ3ZsXQbABXrVcZggHWfreaXhZso5e6ChZUlt+KSSbyWgLWtNQa9gbpt6rNr5Q6KOzvwzsLRGPQGvh25gJKuTgyaMRTHMiWV85euUIYSLo4c2naQFq+3pnwtr3zHa2lliV0JOzLSMtBn6Z660jErIwsLK0upmCwiNBoNMTExREZG5gokg4KCZOq2EIVYzZo1OXDgwEOPa9euHfXq1ePrr7+mZ8+efPTRRwQEBDB06FAmTZpEly5dsLGx4fTp02zZsoXQ0FB69uypBJ4dO3akQ4cORERE0KpVK/r166c0uqlYsSIhISFUrFjxvtfv3bs3vXv3VrbffPNNAgMD2b9/P0ePHuXYsWNcunSJbdu2MWDAACpWrMjWrVtZv34969ev588//yQ1NZWSJUvSrVs3hg4dKqHkM1L0/goUQgghxGP7aMkS/rl8mbkjR1L5nvCrVIkSuDo5EZeURKZWy9WEBKqXL68EW2GHDzN37Vq8vbxQqVT8dvAge48do1X9+so5dHo9e48d401/f1bv2EHon3/SsEYNVk+axHdbtyrTpC9fv648Judt9+yKwdvZ04MMBgM3sysRTZWJU4cMoVfLlqwNC+NsTAw1ypend8uWlHqMBjXtGjRga3g4/5k5k8+HDlWa0jSsUYPSJUsSl5QEwKBXX1Ue4+biwpGzZ4k8dQrfWrUwGAzMWbuW0D//BCA9O5zbFhGBf6NGLBozhs379xN56hSuzs50b9aMap6eucZRu2JF5o4cyeS33mLFb7+xfNs2Qvbs4e2uXXM1s7G0sMDJwYHbOaZyrw0LY/aaNfy9ahUApUuWVKbK29rYYK1WY33PmqFJKSms27ULVycn3EuX5nqiseHJa23aMCw72H2lenUAzmk0eZr13E969pSqwtrd/Xmq29aHtgP9cfEsc99j3po1jPCQvUoQWKd1vfueq25bYyfvivUqM+33WexeuYOTf/zFjegb6LJ02Dva06R7U1r2a4tjmZJ0e783nrUqULVhdUq6GkPzGX/MzzcEtLa1pseYPqyauIxFw76kxweB1O/wCrb2tiRciefE7mMcCY0i8VoCA794m+sXY59JmPjaxP7cvH7z4QeKF5JGoyEiIgKNRkNkZCSRkZG57vf19WXu3LkvxCyzF2GMQpibtbU1a9as4XT2kjQ+Pj4sW7aM999/n/HjxzN+/HhUKpXyppybm1uuSkoLCwuWLl163/PXq5f//5EPYmNjQ7t27WjXrl2+96vVavr370///v0f+9zi0UkwaQYxMTHmHoIQQgjxWCxUKo6eO0f7Dz6gbuXKlHN2xgDcuHmT05cvcycjg0kDB2Jna0vjmjVZFRrKgM8+41ZaGsfOncOnalVWfPwxWTodvSdN4u1Zs/j0rbdyNZBp4u3N8B49WL1jB+VKlWLJ2LEUs7GhUY0aGAwGzsbEcCUuDrWVFW4uLsz68UfikpIoVaIE327ejJODA2Wzqxhtra35N3uKXXKqcRpqhlaLo70973Tv/sTPQ6+WLfl5/372HD1Kz48/xtXJCWu1mmsJCWTlqEo0BXdgrOwMO3yYobNn07JePWJu3OD4v//Su2VLtoSHs2nfPro1a0ZyaioZmZlYWVrSp1Wr+zYKCt66lT9Pn+aTQYMo7+rK+wEBDO3albnr1hG8dSs1K1TI9Vi3UqU4Ex2NwWAg5c4d1u3alavZj6uTkzLegFat2HP0KH2nTKFH8+YUs7Hh1MWL/BoZSVZWFuunTcPR3p5bqak42tszMUcIWtXDA2u1moiTJyWYfAQ1/Go99JjS5cvQ88PHn7pq72hP19G96Dq6132PUduoadQl99S5B4WJTXo0Q6/Ts/7zH/jx0xX8+OmKXC8gLSwteOXVxtRqUZfarR7/xWF+TOtoiheTqQJSo9EoS1lduXJFqYq8l4eHBwEBxvVKpUJSiJeTvb09DRs2VLZbtWpFREQEv/76K+fPnycrKwsvLy/8/PyoVKmSGUcqCpIEk2bkeU/1gxBCCFFYTR0yhNqVKvFbZCRHz53j5IUL6PR6bK2tqVu5Mu907640Whn3xhvEJyXxx/HjFC9WjGHduvHB668r6wiumjiRAZ9/ztQVK6hVsSJeZctSwt6eL0eNwq1UKd4LCCCwdWtcsisZfb29aVSjhhL+2dnYsOnzzxn55ZcEb90KQOOaNZn+9ts4ZldpNq5Zk0379pGank5VDw8c7OxY9uuv9GnVSjnvk/rfhAms3rGDVaGhRN+4gbODA/WqVKGzry/N69Zl6KxZfLF6NY1r1qS8qyt1KlXi5//+l3GLF7P94EEaVKvGvNGj6d2yJd5eXmzYswcwrkH5x/HjSmXl/ZRxcmLn4cPsOnKExjVr4lyiBDqdjr/OnwdQpsObvFK9Ot9v307TESNISE5Gp9fzwyefKPe/6uuLjbU1YGyccyMpiZDdu5n2/fcAVCpXjv906UL/Dh0oWbw4ADOGDyc5JUXZBmO4+J8uXR64nuW9MrIXmnfMcR5RePn1bkHt1vU5uDmca/9eJTM9A+dypajkU5XqTWpgW/zRpvCLwssUIOZ0b1FFzqAR7oaN+d33IB4eHvj5+eHu7o6vry9+fnnXLRVCvPyKFSumvCkhiiaVwfQ2pygwERER9O3bl8DAQOkqJ4QQwuwqVKiAR+nShC9eXGDXzNLpSM/MVNYiNBgMqFSqhzwKpq1Yweb9+zmcPZUnOTUVW2trbO6ZepycmkrU6dO0bdAACwsLgrdu5bOVlQw3AgAAIABJREFUK6ns7s70//yHRjVqoNXpiDp9mp/++IMDJ08yb/RoWtSt+9Sf2/krV+g1cSIlHRzYOH260i37YY6eO0fAJ59gZ2vLpIED6eLnh41azenLl9kSHk7on3/Ss0ULPnz9dc5pNCzatIkjZ89yLSEBG7UazzJlCGzThgH+/rma88QmJtJv2jRup6XxapMmDOrUKc90/PwYDAYMBkOujprPWlxSElvCwxn86qsPvc6GPXv48P/Zu/OwKMu2j+NfVlHc2EThRtyX9FFzyUHKJbVFc0sI60mz0hbNwiTrMSsreytDwUotLTO1UgdzKU1NzY1mlMwVzciVAZVNUNlkmfePcSZGUJFtZuD8HIeHAjNzX4wMM/O7z/M6582zyv3m/P39cffx5L0tH1t6KeI67dpolk9fXOqfl5CQkJtW8VW0irzNkydPkpuby4MPPljsayUFirf6uKIZv09FUfDz8zMFkH5+ftWm9TksLAy1Ws2KFSskWBVCiDKSikkhhBBCVDlHBwezASmlCSWNigZYDYoMaCmqgasrA4q0Co0fMoRr+fnMWbmSJ957z6wF1c7Oji6tWtGyyKTw8mjp68vC115j7IcfMnrmTH766KNiU7xLcnfr1ix+4w1e+fRTpi5YwNQFC8z3WvL0RHV9UE5rRSFy0qRSraexuzvbiwySKC07O7s7+n8pC6+GDXl28OBKPYYQpeHn51ditV9lhHeVcZtqtbrCb9OoaIho/Lex88u3yEkO415w1Sl4FEIIUfkkmLSAyj47KYQQQlRXbvXqkXH1apmuO3HECIL79mWDRkNCSgouzs60VhTu7dQJj/r1K3Sdqg4d+G3uXDZoNHcU7vXp0gXNggVs0Gg4mZho2GupSRMCOnSgRQUFp0KI4iqri+lOX/ff6V70L774IpcuXTJb/52GgiVtLyXBYunI7AAhhCg/CSYtyLcUbVRCCCFEeeh0OkJCQlAUhZUrV1p6OeXm7e5Obl4eKRkZZdorspGbG08PGlQJKyuuiYcH4x555I6vV7tWLYL69q34BYlKo1Kp0Gq1pCak4OHraenlCCAtMQWwfMB2p8e/08u3bdsWrVZLQECAxb/XmkxmBwghRNlV3qZBQgghhLAYnU5HWFgYgYGB1apSv1ubNgCs2bXLwisRori0xFRLL0Fcl5pg+L+Qff+EEEII6ybBZCWrTm8GhRBCWD+dTkdERASBgYGo1WoURSE0NPSW1ZKKoqBLTq7CVZZdS19f2vj58cvevZZeihAmxr31UhNSLLwSYRQXc8LSS6hS0lIshBDCVkkwWYmMbww1Gk2pLm9st6vMzauFEEJUT0UDycjrg05CQ0OJjo4u9QRjbWxsZS6xwnz43HM8/fDDll6GqGKa6z+f1tiualxTTQvDbIE1/rxUJGkhFkIIYeskmKxExrPnkTdMwjRWUd74QikyMhKtVls1ixNCCFEtVEQgaWtv3Lu3a8eQwEBLL0NUMWuu6jW2C8fFnJBw0gose3MxaYkpBAcHW3opQgghhLgNCSYrkfEMZmnbuY2VkrIXjhBCiNu5XSB5J2Gj8USaxkYqJkXNpEtKAqzzdZJxy4S0xBQ2zl9n6eXUaKkJKexdFw0YfidWd8ZhmrJ9lGXI/S6EEOUnwWQlUhQFlUqFTqe7bTu3MZQMDg62ucoVIYQQVaciA0kjYzCpPXasQtcqREXRxsaaKiat9XWS8TVcXMwJCSctJDUhheXTFwMQHh5utT8rlUECMiGEELZKgslKZjxTW7SdOyEhodjloqKiABg5cmTVLEwIIYRNqYxA0sjPzw9FUdDGxhK1Y0cFrViIihNR5ASutVIUhfDwcAA2zl/P2w+8LgFlFUlNMFSqvvPg68TFnEClUln1z0pFqknhqzWT/wchhCg7R0svoLorTTu3RqMx7S1pje1JQgghLCsiIsLsBFdoaGiFVtgb21DDwsKIWLWKoL59K+R2hagIUTt2mAYzWXtrbkBAANHR0ajVaiIjI9k4fz3atb8D0LpHWwA8fD0sucRqJS7mBKkJqaQlGqahK4pCUFBQqffXrU5KKnwQQgghbIEEk5XM2M6t1WrRaDRmwaPxDeXq1asBTGfZhRBCCKj8QLKogIAA0/NV4IQJzJ44EVWHDhV+HCHuRMSqVURer5YMDQ21iaokRVFMVcxRUVGmk89716VYeGXVV00OJI2Pifj4eAuvRAghhCgbCSarQFBQEFqtltWrV5dYESlDb4QQQhR1YyCpUqmYPXt2pYYyiqIwe/ZsQkJC0Ol0TJk3j6C+fZn82GOVdkwhbkaXnEzIO++Y9pVUqVQ2FzoFBweb2ol1Oh3x8fHodDrZC7CCqFQq0zYUQgghhLBdEkxWAWPgWNIAHBl6I4QQwsgSgWRRiqKwcuVKUxtqpFpNpFqN4uWF0qiR4W8vrypZi6h5dMnJ6JKTTW3bRqGhoTYXSt5IURR5nScqRWm2jRJCCCGsmQSTVeDGdm5jq4Wfn5/pDagMvRFCiJpLo9EQFhZmemNZ1YFkUcY21ODgYCIjI1Gr1abASIiqUpNbc4UQQgghahIJJqtI0XZuo/j4eBl6I4QQNZhOp2PKlCmm5wJLBpI3Mk4YDg8PN2tDBanMEZXDWFUor4mEELZCng+FEKL8JJisIkXbuY1vOI3VkjL0RgghahadTmdql4Z/Q0BrDWSkDVUIIayT8XezBGSWI8+PQghRPhJMVpGi7dzGFw5SLSmEEDVPSZO2pV1VCCFEWSmKIsGkEEIIm2Vv6QXUJEFBQcU+J0NvhBCiZtDpdISEhJhCSZVKRXR0tISSQgghKoSEk0IIIWyRVExWoZIqI2XojRBCVH9FqyQVRSE0NJTg4GALr6r0jK3nRav+5Q2wqAxFT9YGBATQs2dPm3qsCCGEEEKIOyPBZBUq2s5tJG3cQghRfd043MbW2rZvnBZelOLlZYEViequ6M+aWq027cUaFBRk010mGo3G9HtAq9WavRYUZWf8eTC+nu7ZsycBAQE2+3MihBBC1EQSTFYx43RuMLxBFUIIUT1pNBpGjRoFGN48r1y50qbeLBfbCzM4GL9GjVC8vFB16GDBlYnqTJecbPg7KQldcjKa2FiiduwgMjKSqKgom3sc3Xhyoih3H08LrKh6MQbZarXa9LeiKAQFBdnUSaDykgp2IYQQtsxOr9frLb2ImkSn0xEYGAhAdHS0Tb24FkIIUTphYWGmN8q2ViUJ5qGkqkMHVs6YYdkFiRotascOIlatQpecbFMhf9HHkbuPJ6rhvWjdox3uPh54+EooWVFSE1IAiIs5QVpiChvnrwdsc9uMsvL39wfkvUVVM76vUxSF6OhoSy9HCCFsllRMVjHjiyRFUeSFgxBCVDPGATc6nQ5FUQgPD7e5LTs0Go0pTFk5Y4ZURwqLC+rbF1WHDkyZNw9tbCwhISFWHwIUPTkxaMJQBk0YZuEVVV/GkNf4d89hgexdF83G+euJjIxEURSb+z0sbIu8pxNCiPKRqdwWMHny5Bpx9lYIIWqSiIgIAgMD0el0ponbtvhm2BhKGsMgIayB4uXF7IkTAcMJAI1GY+EV3ZxGozGFku9u/lhCySrm4evJoAnDeHLmM+h0OrMtKaqjom3cEpAJIYSwRRJMCiGEEOUUEhJievMbGhrKypUrLbyisik6oGPyY49ZeDVCmCsaToaFhVl4NTdn/F0waMJQadm2oNY92tK6R1u0Wq0pKK6O4uPjAQklhRBC2C4JJoUQQogyMrZua7VaFEVhxYoVNrefZFHGypugvn1l6rawStZexatWq9Fqtbj7eEqlpIUZKyeBal01afy9LcGkEEIIWyV7TAohhBBlUHSYmUqlstkqyaL27t0LQICVhz+i5jJOhdfGxqLRaKx2u4RBE4ZaegkCQ9Wku49ntZ5abfy97efnZ+GVCCGEEGUjwaQQokQ6nc5U+WF8QV+dX9iL0jFWZCiKgp+fHyNHjrTaYKAyaTQaRo0aBVSfUBIw7dtn7VVpQoB1PicZQyJhPTx8PUhLTLHqILsi9OzZ09JLEEIIIcpEgkkhhBmNRkNYWFiJb/jcvKVNqCa7dFFnFlIb9+1SFIWgoCCbbmG+ExEREWb7SdaU71sIa6G66y60sbFWGUwa9/tr3aOthVcijFr3aEtczAlLL6PSGPfPrM6hq7UyPt6lWlUIIcpHgkkhhEnRwAWg/+hQ3Lz9cPNWaNFZZcGVCWtx6aIhCDh1yDAgZduyCNPUU61WW20qB28mLCzM9CZwxYoV1e6NoGmvMtlfUlgxv0aNLL2EmzI+hmTojfXRarXV7nd20aE+ssekEEIIW2U1waS0jYqSSNto1SkaSrbopGJ8ePUOmETZGKtmuz0QBECLzir2b1GzbZkhmAwJCam24WTRITfh4eHV9ndRdQglt8TEkHb5MqP697f0UkQNo9PpcPeRUNKaVOf/j6ioKMBQvS+EEELYKosHk9I2Km5G2karjkajMYWS4z9ZKdWRotTcvBUGjJ5MtweCmTU6EK1WS0RERLV7fBYNJaOjoy29HHEbe48dY/fhwxJMCiGqLY1Gg1Zr6F6obs+5QgghahaLBpPSNipup6a3jVYV4+Ow2wNB8tgTZeLmrRAUNpuo8ClERUVVqzdJEkranoLCQq5kZVl6GUKUWUF+AbP/+3/Y2dnx2orpll6OsDLG18Ig1ZJCCCFsn8WCSWkbFaVRk9tGq0rRM+79R1efMElUvRadVbTopOLU4epTNSmhZOlk5uSQmZ1NIzc3Sy8FgIKCAuwsvQghyuHc0dOciz1DrToull6KsDJFu81UKpXVPdcW7YKLj4+/7RZdCQkJZh8bB8rc7DZvt5dmaQbR+Pr63vRrpdmr03iZ2NjY215WCCHE7VkkmJS2UVFWNaVttCoZX+x1eyBItk8Q5eLmrdB/9GROvRZiCrttlU6nY8qUKWi1WlQqlZwAKcG1vDw27dvHyu3b0Rw9SqFez1dTpzKge3dLL42CwkLquFR+oDNvzRq+2bgRJ0dHQoODeaxfP+zsrCcSPZWYyIg332TjJ5/g62m9++xdSEtj3Z49xCcl4ezoSOdWrRgcEICjg4Oll2Yx0at3A1DXra6FV2JZuVk5bFuymf2/7CP9YjrOtZ1p1qkFD78whKYdmlXosUJCQv4dAHY9eDKGXL6+vmb7nhu/VlUDZ4x78UdFRZnWaKnnJp1OR3x8vOl5PiEhwRRAVsV8gNsdw9ZffwghRE1kkWBS2kZFeVXnttGqtnfvXgCad6qegzxE1XJrbHiTZsvDyySUvLXElBS+XL+etbt3k371KvXq1OGx++/HDuuZlpyXn49r7dqVeoytf/zBrO+/N308dcEConbsYM5LL1nN/ZCRmUn61atEHznCY/36Ffv67sOHidPpePrhhy0WqK7bs4fX5s8nNy/P7PPfbNzIihkzcHF2tsi6LCk3K4cDm2IAqFPf1cKrsZwrqZf5/LnZJJzQYe/ogKevJzmZORz57SB//R7L5KVvVGg46efnZwq1jM9hpQ25bhzWaNSzZ887un7RcM8Y+N24BkVRCA0NJTg4uFS3XV7Gzhrjn1sp6X4oWp14Y5B7s2D3xqpHRVFK9bqipGrLokpzG7e6zI3VnWfOnGHfvn23vU0hhBC3VuXBpLSNiopSHdtGLUGj0QDISQJRIYxVt7YaTEooeXOnEhNZ9NNPrN65k9y8PBq7u/PmmDH8d+BAXKugOvFO5BUUUKdWrUq7/aRLl/jou+8AePeZZxgSGIj6t9/45IcfGDVjBms++MAq2toLCwsBOHrqFE0bNeJ8aip5BQW08vWla5s2fL56Ndpjx8i4epXJjz1W5es7lZhI6GefAfDxCy9wf9euXM3OZtrChWhiY1m9cyf/HTiwytdlaRvnryc3OxcAl3qVG7Bbsy1fbSThhA7v5o155Zup1PdsQF5uHj9/uoZt325m+7dbGDvruQo7Xnh4OOHh4bdsRTYGU8YAzBgk3jis0UitVlfI2oyDH1UqFQEBlX8i2VihWXQWQNG1BAQEmKpIjR9Xtjtpsa4qarWamJiYW7aGCyGEuL0qDyalbVRUlOrUNipEdWI8YaDT6ar8TUJ5SSh5c6NnzkSXnExTb2+mhITwSK9eVttqm5efX2mt3LrkZILffpvElBR63nUXQwMDca9fnxeGDUPx8mJiRAQfffcdc156qVKOfzsJKSls1Gg4l5TEkZMnAfh20ya+3bQJADs7O/rdfTff/O9/fBoayqKffkITG8ukgoIq//90cnTE3s6Oh1Uq0wT1Rm5ufDB+PPeHhnLm/PkqXY810B0/x2/LfqVdQAdOHYjD2aVsFaMp8Ukc3n6Qc7FnuJJ2md6P96dz/7sreLWVy7eN4fnj4ReHUt+zAQBOtZwY8OzDbPt2M1fSLlfKcYs+b5X2Ocz4/qZoYGn8u2iV3Y0VfTc+T97YOl5VoV/R9RifB41UKpXpT1WuRQghRM1R5cGktI2KilQd2kYtzXjfyYkCUdHi4+NtKpg0DrqRULJknVq2RJeczIW0NK5mZ1t8L8WsnBycnZxKDNOycnKoV6dOidfbe+wYOw8epF6dOgy791587mDvxWt5eYyeOZPElBT+77nnilXzPdKrF7/s3cua3buZNno0ng0a3Nk3dd25ixdZs3s3+QUF3N+1K3e3bm329bMXLrBBo+G3Awc4ce4cdWvXJur99/Hx9GTawoXsOHDA7PLt/f0J7tePTi1b0t7fn7rX29y93dyYPmZMmdZYlF6vB7jjnwm/Ro3YFB5uWo/RmQsXAHB0tNiMRovIy81j2fSvcXF1YfQHz/DeI2/i4HjzsHj/L/vISEqnzxP9cXAyXG7XD9vRrtnDuWNnzS5b37OBWTBZkF/ApQtpeCpet1zTmcOnWPjy5wRPe4K7HzDsH5v4t4496p1kX8miZdfWBIy4z3R8gDNHTrFuThTxx8/R0NuN5p1acP9TD9Ck1Z1VlalG3EvHvp2LtbOfPXIKAOfalVcVfadu3H/SFhUdTGqs0JRuJCGEEFXBIq3cIG2jomLYetuotZBQUtR0RadvSyhZsohJk1C8vFi6eTNvLlrEF+vW8dyQIYT0708tJ6dilz+VmMiX69cTn5REA1dXBnTvzsg+fcq9jsuZmXy4fDkrtm+nfp06BPXtS2hwsFkQmZWbSxMPD7Pr5ebl8crcufxy/QQpwKerV7P6/fe5q1mzUh17xfbtnEpMZMxDD920xfjhnj35+fff+f3oUYYGBnLi3DlWbt/Oc0OH0tjd3eyyGZmZuLq4mIWrn61ezdyoKPLy8w1rjIrio+ef5/EBA0hOT2fCnDnsO34cAFcXF+5q1oyGdevidD3EC+rTB7d69bi7dWu83dx4PjycQSoVzw4eXKrvMTElhTmrVvFrTAzX8vNp3qQJQ3r1YvyQIaZ1ao8dI/nSJYYEBvJPQgJPvPsuBYWF/PTRR3cU9AK0LiHI2XT9/6h727Z3dFu2btUH35FwQsf4uS/RoFFD7OztsLP/N+yd/0Ikrg1deeqj8exYvpWoj34AICM5nRFhj/HnphhWfWDYYqBRs8b0eeJ+2vXqQAOvhtSqYx7ifTD8bZLPXeR/q9/Fp3XJgWFGUjoLJsxFX1iI/3+ao9frWf3xCnZ+vw19oSGMjvlZy6GtfzLhy8nY2dmRlpjKghciyczIxMnFmauXrqBZs4d9P2sYOfVxej9efK/TW6nrVq/Y5w5s2Q9A257t7+i2xM2FhYWZ2s5DQ0MlkBRCCFGl7C29ACHKq0UnQ8gt4aQQoizCwsJMoWR0dLSll2O1XJydeXPMGLRffMHUJ57gWl4eb339NQEvvMCWmBizy+45fJhHXn+dLTEx1K1dGx9PT9766itTO3FRl68PaAHYtn8/r37+OYkpKQB8v3UrAS++yLEzZ0yXn7ZoEd9v3UphYSHOTk58u2kTQW+9RdKlS6bLZOXkmO17qdfrmTBnDr/s3cuYBx9kz7x5LJo6laycHJZu3lzq+2DnwYM0rFuXt596yuzzGZmZRKxaRWZOzr/Vg9e/tmLbNr7esIEDcXHFbi/47bf5cv1608eLfvqJ8BUr6NGuHes//JBN4eH4eHqaLvPbgQOmUHLCiBEc+uYbot5/n69efx2vhg0BGBIYSOSkSTz10EP079YNgKvZ2Tf9nl6bP5/4pCQAlm7aRP/Jk1H/9ht1XFwY2acPDVxdmfXDDwS//TbnU1MB2Lx3L+8sXkxGZibjZ83i4qVLpGRk8MkPP5T6vryZ7Nxcfv3jD+q4uHBP+5oTPO1dF43mx930Gz3QVNmoL9SbVUzm5+Vx9shpju05wo/hq6jrVg97B3v2rNxBXm4evm39cKn7b/Wpd/PGeDdrjIuri1k1a25WDklnLqAv1HM17UqJ69EX6ln25mKyLmcydtZzuDfxIOrDH9ixfCsturTilW+m8u7mj2ndoy3Hf4/lfJyhXXn70i1kZmRy36h+zNZ+zke7Inl/6yf0GtmbzYt+JuFE+V6r5WTmcHj7Aewd7OnYp1O5bksYREREoFarURSFFStWSCgphBCiylV5MClto6Ky3G4SnzVRq9UVtiG6EKLsjG/IwDD4QNyeW716TBwxgt8XLGDuyy/j6ODA8+HhLL0eOh6Ii2Pshx/SrW1boufNY+FrrzFxxAgyc3J4b8kSTiUmmt3ekl9+4cn33+dKVhZTFyxg9c6dbNq3j3V79jBt4ULTFHAwtDD/dD08vrt1a3Z99hnLpk8nTqdjYkSE6TYzc3LMpnJv3rePrX/8QedWrRh+3304OTry11lDq+udDO5JSE7Gq2FDU3Wi0fItW4hUq5m2cCHb/vwTe3t7OrVsCcDx68fp1qaN2XVOJSZy4tw50q8YgqELaWl88sMPNKxbl6cHDaKxhwdnLlwg4+pV036Zw++9lycfeAB7e3sWrl/PW19/bQpxS+Lk6IidnR1Zubk3vcwGjYY1u3YBhorQrJwcAv/zHzaHhzNz3Dh+eOcdts6Zw+XMTEbNmEFBYSF64Ep2Nk+8+y6nz59n9sSJdGndmm3795f6vryRcVDPvDVruHTlCs8NGXLTdvzq5sLJRFa+v5wmrXy5/6kHSDihI1q9k9ysHP76PZaVM5eTm51LbmYuGUnpLHz5c5xdnHnlm9d48LnB5GbncvbIabybN2bamnfp3L8rSWcu8Pn4Ocx95hNO/mkeiteq44KzizPOLs407diMy8kZ/Pr1LyT+/W9ouGH+Ov7SxDJk0gjuuvc/nD50kp3fb6Oeez3uf+pBvJo24tL5NC6dTwMwBaJHdx7GtYErw18Nwv56qOrW2J2Q6U/ywfbZ+LYt2+t/faEevV7Pz5+tIftKFgGP3oenX6My3Zb4l0ajMbVvh4eHyx6SQgghLMIim/dIKClqurCwMAAiIyMJCAhg5MiR8mKwBAX5eWRdvoS9gyNJZ+No3qmnRdahLyxk74bvuGfQ49g71Kw9z6qzom/IVqxYIY/BO+To4MDw++4joGNHhk+bxkfffccTAwcyV63G282NeZMnmwK1qJ07AcgvKGDW99/zxfXfgQCFej3HzpxhYkQEKRkZAGhjY/ntwAE6t2qFXq/nj7/+AuB0kWEoLwwbRu1atQjo0IExDz3ENxs3kpiSgo+np2H/yevhoV6vZ86qVTRyc0OXlMSj06ebbsPbzY3xQ4aU+ntupShs2ruX86mpZq3izRo3BmDt7t0ADOzeHf/rn8vNy8PVxaXYlG7jVO977roLMARyeQUFNHN3Z/ysWabLOTk68voTTwDg7OTEB+PH8/iAAXyzcSPr9uwhascORtx3H88PG0arGybDFlwP+woKCkyfy87N5fT586b29QZ16xJ7vSK1Xu3aONjb88WUKdR3/Xdfv5a+vjwzeDDTFi7kxLlzpGRkcC0vj6OnT/PmmDEE9e1LzrVrvLloEVezs4vtGXkzefn5vLdkCeujo01Vs0bX8vO5eOkS3lYw3bwyZV3O4stJn3Et5xrn/0ngrQGvmX09NzuXxi2a4OTsREZyOrnZuTg4OjA+YgJNWvnSOb+QXxb8REaSoWLYvYkH4+dO5MLJRH5fvZvoqF1EjPmIVt3a8PCLQ2mrMlSh6gGlrR8uri4smbqQozsPcSz6KK8sfo2Yn7Vs/vJnOg/oysBxgwBYO1uNUy0nXN3qseiVz83WOPDZh3H3MTweLien4+7jQa065R8+laJLZvXHK/hLc4y8nGtmX3Oq5cSl82m4NXG/ybVFaRhfj8pzYNlIt5YQQlQMaeUWwgKio6MJDg5Gp9OhVqsZNWoUgYGBREREyIuc6w5s/ZEPH7+H/xvVg5nBd/Pd+y9QUJBv+nphQQE7VsyvsvWs//wt/tq7/aZfT0k4zbblc8m+mlFlaxJlp9FoGDVqFCBvyErjjS+/ZPi0aZwsMl3WyKN+fbzd3LiWn8+1vDwSU1Np27SpKdjatn8/s1es4K5mzejQvDm/7N3LzoMHzW6joLCQnQcP8uQDDwCGCsdOLVuy4p136N+tGwkpKWTm5HD24kXTdYr+2/f6voZXsrIAQxh56XolorEyceKIEfz26adMGz2aoL59mT5mDL988kmxfR9vpX/XruTl5/Psxx+btWZ3b9fO1EoN8NTDD5v+7ePpSWZODtrYWNPaPvnhBzbv2wdAzvVqxo0aDQ/06MHGWbOY89JLPNavH5NGjmTjrFn06dLFbB0dmzdn9sSJaL/4gkkjR/LrH38wYPJkZi5danY5B3t73OrV40qRVu4V27YR9NZbpo+9GjY0tcq71KqFs5MTzjfsGZp+9Sort2/H280NXy8vLqYZquQe69eP564Hu92u7wcZdwfPYdO/+oqlmzfj37ixKWg2mr9mDf1efpnvfv211Ldna7IuZ7HgxUiSzyXh1sSdZp1a0P+pB3l61nNM+/FdvJs1xtmlFn2e6I9er+dyiuH5ZUTYY7QNMATaTVr54FTLiYR5E/koAAAgAElEQVTr1Y55uXnk5ebRuKUPj04N4b1fZ/Hwi0PQnYjn8/Gz2bF8KwD29vY416lFzAYtR3ceAuDc0TNsX7qF5dMXo7RrypgPx2FnZ0f6xUuc/DOOgEfvY9rqGYz5cBwBj97H/U89wMtfv8awyUGGY+dc41rONfJy88p93+RczWbu2Fkc3XGIFl1a0m1QT7P9Nncs38qMQW+wYd5a8q/l3+KWxM2o1Wp0Oh3BwcHyHFhOtjz0SAghrIGU/ghhAYqiEB4eTmhoKBqNhqioKLRaLZGRkURFRREQEEDPnj0JDg629FItYsOXM9mzehE+rTrQdWAQB7ev5UpaEn9uVtNj0OMAFBYWsHnxx3S89yE8lRaVuh47e3uca7ty8ezf3NXrgRIvc2j7OrYui+DM0X2M/eBbHKSy0mrpdDoJJe+QvZ0dB+LiGPDqq3Rq2ZIm7u7ogaRLlzh+9izZublMHzPGtC/gss2bGT1zJpezsjgYF8fdrVuzZNo08gsKeHT6dMbNmsU7Y8eaDZDpedddvDBsGMu3bKGJhwdfhoVRu1YterRrh16v5+/4eBKSk3FydMTH05NZ339Pcno6HvXr88W6dbjVq0fj61WMLs7O/HM9RM3IzAQMlYsNXF15fujQMt8PI3r3Zu2ePew4cIDh06bh7eaGs5MT51NTyS9SlWgM7sBQ2blt/37Gf/IJvTt3Jj4piUP//MOjvXuzPjqaNbt3MyQwkIzMTHKvXcPRwYGRffrcdFDQop9+Yt/x47z11FM09fbmlaAgxj/yCLNXrmTRTz/R3t/f7Lo+Hh6cOHcOvV7P1exsVm7fbjbsx9vNzbTeoD592HHgAKNmzGDYvfdSu1YtYk+fZoNWS35+Pqvee48Grq5czsykgasrbxaZ6N1aUXB2ckJz9GixKeI38/f1LVjc6tVjfXQ02bm5BPXty4Thwzl6+jSf//gj0xYuRJecbKoarS6upF7ms3HhnP8nkcfe/C+9H7+/2GVq16vDxTMXuHQhjdp1a2NnZ0fjlj5ml3VwdMC3jR9/7zNUFe9e+Rsb561j2OQgegX1xrWBK4MnDqfvkwOZ/d//Y034Kro+1IN6HvU5e+Q0cTEncHJxpna92lxOzuDHWStp6O3GC/Neptb1qdeZ6YbHUF5uHvaODtwzJIB7hhT/vVmQb3gMXE69XO77Jz0pnUsX0rB3sKcgv4DD2w+gL9Qz4JmH6ffkAJLOXmTH8l/Z9MXPHNr6J2E/TMfZxbncx61JjB0DI0eOtPBKhBBC1HTyzlkIC1IUheDgYLPqycjISNMelJGRkQQFBaFSqWpMeHJg2xr2rF5Ex3sf5vE352Hv4ECX+4fz2YRBaNZ/awomHRydsLOzIzn+ZKUHkwCOTs5cunjzSqDej72Ac+06nIjZwfmTx1DayKb81mrKlCmAYfJoTXlclde7zzxDxxYt+EWr5UBcHEdPnaKgsBAXZ2c6tWzJ80OHmgatvPb446Skp7Pr0CHq1q7Nc0OG8GpIiKkabtmbbzL6gw94d8kSOjRvTrPGjanv6sqcl17Cx8ODl4OCCO7bF88GDQBQ3XUXPdq1M4V/dWrVYs0HHzBxzhwW/fQTAPe0b8/748bR4HqV5j3t27Nm924yc3JorSjUq1OHxRs2MLJPH9PtltXXr7/O8i1bWLZ5M+eSknCvV4/OrVoxSKXi3k6dGD9rFh8uX8497dvT1Nub/7Rowdr/+z9emz+fTXv30rVNGyImTeLR3r25q1kzonbsAAx7UO46dAhtbCyqDh1uevxGbm5s3b+f7X/+yT3t2+Nevz4FBQUcPnkSwNQOb9StbVu+3bSJXhMmkJqRQUFhId8VqZh8WKWilrMh0BkSGEhSejrq337jvW+/BaBFkyY8O3gw/x04kIZ16wLw0QsvkHH1quljMLT2Pzt48C33s7zR7IkTeXfJElOr/hMDBvD+uHE4OjjQ0teXwQEBvDZ/PrsPHap2weQvX6wnNSGVpz4eT/dBJW9TYhyklJqQQqtubRgXORG3xu7YO5g3PD0wfhBbvtoIQPPOLdHrYeXM5fz82Ro8fD2xd3TgcnIGaedTcXZxRl+op1O/LmxfuoW67vV4/rNJ6Av1fDFxLg293Xjqo/E0aPRvBbCXfyPqezbgj417uS+kL007NCtxvQ6ODtSpX4fcrFwK8wtMe0yWReMWTXh29otsX7qFM4dP4eJamwFPP8jgicMBaNCoIa17tOX31bv54d1vuZJ6GQ/fO5sIX5NJtaQQQghrYqc3vuqpIv7+/rh5K0xdZpnJp9u//wz/9l1peXegRY5vdFy7ldSEM9w7clyF3WbWlXTq1Gt4+wuWwcoPX2bwi29Tt2Hlvegr6zEWhYVw6rC22lQ+6XQ6sypKI0VRCAoKqvBpiZZ+TBZVUJDPnGf6UatOXSbMXYuj879tfbOf7kt6UgLvb/i3ffLtIW0Z+NQU7gt6rlLWUrTqcWbw3bTp3ofHXo80fe7c8T/5+4+ddOr9CI38b18hVFhQgL1D2d+o2QprfkyGhISg1WpRqVSsXLnS0suxOv7+/iheXkTPr9xtEvILCsi5ds20F6FerzebGnwz7y1Zwro9e9j/1VeAoRrSxdmZWje0HmdkZhJz/Dj3d+2Kvb09i376iZlLl9LS15f3n32WHu3akVdQQMzx4/y4axe/Hz1KxKRJ3Nep/CcUTiYkMOLNN2lYrx6r33/frMX7Vg7ExRH01lvUcXFh+pgxDA4IoJaTE8fPnmV9dDSb9+1j+H33MSUkhDidjnlr1vDn339zPjWVWk5O+DVqRHC/fox+4AGz4TwX0tJ44r33uJKVxcM9e/LUQw/R8oa9KEui1xuGjdjbW3bXn8LCQrKvXTMbUhS1YwdT5s0jNDTU6iYI+/v74+7jyXtbPr7l5f7SxOKpeN1ygEvyuSSi1TsZNHHYHVUDZmZk8tvSLRzddZiU+GQK8gtwbeBK257t6f3E/fh3bE5ebh4Ht+6ndfe2NPQ27ON5qzBx77polr25GNcGrgx7NZguA7vh4upCakIKR347yJ+bY0g7n8oTM57i4ukL9B/7YKnXW16Z6VdxbVj3pl/Xro1m+fTFVvnzYilhYWGo1WrCw8NrbHdORYiIiCAyMlLuRyGEKCerr5jU6/WkJpzmyK4N/L1/FxdPnwCgU98hDH/5gzu+vbg/dnLh1HGLB5Pxfx0kZuP35QomC/Lz+O37zzj023pys66Sk3WF15drcG1QsRuBFxYUcPC3dTTt0J2AoWNufwUrPYatKKmKUqvVFmv1ro4Dc47u2kja+XOMee9rs1ASYMCYVzkb+4fZ55xq1eZy6r/7zOn1erKvZpgF9LHRmzh/8hj3//eVYqFgZnoqrg09KCwoYJf6C7oODKK+hzc/f/Ee0T9+Td9RE3nwmalm18nJvMLBbWuI2bSCxH9iqVOvIQ08GhcLJjOSEzmyawMBw8aSn3eNJdPHcu7Yn4yesYh2Pc1b9o7u3siJfb/RoJEPvYOfx9mlZkyirWoRERFotVoURZFQ0sIcHRzMBqSUJpQ0KhqUNSgyoKWoBq6uDOje3fTx+CFDuJafz5yVK3nivfews7MzVaPZ2dnRpVUrWvr43Om3UaKWvr4sfO01xn74IaNnzuSnjz4qNsW7JHe3bs3iN97glU8/ZeqCBUxdsMBsnT6enqiuD8pprShETppUqvU0dndne2Tk7S94Azs7uzv6f6ks9vb2dzQ53Va0C7h5VayRV9NGDJ9y52GHawNXHpk0gkcmjbjpZZxqOdFjsMrsc7eqcOw5LJDCgkJWffAd37+zhO/fWWL282nvYE+3h++hw32d6Nin8x2vuTxuFUqKkmk0GgAJ04QQQlgFqw0mj2t+JSo8jIL8PHKzM/HwaUbTu7rSvOM95GReJmbTyjIFk4UFBVw8+zcnD/5ORnIiBXl5+LXrQuMW7Svhu7g5fWEhmRlpxP25m6yMNK5lZ9GgkQ9tupe8p9SN8nJz+PqN/5IQd4QeD4WQrDvFP3/uIXrNYh4YG3b7G7iTteoNUz3jj/+Je2M/rl5KprCwkPYBAyqsgrIqjmGLFEUxnd3X6XRmbd5qtdpURRkcHFwtNt4+GxuDvYNjiY+Dzv2G0rmf+d5wzi61ycvNAeBy6kWWvTMO3d+HGTLxXXoNGwvA1mWRXDh1nIBhY81C+6vpKXz4+D08PyeKRn6t2Lx4Fi6u9dHr9UT/+DUAO1fOp/tDIXj4+ANwfO82ju75hbzcHOq5eRH82my63D/CFHhGr1mMe2M/2gcMRPf3ETZ8OZPW3fuwfflczhwxDLpY++mbvPGd4Q1B1uVL/BjxOrHRm03rys26yuDn3yIvNxunWqWbbCtur+gE7vDwcAuvRpSVW716ZNwwvbm0Jo4YQXDfvmzQaEhIScHF2ZnWisK9nTrhUb9+ha5T1aEDv82dywaN5o7CvT5duqBZsIANGg0nExPJz8+nWZMmBHToQIsKCk6FKKuAR++jY98u7F0Xzfl/ErmWk4t7Ew9a3N2atj3b4VJXnrNsgUajMbVxi/JJKGEgnBBCiDtntcFk5uVLZF1Jp0UnFQ+NewO/dnebfX3wC2+X+raO7vmFv2N2cOmijsSTseRfy+WrqYZ96ty8FQKGja2SYDLxn1j2b15F2oV44k8cRK/Xs/iNJwFwbeBOq673lTqY3PDl+yTEHWHcx9/j36E7hQUFRIzrT8zGHyokmNQXFrLnx684f/IYKQlnAMPefwe2rcHBwRFPv5Z4+PiXKzSsimPcyNjGWZoQ71aX8fPzu+31fW/RKne749/s6yNHjqRnz57odDqioqJMYWVkZCQqlcoUUto2PfrCwlJd0qmWIZhM0Z1i0WujuJqeglOt2mxZPIuOgQ/h6uZJ0tk4vJQWxSqJY6M3U1hQQNbldPQYKj7+/HU1ur8P0bT93fR85EnUn0zhxL7t9Br+NAA5Vw0b+nd78DGGTny3WGXjyQPRHM/Jon3AQIxRxJqINzh7bD+9ho2lUF+Idv1Szp86TuNmbVkyfSzxfx1ENWQ0vYOfp3bdBuTn57FlSTi7oxbyxnfaCq+Arolk2E314e3uTm5eHikZGWXaK7KRmxtPDxpUCSsrromHB+MeeeSOr1e7Vi2C+vat+AUJUQHquddjwNMPWXoZohx0OsN+2bd6nSqEEEJUJasNJpt17AHAg8++bhZKFhYUXA8fXHCu7WraAy4vN4fc7EzqNvQodlvrP3uLK5eScXBwRA80bOTDo6/Owq9NZ1zq/lslodfr+WXR/3Fk188U5OfTtP3d9B01EaWteUtKQUE+9vYOt6yCyLl6mX8ORpNz9TLtVP2p29CT3eovOfjbOgCcahnakkLemEvz//SkgVeTUt836UkJxPyygsHPT8e/g6FVzd7BgfYBA9mzehH5eddwdPp3L6LCggIunP6L9Is6vJq2wsuv5U1v+8yRfXj6tST7agYbF35wfa2GM+Ctu/VmwJhX8Wl5V7E2W6PEf45y4fQJGjdvh0+rktuUynuM8vDz80Or1ZpelN3KrS5TdO9Ha1G03Ts62vL7RZZFh3sfRrN+KduWzy3WQl0SvV5P0rk4FoaFgF7Psx9/T1ZGGt+9/yJHdm9ENWQ0hQX5ePg2N7vetZwsdvzwOQDN/3OPqR08/q8DuDX2Y8y7X1OngTubvv7YNPDGxbU+13Kyyb+WQ9r5cyW2W9eu18DUbn7lUjIAZ4/tp33AQB558R1SEk6jXb+U9Is60OuJ/+sgzTv1ZNikmabbyExP5fe13wDgVAmPgZpIht1UH93atAFgza5djB8yxMKrEUII27N3717g9ifJRenJfSmEEOVjtcGksWIqIe4I6RcTSEk4zd8xO0g8GWtq3fTw8SdsyS4Afv02nEM7fmLip+uo79kYgBTdKbYtiyQzI41m/7mHZ/5vGT9GvM6F03/Ruut9xY75+5rF7I5aiP9d3Wjcoj0Xz5zgi9BHGfTCW/QaNpaLZ//mxzmvE//XAezsHWjTvQ//ffsLsxAQDPtHfj/zRdKTEgFwbejBhLlreWjc/+jSfwRK284c+30LP0a8Tpsefe94YM0fm1bi6ORMtwfMK+P6jppAh8AHzdZz6Lf1bPjyfa6kJZk+16xjD0b97zMaeDXhr73b+GPTKp5850s2fDmTPasX4e3fhkkLNvL0B99S38ObRv5t+Hh0L7yUFjRtb165anQtJ4uo8DCO7Npg+tz9T0xi4NiwCjtGRQgPD79tG+ftQsv4+PhyXf9WX79dS0jRY+t0uhJvy5ZfHLXs0osOgQ+yY8U8sq+k8+Czr1O77s2rovJys0nRnaJWbVdemvcznkoLCvLzcK7typkj+wgc8Qx1G3oS/9cBMjPScG3gTk7mFX74YKLp8Zl3LccUTDrVqs3YmUtwvX6Co2n7u0k6G0f21QwyM1Jp1rEH9dwb8eevUexYMY++oyaaraduQ0+yrqSTnpRouk3vZm0Z9b9PsbO3x8uvJS5163PxbBwtuvTCtYE7Z47G8Mui/6Nx83YkxB3hyK4N5GZd5Z5BT+Bcu+T980TpFR12I0MPbk9RlFKduLGUlr6+tPHz45e9eyWYFFZJpVKh1WpJTUiRKdFWIi0xBbDt10eVQe4PIYQQ1sJqg8nUxDMArP/c0LJd38Ob9qoB9A5+nvqejXGu7Ur2lXTT5Vt370P0msV8+/azjP9kBZp1S9i2fC6Nm7fjyRkLaa8aAGBolbyWW+Ixf1+3BJ9WHXl+ThR21zfXP3tsP5cuxKP7+zDf/G80jfxbM+a9xXgqzfn69SdY+tYzPPPRctNtpOhOsXjaaGrXbcCYd7/CsZYLS99+liO7NtAn5EVTZWTteoaw5WZruZWUhDM0aXkXteqYb/Zdp76bqYIS4O8/drLiw0nY2dvz4DOvEzB0DBfOnGDnivl8NmEQ4z7+nqzL6ZyI+Y19G79nz+pF+LTqQOI/scTt30W7nv1Nt1W7bgPy8kpeq76wkKVvP0v8iUMEhYXT9p5+/LzgXXZFLeT+0aEVcoyqVNY266pgHIZjbOMuuqbKmNhtCY9Pn8/GL2ei/Wkp+7eoadezP061XEhPTiQr4xKd+g7h/v++DBgG0QAMfel9PJUWADg4OtFe1Z+4/bvR6/X0Hx3Kus+mM//lYTRt35UzR2O4nHqRu/uP4MC2NRzctga3xobW/N6PPU+jpq1Ma/Fp1ZGdq74gZuMP5GRewadVB/o/GUrS2b/ZvHgWB7atob1qAA89+wZg+D0FUJB/zdT2fWPLt2+rjpw+rKXvqAmMnbmEpW8/yy71l8Xuh8ARz1T0XVvjyLCbstPGxqLqcPvhHJbw4XPPcT411dLLEBakiY0FrDtYSUtMlWDSSqQmGH5fSMW8gfEkd2m2JRJCCCGqgtUGk5kZadjZ26N6ZDRtevShTfe+xSbqFtW6630Mnfgeaz99kw9CuqEvKOChcf8rNvVaj940QbCogvw80s6f46Fn3zCFkgD+d3WjafuuhI/tjV+7Ljz1/jfY2duTmniW9KRE0pMSORGzg7Y9+gIQvfYb9AUFPPvRcjx8mgGGdm3vZm3ND2hcQwlruR0vv5ac2Ledq+mpJbauGxUWFADwwNgw+o6aYPp+Rr/7FQteGc7OlQtQ2nUh/1oua+dOo+vAIIJfm83HT/bi3PEDZqEhev1N13pw+1pOHvwdn1YduJKWhPanZRzXbMXDpxkODo5kX80o9zFqspoQRhbl4ODIkAkzCBzxDHtWf0X8iYMkx5/ExbUeSpvOtO5mqHbW6/XUbehBq7sD6TpwpNlt9HtiEod3/syVtCRUQ0ZTp74bvyz6gNjozbTt0Zd7g8ajtO1MYWEhF8/G0Wv4Mwx+fjo9H3nS7Ha6PxTCH5tX0cDLh7oNPQgc8QxOtVx4auYS1s6dRmz0Jmq7/rsdROvufWjUtBUNG/ly/39fxsuvJS06m78Rui/oOXauWgCA0rYzry3dw6UL8WRfzaC+Z2OiPplC0rl/ik35FndGht2UjbVXTAJ0b9fO0ksQFqZLTrb0Em6qaMVk6x5tb38FUeniYk5YeglWxdp/xwshhKh5rDaYvHRRh5u3wtCX3gOgsCCfK5eSycpIw97RiTr1GlK7bgOzsLKumyf2Dg7kX8ul5+D/FgslAVzq1OVaTpbp44Pb13Jcu5XBz78F/DsduqjLKedJO3/O1I6Zn3eNtZ9Oo3bdBuj1hWz6+iNad70XewdHUnSnqOvmiXsTf9P1O977cLHbNLZo5l5fS9r5cyz+35O8+vV27B1u/d+iGjKaPT9+xTfTxjB6xiIaNip5UqdxD8wbBwfl5WSRnpRI4+btTO2mzf7Tk5GvfgxAi84BpCeZtxQ713YlLyfb9PHaudNo3LwdqqFj2P79Z/yn92A8fPzZtepLrmVn4t+hOyNCPzTcfxVwjJrGGEYa9400qm5TuG/FvUlT0+O/JHZ2drz8xSbs7YufsPD2b8PTH3xr2iahU59H6NSn+BCKUf/71PTve0eOL/b1+h7eTF26B4COvQeZ9rSt29CDJ9/5klOHNLg3aWq6fKOmrZj81TbAsIVDST+7be/pR9t7+pk+dqrlYhZCpl2Ip+ldXW/6fYvSCQszDAGTfSXvjDFU0VhxxaQQuiTD9jTW+Ng2PjfHxZxANTzQwqsRRVX31013Su4PIYQQ1sJqg8krqUmkJyXwflAXrmVnkp93rdhl2vXsz1PvL6awoIC9G77j5wXv0rR9V+p7eLN3w3c4udQhcMTTNGz079S5ho18ycxI5VpOFk7OLvyxaSX1PRtTkJ8HwNVLKcWOU6e+Oy516/Pzgvdo0TmAI7t+JifzCs98tJy8nGwWTxvNkjfHEvLGXFp2CWTz4o9Z9s44HnxmKo3825Q4JMfN2/BiIO38ObyUFvy5dTUF+Xm3DSXBMMH7hYjVLHtnHHOe7cddvR7Er21nMlIukJmRSvuAgXS892H8O3THubYrW775hL6jJmBn70D8Xwc4sPVH6rp5Mvj5t1g/7x0cnZwJfm226dhK207sWf2V+XobK6RdOAcYqlljozfh166LaSJywLCn6DVsLA8+8zqFBflm38fVSynlOkZNIWHknStpAI1R6269K/RYDiU8Nm+shiyvgvw8MlLOF6vcFHcmJCQEnU4n+0qWgUqlAkB77JiFVyJEybSxsaaKSWt8TjSGpXExJ4iLOSFVkxa27M3FpCWmEBwcfPsLCyGEEMIirDaYdG3ogZ2dPVmXL1HPzYvO9w+n+0OP4eHTjJzMyzg4OuFSpx4Ay2aM56+92+g6cCSPTv4YB0cnmrTswM6V86nb0IM+IS+abldp25nCggLmvTSE3Owssi5f4pUvN+HawJ0GXk1wuGGQDRgqmh6bGsG6z97i97Xf0LxTTx558R08r0/6HfW/z1j1cSgrPw7l6f9bSm7WVXZHLeS4dit16jWk6V3daNr+bjreN8g0Edvdxx+XuvWJ+mQKdd29uHj6Lx57Y26p7x9v/za8+vV2Dmz9kT+3/sjOVV+Qfy0Xn9YdTcNCXFzr8eQ7X/Lrktksf+8FQI+3fxvuDRpPtweCqVXbld7Bz9GyS4ApKAX4T+/B/L7mGwry83BwdALAr21nDu/4iQWvDCfp3D808PKhS/8RODg64eXXkt/XfEPnPkNwbehRLFwt7zGqM2MYmZCQgFqtNn3eGEaqVCqrrAgRlSPt/Dn0hYX4S8Vkmcm+kuXj5+eHoihoY2OJ2rGDoL59Lb0kIcxEXH+utNagSVEUQkNDiYyMZOP8dbzyzVRLL6nGSk1IYe+6aMBQPS9ERZP9OoUQomLY6UvacLES+fv74+atMHVZ9C0vV1hQQH5eLpnpqabBFDezbMZ4atdtwMhXZ5ntD3kzP82fwd9/7KS9agABQ8eYbj89KYE69RqWaRJuXm42Do5OplDucsoFjmu3ojtxiMSTsVy6oKPfEy9xX9Bzpusc2LaGbcsi8O/QnXsGPWE2uKYy6AsLS3X/lORadiZLZ4wnN/MKnfsNpcfDj5uG78T/dZAl08dib+9A94dCaNjIh+yrGZw7fgDdXwfp+/hEeg1/ulzHuJVFYSGcOqxlxYoVNhPiGQNJ4x54YLkwsrSPSVHxEuKO8GPE6wSOeJauA0dyLHozy959jhnrjlHLhidyW+oxqdFoGDVqFIBN/T6oKsbQFrhlaKtWqwkLC0Px8iJ6/vyqWp4QtxW1YwdT5s0DIDo62iorJsHwHG+s3B40YSiDJgyz9JJqnNSEFJZPX0xczAnCw8OtNsi2hMDAQHQ6HWfPnrX0UmxeSEgIWq3Wqn8fCSGELbDaikl7BwecHerg3PjmrZpGo2csuqPbHjJhRomfL9ryfaecatU2+7i+Z2N6PvLkLVsy7+4/grursCKwrKEkGPZ/HPfx9yV+za9dFybN38iOFfPYv0VtqHJ198Kv3d0MHBtWbDBJWY5RnRQNT4xhpKIo8qK5Bko69w+J/8Ty45yppJ0/y5mjMdT38LbpUNKSZF/J4ko6CXK73zUBAQGmvSYDJ0xg9sSJst+ksLiIVauIvF4tGRoaatUhgKIohIeHM2rUKDbOX4927e+ohveSgLIKGKskN85fDxi2p5DXV+aMQ850Op1VP46EEELUHFYbTArb0rCRD8Nf/oDhL39g6aVYPT8/P8LDw9HpdLL/XQ3X5f7hZGWk8evSOWxbbtjKodewsZZdlI2SfSXN3awqOzw8/LahraIozJ4923SfTpk3j6C+fZn82GOVvWwhitElJxPyzjumfSVt5TEeEBBAdHS06XFoDCgB076THr4ellxitRIXc4LUhFTSEg17xRtP/NrCz4oQQghR00kwKUQVs7bqSOOZc1H17OzsCHz0Wf7T5xHWf/422VczGDg2zNLLKrdLFw0/T1W155JxaFRN31fSGEZGRUWZHtNlHZ5lvC+NoUqkWk2kWo3i5YXSqJHhby+vyvpWRA2nS05Gl5yMNo3ZoX0AACAASURBVDbW7POhoaE2FTQpisLkyZNRFIWoqCjTVgp71xUftCgqhgSSpRcfHy8Vk0IIIayCBJNCCABOHdLSorPK0suokep7ePPkO19aehk2SafTmVq4w8PDLbwayzDuHWkMPaBi3pwbQ5Xg4GAiIyNRq9WmwEiIqlIdgqbg4GDTCUmdTkd8fLyplVaUn0qlMg3uErfn5+dn9nwhhBBCWJoEk0LUcFIxKSqasWKyKt4kTpkyBah5+0reaoDWnVZH3o6xBdy4BYUxVDGuQ4iKpigKiqJUy8e08XsTQtg+eQ4UQoiKUeXBpIQgoqJVddtodWMccnHqsEYqJkW5nTpkqMJQqSr/Z8lYKWgre86VV0mt2mC4r42BZGWTUEUIIWybr69h2Ke8HxNCCGEtLFYxKW2jQlgHY4B0+pAWRlt4McLmnTqsASr/RIFGozFVC4aGhlbqsSztVtWRNSGQFUIIIayZnLATQojykYpJYfOqsm20OjLuy3TqsJb9W6Lo9kCQpZckbNSlizq2LTOEZyNHjqzUYxXdV7I6tnverDoyNDQUlUpVLb9nIYQQVUfejwkhhLAW9lV9QGN1lrGqRojyqMq20epKURRTxdm2ZREWXo2wZVGfGPZ7rOzgLCQkBJ1Oh0qlsqoJ9xVBp9MRERFBYGAgkZGR6HQ602P07NmzTJ48WUJJIYQQZSYn8oUQQlibKq+YlLZRUZGqqm20ugsICDDtNTlrdCBBYbNlqwVRaoZKyQhOHTacKKjM1mqNRmOaJrpy5cpKO05VK6ldOzQ0tMIH2QghhBBCCCGENanyYFLaRkVFqcq20epOURRmz55tqkSLCp9C1weCGDBa9q8Tt7Z1WYTpcQiwYsWKSqvo0+l0jBo1ynSc6kACSSGEEFXJ+NySkJBg4ZUIIYQQBhbZYzI0NJSwsDC2LYuQYFKUWVW1jdYUiqKwcuVKU0iybZnhj5u3YvjTWKGhtwQlwlDxfumizrS/Kxh+fip7v8cpUwyP+eDgYJt/zN8YSMowGyGEEFUpPj7e0kuwebJPpxBCVAyLTOWWtlFRHlXZNlrTKIrC5MmTCQ4OJjIyErVa/W8AddjSqxPWqKoCtYiICLRarSkAtVUSSAohhLAGsg2SEEIIa2GRYFLaRkVZVWXbaE1mDH/Cw8PR6XTEx8ebzgrL2eGazdgCpihKlT32NBqNKciz5VAyIiJCAkkhhBCimqhuA/iEEMJS7PR6vd5SBy9pby1pGxU3slTbqBDCOoSEhKDVagkNDbXJIE+j0RAWFmYK9W31+xBCCGH7NBoNo0aNIjg42KZP9gkhhKg+LBpMGul0OlPbqBC3IlVGlhEWFoZGoyE6OtrSSxE1jLHKUKVS2dwUbp1Ox5QpU0xTxI37uMpQGyGEEJZiLAyRPdqFEEJYC6sIJouStlFRlCXaRoW5sLAw1Go1iqJIMCmqlLGqAyA6OtqmAr0b27aNk7aFEEIIIYQQQvzLIntM3oqiKDb15lOI6kytVpsqmaXdR1S1sLAwwLCXrK08L9xYJSlt20IIIYQQQghxc1YXTAohrINxXzyQIUOi6hmHo9lSq5larTY9ZmQfXCGEEEIIIYS4PQkmhRDF6HQ6UwuthJKiqqnVarRarWlPRmsnVZJCCCGEEEIIUTYSTAohzBhDFjAELBJKiqqk0+lMVYe2sH1A0X0wpUpSCCGEEEIIIe6MvaUXIISwLsbKL5VKJVVfosrZUigeERFhCiWDg4OJjo62+jULIYQQQgghhDWRikkhhElISIgplLSFFlpRvURERNhEKH5j67ZsdyCEEEIIIYQQZSMVk0II4N9QyFb29RPVi0ajITIyEjBUS1ornU5nCvAVRZFQUgghhBBCCCHKwU6v1+stvQghhGUV3SdPghZhCYGBgeh0Oqv++Sv6OJGqYiGEqD50Oh3x8fFotVoSEhKIj483fV6n01l4dZahKIrZ335+fvTs2RNFUaz2eVoIIYRtkmBSiBpOQklhaWFhYajVaqsO+yIiIswqOq251VwIIUTpGLtFjFtz3Iy7j2cVrch6pCWm3PRrxnBy5MiR8rpRCCFEuUkwKUQNptPpCAwMBCRsEZZRNBg/e/ashVdTsqKt2zJ1WwghbJtOp0OtVptONoEhePTw9aB1j7a07tEOdx8PPHxrXhh5o9SEf8PJuJgTxMWcIC0xhbiYE6bPq1QqmxhYJ4QQwnpJMClEDSbDboSlWXMLd9EhNxJKCiGE7St6QhYMgeSgCUNRDQ+8xbXEjVITUti7Lhrt2t9NlZVyglsIIURZSTApRA0loaSwNGv+GbwxlIyOjrb0koQQQpSDWq0mLCwMgNY92vLkzGekKrKcUhMM1ZPLpy8GMA1QNO5LKYQQQpSGBJNC1EDG/fIkcBGWYs0t3EUraqwxNBVC/D97dx4XVb3+AfzDjuAKuMFBNEXczYUYIrumLWauyYSpbd6Wq2WioG32q2taN8OG0muLptc0Awc0tTS3shRnCM0lTcldDqgEKCiyM78/8BwHZXdmzszweb9evbIZzpzvcEJmPvM834eofoxDyeFTR2H41NEKr8i+ZKdn4ZPnPkJORhZfWxIRUb0xmCRqZDjshqyBtbZwc/I2EZF9Mf57ffqK2QgMDlJ4RfYpOz0Lm5dsRPKGJKjVasTExFjs3NK+oZyoXll1k9VDQ0NZ1UpEVoXBJFEjwlCSrIG1tnAb/3xY+k0VERGZh/RBGCslzc+4ctLce06KoojY2Fhotdoav64xTlSX1DZZPTw8HGq1miElESmOwSRRIyGKIiIiIiCKIjcoJ8VI4Z+1tXoZh5IxMTFQq9UKr4iIiO6UtHVNYHAQpq+YrfRyGoXs9Cy888hrZvs9L01Ul6ohOVG9ZrVNVhcEAaGhoYiMjGRASUSKYTBJ1EhYa5UaNS4BAQEArKtil6EkEZF9kqol2cJtWaveWm6Wlu7o6Gi5QtLL1weqMfeyCraBpMnqm5dsBFARUEZGRvI1EBEpwlnpBRCR+TGUJGsQEREBoKJNmqEkERGZk1arhSiKCBkdxlDSwoZPHYXkDUnQ6XQmeTxRFBEVFQW9Xg8vXx8MnzoKqjFhJnnsxsrbzwfDp45GyOgwOaCUKlHZVUVEluao9AKIyLyio6Oh1+shCAJDSVKMTqeT/z+0lr0bGUoSEdmv2NhYAGCApQBvPx8EBgdBFEWThJPGoeT0FbN4TU1ICiiHTx0l79up0WiUXhYRNTIMJonsmEajgVartaowiBon4wDQGjCUJCKyb9IehKyWVIbUYq3X6+/ocaSun8DgIMzd9iH3jzST4VNHy/uwJiQkmKzalYioLhhMEtkpaXNwoCJ4sZbWWWp8rK2Fm6EkEZF9k/YhDBnNyjqlePl6A6gIuRpKq9XKweakeZNNsi6qXmBwkFw5GR0drfRyiKgRYTBJZId0Op38gsKahoxQ42NtLdwMJYmI7B+rJZXn7ecDL18f+Vo0hBRqTl8xm5WSFjJ86mi5DV8K+ImIzI3BJJGdMQ5eGEqSkkRRtKoWbuP1cPIkEZH9Sk9PV3oJBMDbr6JqsiFtwVK1ZGBwEANmC5Pa8KXOKyIic2MwSWRHbq0GYyhJSoqKigJgHS3coigiLCxMXg8nThIR2a+0tDQArJhUmpdvw6scpWpJtuNbnhQGm2p4ERFRbRhMEtkJVoORNZEqHayhhVsURTkkValUiq+HiIjM607ah8l0pIrJhlwPaW9JTuBWhhQI3+nwIiKiumAwSWQHdDodq8HIahhvmm4NIWBUVBT0ej1UKhXi4+OVXg4REVkI9yW0DvUNJjm8SHlStTGDSSKyBAaTRDbOuH1brVZbRRBEjZs1tXBHREQwlCQiamRYMWkd7qSVG7hZcUmWJ4X6/FkiIktgMElkwxhKkrWxphZuKZQUBIGhJBFRI3OnoRgpJzk5GQCvodK4zyQRWYqz0gsgooa5ddAN95QkpVlTC3d0dLQcSiYlJSm6FiIiIqo7Di+yDhXBcKrSyyCiRoAVk0Q2SKvVMpQkqyO1cEdGRirawq3RaKDVaq2iapOIiIjqh+3D1uFOhhcREdUHKyaJbIxGo0FsbCwAhpJkPYxbuJUcvqTT6Sr9fCi9xyURERE1DIcXWQcGk0RkbgwmiWyItGceAMTFxTF0IatgLS3cxtsb8OeDiIislW79Hpw/ehaPTR2Npl7NlF6O1bGFIKy4sBjbv9oMBwdHPPLiY3BydlJ6SSbHPT6JyFIYTBLZAFEUERUVVWmoCEMXshbW0MItimKl7Q3480FERNbm/J/nkPDBGpw+cBIA0KlvZ9wzkr+vqmLNodjvP6bgu4Va5FzIBgCEjr0Prdp7KbwqIiLbxWCSyMoZV4GpVCpOFyarotFooNfroVKpFGvhloJ7oGI6Pbc3ICIia1JeVo6Nn6zDzv/9CEO5AQAQPEKFAcPuUXhlVB/5V65h5evL8OeePwAAzq7OGD0jnKEkEdEdYjBJZMWM95NkKEnWRhRF+f/PyMhIxdYhVROrVCoOuyEiIquSn5uP5VGfIVV/DH5B/nj4+eHo1LczvHy9lV4a1UN6ahq+mLYIORnZ6D24LwZPfBBC9w7wbNlU6aUREdk8BpNEVsi4dRuoCH2UHChCVBVraOGW9l1lcE9ERNbm0tmL+HzqJ8gS/8bjsyPwwFMPwcHBoc7HF+YXQr9+D65kXkGnvp3RZ8jd1R5fXlYORydHUy2djPzx80H8b/aXcHZ1xrRl0QhSda/X8fW5jmWlZXa5XyURUU0YTBJZGePWbe4nSdbKGlq4o6Oj5X1XGUoSEZE1KS4sxpfTFiE7IxsvxL6M0uJSLHlJgz5D+qHX4L5o1a7m9t+ju//A6re+wtWcq/JtA4eH4NkFLwIA9Ov3oH2gHwJ6dcL+Lb9h9ZzlCOjdCdOWRTPYMqGLpzKwfNYXaNqqKaYtjYJ+w178+u1O3DPyXgTe0w0ezT1qPL6m61heVo4dy7cgZHQYWrRpicQP4/Dzqu14+IXHMGr64+Z+akREVoPBJJGVuLVKUq1Wsy2VrJI1tHBrNBpotVo5vCciIrImGzWJuHTmIp7+4Hn0GdIP3767Esf2HsWxvUcRP281WndogzYd26F1hzbofm9PdL2nG1zcXQEApw+cxLLI/6KZV3O8+Okr8O8RgG/fWYl9m5MxNvoJtGjTEj+v3o62ndrjwcmPYvWc5SgpKsHJfX8hZZMOqrH3Kfzs7YPBYMD/XvsSTk6OmPpZJNp0bIeD2/Yh89wlHNp5AI5OjvDtKsBHaI12nX3R6/4+COjVCQ6OFdWQtV1H1yau2PjJOjRp5gGDwYCfV20HAGxfthn3Pn4ffPzbKPn0iYgshvX+RFZAo9EgLCxMrv6Ki4tj2EJWS+kWbp1OJwejrCgmIiJrU1ZaBt263RC6d5Cnbj/0/HAMefph+HYVAAB/n8/EsaQj2LV6Bz6b+gk+mjAfhdcKYDAYED9vFUqKSjD4qQchdOuA3L+vIOdCNhwcHeTwEnCAeDwN/33xY7g2ccOUzyLh7umOP3YdVOhZ259zf5yBeDwN9467H+27+AEAnnz3GYSOvQ8t2rREeVk50lPTcHD7fvz4+SbETJiP1XOWA0CdrqOhYg4SkjfuhfaDNejY5y48NX8yDAYDjvx6WKmnTURkcayYJFIQ95IkW6N0C7fxVgdxcXEMJYmIyOqUFBajqKAI7p7u8m0+Qms8PjsC5WXliH32Q5w+cBKa/Z8jLysXGzWJSPlBj4M7fodbEzekp4rwCxKwbkE81i24uVXJo1NGyq3DeVm5uJqdJ+972Ll/IHre3weXzl60+PO1V3lZuQAAN083+bbA4CAEBgchLysX742cA78gAa8ui0bm+Ux8NfMzJG/ci8deGYNzf5yp9TpeOlNxrc4ePg1vPx/8a/Gr8GzVFBtiE5GTnm3ZJ0tEpCAGk0QKuDWQlPbIEwRB4ZURVU/pFm5RFOVQkpWSRERkrdybNkHbTu1w6vcT+OPng+j9wN3yfSVFxTCUG+Ds6gwHBwe0aueFLgO7IuUHPZo0a4Lff0xBU69mmB3/fzi1/wQO7tgPBwcH9BrcF93v7QkAKC8tw7XLFXsWPvnuM+jcPxAA0KlvZxzZdYgDVEzEv0cAnJydsGftLwgZdW+l1uqCawVwcACcXJzh6OyEdne1h19XARdPZcDN0x0Hd+yv9TrmZl4BALi6u2LKZ5Fo6tUMANCpT2dcPJVh+SdMRKQQBpNEFiS1oBoHkuHh4aySJJugZAu3KIqIiIiQz69Wqy16fiIiovoY/39P44tpi/Dlq4sRpOoBbz9v5P6di7+Sj6GkqASjIsfJU7QP7TwAR2cndBkYhD3xu1BeWgZDuQFdQ7qha0i32x674FoBYKio3gsZda98u9C9A4oKipD25zl07HOXxZ6rvWrVzgujZ4ZjfcxavD/2HfQa3BeuTdyQdT4Tpw6cgKu7G4a98BiAiirZY3uPQujWAZ4tPHE9N7/263j1OgDgwcnD0O6u9vLt/t07YPvyH1FWUgYnFwbMRGT/GEwSmZkoitBqtUhISIAoigAYSJLt0Wq1irZwR0VFQRRFRaeAExER1VVgcBBmx7+NhPfX4MS+VJw+UFEdOeDRezD0uZtB1PmjZ/Hnnj/Q98H+8GzhiU53d8GxvUexY8WPGPbSiCof27NlUzzz4Qvo0COg0u2d+wUiMDgI1/Pyzf78GoshTz+Mtp3aY2NsAg7u+B3uHm5o2c4Lj7w4AoMnDkXTVhVVjr/G/Yz8K9cwfOooAKjTdez9wN14fFYE7osYXOn20McHQbd+DwrzC+DZsqlZnx8RkTVgMElkJlIgKbW+AhWBJKu9yNaIoojo6GgAyrRwR0REyKFofHx87QcQERFZgTYBbTH1i5o/TLtwMh0A5CE5Q555GEkJv+CHxd+hML8QQ556CM28myM7/W8c3L4f+7b8Bo/mHnj1q1m3PZaDowOmr5ht+ifSyPUc1Bs9B/Wu8WsunEyHo5MjBjx6D4A7u44t2rTEv7d+aJbnQkRkjRhMEplQVdWRwM3WU+4hSbZIyRZuKZSU9mElIiKyJwYD0LZTO/S6vy8AwN3THTNXvYEvpi3CjuVbsGP5Fjg4OMBwY4Sze9MmePRfI5VcMlXBYABCRt0rV1DyOhIR1R2DSaI7JIWRer1e3jsSYLs22Qfp/21BECz+/7I0AZyhJBER2auQ0ffi7gf7V9pL0NvPB28kvotjSUdw9vBpFFwtQMt2rdC5XyA69Owo701J1uPJd56WQ0cJryMRUd0wmCRqII1Gg/T0dGi1Wvk2KYxkdSTZA+MW7piYGIueWxoUJZ2bP09ERGSPHBwc4N60SZW397ivN3rcV3MLMVkHZ9eq31bzOhIR1Y7BJFE9VLdvZHh4OFQqlcXbXInMSWrhVqvVFv1/W6fTYfz48QCAuLg4/lwRERERERHZKQaTRLWoKowEAJVKJVdHEtkb4xZuS1ZLiqIoh5IxMTEMJYmIiIiIiOwYg0myaqIoWryFUwoiAdwWRnLfSGoslGrhNq7SZOhPRERERERk3xhMklUSRRFRUVHQ6/UWaeWsboANwDCSGp+IiAgAlm/hliZwq1QqiweiREREREREZHkMJsnqGO8vJwgC/P39zXIeKYxMSEiAKIry7VIQKQgCK7ao0dHpdIq0cEdHR3MCNxERWbXredfh0dxD6WVQPfG6ERFZNwaTZDVu3csxMjLSpFWKoigiLS0Ner2+2hZtDrChxk6JFm6dTidvn8BKSSIiskYbP1mH7V9txuz4t+HfPUDp5VAd8boREVk/BpNkFXQ6HaKjo+U9JU019KIuLdpqtdri+1gSWaOIiAiIomjRFm5O4CYiImtnMBiwf3MyDOUGlBSWKL0ciygqKELC+2vQ477e6PfIQKWX0yCN8bpJrly6jH2bk5GdngVnF2cE9O6E/g8PhKOzk9JLIyK6DYNJUpxGo5ErGFUq1R21cYqiCJ1OB1EUq62KBMD9IoluIbVwA5arWjSewM1QkoiIrNVf+mPITs8CAHi2bKrwaixj9ZzlOLB1H9p38VN6KQ3WGK8bAOzbnIxv3l6BkqLKYeyu1TswffksuLi7KrQyIqKqMZgkxRgPuAEa3rotBSo1VUWyRZuoZlILd1xcnMXOKU3gjoyM5M8nERFZrV/jfpb/3Bj2Kty5cisObN2n9DLuWGO7bgBw6exFrHx9KQBgwr+fRa/7+6AwvwDfzl2FE78dR/LGvbjvicHKLpKI6BYMJkkRt1ZJLly4sM7t1HVpzwZYFUlUV9I2CpYM8I0ncPNnlYiIrNXR3X/g0M7f4ebhjqLrhXBv2kTpJZlV/pVr+P7T9Wjm1QxXc64qvZwGa2zXTeLs4gxHR0fc/dAA3DtuEACgeesWGP/2U3hv5Fv4+3ymwiskIrodg0myqIZUSUpBJIAq27MFQYBKpeJekUQNYDx4ZuHChRY5pxRKcgI3ERFZs6LrhYifuwr+3QPQ9Z5u2LlyK5xdqn77ZDAYAAAODg6Vbj9z6BQObtuP1N+O4e+zl9ChV0dMXzH75nHlBly/eh2eLTyrX0dBEb5+Yxk69w/EkKcfBgBkiX8jZZMeZWVl6HV/H3Tsc1eltfy8ajtO/X4Czq4u6NAzAPeMCEUz7+a1PmfPlk3xwa+xyBb/xgfj3q31661RY7xuEm8/H7yR+C7cPN0r3f73+UsAACfuMUlEVojBJFmM8ZCLmgbc1DQ9WzqW7dlEpmHcwm2JYF+j0TCUJCIim7DmnZW4evkq/rVkOvZvSQYAODhWBFgnUlKRl5WLAY/eg4unL2DR8zEoLyvH7Li30aq9F07u+wvf/N8KuUKteesW6NC7E3yE1vLjG8oNmDdqDnKzcvH+ro/hWs3ef1+/vhSHfzqIMPU/AAA/fvE9tny+EWUlZRX//fkmPPnuMwgLvx8AsP6jtfjp623y8fs3J2Pb0s14av5k9PpH31qft7unO4quFwEAHJ0c6/U9swaN9bpJ2nX2ve22gzt+BwDc1a9LnR+HiMhSGEyS2dWlSrIu7dkMIolMS6PRWLSFW6fTyR82xMTEsMKZiIis1q5vdmD/lt8w4d/PwjfQD7//6FBpovGhnb9j3+ZkdA/rhaWvLkZu5hUAwKZP1+HpD57H3sRf5XDrpcWvotc/+txWlVdwrQCXzl4EABRdL6oy4PplzU4c2nkAI14Zgx5hvfDTym34ftF6dA3phtEzwuHs4oLPX/kEO1f8iLDw+3H5Yg52fbMDPkJrvPzlTPj4t0bm2UvYvnwLVr35FV5PfBet2nnV+vwLrxcCANw83Br2DVRIY79uVSkuLMYfPx+EWxM3dBnQtUGPQURkTgwmyayqq5LkPpFEyjIOCS3Rws0J3EREZCvOHTmDdR+txcDhIbh33CAU5hdWhFUGA3bH78I9I1WAASi8VoBF/4xB5rlLeGr+ZOyO24UjvxwCAIyc/jguX8zBiZRUrHpzGR58bhj+MelBuDW5GfRJA1m82nujmVcz/LnnDxz+6SDGRqvh5uGOP/f8gcQP49BnSD888tIIXLl0GZs+XQfPFp4YPPFBtGzTCmcOncL13OtoHdAGAHD018MoLyvH2FkRaN2h4ra2ndph0nvPAe89V+fvQUlBMQBUWw1ojXjdKjOUG+Dg6IBtS39A/pVrGD51VKPZa5OIbAuDSTIb4wE3Utio1+vlcELCfSKJLM+SlYuiKCIiIgIAJ3ATEZF1u5ZzFcsil6C8tAynfj+Btx+ahSuXLsNQXrEX4W8b96L7vT1wNScPpcWlSDt2DmOjn0DI6DAUF5Yg/r1VKMwvRKt2Xpi+YjYO7TyApLW7sGnReuxcuQ3/mDAU9z/5AJq2aiafMzA4CPlXruGrmZ+j6HohOvbuhA49O+KrmZ+jTcd2eOY/z8PBwQHblm1GaUkpWndogy9fXSwf7+TihFGR4wAAeVl5AAAfweeOvg/FhRXBpLOryx09jqXwulUoKylD4oJvsX/zb8jPza90X2lxKXIzr6BFm5Z3dA4iIlNjMEkmd2vrtnSb8X6RbM8mUo60z6P0YYC5RUVFyS3jrIImIiJrdT3vOj5/+VNcybwMJ2cneLTwQKe+XeDfvQPOHTmDvYm7Me618fDxbyO3AIeOvQ9Dn30EAHDX3RX79108lSEPNek7tB/6Du2HzHOXsP2rLdi69HvsXPEjJr73HPoPC4azqzMMBgOWzfwMRTfap3d9swP5uflwcXfBlCXT4eZRMcjkwLZ96DOkHybH/Av7NifjREoqWrZpiQGP3oP2XfxuPIeKMKqkqOSOvhdSMOloA8NSeN1uip+3CnsTdyOgVyeEjuuGX9fslK/ltmWb8cuanzA2Wo37nhh8R+chIjIlBpNkUlqtVh6mcavIyEgGkUQKs3QLtzSBW6VScdgNERFZrbysXCx+YSH+Pp+JlxZNQ8/7K+8t6OTijL2Ju3HhZDo69rkL169eh0dzD4yJfkL+mvad28PZ1Rl//XYcHfvchRWzvkAz7+YY+epYtAloi4lzn8XwKaOwas5XWPnGUvj3CIC3rw9+26QDAAQ/psJx3VGIx9Pg4uaC6Stmw9vvZgXd9bx8lBaVwMnZCSGj7kXIqHtvex7lZeXy87kTJUUVYZaTs3UPv+F1q+zCyQwAFdPV929ORnFhMUJGh+Hh54cj7dg5bP3yB8TNXYWcjGy5WpOISGkMJsmkqpqibXyf1DIqCAL8/f0BAH5+fnI7NwD4+/uznZvITKSf0cjISLP/nHECNxER2YofFn+Ha5ev4eUvZqLLwNsHhEhTnbPTswAAE959Btfz8uHZwlP+GkdnJzzw1EMoLqiYaO3R3AO7Vu/Abxv3otPdXeDm4YaSwmKk/XkeZSVlyM+9hqHPDcO6j+Ix7MXH8ODkR3H28Gkkb0hC6Lj70aFHQKU13HV3FxzbexQnUlIRGBxU5fPo2Ocu/PrtT3B2u7MW7NIblXsezT1rZb+OgAAAIABJREFU+Upl8bpV9tT8yUj4TxxOHzgBAAhT/wMRb02Eo7MT2nZqh34PD8Q3b6/A8b1HGUwSkdVwMBgMBqUXQfZDFEXodDr5z+np6UhLS5P/WxTFOj9WVSFmSEgIQkNDGVwSNYC076slqheNB19x2A0REVlSWFgYRFHE4iNf1fmYU7+fQNtO7SrtIXirXat3oMd9vdCmY7s6P+6fSUfw88ptuHAqA3nZefBo5gG/bv64f/wQ9B3ar86PAwBnD5+G5un/wLWJGx6f9QT6PRIMF1cXpP+Vhv2bf8Ohnb9j4AgVAgd0RVdV99umSddHXlYu9m/5DYMnPiiHe/Wl/y4Jq+csR2RkZL22cgkICICXrw/mbvuw1q/ldas/Q7kBxYVFcqt5dRp6/YiI6ovBJFmcFE6mpaXJf5ZCTOPbqwsx+cuRqP5EUURYWBgA8weFDCWJiEhJDQkmbcWfSUew8rWlyL9yDQDg4OAA6e1cq3ZemDRvMoJU3ZVcomzzkg3YvGQjYmJi6rWndX2CSVthS9dNwmCSiCyFrdxkccaVkLWpKsS0xLAOInsTFRUFwPxTsUVRlPeZ5QRuIiJSgiAIEEUR2elZlfb6swc9wnrhvR0f4cDWFFw6cxFlpWVo3aENut7TrV4VgWRZvG5ERNVjMElWrT4hJhFVTavVygNozP2JNydwExGRtcjJyLa7YBIAXN1dETI6TOll1Co7PRtA/V/HS8GyvbGV60ZEZGnWPWaNiIjuyK0VjObECdxERGQNpL3JSVknUlIB8HrYqpyMioFBLBAhInNjMElEZMcs1cLNCdxERGQtxo2rmDas/y5J4ZU0bg0NtqSvlyZpExGRfWMwSURkp6QWbkEQzNpWrdPpEBsbCwCIiYkx23mIiIjqQqrQkyr2yPKkUPhO9obPycg21XKoARraik9EVF8MJomI7JBxC7c5w0JO4CYiImsjCAIEQUBORhar7hQWEhJS72NUKhUAVkwqja34RGQpDCaJiOyQ1MKtVqvNFhaKoiiHkpzATURE1kT6nbR5yUaFV9I4Sd/3hrw2kILJ5A1sxVcS95gkIkthMElEZGeMW7jNWS0phZ+cwE1ERNZGGvjGdm7L03+XhJyMLKjV6gaFWlKFntRKTJZnilZ8IqK6YjBJRGRnLNHCzQncRERkzQRBgEqlQk5GFjYv2aD0choVqdJRGkJUX8bXjsGyMqTve0Na8YmI6stZ6QWYglarhSiKAAC9Xg+9Xq/wiuhW0qelUjtHSEgIQkND2RpAZGIREREAzNvCzQncRERkCyIjIzF+/Hjov9uLwOBuCAwOUnpJdm/zkg04kZIKlUp1R69DwsPDK97XfZfE66YAKVzmNj1EZAkOBoPBoPQiGkqn0yE6OloOJY15+foosCKqjrRHiTFBEBAeHs4WUCITkQbRCIKApCTz7MvEYTdERGRLNBoNYmNj4eXrg+krZsHbj+8RzOVESio+eW4BACApKemOChBEUURYWBivmwI2L9mAzUs2Qq1Wm7X7hohIYpPBpCiKiIqKkisjvXx9oBpzLwKDu8HL15u/uKxUdnoWcjKykZ1e0ZYhfRInVV2xepLozoSFhUEURbMFhtKbBIChJBER2Qbj9w0MuczHOJQ01WuE6OhoaLVahIwOw1PzJ9/x41HtstOz8M4jrwG483CZiKiubDKYlPY2kwLJ4VNHK70kaoDs9CysnrMcJ1JSzVrhRdQYSH8vmvPTbekckZGRrHQmIiKbIYoiIiIiIIoi3z+YgVRhB1Tsb22qgSnG12341FG8ZhbwyXMLcCIlla/1iMiibC6YNG4jXHzkK4VXQ3fKOJxkuwBRwxj/vXju3DmznIPDboiIyJaJogitVovY2FgAQMjoMKjGhHH/wjuweckG6L/bi5yMLAiCgJiYGJN3Uxi3dDNQNi8plORrPSKyNKd33333XaUXUR/jx49HXl4eJs2bDKFbB6WXQ3fIo7kHAoODsGv1DuTl5aFHjx7w9/dXellENkX6ezEuLs4sPz8ajQYJCQkQBAFbtmwx+eMTERGZW/PmzeXQTK/XIz01DckbkqD/bi+aNPNAwdUCtnjXIjs9S/6+ffLcRziRkoqCq9flbZl69uxp8nM2b94cAPDz9p/kSdGBwd1Mfp7GLDs9C/8J/zfSU9MgCAKWLl0qf9+JiCzBpiomtVotoqOjERgchOkrZiu9HDIhbrJM1DDS/kvm+nSbw26IiMjeSNWTCQkJlYZoevn6wNvPu9IQTW8/byWWaDWy07ORk5El/9uYWq1GZGSkRfYhlIYYARXXafjUUVCNCTP7ee1ZdnoWkjckyW34rJQkIqXYVDAp/UKaNG8yfxHZGWmjZf5CJKo749DQHBuUM5QkIiJ7Jooi0tLSoNfr5X+oeoIgIDw8HEBFKGnpwSjGe04ClQegsiW/bqQhpDkZWXIgCYB7ShKRopyVXkB9pKenK70EMhOpdcb4U2siqplUORAXF2fyNweiKCI6OhpAxYtVhpJERGRvBEGAIAjy7zjpdWhaWlql/26Mr0+l1xXSv63hdYA0LFOqdtXr9TfCtY1ylatxQNnYq10lUrWr1AovkYJmJUJmIiJjNhVMSi8S+ImYfQoMDrrtFyZRY6PT6ZCYmFhra5RGo5GH0ZjjzUJUVBREUYRKpeIn6ERE1CjcGsaRdVKr1VCr1RBFETqdTg4pASB5Q1YtRzduxlWvfH1HRNbCpoJJ6dNKbkxt33Q6nVV8KkukhMTERGi1WowbN67aN0Y6nU6ully4cKHJ18AJ3ERERGTtBEGQQ0rgZmu+9GfjfzdG1lj1SkRUFZsLJo03oyb7UnFtWTFJjZtOpwOAGqdrS6FkTEyMyas6pEpMQRDMEnoSERERmYPUmk9ERLbFUekFEBHRTVL7dHUvrI1buKUKAVMxrsQ0R+hJREREREREZIzBJBGRldBqtQCqr5Y0Zwu3KIqcwE1EREREREQWxWCSiMhKJCcnAwBCQkKqvF8KJWsbjNMQUVFR8mMzlCQiIiIiIiJLYDDZAFcuXbb4OU8fOImykrI7eoz01DTsWL7FRCsiIlOT9pesKhg0buE29RRF42E3nNBIRERERERElsJgsg5KCouxPmYt3hoShRkDp+CdR15DzoVs+f6LpzJw5tAps65h7fvfYN+W5AYff/lCDha/sBC/bdKZcFVEZErS5MhbqyFFUaxULWlKxsNuOIGbiIiIiIiILInBZC2KC4sR++wC7InfhQGP3oOeg3qjrLQMO1dslb/m4ukL2BibaNZ1NGnaBBdOZjTo2LLSMnw18zNczblq4lURkalI+0tWNdDGXG3Wtw67ISIiIiIiIrIkBpO1SPwwDhkn0zHtq2g8PisCz2umwi9IgH79HhgMBgCAs6szMs9dMus6HJ2dcOViToOOPZFyHAVXr8Pbz8fEqyIiU6luf0mtVltrm7UoitBoNNBoNHU+n06n47AbIiIiG6XRaBAWFqb0MuiGiIgIXg8iogZiMFmDnIxs6NbvwZiZ4ejY5y759j5D+qGooAjX864DAFzd3ZCbeQVF1wtNdm7x2Hmkp4qVbisrvbnH5NnDp7HqreX4+o1ltT5Wt9CeePv7+ejYt7PJ1kdEppWWlgag8v6SoigiOjoaQPUt3DqdDmFhYYiNjUV6enqdznXr4zKUJCIisi2xsbEQRVHen5qUo9PpoNfreT2IiBqIwWQN9q7bDRcXZ4SMrvzp19Bnh2Ha0ih4tvAEALi4uwCAHFTmXMjGL2t24mp2nnzMF698iv/N/vK2c6T8oEdeVi5yM6/gg3HvIC8rF4d2HsCHEXOxYPxcZPx1M5y8dvkqfl61HfNGzUHMhPnIychCp76dYSg31On5FF4rgLOLc/2+CURkEXq9HkDl/SVra+HWaDRy1aNKpapzO3ZUVBREUeSwGyIiIhskbf8CQN6ShZSTmJhY5Z+JiKhuGEzWIOt8JnyD/OHu6V7pdndPdwSF9pD/262JGwCgpKgEZw6exEcR70H7/hp8MW0RDAYDykvLcGzv0dsqKosLi/HtOytxaMfvKC8vR3qqiIPb92PVm8vg11WAg4MDdq/dJX/9iZRUJH4YB4PBgBdiX8b0FbMxaPwDcHB0qNPzKS4ogtstz4WIlFfV/pJSC7cgCLeFh6IoIiIiotJAnLoOrjGewM1hN0RERLYnISFB/rM0OI+UY1wlyYpJIqL6YzBZg3adfXHhhIi8rNwav87ZtaIK8VjSESx+4WP4+LfBsJdG4Ozh0/hzzxFczbmK0uJSdOxTuZU6ecNeFBcWw7ergPwr+QAA7ftr4OXrjciVryN07H04d/h0xTmcneTjhj47DH0f7F/v51NSWAIXN9d6H0dEluHn5wegcqv1rVWQUuu2FFrGxcXVuerROOxcuHChaRdPREREZie1DUtEUaxUQUmWpdVqK4XDbOcmIqo/BpM1GDT+ATg6OWLxix8jOz2r2q8rLigGUBEqBoX2wIyVr2H4lFFo0swDJ1JS4ebpDgcHB+Rk3HwM8dh5bPqkotTfo7kH8rKuAACaeTfHlM8i4e7pji4Dg5BzoWLgjaHcgA49O0Lo5o8tn2/ElUuX6/18iguL4OjES05kbaTBNyqVCsDNFm61Wl2phfvW1u2kpKQ67w+p0+kqhZ3GLeNERERkG6pqFTauoCTLkl7DGWN7PRFR/XDDwRp4tvDEzNVv4otXPsXcEW+iz5B+6NjnLuReuoK8rFz0GNQb94wMReGNFm2v9t54av5kON6obuw5qDdOpqTCfWY4+g8Lxt7E3XB0doK7pzt++WYnfAMFnDtyBkd/PYymXs0BAGNnPYFW7bwAAB16dsS1y1dxdPdh/Jl0BKqx9+GRFx7DR+PfQ8yE+eg5qDcee2UMmvu0qNPzKSkqgZOLU+1fSI2KKIpIS0uDXq9Henq6PIRFFEWbbw8yDt8EQYC/vz+AisnXxm3TSpM+WQ8NDa1U1ShVS4qiiKioKLlCIjIysl57Q4qiyAncREREduDWajyVSgW9Xg+dTsff7wqoqlrV1l8/ExFZGoPJWrS7qz3mbJyHlO/1+G3TXvy0chuKC4ohdPNH01ZNAQCu7q5wdHbCU/Mno0kzD/nYh/75KGImzIfBYMDT7z+PFq1b4sD2fXBycsLQ5x7BsBdHYtOi9RCPp2H8O0/j8oUcDBweIh/fukMb9AjrhaL8Irg3bYIHJj2E1h3a4OUvZ2Ll68uQlPAr+gztj56DetfpuRQXFqNJ0yam/QaRTZLafvR6faV2oKrYcmXdra010nPVarWIjo6GIAgIDQ3FuHHjFH0xLw2iAXBbC7dOp5NDRSmsrO9aq6vAJCIiItshtQ2r1WrodDr59YP0eo6/4y3LeI9w6c/S9dBqtVb1ITgRkTVzMBgMdRvpbAUCAgLg5euDuds+VHoptym8VgD3KkK/k/v+QucBgXBwqNuAmuqUlZbByWifyZKiEqT/lYaOve+q82Mc1x1Fi9Yt0b6L3x2txVxWvbUcyRuSWNFlRqIoIjY2ttKnu4IgQBAEqFQqqFQquarQlgPJW0kBZVpaGkRRRHJyslwlKlGpVNVOvzYnKSRVq9XymtRqNWJiYqDRaOR2oIYOq+GwGyIiIvsQHR0NrVaLmJgYxMbGQhRFJCUlISwsDIIgICkpSeklNirSa6y4uLhKnSnjx4/n6y4ionpgxaSJVBVKAkCXgV1N8vjGoSQAuLi51CuUBIBuoT1NshayPbe2AguCgPDw8Hq1A9syKWSV/i19gi1VjiYkJECv12P8+PEQBAHx8fEWC2aN9yaSWrgjIyPlF7tA/Vu3JRqNRn5MvjgmIiKybcYVetIHl9KHy2zntjzpdZrx99zf35/Xg4ionjgJhcjOiaJYaYpzZGQkkpKSGk0oWRNBEDBjxgzEx8fLA2FEUURERITFJipK55HebISHhzd46vatjyu9abl1sjcRERHZFuNQ8lbSdjBVDcYh86juekhBMYBat0siIqIKDCaJ7JhOp0NYWBiAm1OcGUjeThAEqNVqxMfHIzIyEqIoIjo6GhqNxuznNt4HUxCESq3b9Zm6fetjctgNERGR/ZAmb48bN+62+6RwzFIfqtLNyduRkZG33SddD05LJyKqG7ZyE9kp46Ep0p6FVDOpghKoeMGZkJAAlUpltmDv1kmOUkjZ0NZtiTTsRok9M4mIiMj0pCq8qn6vSx+wpqWlWXpZjVZoaCjS0tKq3PpHuh5ERFQ3NjX8JiwsDKIoYvGRr5ReCpnBJ88twImUVCQlJdnV4BWlSD8vdxpyNVbGk7vNteektIm9pKFTt41x2A0REZH9k17nnTt3TumlECqGtALg9SAiagCbauWWgoETKakKr4TMITs9G4B9TYNWSkREBERRhEqlYijZQGq1Wm7rjoiIMMs5jFuuamrdDgsLu626sirR0dEcdkNEREREREQ2w6aCSamFgexTTkYWQ0kT0Ol08mbbCxcuVHg1tk2tVkOlUkEURbPs22Tcul1dkKjVaiGKYqXp3VXR6XRyeMm2fSIiIiIiIrIFNhVMSk6kHFd6CWRi2elZAKreN4fqx3gSM4PeOyNNMQcqqhFNLSkpqdaBRDVtdi8x3k+Uw26IiIiIiIjIVthUMClVTOq/26vwSsjUVs9ZrvQS7IIoinK1JDfdNo3Q0FCzVU0KglBreCxdz+rCRuMJ3He6PyURERERERGRJdlUMCkFBDkZWVj1FoMse3EiJVXeN1SqTqOGkaolGUqaVnh4OICb319LkVqza7qe0gRulUrF605EREREREQ2xaaCSeDmnnknUlKh/y5J4dXQncpOz8LmJRsAsPXYFKSKPnMFvGVlZSgtLUVJSQlKS0tRXl5ep+PKy8vl40pKSlBWVgaDwVDrcQaDAWVlZfJxpaWldTrO1KQqRGlPSEuRgtDqricncBMRERFZB76PISJqGJsLJgVBQExMDHIysrB6znJsXrJB3p+QbMvmJRvwziOv4URKKqu9TEQKzkz9wujy5ctYtmwZhg0bhrZt28LNzQ1+fn4YM2YM4uLicO3atSqPMxgM2Lp1K/75z3+iS5cucHd3R6tWrRAWFoaPPvoI58+fr/acR48exTvvvIMBAwagefPm8PT0RPfu3TF9+nTs3bu3zqGoKQiCYNYhONWp6XpqNBpO4CYiImqkLP1hKRERkbk4K72AhlCr1RBFEbGxsdi8ZCP03+1FYHAQACAwOAjefj7w8vVWeJV0qxMpqcjJqAiR9d/tlf+sVqs5RdgE6tL22xBnz57FAw88gLNnz1a6PTMzE5s2bcKmTZvQr18/bNmyBW3btpXvz8/Ph1qtxpYtWyodl5+fj+TkZCQnJ+O9997Dt99+ixEjRsj3l5WVYdWqVXjxxRdRUlJS6diTJ09i0aJFWLRoEWbMmIEPPvgAbm5uJn2+1VGpVNDr9dDr9RbZx1Gj0QCo+nrqdLpKQ46IiIiIiIiIbJFNBpMAMGPGDKjVami1WsTGxiJ5Q0XIlbyB7d22Qqp+5bAO00hOTgYAhISEmOwxz5w5g379+iE3N7fGrztw4AD69OmDP/74A23atMG1a9fQp08fnDlzpsbjrl27hnHjxmHFihWYMGECAGDBggV46623am3Zjo2NxdGjR7F582Y4OTnV74k1gDx868YwGnOTpnHf2sZtPOyGE7iJiIiIiIjIltlsMAlUBFtSQJmWlgZRFOVwJi0tTeHV0a2kYEelUsHf35/7sJiJqb6vOTk5GDJkSK2hpCQzMxMjR47Erl278MEHH9QaSkqKi4vxzDPPYODAgUhLS8Pbb79d5/0nd+7cieXLl+OFF16o07nuhL+/v9nPIdHpdBBFESqV6rbrKQ27iYyMZChJRERERERENs2mg0mJIAjym3fuU0iNlRTGmypAW79+/W3t27X57bff8Pnnn2PBggX1Oq60tBQfffQR0tPTUVZWVufjysrKMGXKFIwdOxY+Pj71Omd9SX/HWGJPp8TERAA3p4FLjIfdzJgxo9rjpf0nFy5cyA8AiIiI7JC09zUREZGts4tgkohMG5iVlZUhLi6uQcfOnj0bpaWl9T5uxYoV9QolJWVlZdi3bx+GDRtW72OtlTRgx7gisi7DbkRRRFRUlMXazYmIiEgZ/PDRuiQlcTsxIqKGsotgUqvVyqGMNJyCrIv0wkkKWkJCQhAaGsoXVGZgiu+pwWDAgQMHGnRsQ0JJAA0KJSXHjx+3SDApCILZqxOkv8/UarV8LWsbdiOKorzfrrTOmJgY/nwRERHZKf6Oty68HkREDWfTwaROp0N0dHSVQYGXr3nbOql+pGskTY7WarUQBAHh4eE1tqSSMgwGA3JycpReRp3Z0lprc+sQI51OV+OwG41GUymQ5M8UERERERER2QqbDCZvbVf08vWBasy9CAzuBi9fb3j7MZS0RtnpWcjJyEZ2ehZOpKQieUMSYmNjkZCQgPj4eH7SeIdEUTTZ99DBwQGtW7dGZmamSR7P3Nq0aWOR80gVk6b8Xt/KuI3beAL3rcNubv17MDIykoEkERERERER2RSbDCalN+NSIDl86mill0R14O3nA28/HwQGB0E1JgzDp47C6jnLcSIlFREREdybxYo4OjoiNDQUGzZsqPexTZo0QUFBQb2Pc3V1RXl5eYNawbt3717vY6zRrW3cERERAFBp2E11bduc0E1ERERERES2xlHpBdSXTqeTK4TmbvuQoaQN8/bzwaR5kxEYHARRFBEdHa30kugGR0fH2yZC19WSJUvg5uZW7+MiIyMxcuTIeh/XokULBAcH1/s4a2Tcxm08gVsadqPRaBAWFobY2FgIgoDIyEgkJSUxlCQiIiIiIiKbZHPBpBReTZo3WeGVkClI4SRQETpLbaykvDFjxqB37971OmbYsGF4+umnMXfu3Hod16xZM7zyyiuYNWsWXF1d63yci4sLvv76azRv3rxe57NW0h6soihWmsAtiiIiIiLkKkkpkGTrNhEREREREdkymwompTZHqRWY7IO3nw+GTx0FURSRmJio9HLohqZNm2L79u113r8xICAAa9asgaOjIyIjI3HPPffAwcGh1uM8PDzw448/wt/fH6GhoVi2bBmcnJxqPc7R0RFPPPEERowYUaf1WTsplBQEoVIAKVVJSkFlXFwcA0kiIiIiIiKyCza1x6Q02TlkNENJexMyOgybl2xEWlqa0kshI23btsXhw4cxevRouc24KsOHD8fq1avRqlUrABX7Re7evRtvvPEGPv7442qP8/Pzw4YNGzBgwAD5tokTJ8Lf3x+jR49GXl5etcfGxMRg2rRpcHS0qc9XqiV9f6W/51QqlVwhzmnbREREFb8j09LSoNfr5a2dpKF0RKQ8aTikIAjw9/eHn58fVCoVtx0iohrZVDCZnp6u9BLITKRJ6nxhaX3atm2LX3/9FZ999hlmzpyJ8vJy+T5XV1csXboUEyZMgLNz5b9OXF1dsXDhQkycOBFr167Fnj17kJmZCRcXF3Tr1g0jRozAqFGj4O3tXek4R0dHDB48GEePHsWMGTOQkJBQ6f6BAwdi6dKluPvuu6td88mTJ5GYmIjDhw+joKAAzZs3x8CBAzFhwgR4eXmZ4LtielLFpITTtomIiCqIoojY2NjbfldKBN8WFl4REVVFei8nbUsk4YfsRFQTmwompWq6wOAghVdC5hAYHIQTKalKL4NucfXqVbz//vtYsGBBpVASAIqLizF16lRkZmZi2rRpVQ696d+/P/r371+vcx47dgwvvvgi9uzZc9t9+/btw8SJE/Hf//4XgwcPrnRfXl4e3n33XSxevBglJSWV7lu5ciVmzpyJhQsX4vnnn0eTJk3qtSZzquqNFqdtExERQR4GJwkf2QeCbwuEDgiAamCAgisjouqIGbkQM65AvJAL7cbD0O8/h9jYWCQkJDCgJKLb2FQwKX0CI1XXkX3S6XQMY6zE9evX0bdvX5w5c6bar8nPz8esWbOg1Wqxe/fueg2vqcqhQ4cQHBx8W7Bo7M8//8QDDzyAJUuWYMqUKQAqQsmePXvWWHVbUlKCV199FZs3b8bmzZvrtAemJRhXhfITZSIioorX/VFRUTc7CF4ahBkv3a/wqoioLgTfFnIlc/jIPhAzcqHddAixX+yW91Lna10iktjU5myiKMLLl6GkveK1tS7l5eWYMGFCjaGksZSUFLz++ut3dM7Tp09jyJAhNYaSxqKionD06FEAwAsvvFDnrQC2bt2Kl156qcHrNDXjtm1O2yYiIoIcSgq+LZD0/SsMJYlsmODbAjNeuh/xX04CAMTGxkKj0Si8KiKyFjYVTBJZC1EUq93nyF7s2LEDGzZsqPPXGwwGaDQa7Nu3r8HnjI2NRU5OTp2/vqCgAKNHj4ZWq8XatWvrfJzBYMCqVausZk/TpKQkBpJEREQ3SO3bUijJPSSJ7INqYACSvn8FAGrcN5aIGhcGk0QNEBERgejoaISFhSE6Oho6nU7pJZlUeXk54uPjG3Tsd99916DjSkpK8PXXX9f7uFOnTuHll1+u93GFhYVYv359vY8zB0EQ5CmGREREjZlOp5M7CRa+O1Lh1RCRqQm+LbDw3xU/21JbNxE1bgwma3Hl0mWLn/P0gZMoKymz+Hmp7uLj46FWq+XKyfHjxyMsLMxuWhLKy8uxffv2Bh27a9euBh2n1+uRm5vboGP//vvvBh3X0BCViIiIzEMKKiJfGsThNlamhO9PyETCR/aBakAARFG0uwIPIqo/BpO3KCksxvqYtXhrSBRmDJyCdx55DTkXsuX7L57KwJlDp8y6hrXvf4N9W5Lv+HFOpKRi8QsfY/a90zArdBqW/CsW6alpJlghSROTk5KSEBkZCZVKBVEUERsbaxdVlAaDARkZGQ06tqEhYV33sjSl06dPW/ycREREVD2pWpJ7SlqXr9fuR7ewBdDtO6f0UshOzHhpEABWTRIRg8lKiguLEfvsAuznRCGiAAAgAElEQVSJ34UBj96DnoN6o6y0DDtXbJW/5uLpC9gYm2jWdTRp2gQXTjYsFJLs35yMRc/HID31PPo9PBB9H+yP1OQ/EfvsAhReKzDRSkkQBMyYMQPx8fFISkqqtorSWvYyrA93d/cGHefs7GzR890JT09Pi5+TiIiIqibtNxc+so/CK6FbfbvuAEpLy5Fz+brSSyE7Ifi2BACbfJ9ERKbFYNJI4odxyDiZjmlfRePxWRF4XjMVfkEC9Ov3wGAwAACcXZ2Ree6SWdfh6OyEKxfrPgCkKod3HYKh3IDpK2bjyXefwaT3nsPkmCkouHodJ/b9ZaKVkjHjKsqYmJjbqigjIiJsZoNnR0dH9OrVq0HHduvWrUHHDRgwAA4ODg061tGxYX+VBQcHN+g4IiIiMr3k5IqOoVC2cFuVI8cv4s+/Kt7/tGzRROHVkL0QfFuwnZuIADCYlOVkZEO3fg/GzAxHxz53ybf3GdIPRQVFuJ5X8emgq7sbcjOvoOh6ocnOLR47j/TUyp8UlZXe3MPl7OHTWPXWcnz9xrI6P+ao6Y9jwtxn0a6zr3ybs4sTAKC8jPvDmJMgCFCr1XIVZWRkJARBgF6vt5mBOY6Ojnj88ccbdOzIkQ3bqL5z584YMGBAg44dMWJEvY9xcHDA5MmTG3Q+IiIiMr20tIoth1QDGExakzWJB+Q/N2/mpuBKyN6oBnYAcHMLByJqnBhM3rB33W64uDgjZHRYpduHPjsM05ZGwbNFRcuni7sLAMhBZc6FbPyyZieuZufJx3zxyqf43+wvbztHyg965GXlIjfzCj4Y9w7ysnJxaOcBfBgxFwvGz0XGXzfDyWuXr+LnVdsxb9QcxEyYj5yMLHTq2xmGckOdno+3nw9Cx95X6bYD2/YBADr2vquqQ8gMjFu9Y2JibGZgjoODAyIiIurdXt22bVuMGjWqweedMmVKvY+Jjo7G4sWL4ePjU6/jOnTogP79+9f7fERERGQebOm0PsdOZCJ+w0G5q6V5M8tvvUP2y/9GOzcRNW4N2wzODmWdz4RvkD/cPSv/snX3dEdQaA/5v92aVHxKWFJUgjMHT+LLVxfjas5VpHyvR9Q3b8JQVo5je4+i+709Kz1OcWExvn1nJcZGP4Feg/siPVXEwe37sTE2EX5dBVw8fQG71+5CxJxJACoG15xISUXbTu3wQuzL6PvgnQUohdcKcGjnAXTsfRdatLHdXwARERHQ6/UQBKHGr6vufn9//xqP8/Pzq9fj1Xaf8flCQ0MRGhqKcePGQa/Xy//ExsYiISFBvi80NLTGNVpKQEAAtFotxo0bh+Li4lq/3sPDAz/88AO8vLwafM5nnnkGW7ZsQWJiorx9Qk369u2LefPmwc3NDZ9++ikmTZqE8vLyWo9r0aIF1q1bxz0miYiIrJDg20LpJRCA8nIDXpv7A5o3c8ewB4KwZt0BuLlW/fbxwqU8XMy8in69K7+WLiwqxc5fT2Drz6k4/OcFXPr7Gl58OqTScKPsy9dhMBjg41X967LColL8KzoBTZu6YfEHYwEAFzOvYs26A7h85Tr69xEw6pEecHK6WXezJ/kMPv78V6SezESA0AoD7/bHtOfvQ2tvvv6zFkL7ip/19PR0hVdCREpiMHlDu86+OPLLIeRl5aK5T/Uvhpxv/DI+lnQEGzWJ8O0qIEz9D/z4xff4c88RCEH+KC0uRcc+nSsdl7xhL4oLi+HbVUD+lXwAgPb9NWjfxReRK1/Hho+1OHe4YkKws7OTfNzQZ4c1OJTMEv9G5tmLAICU7/UouHodo2eMa9BjWQt/f3/o9fpaP1Gv7n5rbROQqii1Wi0EQUBSUpLSSwJQ0SK9YsUKPPfcczWGk56enti2bVuDW7ElTk5O+Oabb3DlyhXs3LmzxnCyX79+0Ov1cHV1BQA8+eSTcHFxwcSJE2tcq6urK9atW8dqSSIiIivDiknr8snS3Th0NAP//c9YnDl/GQDg7FwR/K374Q/sOyji/bcexZHjFxHxwipcyy/G3NcewTMRA2EwGDBfsxNr1h1A/vWK12WCbwsEdWkNdzcX+RypJ//GiIlfQfBtiZ/X/6vatcz+9/f4OekUYt6t2L7n67X7MF+zE4VFpTf+ez+27DyOLxeGy4/7/AwtCgpL0LZ1M1zLL8bXa/djw49H8cm80Rgc1rnacxERkWUxmLxh0PgH8NPKrVj84sd4adE0ePtV3RZaXFDxi1X7/hr0GdIPz39c0Xr6y5qfcCIlFZ37B8LBwQE5GVnyMeKx89j0ScUkb4/mHrh8Y7BNM+/mmPJZJNw93dFlYBAObNsPADCUG9ChZ0eUl5Vhy+cb0eO+XmjZtlW9nk/CB99i1zc7brt9gyYRQ555GP2H2ebQj5iYGMTExFR7f00vaKV9i+p7XHX31fTJXl3OVds5a6sKtZQJEyYgJCQECxcuxGeffVbpPldXV7z66quYNm0aOnToUO1jlJSUYPfu3RBFEU2aNEFwcDA6duxY5de6urpi69at2LVrF+bMmXPbXpwBAQGYP38+Ro0aJYeSknHjxqFv37748MMPsXLlSpSWlsr3NW3aFFOmTMG0adNqrZwlIiIiZbBa0jr8qjuNT5fuwcRx/THi4R5YsmIvAMDxRkv3qbPZ+Cbxd7zyzzD8a1YirheUAAA0X/yKSeH9kXExD0tXVwwz6tq5Nb6KfQId/G7v2krLuILikjJcySuodi3x3x3Chh+P4uknBkI9qi/ivzuEt/+zFd0C2+CtyKHo3NEb0e9+j60/p+J8+hV08GuJL1fpUVBYgvdeH4ann6j44FzMyMXir5Kw6Ks9DCaJiKwIg8kbPFt4YubqN/HFK59i7og30WdIP3TscxdyL11BXlYuegzqjXtGhqLwxtAbr/beeGr+ZDjeqG7sOag3Tqakwn1mOPoPC8bexN1wdHaCu6c7fvlmJ3wDBZw7cgZHfz2Mpl7NAQBjZz2BVu0q2l479OyIa5ev4ujuw/gz6QhUY+/DIy88ho/Gv4eYCfPRc1BvPPbKmBqrOSUXTqZj1zc70KqdF3oM6o2D2/ahla83VGPCkH48DSvfWIqkxF/x4icvw83DvvaJaWjLtaVIlZHp6emVJnQLgoDw8HCoVCqraeU2dvLkSZw6dQqOjo6VWqXLy8tx5swZXLhwocpg8vTp04iJiUFcXBwuX74s3+7g4ID+/ftj+vTpmDBhApycnCodV15ejoMHD1Y6RpKfn4/jx49j6NChaNasWaX7HBwcEBgYiGXLluHjjz/GsWPHcO3aNbRs2RI9e/as956ZRERERI3NxcyrmD5nA4K6tME7sx4CABQXVwzPvHa9GF6tPFBUXPHh75MvfQMxIxefvj8ax/7KxJIVe3Hk+EX07emL+W8+iv98+hP+OvU3Xn3zO0x/cRAeuCUQ9GrpAQDo26M9AGDDj0ex76CI/4t6EC4uTtDtO4c339+Me/p1wDvRDyHn8nX8O2Yb3Fyd8cKkEHTp5IMLl/KQcTEPjo4O8GhSUY25J/kM7grwxqTwmx0ygm8L/Oft4eb95hERUb0xmDTS7q72mLNxHlK+1+O3TXvx08ptKC4ohtDNH01bNQUAuLq7wtHZCU/Nn4wmzTzkYx/656OImTAfBoMBT7//PFq0bokD2/fByckJQ597BMNeHIlNi9ZDPJ6G8e88jcsXcjBweIh8fOsObdAjrBeK8ovg3rQJHpj0EFp3aIOXv5yJla8vQ1LCr+gztD96Dupdh+fhi+FTR0O/fg+SNySh84CuePbDF9HMqyLEefCfj2Lx8wuRqj+GPkP6mfi7SFXR6XRITEy0qTASAMrKyrBgwQK89dZbVbZVl5aWIjExET/88AO+/PJLTJw4EY6OFS0+hw4dwtChQ5GdnX3bcQaDAfv378fTTz+NjRs34ptvvpGrH7OysjBs2DDs37+/yjVlZWVh3rx5+Pbbb7F161Z07lz1J97NmzdHSEhIlfcRERER0e1KSsow9bV1yLl8HQ8P7orP/6fDiTNZ2JN8BgBw/6gl+CJmHP7Oqtia6mxaDma9PBijHumJoM5tsGTFXly4dBV9ewKTwvtj+NBuWLl2H9b/cATPTotD965tMfnJYIx+tCfcXJ1RfmOwZ1hIJ5xNy0HknA0oLzcgdGAAugW2wUvRCWjt3RSfffQ4nJ0dsWTFXuRfL0bPoLaIemdTpbVPez4MPl6eMBgMyMrJR//ereDo6GDZbyAREdWbg6EuEyasREBAALx8fTB324eKrqPwWgHcmza57faT+/5C5wGB8tS6hiorLYOT0T6TJUUlSP8rzaTTtIsKiuDq5goHK/plveqt5UjekIS4uDirDerqQ6qOjI2NlW+TwkhBEKBWq016voCAAJPvT/m///0PkydPrtMgGicnJ+zYsQODBw/GsWPH0LdvX5SUlNR6nIODA4YOHYqtW7eivLwcKpWq2lDyVoIgIDU1FR4eHrV/sQlIw5eSkpKsogKXiIjIngQEBEDwbYGk719ReimNksFgwOy5P2DthkPybc7Ojujc0RtFRWU4m5aDeW8Mw1PqAYh4YTX0+89h1CM9seiDMQAqhuX0HBSDfz2jwvQXB932+LuSTmHp6mTsST6D9m2b4/OPxsHR0QEjJy3HG9OHYOOPR3E09RIAIKR/B1zMvIqsnHwkrngG3QPbAABUjy5CB7+WiF86Cdt2/YVdSafQpIkLhtzXBfeFdAIA5F8vRo/7PkLPoP9n787Doi7XP46/h30REFFQGBUXwAUXcAPRcsmwxd0JTbM0tfJ0DI506nT6ZWW7mNhilppbizqoueG+lWzimguCiiIDriCg7Azz+4Mzkygq4DJA9+u6uoKZ73J/hxFmPvM8z+1C5K8TH/bDJu5D7P4Ugib/hEqluutyXUKIuk1GTFZDRaEkQOuung/k+DeHkgDmluYPNJSEv7qLiwerojASwM/PDz8/P0JCQoxUWdUlJCTw6quvViqUhLLRlcOGDePo0aO8/vrrlQoloexF8I4dO9i2bRtJSUmVDiWh7PEeM2YMq1evvu8PBIQQQggh/q602lLemhGJet0RBge2p2tnJe29GtOhXWMsLczYtieJiSFqQ9frG7mF2NpY8H/TnjAcw8REQTsvZ6LjU3hjcm+OnbzI/322mVfG+RPY15M+Aa3oE9CK44mXmPQvNRPeWMmGnycA8PnXuygt1fFkH0927j1N3MHzmJgoWDBbZQglAa5l5eHcsB4KhYLAvl4E9vW67VpKSsqWHbqSkfswHzIhhBAPiASTQtynO4WR+tGRtSmM1NPpdPz6668UFhZWab+srCyCg4PZuXNnlc/3wQcfVLim5L389ttvJCUl4eV1+wtTIYQQQghxb+9+uhn1uiNMHNuDd0P63/aBb0ZmHgAn/jei8fP3nuHCpRycG9Yrt91rL/U0NMpxcrQh/WIOr4RG4NrYnpbNnTA1NSEjM5cLl3KwtjLHxtqcAY97suOPU4ROeZx/TuzFmshjrNrwJ+NHd6N/b49yx+/u24zfY5KJ3H6Sp59oU+G12NpY0MrdiaJi7QN5bIQQQjxcEkwKUQ11MYy8WWlpKevWravWvqtWrarWfrd2366Ko0ePSjAphBBCCFFNxxMvMXvGYIY/U/F69o/3bImTow22NmVrgnu3aYx3m8a3bffEYx488VhZmNjExZ7dv73GMvUB1m0+wdGECxQUlFDfwZohA9vz0qhu1Hew5vuwEWTnFNDAsWxpnmFPezPsae8K63gnuD+x+1MIfnctqelZqAZ1xMHeCs2FbDbvTGT9lhMUl2hZ/PUosu/S6VsIIUTNIcGkENUQEBBg+LquhJE3Ky0tJTEx0dhlVNrJkyeNXYIQQgghRK21btn4u97fxMWeuM1TMTc3vet2t7K2MmfyC35MfsHvjtuYmpoYQsl7aevhzM/znue1N1fzSfgOPgnfgYmJwtBEx8LclBeDutLMrT641a9SrUIIIYxDgkkhqkij0RAcHAxQp8LIW5WUlBi7hErTamWqjhBCCCHEw1TVUPJh6e7TjOjI11m3+TgJpy6Tm1eEq4s9Ph3c6O7bDCtLeYsrhBC1Sa36ra1UKtFoNMYuQzwkmelXAWjatKmRK7k7pVJZpwNJKOuU3bJlS5KSkoxdSqW0aNHC2CUIIYQQQohHxNLCDNXgTsYuQwghxANgYuwCqkKpVAJwKr72TDEVlZeRlgH89XMWxmNiYsITTzxx7w0r0LFjx2rt17RpU5ydne+9YQU6dKh4PSQhhBBCCCGEEELUXLUqmPTzu/PaJKL2y0y/KqFkDWFiYsLo0aOrte9PP/2Ep6dnlfebNWsW48fffX2jivTo0UOCSSGEEEIIIYQQohaqVcGk3ql4aXRR12SklU3j9vf3N3IlQi8gIIDXX38dhUJRqe1NTEwIDw+nQ4cOfPvtt1U6V6dOnRg4cCDBwcG0bt260vs5OjoSERGBmVmtWpVCCCGEEEIIIYQQ1LJgUj9iMva3aCNXIh60n9790dgliFsoFArCwsJ48sknMTW9+2Ln5ubmjB49mn/84x8APPHEE8ydOxdra+t7nqdTp07s27cPOzs7GjduzObNm3Fzc7vnfvXr12fhwoUyylYIIYQQQgghhKilalUw6e/vj5+fH5npV1n2Xwmy6opT8YmGdUP13a5FzWBpacnGjRv54YcfMDc3r3AbGxsbfv31V5YuXVpu5OJrr73GkSNH7jqt+4033iAuLg4LCwvDba1atSIpKYmgoKA77ufr68uhQ4cYNmxYNa5KCCGEEEIIIYQQNUGtm/84a9YsAgICOBWfSOxvUfgNDTB2SeI+ZKRdJXLuWgDCwsJk9FsNZGpqyoQJExgyZAj79+/nxIkTXLt2jYYNG9K+fXu6du2Kg4NDhft6eHjw/vvvM23aNC5cuGC4XaFQMGTIEEJDQ7G0tLxtPxsbG5YvX86HH37IkSNHSExMRKvV0qpVK9q0aUOnTp3uGJTm5+cza9YsIiMjOX/+PMXFxdjY2NCiRQuef/55XnzxxTvuK4QQQgghhBBCiEen1gWTSqWSsLAwQkND+endH8lMv0qPIQE4uTU0dmmiiiLnriVy7jqgbJq+SqUyckXibpycnAgMDCQwMLBS2+fk5DBx4kTUavVt9+l0On777Tf27NnDt99+e8dGO56enlVqpBMdHc3YsWM5e/bsbfedO3eOXbt2sXjxYubNm4e3t3elj/so6UeKrlixwsiVCCGEEHWHVluKQqHAxKRya2fXVmfOZZBzvQCfDvdeFkc8XHd7zh1PvMR7n29mYL82PD/cB1sbiwqOIIQQfw+1LpgEUKlUaDQawsPDiZy7jtjfovHo5gWARzcvnNwa0sDVychViludik8kM72syU3sb9GGr1UqFWFhYcYsTTxgOTk5tG/fHo1Gc9ftrl27xtixYzE3N2fEiBGVbrRTkaioKHr16lWp7QIDAzl+/Dj169ev9vkeptjYWGJiYqQZlBBCCPEA7Nx7mv9+sonCwhLW/zQBtyYVz/SoC+YtiWHX3tPEbZ6KqWmtWrWrTrnXc25v3Fn2H9aw/7CGz7/eRbfOTent14LH/FvS3sulyq+JtdpSSrSlWFpU/+19bl4RVzNzaa50LHf77qgz7PjjFB++FXhfr9WFEOJOamUwCRASEoJKpUKtVhMeHk7c2rKQK25tlJErE5WlH/0q4Uvd8/HHH98zlNQrLS1lzJgxdOjQAS8vr2qdr6SkhDFjxlR6+/T0dHr37s2BAwfKrW9ZEwQHBzNq1ChCQ0OJipLfZ0IIIUR1nUvN5L3Pt7In+gwANtYWdX7EZGmpjisZueTmFWFvZ3Xb/Z99tZM90cksDH8O18b2Rqjwwdq8M5FP5+zkw7ee5PGerYxdTqWfc6OHdWbTjpMcOpqGTqdj/+FUouPP8fnXu2jmVp+JY3ugGtwRG+vyr1PTL+aQcOoSJ09d5njiJaLjz5GVXYBOpwPAwd6Kls2dMDczxdbGnGlTHqdD2yb3rDs5JYMxr/2CWxMHIhaOK3ff51/vIuHUZUJeeYwGjjbVfWiEEOKOam0wCWXBlj6gTE1NRaPREBcXB0BqaqqRqxO30ndV9/Pzo2nTprKeZB116tQp5syZU6V9ioqK+OKLL1i4cGG1zvnZZ5+RkpJSpX0SEhI4fvw4Pj4+1Trnw6Jv8iWjJoUQQojqW7flOG99GElefhG2NhZ0aNuE/4b0p4lL7Q/j7qaoSAvAF9/sprhEy7WsfEpLdQwKbMeQge3Jyi7gRNIlXg5eydpl47EwNzVyxfcnv6CY82nX+Mfba1i3bAItmzcwWi1Vec7Z21nR3suFQ0fTmP+lioDu7sTuT+GPuLPsjT3L9C+2sujXeJZ+O5rCwhLWbj7G5p2JnEq+WuG5A/t64Vjfmgb1bTAzM0GhUKDT6UhOybxnMJl+MYdRk3+mpKSULz8cXO6+6zcKOXn6Mn16tpRQUgjx0NTqYFJPqVQaQi5Zp1AI41q9ejWFhYVV3m/p0qWEh4djZ2dX5X03bdpU5X20Wi0//fRTjQsmAUMwGRsbK8GkEEIIUQWlpTo+/3oX85bE4NrYnumhAxg5qCNmZnVzWnNuXhHvz9xKQtJlUjTXyLleAMAy9QEAnBxtaefljNn/pnV/9n9PM/Tp9uw/rKGgoLjWB5PDnvamU3tXdu09TX5BsVFqqO5z7uLl61hbmfOYX0vMzEx4vGcrw6jPQ0fTmPLWaoaOW0RDJ1sST1/Brp4lzwxoS0B3d1q5O+Ha2IGRE5biWN+aH2aNrFbtOp2Of76zhqzsfH79fgzN3MovcxQdf47SUh2jh9e818tCiLqjTgSTQoia4+jRo9Xar6SkhGPHjlUriDt//ny1zrlr165q7fewqVQqwsPDiY2NNXYpQgghRK2h1Zby6pur2Lo7id5+LfjuixHY1bOs0jHyC4r5dfUhklMyaa50JGhopwqnRANcuJTDxcvX79ho5vCxdDZsOwFA/94e+HdtXu7+vXFn+XLe7ySevkxzpSNdOzflnxN70cjJttL1Hjmezsq1Rwzfm5qaoNWWMj10AE/1b1PhaD2/Ls3x61K+lmtZ+azbcpzUtCycGtjSs1tzOrV3veN5dTodB46k0aJ5A5we4ki6q5m5bNpxku2/nyIh6TL5BcV88+nQctO2WzZvQMvm3atVZ0lJKdnXC6p9DffznEs4dZl2Xi7lAszSUh3pF3NIu5CNk6MtRxMuENjXi6f6t+EFVRcaNij/3LCrZ0k92+ovS7R1dxL7D2t4e2o/unS6fTbbxm0J2NWzpF+v1uVur+rzRQgh7kaCSSHEA5WXl/fI9y0qKqrWfhkZGVXavrLrZj4oj/p8QgghRG32zcIotu5OYlBgO8JnDGHwCz9yJSOX4c90IKC7Oz26NLtrc5D4Q6n84+01XLpy3XDb0pX72fDzyzjYW7F641H2H9bwyX+f4tjJiwRNWsaN3CI+fCuQF4O6GvYpKSnlPx9HlgsM5y+LY3roACY8XxagJZ6+wsQQNfkFxbg0suNGbhFLVx5g7ebjzPloCH0CWpFfUMzMb3ezbvNxcvOKcaxvzdtT+zI4sL3huH5dmvPhW4G4NLLDp4Mry9cc5st5v9Ovd+tKTVsvLdXxU8RBvvhmF9dvlJ/x0rWzkpnTn6VlcyeKi7W8ErqKKeN70rWzkpD/W8eayGM0davPlhWTqtRVespbq7G0MGP2jMG33bc76gw9u7uTl1fEtOnr2RV1Bq22FBMTBa3cnVC63r1xUVXqXLryAGFzd5OdU0AbD2eWfD2Kxs5Vm7lT3edcbl4R6RdzyMrOZ/ALi7C3s+RKRi5nUzIpLCoBykLmSS/04L/B/e/YdCY3r6hKQfatDv6ZBpStjZl+MQenBjbodJCVnU9qWhbb9pyih29TzP83srayzxchhKgKCSaFqCOUSmWNCLLup9O1o6PjvTeqgI1N9T7ldnOreITDvTzs9VGVSqWsMymEEEJUQdKZK8yZvxefDm58+eFgzMxMuHj5OhnX8pi3JIZ5S2IwMVHQ2NmOVu5OPObfkiED2+PSqCyIOpF0iZemLsfU1ISw95+lT0Brvl8aw/xlcWzZlchzQzpx5lwGP686yOsvB/Dqm6vIyy+bOjz7+98ZO9LX0AX7zQ82sHrjUZ7q34bgyb0pLtEy4Y2VzP8pzhBM/rAslvyCYma8PZBxz3UBQJOezTcLo/h6Ydl1vDR1OUcTLjLoyXa083LhRm4hIf+3jry8YkYN6wyAiYmiXCjqWN8agKzsAmha8WM1YsIS3pjUm8f8W/LlvD18vaCs2Z5/1+Z8Mf1ZzExNiNyewLeLohk5YSmrF7+IW2MHdvxxiraezsQfTmVN5DFMTU1ITcti+ZrDvDym4hGLt8rOKWDjtoQKQ9PzaVm8+M/lfPvZMPLyi9n++ymgbLr29NAnDdd2q0/Cd2BtbU7IK48B3LNOfXi5449TODesx/BnOpCZlcfVzFzOnMvgRm4hgX3v3ZDxfp5ziaevoNPpyM0r4sjxdBzrW9O6RUOGPNUe96aOdPZ2o7O36z0D3/yCYizu0Yk78fQVPpmzg6sZufQJaEVpqQ4zMxP6BrSmb69WLPg5juVrDrN8zeEK99+77xzpF3NwbWxf6eeLe1PjrfUphKh96kwwqVarDaGMfm02UbPowxx9yNKjRw/8/f2lCc4Dog8mNRqNUR9THx8fFi1aVOX9rK2tad++/b03rEDLli05d+5clfcbMGBAtc73KIwcOVLWmRRCCCEqKXL7SbTaUl5+vrth3cTlP4xl+++n2LwzkSPH0zE3M8XU1IS4A+f5I/Ysi36NJ/KXiTjWt+bdTzdzI7eIKeN70qWTkus3Ckk+lwlAPduyqbn6kWyjX/kZTXo2X30yhISky8xdFM2xkxfp1N6V32OSWb3xKM2Vjowd6UsDRxti9qdw/UYhTW9av29v3FlaNndi7Ehfw21KVwc++7dSxgkAACAASURBVL+nKS3VMWryTxxNuMjP3z1PD99mAPyy+hAlJaV8HL6Dp/q3wcH+9inmZmam5WqtyNmUTHZHn+Ex/5ZYWpgD0LObO0u/GWUYGTdxbA+e7OtF/+Hz+HpBFB++FQjAhq0JnEsta6jy3RfDGThqPn/EJlc6mDx9tqx5S3ef21PTTTtOAmBubsqwvl4cTbjAz6sOsXbzcUq0pbw+IYA2Hs637Zd2MYfkcxmEvPIYRcXae9b53eIYdvxxir4BrZgXNhIry7K3xDdyi+geOAcrS/NKBZP385xLPHMZgNApjzNyUMdqN2UyMVFwh8GUBr+uOURMfAqFRSUcO3nRcHthYQnvBPdn8/JJLF15gPNp18rtd+hoGtk5BVhamBrOUdnny6wPBlXreoQQf0+1PpiMiYkhNDS0wpFiDVwbGqEicSf6n5FarTb8X6lUMnLkSEJCQoxZmniAgoKCCAkJQavVVmm/0NBQLC2rtg6U3tChQ9m5c2eV9jE3N+fFF1+s0j6PMvTVh5ERERHy70MIIYS4h9y8smVdikv+ev3h2aoRnq0a8dKorvQeNBenBjZsXTmZkpJSPgjbxtKV+9n+exINHG04cERDey8X5i6KZu6iaMMx/Lo0I7CvJwBXruYCZdNe3/xHHwYHtserlTNzF0Vz4dJ1OrUv64btYG+FtrSUMa/9YjiOubkp7wT3B8rWPbyamYtvB0dMTG5PlQ7+mUbcwfN88O9AQyip1Zby0/8a2uRcL2Duomj+80a/2/bV6XRA2XRyvdNnr3Ijt4jO3mVrADZwtOH02bLlbPTrG740qqshZNJTNnGgvoM1mvRsrmTcMFy7a2N7fpzzHM4N69HbryVnz2fe7UdTjv7nc2sQl3Etl28Wlo3E6+ztirm5KTPeHsgLqi4sXr6fjdsT2LA1gT49W/LymB709mth2LdBfRt2pp6mpKS0UnWeSLoEwIA+noZQsqhYy6dzdpCbV1Tu2HdzP8+5lNSyEFA1uFOVp4/fzMzU5LYp1bd6/80nmR46gMtXb1DfwRoThYJLV24YpsV7tGzIjLcDb9vv2TE/ctU6l60rJxnWWa3s80UIIaqi1gaTGo2GadOmGUZGNnBtiN/Qnnh0a0MDVyec3CSUrIky0q6SmZ5BRtpVTsUnErc2ivDwcCIiIlixYoWMnnwAUlNTjfo4Ojs7M3PmTP71r39Veh9ra2teffXVap9z0qRJLF++nOjo6Htv/D9PPvkkzZo1q/Y5HzaZzi2EEEJUnn+35vywLJY5P+ylZzf3cmFP4ukrFBQWY21VNtrLzMyEhg3KloFp4mJPxPo/sbI0Y9WiFzl99iprNx/nxo0iuvs2ZXBge8MU7YuXy9aeHBzYntdfDgDKQh0bawsST1+mraczRxMu8Mbk3rz2kj8R6//kyPELuDWx59kB7fBoWfb+JC+/mJKSUnLzKg6UMq6VBaBN3cqCo9y8Iv77ySaOJ15izAhftu1JYvHyeAYFtsO7TeNy+zZyqgeUTZnW+/zrXWRm5bHqx7IPZBs2sGX/4VS02lLs7Sxv2x7KAs7wH/7g8tUb/HNiLy5eLgv8zM1N+TG8LOyDshBx5x+nKSrWVqrDd3OlIwqFgq27k5g6qRe2NhakX8zh5eCVho7iOdcLDVPsPVs14pP/PsX0NwewasNRFv68j7FTfqGbT1O+DxuJk6MNDRvYkJdfxLGTFykoLLlnnWNG+LJ1dxLvfb6FtZuOY2ZmwoEjGsO+r4yr3Guu+3nO7TuUCnDfncTNzEzIyLz3Gu0KhcLwmAL3XKsTIP1iDt5tG5dr/lTZ54sQQlRFrQ0m9aGkPpB8esoQY5ckKsHJrSFObg3x6OaF39AAnp4ymJ/e/ZFT8YkEBQURFRVl7BJrraZNm9aYJQxeeeUVtm7dypYtWwyf3N+Jg4MDq1evxtW1+p38rKysmD9/Pv379+fixYv33N7X15dVq1ZhZlb5X4ExMTEAjzQg1AeTMp1bCCGEuLt+vVozdqQvP0UcpN/wefTs5o6ZmQlnzmWQdOYKdvUseeuffQ3bR+44iY21BV06KZn/UxylpTqKirR0aNuEDm2bVHiOG7mF2NpY8H/TnjDcZmKioJ2XM9HxKYZO0QUFZYHUC6ouvKC6/Tj60YxXMnIrPI9vRyU21hb88z+/0aFtYxJOXSY7p4DgV3oT8spjvKDyRTVxGc9NXMasDwbxVP82hn3108WPnbzI00+04cjxdH6PSS63DmVjZzviDp5HoVDw7IB2fDpnJ+99voUTSZdo0awBFy7lsG3PKU6fvcqE57sz7rkuhvUeXwrqSltPF8OxvNs0prCohIN/puHX5d4f+DZxsefFoLJRkIFB82nT2pl9h85TUFBi+Pmt2vAnb0/tx6Jf49l/RMPkF3rQqb0rzw/3YfSwzmzZlcRr/17Fvz/YwMLw52jsXDb6sqCwmBu5Rfess7dfC34Mf45P5uzg2MmLmJqa4N6sASdPXcajZUN8O1ZuDfL7ec4t/HkfACmp12jRrPrrMSoUCjQXsigpKS3X3ftBKC7RcvmmRlBApZ8vQghRFbUymIyJiTEEMB9u/dzI1Yj74eTWkLEfTTCEk6GhoYSFhRm7rFqpR48eqNXqGhFi2djYEBkZyT//+U9+/PFH8vPzK9zOzc2NdevW4evrW+H9VdGuXTtOnjxJnz59OHbsGCUlt6+tZGZmxsCBA4mIiKjytHFjNBby8/MDqDGBsxBCCPEw3W8jv4/feYqO7Vz5bnE0u6JOY1fPkibO9oROeZwXg7oaRn5Fbj/JyVOXeW5IJ6ytzOnu05TdUWcIm7ubGW8PvOPxP3/vGS5cyjGMwtN77aWezF0UjUfLhjjWt+bXNYcZPdznjoGTrY0FrdydDOsh3qqRky1zvxjOh2HbOHwsndYtGvLZuwE8/URZANnW04UFs1W8+uYqpk1fTzsvF5oryxoIerRoiIO9Fd/+GMWayKOkX8zBpZEdk8f5GY7/7JPtuHTlBiYmChzsrVj6zWjCf/iDlWuPkJtXhJOjLb16uDPrg0GG6d/9e7dmeugAVIM7lavVv2tzuvk0vePoz4p88O9Amrk58s2PUcTsP8cTj3kyeZwfbT2cKSrWGqZct2jWgE/Cd7Bh6wnaeDjj3LAeCoWCc+czKS3VcS277PXl4z1b4t60AV6tnKnvYFWpOvsEtKJPQCvD/dk5BXTsM4uune7QMegOqvuca/i/Ttoujerd7fD31LCBLcXFWkq0Dz6Y7NXDnYKC8q+nK/t8qSzNBZn2LYQAhe5ew5lqoICAADQaDWM/moDf0ABjlyMegIy0q0wPfAulUklYWJjRg7XaSKPREBAQgJ+fHytWrDB2OUDZtI60tDTWrVuHWq0mJSUFGxsbunXrxsSJE+nUqRP16t3fC7JbFRUVceLECZYuXcru3bvJyMjAzc2NAQMGMG7cOJo3b16lkZJ6oaGhqNVqwsLCUKkqGP7wEOh/pkqlsk6OJs7Ly8PS0hKtVktpaSmWlpaUlpZianrvqWBCCCHqnqCgIGJjY4na8HqlpppWV/j3fzD7+99ZMX8sfl2aU1BYwhMjvyc1LYvnhnRi6sReKF0dyMzKZ3fUGX7bdIzjJy+xbtn4e9b1U8RB/vvJJpq42PP+m0/ymH8LzMxMOXnqMuu3nGDbniR6dnPn1Zf8yc7Jv+PozMooLCqhpKT0ts7NEev/5D8fReLWxIGhT3nzYlDXO3a0vpVWW2qYul4TXLiUw3eLY4iOP4cmPRsTEwVOjjY81b8N40d3q3bTmFsdOZ7O4BcWMXP6szw3pNO9d6iiW59zmdfyOHMug24VNAGqirPnM9FqS2ndwjjLmN3v8yVi/Z9Mm76e4OBgWVNdiL+xWjdiUt99Wz8VWNQNTm4NeXrKYCLnrmPVqlUSTFaDfl1JY4zsuxOFQoFSqWTKlClMmTLlkZzTwsKCzp0707lz5wd6XGNM5b55ncnZs2fXqRdsV69eZd26dWRkZGBhYYHif+0eW7duzfXr12ndujUeHh5YW1tjbm5u5GqFEEI8Spr0rIcaTJZoS2nv5UJ3n7Kpx1aWZqz6cRwTQ9SsXHuElWuPYGKioLS0bPyGQqHgMf+WlQr3xo70LeucPXs7r4RGoFAoyi1rU9/BGv+uzWnmVh9u6tJdHZYWZlha3H77yEEdGfqUd7VG0NWkUBLKpn7rO4I/TOf+14zGp0PlpnFX1a3PuQaONjRwtLnv497PNPAHoaY9X4QQtVOtCyb1oUuPIRJK1jU9hgQQOXcdqampxi6l1pJmKQ9HTEyM4XfPo24sNGvWLAICAoiIiEClUtWZBlHr16/nm2++wdTUFDMzM7KysqhXrx7Xrl3D3d2d7OxsvL29effdd2nVqtW9DyiEEKLW07+Oedj+OTGA8aO7leuI7dLIjnXLxvN7TDLR8Slcy86jYYN6tPN0pmc39yqFSC+N6sqgwHas3njU0H25eVNHuvs0pUPbJhV24n7QHvS03rpIp9MZPhg9dz4Tu3qWtHJ3eijnqug5JyBmfwrw6F9fCyFqlloXTKalpRm7BPGQ6Dup16QRf7XNyJEjiY2NJTw8XILJB2jVqlUAj2wK982USiUqlQq1Wk14eHidWINVp9Ph6uqKiYkJRUVFjBs3jo4dO5Keno6zszNpaWloNBpOnTqFlZUVpaWlmJjIGywhhKjr9Gsrq9f/iV/X5g/tPGUjDW9/G6RQKHi8ZytDE5v74eRow6SxPe77OOLhmPrOb8TsT2HF/Bdo2bwBSclX8W7T+KEFh3d6zv3dxR4oCyblfYsQf2+17rejfjSdRzcvI1ciHgaPbl6cik80dhm1lr+/P0qlUkZNPmBqtRqA4OBgo5w/ODgYtVpNTExMnfi5KhQKPD098fHx4eDBg7i4uPDYY4+hUCjQarWYmJhQUFCARqOhUaNGEkoKIcTfRNOmZevt6cMKIR6W82lZXL56g5ETljB+dDd2/nGa4c94G7usv5XY/Slo0rNRKpUyYlKIv7la925PP5pOP7pO1E369fxE1SiVSkN4ph/lJ+5PaGgogFGnUeubQmk0GsLDw41Sw4PWokUL3nvvPTp37sxXX33Fhg0bKCwsxNTUFIVCgbW1NR4eHlhYVLB4lhBCiDpJv7ayJj2biPV/GrscUYfN/1LF0Ke8ybiWR9jcPeTlFzEosL2xy/pbUf/v37ixPvgXQtQctTKYbOAqoWRdJT/b+6cfTadWqw0j/UT1xMTEGH20pJ6/v3+5Rji10b59+0hPTzd836hRI5555hmys7PZuHEjmZmZRqxOCCFETaD/ezv7+9+NXImoyxo52TLn4yH8/N3zeLZqRMgrj+HXpZmxy/rbiN2fYvjwwRhLJQkhapZaF0wKIe5OP7oOIDw8XNbsrCaNRsOoUaOAsjdJxp5icvNo2IiIiFo3qjg1NZWVK1fy/vvvc+HCBXQ6HVZWVgwaNIgZM2Zw7tw5vvzyS7KysoxdqhBCCCPSfxCnSc9m2vT1xi5H1HG9erRgm3oywa/0NnYpfyvT3i/7t23sD/6FEDWDBJNC1EEqlYrg4GA0Gg1BQUESTlaRRqNh2rRpQNlC/CEhIUauqIy/v79hSndoaGitCifr1auHlZUVO3bsYNmyZdy4cQMAc3NzevToQYsWLdi/fz9JSUmGfXQ6HVqtluLiYmOVLYQQwghmzZqFUqkkYv2fMnJSiDpEk55N0KSf0KRn16jX2EII4zJ9//333zd2EVURHh6OtZ0NfV8YYJTzZ126hlU960d6zuRDp7Fv6ICJafVy5NzsXM4ePsOZg6e4npGDpbUlljZWD7jKB+PPnYdIS0xl5MiRhgXQRfU0bdqUEydOcOLECbZu3UpgYCD29vbGLqvG04e5J06cQKlUsmnTJmOXVE779mXrH23dupXY2Nha83O1sLDAxMSE9PR09u/fz5EjRwBo0qQJ1tbWFBQUsHv3bnx9ffH09ESn03HlyhV27drFkSNHiI6Oxt7eHicnJyNfiRBCiIfN3t6ewMBAtm7dytZdx4hY/ycOdla083IxdmlCiGqa/f3vTJ4WgeZCdo18jS2EMB6FTqfTGbuIqmjevDkNXBvy4dbPH8n5iguK2PDNb+yPjCMvJ4/SEi3TN31KgyZlb44vnkkn/0Y+LTq1emg1fKb6gL4vDKDH4J5V3jf7chafjnifG9eul7vdvWNL+ox9gi5PdUehUDyoUu/bsv/+SNzaKJYvX17rOw/XBBqNBrVaTXh4OEqlkpEjR8onk3ehVqsNzW6USiVRUVFGrujOZs+ebfi5rlixwuhTzSvr2LFjzJs3j927d9OiRQvs7Ozo2rUrCQkJJCcn89prr9GjRw/i4+NZunQply5dIjMzk1atWnH58mVef/11VCoVtra2xr4UIYQQD9mtMz+Urg6MHNQR/y7N8eva3MjVCSHuRpOeTeyBFFLTswj//g/D7SqVyrDslBBCgASTd1VUUMScl77g4pl0Ap57nMz0DA5vO8Djz/dH9c7zABzedoA9v+zgjUX/fmh1zBn/Bc07tGTov0ZWed/863nsWradtgHtsXGw5dqFDE7sPcbOJVtRKBR8Ef0V1nY2D6Hq6pFg8uHQh1iABJQVmD17NhEREYY3PrXhBdOtoXNtCicB9u/fz+LFizl27Bh2dnZkZmYSFBTExIkT+eijj4iPj+f8+fMEBgYyYcIEDhw4wMqVK3FycuL777/Hzs7O2JcghBDiEdBoNMTExBAREUFsbGy5+5SuDn993aT+oy5NCFEBzYUsNOnZt93u5+dnWKZBCCFuZmbsAmqyVZ8vJ/10Gm/8+CbuHVsC8OmI6cSu2cvI/4xGoVBgZmHG5ZRLD7UOEzNTsi5Wr1uttZ0NT08ZjLZEy9Fdhzmy/SB/7jwEQO9RfWtUKCkenpCQEPz8/AgPDyc2Npbw8HAiIiLw9/enR48eKJXKv1UQrH+To9FoDIEt/NVgpjZ0B1QqlYY6w8PDCQoKqlWBc9euXenatSuJiYnk5OTg7OyMvb0969evJzo6mu7duzN06FDWr1/Pb7/9ho+PD02aNCEvL48jR47Qq1cvdDpdjRrxLYQQ4sHT/71TqVSGv99xcXGGv+N6FQUhQgjj0IePI0eWDaxRqVQSSAoh7kiCyTvITM8gZs1eRvw7yBBKAnTs58Om79aTl5OHrYMtFlaWZF/OojCv4IGt26hJOI/CxAQ3r79+eWtLtIavz/2ZzB8rdqMrLWXcpxMrdcykfSdZEDLX8L3/8N48998xD6ReUTv4+/vj7+9vCOPUarXhPyh7AaFUKg1re7q5udX6FxA3v2FJS0sjNTX1ttEWULsCyZsplUpDEKkPm4FaE04CeHl5lft+586d2Nra8txzz2FjY0NiYiIHDhxg48aNFBYWMnz4cLy9vbl69SoKhQI7OzvMzMwwMZFebkIIUdfdHFLeTJr8CVEz1Pb3DkII45Bg8g6iV/+BubkZPYYElLu9/0sDae3ria1D2fpm5lbmAOTl5GFpY0XmhQyO7jqMb2A37JzKGlJ8//pXWNpY8dIXk8sdK35jLF492qIr1TH3tdn84/t/cfZIMgtCvsXE1IS3VryHq2fZL/cb166za9k2otR7uJh8AY9uXvgGdkNXqkNhcu8RQ829W9CuVwdST5zjeuZ1Ylb/wbk/kwl6dyytu3re9+Mlag+lUklYWBjBwcGGUQf6wE6j0VQY3NU1+unsUDc+wQ0JCUGpVBIaGmoYAVqbwkm9a9eu4e7uzvHjx2nUqBHNmjUjPDycyMhIsrOzadSoEZ6enmi1WjZs2MCmTZtYunQpJ06c4NKlS/Tu3RsLCwtjX4YQQohHrLb/HRdCCCH+ziSYvIOr5y/j6tUUK9vyoyCtbK3w8m9n+N7S2hKA4sJizh4+zQ9Tv+F65nXiN8Qy7ed30GlLSYg+Ttue7csdp6igiF+nL2FY6HN49+lEWqKGw9sOsC58FW6eSi4mX+CPlbsJencsAKfiEzkVn4hLi8ZMCv8HnZ7wrdL12NjbMGVeMAA3Mq9z7Pc/iVsbxdcTwxg9fRx+w3pV+TEStdutow70ow1SU1PLT42q5aMQ9G9W9P+vq1PWVSoV/v7+BAUFGUZP1rZ1Jx0dHWnWrBmlpaWsW7eOF154gfr16/Pss8+W2+7PP/9kw4YNmJmZMWvWLKKiolAqlbRo0QJ3d3eKi4uxtLQ00lUIIYQQQgghhKgsCSbvoHErV47tOULO1WzsGzrccTszi7KHMCHqGOtmr8LVU0mA6nE2f7+BE3uPofRqSklRCe4dy3ftjlsbTVFBEa6eSnKzcgFQf/ILTVq7ErzkbdZ+qSblz+Syc5iZGvbr/9LAKoeSt6rXwA6/oQH4DQ0g9rcoVsxYRjPvFrh6uN3XcUXtdmuAJ2offRMcfQfT2rbuJICvry/16tVjx44deHt707NnTwoLC7G3LxuBnpGRwYEDB4iPj8fZ2ZnTp0+TkZGBo6Mjx44d49ChQyQkJODp6cnw4cMxM5M/c0IIIYQQQghRU8miXHfQe1RfTExN+Gbyl2SkXb3jdkX5RUBZqOjl346QJW/x9GuDsbaz4VR8Ipa2VigUCjLT/zqGJuE86+esAspGMuZczQLAzsme174LxsrWitZdvci8UNbwRleqo1l7d5RtmrJp3jqyLl17YNfpNzSAdr28iV2z94EdUwhhPPpwMjg42LCeqD6orA08PT0JCwujSZMmvPHGG7z66qssX76cr776ih07dpCfn8/27duxt7enZ8+eeHt7Y2FhQWZmJgsWLOCzzz5j8eLFJCUlkZOTA4BOp0On01FcXGzkqxNCCCGEEEIIcTMZSnIHtg62/Ound/j+9a/48Nl36NjPB/eOLcm+lEXO1Wza9e5A90H+FOQVANCgiRMvfDwBk/+NbmzfuwOn4xOx+tdIfAd2I3rVH5iYmWJla8Wen3fg6qEk5dhZjv/+J/UalI0EGvbmczg2bgBAs/bu3Lh2neN//MmJqGP4DetF4KRnmDlqBmHPf0z73h145vWhdx3NqXd09xFKtVraBXhjblV+/bXrGTlcOnsRB2fHB/nwCSGMSN8UR6VSERQURGxsbK0ZPWlqaoqXlxcvvPAC7du3JykpiYULF2Jtbc3Jkyd56aWXaNWqFWPGjMHPz4/x48ej0+nIz8/H398fpVLJmTNnuH79Oo6OjhQXF5OXl8fu3bsxMTHB1NSUTp064eYmI8SFEEIIIYQQwtgkmLyLxi2b8O66j4jfEMu+9dHsXLKVovwilG2aUs+xHgAWVhaYmJnywscTsLazMew74OWnCHv+Y3Q6HeM+mYhDo/oc2rYfU1NT+o8PZODkQaz/eg2ak6mMmj6Oaxcy6fp0D8P+jZo50y7Am8LcQqzqWdN37AAaNXPmHz/8iyVvLyAq4nc69velfe8O97yOQ1vi2bc+BnMrC1zcXXBSNsLG3pbsy1mcPXIGnU5Hr+f6PPDHTwhhXPrRk2q1mvDwcMLDw0lLSyM4OLhGT9m3tLSkZ8+e+Pj4kJqayvTp04mKiqJJkya0adOG7t27U1hYSHZ2NkOHDqVnz56MGjUKJycntm3bRkJCAj179iQhIYFdu3axefNmABISEvD39+fAgQMMHDgQHx8fmeothBBCCCGEEEak0Ol0OmMXURXNmzengWtDPtz6ubFLMSi4kY9VPevbbj+9P4lWXTxQKO7dNftutCVaTG9aZ7K4sJi0pFTcO7Ss1P6FeQWcPnCK9CQNF06ncePaDfKv52FtZ0PTds3oPrgnLu6N76vGB2XZf38kbm0Uy5cvr7NNSoQwhpiYGEJDQ9FoNIau5DV99KSeTqe74+/RGzduAGBjY0N+fj4rV65kyZIlLF++nCVLlrB69WpGjRqFt7c37u7uREdHs379enx9fQkJCcHa+vbf3UIIIYQQQgghHg0ZKvIAVBRKArTu6vlAjn9zKAlgbmle6VASwNLGiva9O1RqdKUQom7y9/e/bfRkREQEwcHBhs7sNdXdPtypV6+e4WsbGxs0Gg1DhgzhxIkTpKSk0K1bNyZNmoStrS0KhQIHBwc2bdrE0aNHSU1NxdPzr9/TpaWlmJjI0stCCCGEEEII8ajIOzAhhPib0K89GRUVhZ+fHxqNhtDQ0FrVHOduFAoFL730EgMGDKBHjx7k5eVx+vRpVq9ezYkTJ4iIiGDmzJns27ePU6dOYWVlRWlpqWH/48ePo9FoKCkpuee5kpOT+f3330lPT3+YlySEEEIIIYQQdZoEk0II8TejX3syLCwMpVJpaI4ze/ZsY5d235o2bYq3tzcKhYI+ffpw+fJlfv31V4YPH863337L8ePHadKkCY6OjiQlJRlGSGo0GqZNm8bvv/9OYWEhycnJJCcnG45bVFRETk4O+tVPNm3axHfffce1a9eMcp1CCCGEEEIIURfIVG4hRJ1WXFzM5cuXpQtzBVQqFf7+/rdN765N60/eiY2NDS+99BI9e/Zk69at2Nra0qBBA7p27cqWLVv47LPPsLe3R6vVYmpqyooVK7CwsODgwYOUlpayZcsWLly4QHBwMAMHDmTmzJk4ODjQsmVLsrOzOXjwIPn5+Xh4eBj7UoUQQgghhBCi1pIRk0KIalm1ahUDBw5Eq9Uau5QKxcbGEhwcTKdOnejZsycff/yx4b7k5GQ6depEWlqaESusGW6e3h0cHIxGoyE8PJyAgIA6MYLS09OT119/nTFjxvDss8/SpEkTmjVrhpOTE3v37qWgoIDk5GQWL17MmTNn2LFjB3PmzCEjI4MzZ85w+vRpTp48yZEjR5g/fz4fffQRCxYsIDY2FldXVzIzM8nOzq7U9G8hhBDiUdFoNMyePZvmzZvXib/nQggh6q5aF0wqlUoy068auwzxkOh/tk2bNjVyJbVHamoqW7ZsueP9U6dOxcfH54GvhXf58mUSEhJqVLiXl5fHvHnzePzxxwkKCmL9+vX0K3vSkQAAIABJREFU6tWL8ePH4+PjY9guOzubrKwsoqKijFhtzXJzQKlSqepcQGlhYYGpqSkKhQJfX1+cnJw4duwYqampzJs3j/z8fJo3b07fvn359NNPCQkJwcTEhAYNGtCiRQteffVVxowZQ5s2bbh06RJarZb4+HieeuopvvnmG86ePUtxcTElJSVkZGSQn59v7EsWQgjxNzR79myCgoIICAggPDwcoEa9VhNCCCFuVeumciuVSjQaDafiE/Ho5mXscsQDlpGWAZT9nEXlzJgxgy1btrBs2TIee+yxcvclJSWxdu1a+vXrR+PGjR/oefUjJePi4khOTiYjI4PS0lL8/PweebCcm5vLvHnzWL58OZcvX8bGxobx48czadKkCqdw6xueHDt2jGbNmnHhwgWKi4tp3bo1vr6+j7T2mkapVBIWFkZwcDDh4eGGad51ZYq3QqHA0dGRsLAwNm/eTG5uLteuXaNXr15MnjwZLy8v7O3tWb16Nd26dcPGxgZbW1v69etHv379WLBgAYcPH2bEiBHUr1+f3Nxcjh8/zltvvcXMmTNJTU1l1apVvP7663h4eEiXbyGEEA+dRqMx/L2uSFhY2COuSAghhKi8WhdM+vn5ERsba+wyxEOSmX5VQskqatasGQBRUVG3BZMLFizA1taW2bNnP5CA5ODBg0RFRXH+/Hni4uIACA0NNdxvZmbGtGnTmDJlyn2fqyqmT5+OWq3G2tqat99+m9GjR1O/fv1y26SlpREZGcn58+c5evQoAEuWLGHJkiVAWWDVt29fFi1a9Ehrr6nqckCpUCjw8PDAw8ODgoIC/v3vf1OvXj2aNGkCYGh+Y25uTseOHQ37paSkcPHiRRo1asTo0aNp06YNFy9epKCgAChb17K4uJhz584RERHBf/7zH6NcnxBCiLpPH0ZGRESg0WjuuF1wcPAjrEoIIYSouloXTOqdij8pIybrmIy0smnc/v7+Rq6kdpkyZQqLFy/m4MGD5W6/cuUKa9asYdy4cbeFdHFxcezZswc7OzuGDBmCq6vrbcc9c+YMJ0+epG3btrRs2RKAUaNGUVhYWG67J554gsDAQLy9vfHw8MDc3LzK15Cdnc13333HyZMnady4Mb6+vgwbNqzSx+rYsSNqtZr8/HzS0tIq3O+dd95h9+7d5W5r27YtKpWKjh070rZtW+rVq1fl2nU6HQsXLiQ+Ph4LCws6duzIsGHDaNiwYbntzp8/z5o1aygpKaFfv37lppbXZHU1oDQ1NQXA1tYWDw8PQ7dtgMzMTDIzM3FycqJ169aG25OTk9mxYwfDhw+nQYMGALeNRO7cuTMBAQGYmZmRnZ2No6PjI7gaIYQQfxcVjY5UKpWMHDmS2NjYcgM49Mu0CCGEEDWZQnfzu7FaICYmhlGjRtHAtSEfbv3c2OWIB2jO+C84FZ+ISqWSKSdV9Pzzz7Nv3z6OHDmCra0tAF988QXz5s3jjz/+MExnLiws5I033mDTpk2GfW1sbFi1ahXt2rUDoKSkhC+++IIffvjBENa8/PLLvPfee8ycOZPs7Gx8fHxITU1l9uzZqNVqunfvXmFd6enpfPnll2zbto2ioiJatGjBoEGDmDRpEmZmZZ+LFBcXo1KpOHToULl927Rpw3fffWcIRe9l+fLlzJo1i8uXL2NnZ8eYMWOYOHEijRo1AmD9+vXs2LEDHx8fXFxceOWVV5g2bRpTp06t8Hj5+fnMmzcPtVrNlStXaNKkCX379mXq1Kk4OTkZtpsxYwYLFiwot6+joyOzZs2if//+AHz99dfMmTOH4uJiwzafffYZo0ePrtS11ST6tSfVajXw15shlUpVZ0Y7FxQU8PLLLzNhwgQee+wxzM3NuXTpEj///DMbN24kLCyswmBZ3+EbICcnB3t7+0dduhBCiDroTqMjg4OD8fPzw9/fn6CgIGJjYw3LXkHZFG6VSmWssoUQQohKqXWLX/n7++Pn50dm+lWW/fdHY5cjHpBT8Ymcik8EZMpJdfTr14/i4mL27t0LQFZWFosXLyYwMNAQSup0OqZMmcKmTZsYN24ce/fuZf78+eTl5bF06VKgbO3FqVOn8v333+Ph4cG0adPw8vJi4cKFnD59mjfffJOPPvqIESNGGNZizM3NrbCmpUuX0r9/f9RqNTY2NowYMQIHBwe++OILVCoVFy5cACAyMpJDhw4xcOBADh48SHJyMitWrMDJyYlXX3210o/BqFGjiI6OZvbs2TRr1ox58+bRs2dPZs+ejU6nY9CgQYSHh/Piiy8aAsMbN25UeKy4uDj69+9PeHg4mZmZDB06lDZt2qBWq3nqqafYt28fUBa8Llq0iGbNmrFnzx7OnTvHzp07GTBgAP/6179IT09n/vz5hIWF0a1bN9atW8fmzZtxdXXl+++/r/S11ST6EZS3dvEOCgoiNDSUmJgYY5d436ysrJg9ezbdu3c3jL4tLS1l27Zt9O3b1xB2FxYWGqZxa7Vajh49yrFjxygpKZFQUgghxH3Td9bWN7LRaDQolUqCg4NJSUkhJCSEpk2blgsl9a+j/fz8JJQUQghRK9S6YBJg1qxZQFmYFfubdNWt7TLSrhI5dy1Q9sluXRl19Sj169cPKAv5oGxtydzcXCZMmGDYZsuWLWzfvp1OnToxdOhQzM3NOXnyJIBhlOXatWvZuHEjgYGBbNq0ialTpzJnzhycnJxuW6PSwsICuHMwuXz5cvLy8ggICGDLli189NFH/Prrr2zfvp2cnBxGjRqFVqtl+/btmJub89FHH+Hk5ISpqSl+fn788ssvbN26tUqPg7m5OcOHDycyMpKVK1fi7e1NeHg4b7zxRrmpuubm5igUCvLy8io8ztatW0lLS8PV1ZV169Yxc+ZMfvjhB6Kjo+nSpQujR48mJSWFXbt2odVqeffdd3F3d0ehUNCqVStmzpzJkSNHMDExYebMmdSvX5/x48fTuHFjzp07R3Z2NjY2NlW6tprm5i7e+oBSrVYzatSoOhFQOjs7Y2dnZ/j+xIkTXLlyBRcXF5RKJfn5+cydO5fExES0Wi1Hjhxh2bJl7N27FzMzM7RaLbm5uSQnJ3P06FGysrJuWwZBCCGEuNXNYeTNnbWDg4OJiooiKirKMD1bo9EQEBBAbGwsfn5+REVFldteCCGEqA1q5RqT+hE7oaGh/PTuj2SmX6XHkACc3Bree2dRo0TOXUvk3HWAfLJ7P1q2bEnbtm3ZtGkTkyZNYuHChbRr145u3boBZaMlv/zyS5ydndFoNAwfPtywr4uLC5MmTQIwBIFhYWGGqdZt27a9bf1K+Kuztb47N8DFixfRarW4ublhZ2eHqakp8+bNKzd6rFWrVkyYMIF33nmHxMREQxdt/Si0B6VHjx6o1WomTJjA2rVrGT9+vGH6rb7mm2vPz8/n7NmztGvXzhBIzZgxA09PT8M29evX58033yQyMpLt27dz/fp14K8GRLf69ttvKS4uxt3d3fAYQ1kw+tZbbz3Q6zUWfUCpUqkM08zUajVqtdowcqMu/LsuKSnB2tqaxYsXc+3aNYqLi4mJicHZ2Rlvb29SUlKIiorimWee4fr162zfvp3o6GiSk5NJSUmhdevWTJ48mb59+6JQKIx9OUIIIWqYu60dWdE6kfrlraDsNfSKFStQq9VoNBrD9G4hhBCiNqiVwSSASqUyTCGMnLuO2N+iDc1wPLp54eTWkAauTvc4injUTsUnkple1uQm9rdow9eyruT9U6lUfPjhhzz33HPk5eXx2muvGe5LTk4mMTGRDz74gGHDhrF8+XKSkpJo06YNw4cPN6yZqJ+2evDgQfr06XPX8+mbu+jDOYD33nuP4uJiFi1ahJWVFRYWFoaRlXpZWVmsWLECFxcX3NzcyM7ONkyHrY6IiAjCw8OZPn06AwYMKHefmZkZnp6e7Nmzp9y0bVNTUxwdHcvVvnz5cmbOnMmJEyewtLQEuK0ZjlarZcGCBZiZmdGxY0fDCNU71R8ZGcmTTz7Jt99+y9q1a4mNjcXFxYXBgweXCzzrgpsDypiYGMOUs9DQUMLDw2t1oxyADh060KVLF86ePcu+ffsoLCzkxRdfxN/fn4sXL3L06FEaNWqEs7Mz69evZ8aMGdSrV4+ePXtiY2NDWloaX331Ffb29nTt2tXYlyOEEKIG0Y98hMqt3XxzKHnza2gZLSmEEKI2qrXBJFBulE54eDhxa8tCrri1Mr27ttCPfpVPde/fsGHD+OSTT8jNzaVZs2Y888wzhvuys7OBsjXxHBwceOWVVyo8xpgxY9i0aRMTJkxg3Lhx9OnThy5dupSb0qrn6uqKQqEgMbFsbdCkpCT27t1rmD4+cuRIdu/ezahRoxgyZAjW1tYcP36cjRs3UlJSwsqVK3FwcKCkpITCwkKuX79e4XnuxcLCgtTUVCZOnEjLli3x8PDAxMSEnJwcEhISyMzMZMCAAfTu3fu2+hMTE9HpdNy4cYMVK1YYGgANHjyYL7/8kuDgYMaOHYuLiwspKSlERkZy9uxZZs2aZVgzEso6oFckOzubwsJCzMzMGDFiBCNGjKjy9dU2SqUSlUqFv78/MTExREREEBsbW66Td21slOPq6kp4eDjJycnk5eXh7OxMo0aNUCgUXLhwgV27dvHMM89w9uxZtmzZgo+PD5MnT8bX1xdzc3Pmz5/PV199xY0bN8o1yRFCCCHgrzDxXh/iqdVqQkNDgfLNbWbPno1GozH8DRZCCCFqi1rXlftONBoNqampaDQa4uLiAEhNTTVyVeJWfn5+hv83bdq01oUTNd2LL77I7t27mTBhAtOnTzfcfv36dfz9/bG1tWXjxo2G0Y4ViYuL49133yUpKQkAExMTPDw86Ny5M+PHj6dt27aGbZ988kkSExNxd3cnJSUFFxcXIiMjDSMwFy5ciFqtNoSXLVu2ZPjw4YwZM4b69esD8Oabb7J27Vr2799f7YYhu3btYs2aNURFRZGdnU1xcTFmZmZ4eHgwYsQIxo8fb5iarvfee++xZMkSXF1dycjIQKvV8vPPPxueo3v27GHu3LkcOnSIwsJCmjRpwoABA5gwYQItWrQAYPXq1YSEhLB06VIef/zx2+oKCgoiPj6eX375xXDcv6OKOnn7+/szYsSIWvvmSafTGaZknzhxgrlz5zJp0iTq16/P8OHDGTBgACEhIbi4uKDVatm1axc//PADH3zwQbl/Q9U5nxBCiL+n0NBQw9/S5cuXl/sb2rx5cwCioqLk9bUQQohapc4Ek0IIOHr0KCtXruStt966bRry/Pnz+eijj2jVqhUzZsygW7duFBcXEx8fz+rVqw0drXv37o1Op+PAgQPs37+fQ4f+v717D4iyzvc4/h6ugoKoIIpD3k3Ba1oOUR3bSsvSLB3RLN2OWe1u5RC0dcpObrV2EYWycsvKyjpBg10sLbEyTQS3zNLCtNSUB29oCiqIXOb8wc4EigoKDJfP6x91eOZ5vvOICJ/5/n7fDfz666/k5uby1FNPVerEXLNmDX/7299o3749o0eP5rbbbnMFjhU5HA4cDscpA3QACgoK2Lx5M4MHD679G3IGe/fu5ZZbbuHIkSNcd911TJkyhe7du1d5bFlZWZW1Q/k9iI6OrjI02rBhA+PGjcPf358ZM2Zw/fXX4+vry+bNm1myZAnLly9nzJgxxMXF1epra6hOt39WU9iH8tixY3h6enL06FEee+wxNmzYwMMPP8yVV15Jy5YtOXHiBAUFBXh5eZ3yb/N09u3bR25uLrt27aJjx47k5uYSHBxMUFAQx44dIzIykqKiIlq0aKHQUkSkias4efvk1UbOwFJbI4mISGOkYFKkGXnxxReZO3cuJSUlmEwm16Rqk8nEwIEDeemllwgLC3NzlU3LqlWrmD59OocOHQKodN/DwsJISEhw7SvVXBiGUWmZN5x5g//GpLi4mMWLF/PPf/6TwYMHM336dPr163dKx+7ZlJSUsHLlSv73f/8Xb29vjhw5Qps2bSgsLCQoKIiAgAAKCwu5/vrrmTZtmpaGi4g0UYZhEBcXd9pQsuL+lOqWFBGRxkjBpEgzs3//fpYuXUpOTg4tWrSgZ8+eXHbZZa7l11L7CgsLWbp0Kdu2baOkpIQuXboQFRVFt27d3F2a2zkneVcMKKOiorDZbI32h6vCwkJWrFjBG2+8wU8//UR8fDwTJkyo0R6qRUVF2O12Hn30UVq3bs3EiRMpKyvjwIEDtGvXjuzsbI4dO4aXlxe33HILV199dR2+IhERcYeTQ8n09FP30Xd2Utpstkb/5p6IiDRPCiZFRMTtqlrmbbVaG+0+lAcPHuTLL78kNTWVwsJCrrzySm699VZCQkKqfY6srCxmzJhBXl4e9957L1dddRX+/v4UFBQQEBDA999/z6FDhxgwYABt27atw1cjIiL1rWInpMViISUl5ZRjKk7n3rlzZ73WJyIiUls8Z86cOdPdRYiISPMWGBhIVFQUVquVwMBADMMgMzOT1NRUUlNTyc/Pb1QBpb+/P5GRkfTr14/s7Gw8PT3p27dvjQY8BQcH06JFC77++muys7MZMGAAoaGh+Pr6AtChQwe6du2Kn59fXb0MERFxg4yMDEaOHAmUv0m3YMGCKo+Lj4/HMAwSEhKIjIyszxJFRERqjTomRUSkwTnTPpRWq7XRLfP+/fffz9rVWFJSQmFhYaUl37m5uSxatIilS5cyYsQI/vKXv9RoSbiIiDQudrud+Ph4gDMuz3Z2S55uibeIiEhjoWBSREQaNMMwSEpKwm63A3/sQ9lYl3mfzoEDB3jnnXcwmUxMnDjRtey7rKyMWbNm8e677xIbG8vEiRNp2bKlm6sVEZHalpiY6NrSJCEhAavVetpjnXtLJicnN6n/C0VEpPnxcHcBIiIiZ+KcQpqeno7NZnPtRzlhwgSio6NdgWVT8OGHH7Js2TJ++OEH1/R2h8PByJEj6dy5M1999RUlJSVurlJERGpbfHy8K5RMTk4+Yyhpt9vJzMzEYrEolBQRkUZPe0yKiEijUNU+lIZhkJaW1ij3oTxZWVkZoaGhrF+/nmXLllFWVkZwcDBt27YlNDSUffv2sXHjRoKDg+nduzcAxcXF5Ofnk5uby65duwgNDXXzqxARkZqKiYkhLS0Ns9nMK6+8ctb/y+68807y8/NJSEggPDy8nqoUERGpGwomRYSCggKeeeYZPDw8MJvNmEwmd5ckclrOgHLEiBFERESQn59PVlaWa1hOfn4+4eHhNRo00xD4+PhgNptxOBykp6ezfft2Nm7cyG+//caJEyfYsmULn3/+ORMnTsRsNpObm8sHH3zAO++8Q2pqKsuXL+fTTz8lJCSEzp07u/vliIjIWRiGwbRp08jMzMRsNpOSknLWITZ2u53U1FQsFstp958UERFpTLTHpIiwcOFCnO9RBAQE0L17dy6//HJuvvlmunXrdl7n3r59OzfddBPLli2jU6dOtVCtyKma2j6Uubm5LFiwgIULF2I2m8nPz6ddu3b069ePSZMm4eHhwb/+9S9WrFhBZGQkl156KYZh8OuvvxIeHk5cXJyrq1JERBoewzCIiYnBMAwsFgspKSnVfl5cXBw2m61R/v8mIiJyMgWTIsLChQuZN28evXv3JjQ0FMMw+OGHHygqKuKiiy5i/PjxxMTE4OFR821pN2zYwJgxY5g9ezbjx4+v8fPffvttOnbsyFVXXVXj50rz49x/MjU1FcMwgD+meTfGzpKDBw+ybNkyiouLad++PZdeein79u1j1qxZlJWVcezYMfbv3899993HDTfcwKxZs0hOTmbRokUMHToULy8vd78EERE5iXOiNlCjUFJERKQpUjAp0sxt2LCBcePGcfvttzNjxgzX47///rsr4Ni9ezfDhg1j/vz5+Pv71+j869ev5+abb2bKlCmMHDmSPXv2UFxcTI8ePbjooovO+NwTJ07Qt29fSktLefPNN7nsssvO6TVK82MYBhkZGaSmppKZmQk07i7KkpISPDw8KCkpYfny5Tz66KM89NBDdO3alddff519+/Zx9OhRfHx88PX15e9//ztDhw6lrKyMvLw8jh8/TlFR0Xl3QIuIyPmpGEparVYSEhLcXJGIiIh7KZgUaeY2b97MtddeS9++fVm6dOkpHy8tLeWJJ55g4cKFTJs2rVJ4eTo5OTksW7aMXbt2sWnTJjZs2FDp4yaTiSuvvJKFCxee9VwbN27kjTfeoEWLFsyaNavar6u4uBhvb+9qHy9N18nLvKE8pLTZbGecetoQORwO5s6dy+LFi3nwwQcZPXo069ev57333uPAgQN4enoyadIk+vXrx7Zt21i3bh1btmzhwIEDhIeH06tXL4YNG0bPnj3d/VJERJodu91OfHw8ADabrVF28ouIiNQ2rfESaebat28PQKtWrTAMg0WLFjF69OhKm687h4gcPXr0lOevW7eOVatWERAQwI033khYWBgPP/wwX331VaXj+vTpg9VqpX///vTp04dWrVq5PuZwOHjttdf45ptv8PHxoX///tx0000EBwfTv39/5s6dW2Xtu3bt4oMPPqCkpIQ//elPDBo0CIDMzEwmTZrEc889xw033HDK8+bNm4e/vz9Tp04967mcli9fTlpaGocPH6Zfv36MGDGCPn36nOnWSgNhNptJSEjAZrNVWuYdHx9PUlIS48aNw2q1Yjab3V3qWZWWltKlSxf27dvHgQMHKC0tZciQIQwZMoTs7GxatmyJn58fv/76K7Gxsfj4+DB06FB69erF+vXrOXz4MC1atMBsNuPn5+fulyMi0mwkJiaSlJQEQHJycqPr3BcREakr6pgUaeb27t3L0KFDufLKK4mOjubJJ5/EZDLRq1cvAPbs2UN+fj6DBw/mlVdeITg4GICioiKmT5/Op59+6jqXv78/ixcvZtu2bXzxxRcMGjSI0NBQ7rrrLuLi4rjvvvuqrOGJJ57g1VdfrfRYmzZtmDNnzmn3lpw3bx7PPfccxcXFrseefvppJk6cSGpqKnFxcYwfP57Zs2dXet727du5+uqriYmJ4amnnjrruaA8lLz77rspKytzfdzLywubzcY999yjKeaNTGNf5v3DDz/wz3/+Ex8fHx5//HE6d+6Mh4eH6/OwuLiYhx56iIMHDzJ27FiuueYafHx8OH78OE899RSbNm1i9uzZdO/e/Zz2jRURkZqJiYlx/X+jUFJERKQy/UQi0sydOHHC9ftrr72W8PBwHA4HW7ZsoaCggEsuuYQ5c+aQmprqCiUdDgd//etf+fTTT5k8eTJr1qxhwYIFFBQU8NZbbzFq1CiSkpKYMmWKK1isqtsSYPfu3SxcuJALLriAVatW8dtvv/Hll19yzTXXcP/99/Pjjz8SFxdHSUmJ6zkLFiwgISGBiy++mCVLlvDZZ58RFhbGyy+/DMDIkSMJCQlh69atp1wvKSkJPz8/V0h6tnM5HA6efvppPDw8eOedd9i2bRvr16/HZrOxaNEiPvroo1r4W5D6ZDabsVqtpKSkkJ6ejs1mcw3NmTBhAtHR0SQmJrq7zNMaMGAAkydPJjc3l7vuuouUlBSysrL45ptvOHHiBHl5eWzfvp28vDzCw8Px8fHBMAwWL17Mpk2b2L9/Pzt27DgllDx48CBbt27l+++/rxTCi4jIuXOGkmazWaGkiIhIFbSUW6SZcwaTZWVlhIeH88knn3DvvfeyevVqLrzwQl544YVTlnwuX76czz//nAEDBjBmzBi8vb35+eefAWjZsmWlY729vTGZTBQUFFR5/ZUrV1JaWsqMGTPo0qULAN27d2f27NnMnj2bH3/8kdTUVGJiYrjkkkvYu3cvs2fPJigoiNtvv50OHTrw3XffkZeX53q+v78/VquVN998E4fD4eokW7t2LUuWLGHWrFl07NixWufasWMH27dv5/bbb3cN3wkODubee+/l3nvvPe/7L+5lNpuJjY3FarVW6qJMSkoiNTW1wS7zjoqKoqioiNWrV7NgwQKOHz9Ohw4dKC4u5pFHHuG//uu/+PDDD3n66adp164dJ06cYM2aNQQFBXHgwAEcDofr30ZpaSlbtmzhgw8+YPny5ZjNZrp168aYMWMYPHiwOoJFRM6BYRjExcW5Qsn09HR3lyQiItIgKZgUaeZKS0sBXMFhUFAQb775Js8++yzz58/HarXy6quv0qFDB+CP4Rvt27fHMAxuvvlm17lCQ0OZNm1aled3/gpQWFjIjh07iIiIIDc3F4ALLrigyvpat24NQFZWFpdccgkvvvgixcXFdOnSpdK1vL29efDBB11/HjZsGC+99BLr1q3DYrHw+++/Ex8fj8VicS3Rrs659u/fD0Dnzp2rd0OlUXJ2UVqt1krDcpKSkkhKSsJisbhCyoagXbt2jB07lj59+mAYBvv27WPdunUEBQUxcOBA+vXrh4eHB8uWLSMwMBB/f39iY2PZunUrK1eupKSkxBU4ZmdnM3fuXDZt2kTv3r3Jy8tj6dKllJaW0rlzZ0JCQtz8akVEGhfDMIiOjgbAYrGQkpLi5opEREQaLgWTIs2cc3l2Tk6O6zEPDw8eeughIiMjeeCBBxg1ahTvvvsuPXr0YPv27WzZsoV//OMf3HTTTSQnJ7N161Z69+7NzTffTLt27Sqd39PTkzZt2nDkyBHXY8nJycyePZusrCzy8vIAOH78+Bnr++mnnwBYtmwZw4cP58UXX+Sjjz4iMzOT0NBQRo8e7doXE2Dw4MGEhIQwZ84c5s6dS3x8PIcPHyY5OdkVyFTnXGerT5qeqoblZGZmujopx40b12AmqUZERBAREUFRURGTJk0iLy8PX19fPDw8uO+++5g6dSp79+6lW7dulJSUkJGRwU8//UR2djZlZWXk5uayYsUKvvvuO26//XbGjBlDSEgIn376KS+//DILFy5k2rRptGnT5ox1lJSU4OWlbylERDIyMpgwYQKgUFJERKTuADrKAAAeI0lEQVQ69FOESDPXtm1bvL29q9wDctSoUfTo0YNbb72VSZMm8cUXX7iCuqKiIlq3bs1dd9111muEhYWxZcsWHA4HR48eJSUlhYiICADX3pHOzsmT+fn5ERAQ4FpynpeXR1FREV5eXowdO5axY8dW+TwvLy/uv/9+/ud//ofLLrsMk8nE888/X6kzszrnOlt90nSdbZl3QxqW4+vrC3BKgNiyZUu6d+8OlHcC9+rVi/bt27NixQpuuukmtm7dypIlS7jiiisYNmwY4eHhQHnH8a5du1i3bh2FhYW0atUKwzDo2LEjDoej0vYODoeDvXv34unpSUhIiAJKEWm2KoaSNputwbyJJSIi0pBp+I1IM2cymejduzdXXHFFlR/v06cPycnJlJaWkp+fT8+ePQkICOD111/nwIED1brG4MGD2bp1K5deeimDBw/ml19+IT4+HoBBgwYBfwQrVRkzZgz9+/d3nWv16tWu6ZZncssttzB58mRCQkJITExk9OjRp9R1tnNFRETg6+t7xvqkaavOsBy73e7uMqslODiY4cOHc/z4cdauXcv3339PdnY2/fv358ILL3Qd16ZNG6ZPn87o0aNZvnw5Dz30EI899hg33ngj7777bqVhVNnZ2XzwwQd88sknCiVFpNlKTEx0hZIJCQkKJUVERKrJ5HA4HO4uQkTcq+KAmOocs2DBAp588km6d+/OE088wcUXX0xxcTHffPMN77//PmvXriUxMZHLL78cgL1793LLLbdw5MgRrrvuOqZMmeLq4gJYs2YN0dHR1RqysWHDBsaNG4e/vz8zZszg+uuvx9fXl82bN7NkyRKWL1/OmDFjiIuLq7VzXX311XTt2pXAwMCznlOaB2cwmZqaimEYQHmA2VCH5VR0/Phx1xsNs2bNori4mNtuu821H1pFsbGxZGRk0LFjR0pLSykqKuLw4cPcfPPN3Hvvvfj5+bF+/XoefPBBZs6c6fo3LyLSnMTHx7veoNLkbRERkZpRMCki5+TFF19k7ty5riEazi8lJpOJgQMH8tJLLxEWFlYn1161ahXTp0/n0KFDrms6rx8WFkZCQkKVIUtdn0uap4yMDBYvXuz6odRsNjeoZd6nU1JSwt69e8nNzSUyMhIfH59KH7fb7cyZM4chQ4YwefJkhgwZwtdff81jjz1Gnz59mDFjBj4+PrzzzjusWrWKt956i4CAADe9GhER94iJiXFN3k5ISGjQX/dFREQaIgWTInLO9u/fz9KlS8nJyaFFixb07NmTyy677JQBOHWhsLCQpUuXsm3bNkpKSujSpQtRUVF069bNreeS5svZRZmUlOR6zNlF2VCX9JWVlWEymarsVp41axYff/wxDz/8MCNHjsTT05N9+/YxZ84cCgsLmTdvHjt27OCOO+7AarUyYcIEgoKC3PAqRETqn2EYxMXFKZQUERE5TwomRUREapFhGJWG5UDj6aKs6OOPP+bxxx9n2LBh3H///a7BN0ePHmX37t106tSJlJQU3nvvPdeWDtXZjkFEpLGrGEpq8raIiMj5UTApIiJSRwzDICkpqdJwHIvF4tqLsiHbsmULTz75JLt37+bPf/4zo0aNIigoyLXf7P79+7ntttu44ooruOaaa2jZsiU7d+6kffv2DBkyxN3li4jUCcMwXFu8KJQUERE5f54zZ86c6e4iREREmqLAwECGDx+O1WolMDAQwzDIysoiLS2N1NRU8vPzCQ8Pb5CDldq2bUtUVBRbtmxh4cKFHDx4kO7du9OmTRsOHTrEhx9+yPLlywkKCuLHH39k3rx5rFmzhrVr19KrVy86duyoDkoRaVIyMjIYOXIkAFarlQULFri5IhERkcZPwaSIiEgdCwwMJCoqihEjRhAREUF+fj5ZWVlkZmaSlpbG5s2bCQgIIDw83N2luphMJnx9fQkJCaGkpISNGzeyceNGWrVqhclkYvbs2eTm5nL8+HGCgoKIiooiODiYn376iUGDBtG7d28FkyLSZNjtdu68804AbDYbjz32mJsrEhERaRq0lFtERMQNGtOwnNzcXJKTk/nwww+ZOnUqpaWlzJ8/n65duzJt2jQGDx6Ml5cXb775Jm+99RbTp08nJibG3WWLiNSKxMRE19fqhISEBr8Vh4iISGOiYFJERMSNGtOwnL179+Ln54fD4WDjxo1ccMEFdOrUCW9vb44dO8bixYt5+umnWbZsGZ07d1bHpIg0ejExMa6vzcnJyQ3qa7KIiEhToGBSRESkgWgsXZRlZWV4eHhUeqy4uJgnnniCnJwcHnjgAXr37u2m6kREzl/Fydtms5mEhASFkiIiInVAwaSIiEgD05i6KJ2OHTvGCy+8wOHDh4mPj6ddu3buLklE5JwYhkFMTAyGYWA2m0lPT3d3SSIiIk2WgkkREZEGzDAMkpKSsNvtrsfMZjM2m61B7XNWUlKCYRjs2bOnQQanIiLVkZGRwYQJEwCwWCykpKS4uSIREZGmTcGkiIhII+Bc5p2amophGMAfy7ytVitms9nNFZYv5zaZTHh5ebm7FBGRGrPb7cTHxwNgtVpJSEhwc0UiIiJNn4JJERGRRiYjI4PFixdX6qK0WCyukFJERGpGk7dFRETcQ8GkiIhII9UYuihFRBo6Td4WERFxHwWTIiIijVxjHJYjIuJumrwtIiLifgomRUREmhBnF6VzSSL80UUZGxvrxspERBqOipO3NeRGRETEfRRMioiINEHqohQRqZomb4uIiDQcCiZFRESaOHVRioiUqzh522az6WugiIiImymYFBERaSbURSkizZkmb4uIiDQ8CiZFRESaIcMwSEpKwm63ux4zm83YbDb9sC4iTY5z8raG3IiIiDQsCiZFRESaMecy79TUVAzDAP5Y5m2xWPTDu4g0apq8LSIi0rApmBQRERGgfCDE4sWL1UUpIk2CYRhER0cDGnIjIiLSUCmYFBERkUrO1EVptVoxm81urlBE5Mw0eVtERKRxUDApIiIiVdKwHBFpjDR5W0REpPFQMCkiIiJn5eyidE60BXVRikjDo8nbIiIijYuCSREREam2+uyizMjIwDCMBhcs2O121xL3zMxM130QqchsNmM2mwkPD2fo0KGufydSNzTkRkREpHFSMCkiIiLn5ExdlLWxdDI6OhrDMBrMUsyMjAzi4+NdoWRF5rDWbqhIGjJjd94pj1mtVmw2mzqMa5lhGMTExGAYhkJJERGRRkbBpIiIiJyXuuqirDhR153hZMVOLCgPIceN6k/U4M6Yw4IUSkqVnMGksfswxp48El9ejbE7r1bDe9GQGxERkcZOwaSIiIjUGsMwSEpKwm63ux4zm83YbLZzWpJdsRPKarWSkJBQm+VWS0xMTPny0P8EkrF3XVHvNUjjZ+zOw/7xDyS9/DWg/Q9rQ8X9JBtKZ7WIiIjUjIJJERERqXXOZd4V92A81y7Kk5dppqen11XZp6jYjbXzu0fq7brSdBm784i+4YV6/1xuapxvGAAkJydr6baIiEgjpWBSRERE6lRtdFGeHE6mpKTUyz59zn0u5/xjFONG9a/z60nzEPfYx6R+vNFtXcCNmYbciIiINC0KJkVERKReOLsoU1NTXQNknPvtWa3WswaNFQPO+ggn7XY78fHxWAZ3JmXBrXV2HWl+jN15xNy5CGN3Hunp6RqGU03aT1JERKTp8Zw5c+ZMdxchIiIiTV9gYCBRUVGMGDGCiIgIWrdu7VrqnZaWxubNm8nLyyMyMvK0z4+IiCAwMJC0tDTS0tKIiIggPDy8TupNS0sjMzOT2LuvIOLC0Dq5hjRPgQEtSFu5FWNPHsOHD6+zz+GmJDExkfj4eKB8P8k5c+a4uSIRERGpDQomRUREpF4FBgYSGRnJ8OHDsVqtBAYGkpWV5QooU1NTyc/PJzw8nMDAwFOe63zcGRzm5+fXyVLOxYsXk5WVxYgrL1QwKbXO2HOYzPW7sFgspw3jpVxMTAypqalA+X6SGhokIiLSdCiYFBEREbc5uYsyPz+/Uki5efNmAgICKnWUOZ8D5V2NhmHUSTj52muvYRgGj8UPJzCgRa2eW8QEpH68kYiICO2ReBqGYTBt2jTXfpKvvPKK7pWIiEgT4+HuAkRERETMZjNWq5WUlBTS09Ox2WyuPSknTJhAdHQ0iYmJlZ4TGxvrOi4pKemUj58v1z6YYa3P6zxlZQ4W2ddTXFxaG2VJE+OcLC2VZWRkEB0dTWZmJhaLhfT0dIWSIiIiTZCCSREREWlQzGYzsbGxpKenk5CQgMVicYWP0dHRxMfHk5GRAZSHk86pxrUdThqGcd6hJMChvAJmPPUZ6f/+7fyLkibDHBbk7hIaLOcbEoDrDQsRERFpmrzcXYCIiIhIVZxdlFar1dU96ZzK7ZzMPW7cOGJjY4mKiiI6OpqkpCSgPLBsKByO8l8LCovdW4hIIxAfH4/dbgcgISFB+0mKiIg0ceqYFBERkQbvbF2USUlJJCQkYDabSUpKIiYmxt0lu5SWllX6VUROZRgGMTExrjcdNORGRESkeVAwKSIiIo3GyXtRVuymjI+Pd+0LmZmZSXR0tOvP9aHoRAm79+af8nhpaXnLpL+/d73Vcr5ipr3Nhk05tX7e092j2lZX9Z/JPxO/4I3kb+v1mk2F9pMUERFpvrSUW0RERNyq4r6QZrO5yt8DrsnczsfNZjMJCQnYbDbsdjupqamVgkhnB1ZKSsop56otq9Zu46WFGWzM2oO3twedzW34+O3/rnRMyX86JVv6+9ZJDXUhc/1OPlj2I4P6dTrvc1XnHtW22qy/ujZm7WZVxjb+PGFIvV2zKUhMTHRtwWCz2RrUNgwiIiJS9xRMioiIiFs5Q4lzcbbAsS7DyflvZPDMvJVc0CmIq67owYqvtrIxaw/rfzAYPOCPa5WUlAeTfi0ax7ddZWXlHZ45e/LYu/8Ixu48Co8X0zE0gB5dg2t0rureo9pUm/XX7LqQe/AYuQePkbMnj6PHivD382FQvzBMJlOdXbexMgyDuLg4MjMzXW8yqEtSRESk+Wkc3yGLiIhIk5WcnOzqdHT+mpNTeRludna26/cnd0WejXMvSuf07tqQ9tVWnn7+S66M7s7Lc8bh6+PF0hWb+euD7/PKokxeHjDOdWxxcSkA/n4+tXb92lZaWsbnq39hV85hftv1OwCfr/6Fz1f/4jqmXRt/1n9uq3bIVpN71BDrr45/b9jFz7/kkp1zmF9/O8DvhwoYcs0fQbuXlwcfv/3fRPQKrbVrNgXOrRcALBaLpm6LiIg0YwomRURExK1qq0vq5JDSGWYahlGrQzTKyhw8M28lF/YI4aVnb8bXp/zbqRFXXkhLfx+2/Jpb6fiCwhMAtPSveTB5oriUlWt+ZdnnP/P9j7vZf+Ao9999BdNuG1qr53rrvfXMnJ1W6fg2QX7ceG0kAyLDiOgVSo+uwdUO9Wp6j85XbddfHZu37sM6dVGlx0wmE2Nv6MeAyI707d2RPr3a49ei8ewtWtcqdkmClm6LiIiIgkkRERFpIk5eql1X+0qmfbWVX3cc4NVEa6UuSC8vD+6ZGs2+3COVji8oLAYqB5PHCk4w/40MUj/eyO+HCujUsTVXXd6D+6ZdRmBACwAWLFrHcwu+5sjRIkwmExG92mMZfAHt2vpXWdcv2w/w7AsrWbPuN9q28eO/J17C1EmXVOtcA/uGETWkM337dGBgZBgPz/qUiF6h/OPvI+rlHjkcDuxLNvLqO+vYsfN32rbxJ2pIZ/7239H07Fa+/Dr34DFWZ2znppF9OV5UzF/+/j5bt+XyauL4c67/q/RtzJm/iq3bcunepR0P3fcnrojq5vr49z/u5p+JXzB/9s0Et23peryszMEPP+2hfXArPD1NxN71X7y/dBPffJ/NnH+MOqd71tRlZGQwYcIEAC3dFhERERcFkyIiIiI1sPab3/Dx9mTYpd1P+dhfb7/0lMeOFVTumMxcvwvbjI/Ysy+fgFa+jL9xAPsPHOXt1O9YvnIL82ePpW/vDix4ex1HjhbRPrgVSxbdTsfQwNPWtGrtNu6ItePl5cmw6G5kbdnP43NWENYhkOuu6n3Wcw3q14nkV251/fmF19dy4j9L0M9FTe7R7r35/O2h9/luYw4eHiZuuCYCX18vVq3dxvKVW3lqxnWMua4v336fzf3/u4RLBoUz+8Wv+Cp9GwCPPv0Z7y+cUuP6E19eTdLLX9M+uBXDh13I6szt3BFrZ/WSv9KhfQAAO3b9zr837OKOWDspC27F18eLtd/8xmPPprF1Wy4XDwrngb8NY+hFF7D5l31krt9JaWkZnp4e53zvmpqTuyStVmutbqsgIiIijZuCSREREZEacA5Xqe6y4ILCE/j6eOHlVR5WrVi1lT378ulsbsOilybS2dwGgAO/HyP20SVYpy5i7dJ7mPfUGGIfXULOnjwm/eX/iP/rMK676sJTrrsr5zB3xqXSq3sIrz83ntCQADZsymHMlDfYtHkv113Vu9rncvLx9nQN7TkXNblHP23Zy3cbc2jh68W/EsZxZXR5mHmiuJQ5L61i+iMf0aa1Hx4e5ed66MllrFm3gykxQ9h/4CiffbmFg4cKaNfmj07Ss9X/0Wc/kfTy14y9oR/PPHo93t6evLRwLc/MW8nWbbmuYHLMdZF8vvoXPknLYvojH9GujT9vp35Hn57t+b9/TSL6ki6VrglQWubA07Nm96upqjhxW12SIiIiUhW9nSsiIiJSA1df0ZMTxaUseHtdtY4/VnACH58/kqqAlr4APPk/17pCSYDgti25Z2o0BYUnWPvNbwy96AK+fP9uZtx/NQB/+ftirpv4Gh999hOlpX+Ebs/MW0lxSRk//7qfyfckc8vd7xAz7W28vDwYeXVvgGqfy6mszOEKF6F8qfXmX/ZX9xbV6B4FtCq/H1MnDXWFklAe9D3wt2F4e3vy+epfyT14DIA163YwfFgvZj4wnNsnXozD4WBn9qFq1190ooSnn/8ST08Pln3+MzF3vs1Nf36DZ1/4ik4dW1eaFm4ymZj7+Cj69u7Ap1/8zNup3xEzZgAfLbq9UijpvCZQ6X5u+TX3vALexiojI4Po6GhXKGmz2UhPT1coKSIiIqdQMCkiIiJSA8Oiu3O5pStPP/8lTz33pWu4zek4HHD02AlXQOXrW75gpdV/Akqn4uJS3kz5Fr8W3vTu2R6AFr5eTLt1KF++fzcLn4/B38+b+x7+kJG3vMbuvfnsyz3CJ2lZ3DM1mtcSxxMWGsieffn86fIevLfgNvr27uA6/9nOVVG7tv7kHznu+nP6v3/j2pgFrnCwNu+RczBOS/9Th8S8mfItxcWlDOwbxp595TX26RXK87PG4OFhYmDfMLy8PPh1x4Fq1//Zl1vYvTefN+dN4JHYqzCZyv9+Jt40kPcXTjllSFHmtzvZuu2PYT3XXdXHVXNFwe3K96B0XrfweDHXT3qNZV9sPuO9akoMwyAmJoYJEyZgGAZms5nk5GQNuBEREZHT0lJuERERkRpa+FwMM57+jFcWZbLIvp7hw3rh4WEiZ08+xwpOMHn8YMbfOAAoDwQdDgfbdx6kV/cQRg2PYM78Vdz3yIdMHj+Y4LYt2b7zIJ+kbSZ792FeTRxP9y7tKDxejHXqIiaPH4x1dH/+dFkP/nRZD1ZnbOdvD32AbcZH/P2eKwHoGBrAsOjuDIs+dU9HoFrneu/V21zHh4cFsWbdDo4VlC9Dfzv1O0JDAgg+zeCd87lHA/uG0a1zW154bS0FhcX06BrM/gNH+XzVL/x7wy7umRrN2Bv68cisTwF4/MHhrknXvj5e9OoWwr83ZLvu99nq32WUd1eaw1pzuaUrt1kHV1l/SUkZC5O/4dl5K+nWpR3Tp13Osy+s5M777UyOGcKdtw0lNCTAdby5Y2sAfv5lP6EhAfzf4g2UlJQReWGHKs/flBiGgd1ur7Rs22azYbVa3VyZiIiINHQKJkVERERqyNvbk2cevZ47b7Pw2jv/ZmPWHrbvPEhQaz+GDDAzqF8n17EX9e+Ep6cHy1duoVf3EMxhrXllzjhefD2d2S98xYniUjp1bM3Iq3pz+8SL6fSfgMvT04NjBUU88I9PmP/GWiIu7ICXpwf7Dxzl6LEi9uUepXuXdrT092HBonWMGh5Jq5Y+VdZbnXNVdFH/Tiyyr2fYmPmcKC7lcF4h858dW+19NWtyj0wmEwvmWnn2ha9467315B85Tts2/lx2SVc+fPPPruOm33k5F/ZozyWDLqh0nTtuHcqKVVurXf+AyDAAEv+1mudnjTlt/Xc/sJgVq7YyIDKMd+bfQkArX6Iv6cL0Rz7kbft6BkaGMWpEhOv4gX074eFh4i9/f5+g1n7k7MljSswQundpV+171thkZGSQlJTkGmxjNpsZN26cOiRFRESk2kwOh8Nx9sNEREREmpfOnTtjDmtN+if3nPe5vv3eoH1IKy7oFHTKx8rKHK7BLicrOlHCu+9/z+JPNrLTOMTx4yW0CfJjWHR37pg0lJ7dgl1DW/r16cjMB4YzsG8Yx4tKSP/3DhZ/solvv8/G/tpkzGGtz3oup5KSMv58XzI//LSHP13WgykxQ7iof6cqa6xtZ7of1XW2+qfcm8xX6dsYPSKSuL9ewQWd2nDg92OsWLWV95du4uDvBfTsFsxO4xApr9xGmyC/al034aWveO2dbxg8oBPjbxzAqOERpw1zjd15RN/wAhaLhZSUlPN6vfXJ2R2ZmpqKYRiAAkkRERE5dwomRURERKpQm8FkXZv3ajpJr6ympKQMk8mE89s7T08PLrd0Zd6sMQQGtHBzlQ1H4fFiYh9dwqdf/AxQ6Z75tfDmlrGDeMR2FVB+D+tCYwomnWFkZmamqzsStGRbREREzp+WcouIiIg0cvfeEc34G/vz6Rdb2Jd7hJb+PvTsFkzUkM4KJKvg18Kbf80eS9bWfazO2M6hw4UEt2tJ394dGDLAjLe359lP4gaJiYnk5ORUeqxTp06YzWbM5vJp4uHh4a7fnytnEAlU6oyEP7ojrVbreV9HRERERMGkiIiISBMQGhLAnycMcXcZjUpEr1AieoW6u4xqOzkkPBNnaGg2mwkPDwdg6NChpzy/YtCZnZ1dqSOy4rnGjRuHxWIhKirqXMsXEREROYWCSRERERGRRiAlJYXs7OxKjxmGgWEY5OTkuD7mfMz5e2fY6OyCPBNn96XFYsFsNhMVFaXOSBEREakzCiZFREREqmA2m6vdnSZSU5nrdwK4uhmro+KS7epwfv5mZ2dXCiqd56r4K6BuSBEREal3CiZFREREquAMJjO/3YllSGd3lyNNVKdOdTftvKrwUURERKQhqZsxgyIiIiKNnMVicXcJ0oRlfFveManQUERERJozBZMiIiIiZ5DxnyW3InVBy6dFRESkOVMwKSIiIlIFZ8dk6scb3VyJNDXG7jx9XomIiIigYFJERESkSlFRUVgsFozdecQ99rG7y5EmxPn5ZLPZtJRbREREmjUFkyIiIiKnMWfOHKB8grI63KQ2ZH67k8z1OzGbzcTGxrq7HBERERG3MjkcDoe7ixARERFpqOx2O/Hx8QDY7roc66gBmMNau7kqaWycnbeZ/9mzNDk5WftLioiISLOnYFJERETkLBITE0lKSgLAHNYay+DOAEQN6Yy5Y2vMYUHuLE8aIGP3YdfgpIp7SprNZhISEhRKioiIiKBgUkRERKRaDMPAbre7AkqRmrLZbFq+LSIiIlKBgkkRERGRGjAMg+zsbAzDYN26dQBkZ2e7uSppaMLDw+nUqRNmsxmz2awOSREREZEqKJgUERERERERERGReqep3CIiIiIiIiIiIlLvFEyKiIiIiIiIiIhIvVMwKSIiIiIiIiIiIvVOwaSIiIiIiIiIiIjUOwWTIiIiIiIiIiIiUu8UTIqIiIiIiIiIiEi9UzApIiIiIiIiIiIi9U7BpIiIiIiIiIiIiNQ7BZMiIiIiIiIiIiJS7xRMioiIiIiIiIiISL1TMCkiIiIiIiIiIiL1TsGkiIiIiIiIiIiI1DsFkyIiIiIiIiIiIlLvFEyKiIiIiIiIiIhIvVMwKSIiIiIiIiIiIvVOwaSIiIiIiIiIiIjUOwWTIiIiIiIiIiIiUu8UTIqIiIiIiIiIiEi9UzApIiIiIiIiIiIi9U7BpIiIiIiIiIiIiNQ7BZMiIiIiIiIiIiJS7xRMioiIiIiIiIiISL1TMCkiIiIiIiIiIiL1TsGkiIiIiIiIiIiI1DsFkyIiIiIiIiIiIlLvFEyKiIiIiIiIiIhIvVMwKSIiIiIiIiIiIvVOwaSIiIiIiIiIiIjUu/8HP+5u075hZbwAAAAASUVORK5CYII=)"
+ ],
+ "metadata": {
+ "id": "AwhxwHTf4VZp"
+ }
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "1mr4fXemsYci",
+ "outputId": "dd51d890-7da2-45ec-b14f-d96169bb8bdf"
+ },
+ "outputs": [
+ {
+ "output_type": "stream",
+ "name": "stdout",
+ "text": [
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m973.5/973.5 kB\u001b[0m \u001b[31m10.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m320.6/320.6 kB\u001b[0m \u001b[31m38.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m18.9/18.9 MB\u001b[0m \u001b[31m71.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m84.1/84.1 kB\u001b[0m \u001b[31m11.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m308.5/308.5 kB\u001b[0m \u001b[31m38.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m122.8/122.8 kB\u001b[0m \u001b[31m15.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m75.6/75.6 kB\u001b[0m \u001b[31m11.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m22.8/22.8 MB\u001b[0m \u001b[31m58.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m542.0/542.0 kB\u001b[0m \u001b[31m46.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m1.1/1.1 MB\u001b[0m \u001b[31m51.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m2.1/2.1 MB\u001b[0m \u001b[31m48.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m71.1/71.1 kB\u001b[0m \u001b[31m8.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m77.9/77.9 kB\u001b[0m \u001b[31m10.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m58.3/58.3 kB\u001b[0m \u001b[31m6.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m53.0/53.0 kB\u001b[0m \u001b[31m8.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m142.5/142.5 kB\u001b[0m \u001b[31m20.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m98.7/98.7 kB\u001b[0m \u001b[31m13.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m116.3/116.3 kB\u001b[0m \u001b[31m18.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m194.1/194.1 kB\u001b[0m \u001b[31m22.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m134.8/134.8 kB\u001b[0m \u001b[31m21.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m49.3/49.3 kB\u001b[0m \u001b[31m8.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25h"
+ ]
+ }
+ ],
+ "source": [
+ "!pip install langchain openai lancedb ragas -q"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "### Setup `OPENAI_API_KEY` as an environment variable"
+ ],
+ "metadata": {
+ "id": "z8hT0Jn74ZmT"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "import os\n",
+ "\n",
+ "os.environ[\"OPENAI_API_KEY\"] = \"sk-proj-...\""
+ ],
+ "metadata": {
+ "id": "YHgQd_1rI04R"
+ },
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "### Load .txt file and convert them into chunks"
+ ],
+ "metadata": {
+ "id": "mM0tf_vo6GbI"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "import requests\n",
+ "from langchain.document_loaders import TextLoader\n",
+ "from langchain.text_splitter import CharacterTextSplitter\n",
+ "\n",
+ "url = \"https://raw.githubusercontent.com/hwchase17/chroma-langchain/master/state_of_the_union.txt\"\n",
+ "res = requests.get(url)\n",
+ "with open(\"state_of_the_union.txt\", \"w\") as f:\n",
+ " f.write(res.text)\n",
+ "\n",
+ "# Load the data\n",
+ "loader = TextLoader(\"./state_of_the_union.txt\")\n",
+ "documents = loader.load()\n",
+ "\n",
+ "# Chunk the data\n",
+ "text_splitter = CharacterTextSplitter(chunk_size=200, chunk_overlap=10)\n",
+ "chunks = text_splitter.split_documents(documents)"
+ ],
+ "metadata": {
+ "id": "IkLbg-_1I3Rt",
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "outputId": "4248c952-c719-4a30-ee7e-06d2f1b17449"
+ },
+ "execution_count": null,
+ "outputs": [
+ {
+ "output_type": "stream",
+ "name": "stderr",
+ "text": [
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 215, which is longer than the specified 200\n",
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 232, which is longer than the specified 200\n",
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 242, which is longer than the specified 200\n",
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 219, which is longer than the specified 200\n",
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 304, which is longer than the specified 200\n",
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 205, which is longer than the specified 200\n",
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 332, which is longer than the specified 200\n",
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 215, which is longer than the specified 200\n",
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 203, which is longer than the specified 200\n",
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 281, which is longer than the specified 200\n",
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 201, which is longer than the specified 200\n",
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 250, which is longer than the specified 200\n",
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 325, which is longer than the specified 200\n",
+ "WARNING:langchain_text_splitters.base:Created a chunk of size 242, which is longer than the specified 200\n"
+ ]
+ }
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "### Setup Retriever\n",
+ "\n",
+ "Retriever utilizes **LanceDB** for scalable vector search and advanced retrieval in RAG, delivering blazing fast performance for searching large sets of embeddings."
+ ],
+ "metadata": {
+ "id": "pgetSLZXEJ2Q"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "from langchain.embeddings import OpenAIEmbeddings\n",
+ "from langchain.vectorstores import LanceDB\n",
+ "import lancedb\n",
+ "\n",
+ "openai_embed = OpenAIEmbeddings()\n",
+ "\n",
+ "# Setup lancedb\n",
+ "db = lancedb.connect(\"/tmp/lancedb\")\n",
+ "table = db.create_table(\n",
+ " \"raga_eval\",\n",
+ " data=[{\"vector\": openai_embed.embed_query(\"Hello World\"), \"text\": \"Hello World\"}],\n",
+ " mode=\"overwrite\",\n",
+ ")\n",
+ "\n",
+ "# Populate vector database\n",
+ "vectorstore = LanceDB.from_documents(\n",
+ " client=table, documents=chunks, embedding=openai_embed, by_text=False\n",
+ ")\n",
+ "\n",
+ "# Define vectorstore as retriever to enable semantic search\n",
+ "retriever = vectorstore.as_retriever()"
+ ],
+ "metadata": {
+ "id": "2PYhU_vvJC0P"
+ },
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "### Setup RAG Pipeline with Prompt template"
+ ],
+ "metadata": {
+ "id": "9CFnNEfuExj7"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "from langchain.chat_models import ChatOpenAI\n",
+ "from langchain.prompts import ChatPromptTemplate\n",
+ "from langchain.schema.runnable import RunnablePassthrough\n",
+ "from langchain.schema.output_parser import StrOutputParser\n",
+ "\n",
+ "# Define LLM\n",
+ "llm = ChatOpenAI(model_name=\"gpt-4o\", temperature=0)\n",
+ "\n",
+ "# Define Prompt template\n",
+ "template = \"\"\"You are an assistant for question-answering tasks.\n",
+ "Use the following pieces of retrieved context to answer the question.\n",
+ "If you don't know the answer, just say that you don't know.\n",
+ "Use two sentences maximum and keep the answer concise.\n",
+ "Question: {question}\n",
+ "Context: {context}\n",
+ "Answer:\n",
+ "\"\"\"\n",
+ "\n",
+ "prompt = ChatPromptTemplate.from_template(template)\n",
+ "\n",
+ "# Setup RAG pipeline\n",
+ "rag_chain = (\n",
+ " {\"context\": retriever, \"question\": RunnablePassthrough()}\n",
+ " | prompt\n",
+ " | llm\n",
+ " | StrOutputParser()\n",
+ ")"
+ ],
+ "metadata": {
+ "id": "-TiQhbNyLSKv"
+ },
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "### Sample Questions with their Expected Answers\n",
+ "\n",
+ "Define a set of questions with their answers for creating dataset including ground truth, generated answers with their context using which they are generated."
+ ],
+ "metadata": {
+ "id": "Ge8JtkNXFXhI"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "from datasets import Dataset\n",
+ "\n",
+ "questions = [\n",
+ " \"What did the president say about Justice Breyer?\",\n",
+ " \"What did the president say about Intel's CEO?\",\n",
+ " \"What did the president say about gun violence?\",\n",
+ "]\n",
+ "ground_truth = [\n",
+ " \"The president said that Justice Breyer has dedicated his life to serve the country and thanked him for his service.\",\n",
+ " \"The president said that Pat Gelsinger is ready to increase Intel's investment to $100 billion.\",\n",
+ " \"The president asked Congress to pass proven measures to reduce gun violence.\",\n",
+ "]\n",
+ "answers = []\n",
+ "contexts = []\n",
+ "\n",
+ "# Inference\n",
+ "for query in questions:\n",
+ " answers.append(rag_chain.invoke(query))\n",
+ " contexts.append(\n",
+ " [docs.page_content for docs in retriever.get_relevant_documents(query)]\n",
+ " )\n",
+ "\n",
+ "# To dict\n",
+ "data = {\n",
+ " \"question\": questions,\n",
+ " \"answer\": answers,\n",
+ " \"contexts\": contexts,\n",
+ " \"ground_truth\": ground_truth,\n",
+ "}\n",
+ "\n",
+ "# Convert dict to dataset\n",
+ "dataset = Dataset.from_dict(data)"
+ ],
+ "metadata": {
+ "id": "PGiU57QJMP0J"
+ },
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "### RAGA Evaluation Pipeline\n",
+ "\n",
+ "Simple pipeline of RAGA for evaluation with the listed metrics to understand and evaluate the RAG system.\n",
+ "\n",
+ "**Metrics** on which we will evaulate are answer_correctness,\n",
+ "faithfulness,\n",
+ "answer_similarity,\n",
+ "context_precision,\n",
+ "context_utilization,\n",
+ "context_recall,\n",
+ "context_relevancy,\n",
+ "answer_relevancy, and\n",
+ "context_entity_recall"
+ ],
+ "metadata": {
+ "id": "szBZ1nwkFruF"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "from ragas import evaluate\n",
+ "from ragas.metrics import (\n",
+ " answer_correctness,\n",
+ " faithfulness,\n",
+ " answer_similarity,\n",
+ " context_precision,\n",
+ " context_utilization,\n",
+ " context_recall,\n",
+ " context_relevancy,\n",
+ " answer_relevancy,\n",
+ " context_entity_recall,\n",
+ ")\n",
+ "\n",
+ "\n",
+ "# evaluating dataest on listed metrics\n",
+ "result = evaluate(\n",
+ " dataset=dataset,\n",
+ " metrics=[\n",
+ " answer_correctness,\n",
+ " faithfulness,\n",
+ " answer_similarity,\n",
+ " context_precision,\n",
+ " context_utilization,\n",
+ " context_recall,\n",
+ " context_relevancy,\n",
+ " answer_relevancy,\n",
+ " context_entity_recall,\n",
+ " ],\n",
+ ")\n",
+ "\n",
+ "\n",
+ "df = result.to_pandas()"
+ ],
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 49,
+ "referenced_widgets": [
+ "91f4187ef74b4c0791fa9058899f7454",
+ "d9fb5f1092e24ba59cb768842ff8f828",
+ "6441a4ce0f644c51aa6c140de43ed31d",
+ "a6b459a38e5c4386b85ee7ebc0e302a4",
+ "cf96076499974020b541a541648028f4",
+ "82cd48cdf6f144e19cb3ef0a0553b689",
+ "8e537aa094004828b08a55a94cbd7dff",
+ "55a45baafaad4a4ca2cad720da0a80aa",
+ "7f2a940f7f114e439f63e5e900748511",
+ "50282721d29447c89bc05559a99183dd",
+ "23c5a724d0fa40efac45f1fe8fc0b9c5"
+ ]
+ },
+ "id": "Samkm2TnMUQA",
+ "outputId": "f50e72d4-55fd-4f74-f0f6-334af8353ae2"
+ },
+ "execution_count": null,
+ "outputs": [
+ {
+ "output_type": "display_data",
+ "data": {
+ "text/plain": [
+ "Evaluating: 0%| | 0/27 [00:00, ?it/s]"
+ ],
+ "application/vnd.jupyter.widget-view+json": {
+ "version_major": 2,
+ "version_minor": 0,
+ "model_id": "91f4187ef74b4c0791fa9058899f7454"
+ }
+ },
+ "metadata": {}
+ }
+ ]
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "# evaluation metrics\n",
+ "df"
+ ],
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 441
+ },
+ "id": "77QVRXtVMbCG",
+ "outputId": "593cb344-b7dc-4ddc-b00b-559d62a3cec6"
+ },
+ "execution_count": null,
+ "outputs": [
+ {
+ "output_type": "execute_result",
+ "data": {
+ "text/plain": [
+ " question \\\n",
+ "0 What did the president say about Justice Breyer? \n",
+ "1 What did the president say about Intel's CEO? \n",
+ "2 What did the president say about gun violence? \n",
+ "\n",
+ " answer \\\n",
+ "0 The president honored Justice Stephen Breyer a... \n",
+ "1 The president said that Intel’s CEO, Pat Gelsi... \n",
+ "2 The president called on Congress to pass prove... \n",
+ "\n",
+ " contexts \\\n",
+ "0 [And I did that 4 days ago, when I nominated C... \n",
+ "1 [Intel’s CEO, Pat Gelsinger, who is here tonig... \n",
+ "2 [And I ask Congress to pass proven measures to... \n",
+ "\n",
+ " ground_truth answer_correctness \\\n",
+ "0 The president said that Justice Breyer has ded... 0.415487 \n",
+ "1 The president said that Pat Gelsinger is ready... 0.619998 \n",
+ "2 The president asked Congress to pass proven me... 0.606230 \n",
+ "\n",
+ " faithfulness answer_similarity context_precision context_utilization \\\n",
+ "0 1.0 0.911948 1.0 1.0 \n",
+ "1 0.0 0.980103 1.0 1.0 \n",
+ "2 1.0 0.924895 1.0 1.0 \n",
+ "\n",
+ " context_recall context_relevancy answer_relevancy context_entity_recall \n",
+ "0 1.0 0.200000 0.841589 0.500000 \n",
+ "1 1.0 0.090909 0.897084 0.750000 \n",
+ "2 1.0 0.250000 0.914888 0.666667 "
+ ],
+ "text/html": [
+ "\n",
+ "
\n",
+ "
\n",
+ "\n",
+ "
\n",
+ " \n",
+ " \n",
+ " | \n",
+ " question | \n",
+ " answer | \n",
+ " contexts | \n",
+ " ground_truth | \n",
+ " answer_correctness | \n",
+ " faithfulness | \n",
+ " answer_similarity | \n",
+ " context_precision | \n",
+ " context_utilization | \n",
+ " context_recall | \n",
+ " context_relevancy | \n",
+ " answer_relevancy | \n",
+ " context_entity_recall | \n",
+ "
\n",
+ " \n",
+ " \n",
+ " \n",
+ " 0 | \n",
+ " What did the president say about Justice Breyer? | \n",
+ " The president honored Justice Stephen Breyer a... | \n",
+ " [And I did that 4 days ago, when I nominated C... | \n",
+ " The president said that Justice Breyer has ded... | \n",
+ " 0.415487 | \n",
+ " 1.0 | \n",
+ " 0.911948 | \n",
+ " 1.0 | \n",
+ " 1.0 | \n",
+ " 1.0 | \n",
+ " 0.200000 | \n",
+ " 0.841589 | \n",
+ " 0.500000 | \n",
+ "
\n",
+ " \n",
+ " 1 | \n",
+ " What did the president say about Intel's CEO? | \n",
+ " The president said that Intel’s CEO, Pat Gelsi... | \n",
+ " [Intel’s CEO, Pat Gelsinger, who is here tonig... | \n",
+ " The president said that Pat Gelsinger is ready... | \n",
+ " 0.619998 | \n",
+ " 0.0 | \n",
+ " 0.980103 | \n",
+ " 1.0 | \n",
+ " 1.0 | \n",
+ " 1.0 | \n",
+ " 0.090909 | \n",
+ " 0.897084 | \n",
+ " 0.750000 | \n",
+ "
\n",
+ " \n",
+ " 2 | \n",
+ " What did the president say about gun violence? | \n",
+ " The president called on Congress to pass prove... | \n",
+ " [And I ask Congress to pass proven measures to... | \n",
+ " The president asked Congress to pass proven me... | \n",
+ " 0.606230 | \n",
+ " 1.0 | \n",
+ " 0.924895 | \n",
+ " 1.0 | \n",
+ " 1.0 | \n",
+ " 1.0 | \n",
+ " 0.250000 | \n",
+ " 0.914888 | \n",
+ " 0.666667 | \n",
+ "
\n",
+ " \n",
+ "
\n",
+ "
\n",
+ "
\n",
+ "
\n"
+ ],
+ "application/vnd.google.colaboratory.intrinsic+json": {
+ "type": "dataframe",
+ "variable_name": "df",
+ "summary": "{\n \"name\": \"df\",\n \"rows\": 3,\n \"fields\": [\n {\n \"column\": \"question\",\n \"properties\": {\n \"dtype\": \"string\",\n \"num_unique_values\": 3,\n \"samples\": [\n \"What did the president say about Justice Breyer?\",\n \"What did the president say about Intel's CEO?\",\n \"What did the president say about gun violence?\"\n ],\n \"semantic_type\": \"\",\n \"description\": \"\"\n }\n },\n {\n \"column\": \"answer\",\n \"properties\": {\n \"dtype\": \"string\",\n \"num_unique_values\": 3,\n \"samples\": [\n \"The president honored Justice Stephen Breyer as an Army veteran, Constitutional scholar, and retiring Justice of the United States Supreme Court, and thanked him for his service. He also mentioned that Circuit Court of Appeals Judge Ketanji Brown Jackson, who he nominated, will continue Justice Breyer\\u2019s legacy of excellence.\",\n \"The president said that Intel\\u2019s CEO, Pat Gelsinger, told him they are ready to increase their investment from $20 billion to $100 billion.\",\n \"The president called on Congress to pass proven measures to reduce gun violence, including universal background checks and banning assault weapons and high-capacity magazines. He also questioned why individuals on a terrorist list should be able to purchase a weapon and advocated for repealing the liability shield that protects gun manufacturers from being sued.\"\n ],\n \"semantic_type\": \"\",\n \"description\": \"\"\n }\n },\n {\n \"column\": \"contexts\",\n \"properties\": {\n \"dtype\": \"object\",\n \"semantic_type\": \"\",\n \"description\": \"\"\n }\n },\n {\n \"column\": \"ground_truth\",\n \"properties\": {\n \"dtype\": \"string\",\n \"num_unique_values\": 3,\n \"samples\": [\n \"The president said that Justice Breyer has dedicated his life to serve the country and thanked him for his service.\",\n \"The president said that Pat Gelsinger is ready to increase Intel's investment to $100 billion.\",\n \"The president asked Congress to pass proven measures to reduce gun violence.\"\n ],\n \"semantic_type\": \"\",\n \"description\": \"\"\n }\n },\n {\n \"column\": \"answer_correctness\",\n \"properties\": {\n \"dtype\": \"number\",\n \"std\": 0.11430741128276067,\n \"min\": 0.4154869951100285,\n \"max\": 0.6199979663207625,\n \"num_unique_values\": 3,\n \"samples\": [\n 0.4154869951100285,\n 0.6199979663207625,\n 0.6062297668831289\n ],\n \"semantic_type\": \"\",\n \"description\": \"\"\n }\n },\n {\n \"column\": \"faithfulness\",\n \"properties\": {\n \"dtype\": \"number\",\n \"std\": 0.5773502691896258,\n \"min\": 0.0,\n \"max\": 1.0,\n \"num_unique_values\": 2,\n \"samples\": [\n 0.0,\n 1.0\n ],\n \"semantic_type\": \"\",\n \"description\": \"\"\n }\n },\n {\n \"column\": \"answer_similarity\",\n \"properties\": {\n \"dtype\": \"number\",\n \"std\": 0.03619563595630723,\n \"min\": 0.911947980440114,\n \"max\": 0.9801032234987175,\n \"num_unique_values\": 3,\n \"samples\": [\n 0.911947980440114,\n 0.9801032234987175\n ],\n \"semantic_type\": \"\",\n \"description\": \"\"\n }\n },\n {\n \"column\": \"context_precision\",\n \"properties\": {\n \"dtype\": \"number\",\n \"std\": 2.8867515847990063e-11,\n \"min\": 0.9999999999,\n \"max\": 0.99999999995,\n \"num_unique_values\": 2,\n \"samples\": [\n 0.9999999999,\n 0.99999999995\n ],\n \"semantic_type\": \"\",\n \"description\": \"\"\n }\n },\n {\n \"column\": \"context_utilization\",\n \"properties\": {\n \"dtype\": \"number\",\n \"std\": 2.8867515847990063e-11,\n \"min\": 0.9999999999,\n \"max\": 0.99999999995,\n \"num_unique_values\": 2,\n \"samples\": [\n 0.9999999999,\n 0.99999999995\n ],\n \"semantic_type\": \"\",\n \"description\": \"\"\n }\n },\n {\n \"column\": \"context_recall\",\n \"properties\": {\n \"dtype\": \"number\",\n \"std\": 0.0,\n \"min\": 1.0,\n \"max\": 1.0,\n \"num_unique_values\": 1,\n \"samples\": [\n 1.0\n ],\n \"semantic_type\": \"\",\n \"description\": \"\"\n }\n },\n {\n \"column\": \"context_relevancy\",\n \"properties\": {\n \"dtype\": \"number\",\n \"std\": 0.08135390156762909,\n \"min\": 0.09090909090909091,\n \"max\": 0.25,\n \"num_unique_values\": 3,\n \"samples\": [\n 0.2\n ],\n \"semantic_type\": \"\",\n \"description\": \"\"\n }\n },\n {\n \"column\": \"answer_relevancy\",\n \"properties\": {\n \"dtype\": \"number\",\n \"std\": 0.038230490911004,\n \"min\": 0.8415890798965432,\n \"max\": 0.9148879952296768,\n \"num_unique_values\": 3,\n \"samples\": [\n 0.8415890798965432\n ],\n \"semantic_type\": \"\",\n \"description\": \"\"\n }\n },\n {\n \"column\": \"context_entity_recall\",\n \"properties\": {\n \"dtype\": \"number\",\n \"std\": 0.12729376960740932,\n \"min\": 0.4999999975,\n \"max\": 0.7499999981250001,\n \"num_unique_values\": 3,\n \"samples\": [\n 0.4999999975\n ],\n \"semantic_type\": \"\",\n \"description\": \"\"\n }\n }\n ]\n}"
+ }
+ },
+ "metadata": {},
+ "execution_count": 22
+ }
+ ]
+ }
+ ]
+}
\ No newline at end of file
diff --git a/examples/Evaluating_RAG_with_RAGAs/README.md b/examples/Evaluating_RAG_with_RAGAs/README.md
new file mode 100644
index 00000000..98c70693
--- /dev/null
+++ b/examples/Evaluating_RAG_with_RAGAs/README.md
@@ -0,0 +1,13 @@
+# Evaluating RAG with RAGAs and GPT-4o
+
+
+
+Ragas is a **framework for evaluating Retrieval Augmented Generation (RAG) pipelines**.
+
+Ragas provides you with the tools/metrics based on the latest research for evaluating LLM-generated text to give you insights about your RAG pipeline. Ragas can be integrated with your CI/CD to provide continuous checks to ensure performance.
+
+GPT4-o is used as an LLM to generate responses out of semantically close context chunks.
+
+![flow](../../assets/rag_evaluation_flow.png)
+
+Try it out on Colab -
\ No newline at end of file
diff --git a/examples/LlamaIndex-demo/lancedb_cloud/README.md b/examples/LlamaIndex-demo/lancedb_cloud/README.md
new file mode 100644
index 00000000..636ab133
--- /dev/null
+++ b/examples/LlamaIndex-demo/lancedb_cloud/README.md
@@ -0,0 +1,29 @@
+# LlamaIndex and LanceDB Cloud Demo
+
+In this demo, we are going to show how to use LanceDB Cloud to perform vector searches in LlamaIndex
+
+
+### Set credentials
+if you would like to set api key through an environment variable:
+```
+export LANCEDB_API_KEY="sk_..."
+```
+or
+```
+import os
+import getpass
+
+os.environ["LANCEDB_API_KEY"] = getpass.getpass("Enter Your LANCEDB API Key:")
+```
+
+replace the following lines in main.py with your project slug and api key"
+```
+db_url="db://your-project-slug-name"
+api_key="sk_..."
+region="us-east-1"
+```
+
+### Run the script
+```python
+OPENAI_API_KEY=... python main.py
+```
\ No newline at end of file
diff --git a/examples/LlamaIndex-demo/lancedb_cloud/main.ipynb b/examples/LlamaIndex-demo/lancedb_cloud/main.ipynb
new file mode 100644
index 00000000..6a36bd18
--- /dev/null
+++ b/examples/LlamaIndex-demo/lancedb_cloud/main.ipynb
@@ -0,0 +1,391 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "id": "13cb272e",
+ "metadata": {},
+ "source": [
+ "# Vector search with LanceDB Cloud and LlamaIndex \n"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "9a0e829a",
+ "metadata": {
+ "id": "wgPbKbpumkhH"
+ },
+ "source": [
+ "### Credentials\n",
+ "\n",
+ "Copy and paste the project name and the api key from your project page.\n",
+ "These will be used later to [connect to LanceDB Cloud](#scroll-to=5q8m6GMD7sGu)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 1,
+ "id": "6553603f",
+ "metadata": {
+ "id": "rqEXT5-fmofw"
+ },
+ "outputs": [],
+ "source": [
+ "project_slug = \"your-project-slug\" # @param {type:\"string\"}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 2,
+ "id": "36ef9c45",
+ "metadata": {
+ "id": "5LYmBomPmswi"
+ },
+ "outputs": [],
+ "source": [
+ "api_key = \"sk_...\" # @param {type:\"string\"}"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "33ba6af1",
+ "metadata": {
+ "id": "Xs6tr6CMnBrr"
+ },
+ "source": [
+ "You can also set the LANCEDB_API_KEY as an environment variable. More details can be found **here**."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "Le27BWs2vDbB"
+ },
+ "source": [
+ "Since we will be using OPENAI API, let us set the OPENAI API KEY as well."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "-2-fyVPKu9fl"
+ },
+ "outputs": [],
+ "source": [
+ "openai_api_key = \"sk-...\" # @param {type:\"string\"}"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "1991331f-4316-417a-b693-e2f27cbe9ea7",
+ "metadata": {},
+ "source": [
+ "### Installing dependencies"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "e8a49c31",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "! pip install llama-index-vector-stores-lancedb llama-index-readers-file llama-index-embeddings-openai llama-index-llms-openai"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "0QQL4lm8lTzg"
+ },
+ "source": [
+ "### Importing libraries"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "vP6d6JUShgqo"
+ },
+ "outputs": [],
+ "source": [
+ "import openai\n",
+ "import logging\n",
+ "import sys\n",
+ "\n",
+ "# Uncomment to see debug logs\n",
+ "# logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)\n",
+ "# logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))\n",
+ "\n",
+ "from llama_index.core import SimpleDirectoryReader, Document, StorageContext\n",
+ "from llama_index.core import VectorStoreIndex\n",
+ "from llama_index.vector_stores.lancedb import LanceDBVectorStore\n",
+ "import textwrap\n",
+ "\n",
+ "openai.api_key = openai_api_key\n",
+ "assert openai.models.list() is not None"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "8eKRYd2F7v5n"
+ },
+ "source": [
+ "### Download the data\n"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "l0ezDr7suAf_"
+ },
+ "outputs": [],
+ "source": [
+ "! mkdir -p 'data/paul_graham/'\n",
+ "! wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'\n",
+ "! ls 'data/paul_graham/'"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "HJf8xZmX8VJC"
+ },
+ "source": [
+ "Load the documents stored in the data/paul_graham/ using the SimpleDirectoryReader:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "5aljyqpUiViE"
+ },
+ "outputs": [],
+ "source": [
+ "documents = SimpleDirectoryReader(\"data/paul_graham/\").load_data()\n",
+ "print(\"Document ID:\", documents[0].doc_id, \"Document Hash:\", documents[0].hash)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "IiM4DJvC_2dV"
+ },
+ "source": [
+ "### Store data in LanceDB Cloud\n",
+ "\n",
+ "Let's connect to LanceDB so we can store our documents, It requires 0 setup !"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "GV77SSi-AK0v"
+ },
+ "outputs": [],
+ "source": [
+ "uri = \"db://\" + project_slug\n",
+ "table_name = \"llamaindex_vectorstore\" #optional, default table name is \"vectors\" \n",
+ "\n",
+ "vector_store = LanceDBVectorStore( \n",
+ " uri=uri, # your remote DB URI\n",
+ " api_key=\"sk_..\", # lancedb cloud api key\n",
+ " region=\"your-region\" # the region you configured\n",
+ " ...\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "sZOUxfqzXr1m"
+ },
+ "source": [
+ "### Create an index"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "4nDltKClAhhU"
+ },
+ "outputs": [],
+ "source": [
+ "storage_context = StorageContext.from_defaults(vector_store=vector_store)\n",
+ "\n",
+ "index = VectorStoreIndex.from_documents(documents, storage_context=storage_context)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "xoS-WKXMXvvR"
+ },
+ "source": [
+ "And thats it! We're all setup. The next step is to run some queries, let's try a few:"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "7SKSlyq2iwpK"
+ },
+ "source": [
+ "### Query the index\n",
+ "We can now ask questions using the created index. Filtering can be enabled via `MetadataFilters` or use native lance `where` clause."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "5eb6419b",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "from datetime import datetime\n",
+ "from llama_index.core.vector_stores import (\n",
+ " MetadataFilters,\n",
+ " FilterOperator,\n",
+ " FilterCondition,\n",
+ " MetadataFilter,\n",
+ ")\n",
+ "\n",
+ "date = datetime.today().strftime(\"%Y-%m-%d\")\n",
+ "query_filters = MetadataFilters(\n",
+ " filters=[\n",
+ " MetadataFilter(\n",
+ " key=\"creation_date\",\n",
+ " operator=FilterOperator.EQ,\n",
+ " value=date, # using current date as the latest data is scraped\n",
+ " ),\n",
+ " MetadataFilter(key=\"file_size\", value=75040, operator=FilterOperator.GT),\n",
+ " ],\n",
+ " condition=FilterCondition.AND,\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [
+ {
+ "data": {
+ "text/plain": [
+ "Viaweb charged $100 a month for a small store and $300 a month for a big one.\n",
+ "metadata - ..."
+ ]
+ },
+ "execution_count": 15,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "query_engine = index.as_query_engine(\n",
+ " filters=query_filters,\n",
+ ")\n",
+ "\n",
+ "response = query_engine.query(\"How much did Viaweb charge per month?\")\n",
+ "print(response)\n",
+ "print(\"metadata -\", response.metadata)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "0c1c6c73",
+ "metadata": {},
+ "source": [
+ "Let's use LanceDB filters(SQL like) directly via the `where` clause :"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "0a2bcc07",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "lance_filter = \"metadata.file_name = 'paul_graham_essay.txt' \"\n",
+ "retriever = index.as_retriever(vector_store_kwargs={\"where\": lance_filter})\n",
+ "response = retriever.retrieve(\"What did the author do growing up?\")\n",
+ "print(response[0].get_content())\n",
+ "print(\"metadata -\", response[0].metadata)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "sZOUxfqzXr1m"
+ },
+ "source": [
+ "### Append data to the index \n",
+ "You can also add data to an existing index"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "069fc099",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "del index\n",
+ "\n",
+ "index = VectorStoreIndex.from_documents(\n",
+ " [Document(text=\"The sky is purple in Portland, Maine\")],\n",
+ " uri=\"/tmp/new_dataset\",\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "b5cffcfe",
+ "metadata": {},
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Portland, Maine\n"
+ ]
+ }
+ ],
+ "source": [
+ "query_engine = index.as_query_engine()\n",
+ "response = query_engine.query(\"Where is the sky purple?\")\n",
+ "print(textwrap.fill(str(response), 100))"
+ ]
+ }
+ ],
+ "metadata": {
+ "colab": {
+ "provenance": []
+ },
+ "kernelspec": {
+ "display_name": "Python 3",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.12.1"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 0
+}
diff --git a/examples/LlamaIndex-demo/lancedb_cloud/main.py b/examples/LlamaIndex-demo/lancedb_cloud/main.py
new file mode 100644
index 00000000..ef4e168d
--- /dev/null
+++ b/examples/LlamaIndex-demo/lancedb_cloud/main.py
@@ -0,0 +1,89 @@
+import os
+import textwrap
+from datetime import datetime
+
+import openai
+import requests
+from llama_index.core import (
+ Document,
+ SimpleDirectoryReader,
+ StorageContext,
+ VectorStoreIndex,
+)
+from llama_index.vector_stores.lancedb import LanceDBVectorStore
+
+if __name__ == "__main__":
+ if "OPENAI_API_KEY" not in os.environ:
+ raise ValueError("OPENAI_API_KEY environment variable not set. Please set it")
+ else:
+ openai.api_key = os.environ["OPENAI_API_KEY"]
+
+ # Download the document
+ data_path = r"data/paul_graham/"
+ if not os.path.exists(data_path):
+ os.makedirs(data_path)
+ url = "https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt"
+ r = requests.get(url)
+ with open(data_path + "/paul_graham_essay.txt", "wb") as f:
+ f.write(r.content)
+
+ # Load the document
+ documents = SimpleDirectoryReader(data_path).load_data()
+ print("Document ID:", documents[0].doc_id, "Document Hash:", documents[0].hash)
+
+ # Create a LanceDBVectorStore and create an index
+ vector_store = LanceDBVectorStore(
+ uri="db://your-project-slug", # your remote DB URI
+ api_key="sk_...", # lancedb cloud api key
+ region="us-east-1", # the region you configured
+ )
+
+ storage_context = StorageContext.from_defaults(vector_store=vector_store)
+
+ index = VectorStoreIndex.from_documents(documents, storage_context=storage_context)
+
+ # Query via MetadataFilters
+ from llama_index.core.vector_stores import (
+ FilterCondition,
+ FilterOperator,
+ MetadataFilter,
+ MetadataFilters,
+ )
+
+ date = datetime.today().strftime("%Y-%m-%d")
+ query_filters = MetadataFilters(
+ filters=[
+ MetadataFilter(key="creation_date", operator=FilterOperator.EQ, value=date),
+ MetadataFilter(key="file_size", value=75040, operator=FilterOperator.GT),
+ ],
+ condition=FilterCondition.AND,
+ )
+
+ query_engine = index.as_query_engine(
+ filters=query_filters,
+ )
+
+ response = query_engine.query("How much did Viaweb charge per month?")
+ print("==== query via MetadataFilters")
+ print(response)
+ print("metadata -", response.metadata)
+
+ # Query via LanceDB where clause
+ lance_filter = "metadata.file_name = 'paul_graham_essay.txt' "
+ retriever = index.as_retriever(vector_store_kwargs={"where": lance_filter})
+ response = retriever.retrieve("What did the author do growing up?")
+ print("==== query via LanceDB where clause")
+ print(response[0].get_content())
+ print("metadata -", response[0].metadata)
+
+ # add data to an existing index and query with the new data
+ del index
+
+ index = VectorStoreIndex.from_documents(
+ [Document(text="The sky is purple in Portland, Maine")],
+ uri="/tmp/new_dataset",
+ )
+ query_engine = index.as_query_engine()
+ response = query_engine.query("Where is the sky purple?")
+ print("==== query with new data")
+ print(textwrap.fill(str(response), 100))
diff --git a/examples/LlamaIndex-demo/lancedb_cloud/requirements.txt b/examples/LlamaIndex-demo/lancedb_cloud/requirements.txt
new file mode 100644
index 00000000..8272f875
--- /dev/null
+++ b/examples/LlamaIndex-demo/lancedb_cloud/requirements.txt
@@ -0,0 +1,5 @@
+llama-index-vector-stores-lancedb
+llama-index-readers-file
+llama-index-embeddings-openai
+llama-index-llms-openai
+lancedb
\ No newline at end of file
diff --git a/examples/QueryExpansion&Reranker/README.md b/examples/QueryExpansion&Reranker/README.md
index b89b35d8..3daf63c6 100644
--- a/examples/QueryExpansion&Reranker/README.md
+++ b/examples/QueryExpansion&Reranker/README.md
@@ -19,4 +19,4 @@ Our focus is on improving the precision and recall of document retrieval process
For a detailed exploration of the concepts and methodologies discussed in this project,
visit our blog
-[Read the Blog Post](https://blog.lancedb.com/improving-rag-with-query-expansion-reranking-models/)
+[Read the Blog Post](https://aksdesai1998.medium.com/improving-rag-with-query-expansion-reranking-models-31d252856580)
diff --git a/examples/SuperAgent_Autogen/main.ipynb b/examples/SuperAgent_Autogen/main.ipynb
index 9278654b..13047b03 100644
--- a/examples/SuperAgent_Autogen/main.ipynb
+++ b/examples/SuperAgent_Autogen/main.ipynb
@@ -11,55 +11,53 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 1,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "3poVgyh-bZJ-",
- "outputId": "ad799a6e-7eec-4e14-dae3-f7e86c9e67cc"
+ "outputId": "7244e7cf-8eca-481d-fd05-d21a82460d47"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m88.8/88.8 kB\u001b[0m \u001b[31m2.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m811.8/811.8 kB\u001b[0m \u001b[31m9.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m226.7/226.7 kB\u001b[0m \u001b[31m9.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m1.8/1.8 MB\u001b[0m \u001b[31m14.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m111.9/111.9 kB\u001b[0m \u001b[31m12.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m284.0/284.0 kB\u001b[0m \u001b[31m13.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m295.2/295.2 kB\u001b[0m \u001b[31m12.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m77.0/77.0 kB\u001b[0m \u001b[31m3.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m1.6/1.6 MB\u001b[0m \u001b[31m21.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m239.4/239.4 kB\u001b[0m \u001b[31m23.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m55.7/55.7 kB\u001b[0m \u001b[31m5.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m21.6/21.6 MB\u001b[0m \u001b[31m34.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m38.3/38.3 MB\u001b[0m \u001b[31m12.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m49.4/49.4 kB\u001b[0m \u001b[31m3.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m55.4/55.4 kB\u001b[0m \u001b[31m4.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m98.7/98.7 kB\u001b[0m \u001b[31m8.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
- "\u001b[?25h\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n",
- "llmx 0.0.15a0 requires cohere, which is not installed.\n",
- "ibis-framework 7.1.0 requires pyarrow<15,>=2, but you have pyarrow 15.0.0 which is incompatible.\u001b[0m\u001b[31m\n",
- "\u001b[0m"
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m88.8/88.8 kB\u001b[0m \u001b[31m1.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m973.5/973.5 kB\u001b[0m \u001b[31m6.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m2.1/2.1 MB\u001b[0m \u001b[31m12.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m320.6/320.6 kB\u001b[0m \u001b[31m14.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m1.1/1.1 MB\u001b[0m \u001b[31m19.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m18.9/18.9 MB\u001b[0m \u001b[31m36.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m290.4/290.4 kB\u001b[0m \u001b[31m22.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m45.5/45.5 kB\u001b[0m \u001b[31m3.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m296.7/296.7 kB\u001b[0m \u001b[31m20.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m77.0/77.0 kB\u001b[0m \u001b[31m2.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m308.5/308.5 kB\u001b[0m \u001b[31m24.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m122.8/122.8 kB\u001b[0m \u001b[31m10.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m22.8/22.8 MB\u001b[0m \u001b[31m30.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m49.3/49.3 kB\u001b[0m \u001b[31m3.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m53.0/53.0 kB\u001b[0m \u001b[31m5.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m142.5/142.5 kB\u001b[0m \u001b[31m1.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m98.7/98.7 kB\u001b[0m \u001b[31m9.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25h"
]
}
],
"source": [
- "%pip install pyautogen~=0.1.0 langchain openai tiktoken lancedb pypdf -q -U"
+ "%pip install pyautogen~=0.1.0 langchain langchain_community openai tiktoken lancedb pypdf -q -U"
]
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 2,
"metadata": {
"id": "0tLTTT9ucFEb"
},
"outputs": [],
"source": [
- "from langchain.embeddings import OpenAIEmbeddings\n",
+ "from langchain_community.embeddings import OpenAIEmbeddings\n",
"from langchain.text_splitter import RecursiveCharacterTextSplitter\n",
"from langchain.document_loaders import PyPDFLoader\n",
"from langchain.memory import ConversationBufferMemory\n",
@@ -80,60 +78,88 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 3,
"metadata": {
- "id": "6RuVu12whCG0"
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "6RuVu12whCG0",
+ "outputId": "1300d37d-65ae-4e24-cbfb-d1cc701db28d"
},
- "outputs": [],
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Requirement already satisfied: pyautogen in /usr/local/lib/python3.10/dist-packages (0.1.14)\n",
+ "Requirement already satisfied: diskcache in /usr/local/lib/python3.10/dist-packages (from pyautogen) (5.6.3)\n",
+ "Requirement already satisfied: flaml in /usr/local/lib/python3.10/dist-packages (from pyautogen) (2.1.2)\n",
+ "Requirement already satisfied: openai<1 in /usr/local/lib/python3.10/dist-packages (from pyautogen) (0.28.1)\n",
+ "Requirement already satisfied: python-dotenv in /usr/local/lib/python3.10/dist-packages (from pyautogen) (1.0.1)\n",
+ "Requirement already satisfied: termcolor in /usr/local/lib/python3.10/dist-packages (from pyautogen) (2.4.0)\n",
+ "Requirement already satisfied: requests>=2.20 in /usr/local/lib/python3.10/dist-packages (from openai<1->pyautogen) (2.31.0)\n",
+ "Requirement already satisfied: tqdm in /usr/local/lib/python3.10/dist-packages (from openai<1->pyautogen) (4.66.4)\n",
+ "Requirement already satisfied: aiohttp in /usr/local/lib/python3.10/dist-packages (from openai<1->pyautogen) (3.9.5)\n",
+ "Requirement already satisfied: NumPy>=1.17 in /usr/local/lib/python3.10/dist-packages (from flaml->pyautogen) (1.25.2)\n",
+ "Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests>=2.20->openai<1->pyautogen) (3.3.2)\n",
+ "Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests>=2.20->openai<1->pyautogen) (3.7)\n",
+ "Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests>=2.20->openai<1->pyautogen) (2.0.7)\n",
+ "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests>=2.20->openai<1->pyautogen) (2024.2.2)\n",
+ "Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.10/dist-packages (from aiohttp->openai<1->pyautogen) (1.3.1)\n",
+ "Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->openai<1->pyautogen) (23.2.0)\n",
+ "Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.10/dist-packages (from aiohttp->openai<1->pyautogen) (1.4.1)\n",
+ "Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.10/dist-packages (from aiohttp->openai<1->pyautogen) (6.0.5)\n",
+ "Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->openai<1->pyautogen) (1.9.4)\n",
+ "Requirement already satisfied: async-timeout<5.0,>=4.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->openai<1->pyautogen) (4.0.3)\n"
+ ]
+ }
+ ],
"source": [
"!pip install pyautogen"
]
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 38,
"metadata": {
"id": "sUFdvTyVh8xF"
},
"outputs": [],
"source": [
"import lancedb\n",
+ "import os\n",
+ "\n",
+ "# setup OPENAI API KEY\n",
+ "os.environ[\"OPENAI_API_KEY\"] = \"sk-....\"\n",
"\n",
- "embeddings = OpenAIEmbeddings(openai_api_key=\"sk-yourapikey\")"
+ "embeddings = OpenAIEmbeddings()"
]
},
- {
- "cell_type": "markdown",
- "metadata": {
- "id": "kztlyFIXU8m-"
- },
- "source": []
- },
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 5,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "ODKg12trdhX-",
- "outputId": "a7041322-f633-496c-a8e8-126a81cbb5d9"
+ "outputId": "4736d14d-454c-4458-98c4-d2d0d2edceb9"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
- "--2024-02-11 04:40:16-- https://pdf.usaid.gov/pdf_docs/PA00TBCT.pdf\n",
- "Resolving pdf.usaid.gov (pdf.usaid.gov)... 23.7.61.67, 2600:1408:ec00:380::1923, 2600:1408:ec00:38f::1923\n",
- "Connecting to pdf.usaid.gov (pdf.usaid.gov)|23.7.61.67|:443... connected.\n",
+ "--2024-05-28 05:17:50-- https://pdf.usaid.gov/pdf_docs/PA00TBCT.pdf\n",
+ "Resolving pdf.usaid.gov (pdf.usaid.gov)... 23.4.180.157, 2600:1408:5400:197::1923, 2600:1408:5400:183::1923\n",
+ "Connecting to pdf.usaid.gov (pdf.usaid.gov)|23.4.180.157|:443... connected.\n",
"HTTP request sent, awaiting response... 200 OK\n",
"Length: 6419525 (6.1M) [application/pdf]\n",
"Saving to: ‘food.pdf’\n",
"\n",
- "food.pdf 100%[===================>] 6.12M --.-KB/s in 0.1s \n",
+ "food.pdf 100%[===================>] 6.12M 27.0MB/s in 0.2s \n",
"\n",
- "2024-02-11 04:40:16 (42.7 MB/s) - ‘food.pdf’ saved [6419525/6419525]\n",
+ "2024-05-28 05:17:51 (27.0 MB/s) - ‘food.pdf’ saved [6419525/6419525]\n",
"\n"
]
}
@@ -143,14 +169,40 @@
"!wget -O food.pdf https://pdf.usaid.gov/pdf_docs/PA00TBCT.pdf"
]
},
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "yV0pNPiRPy8h"
+ },
+ "source": [
+ "# create file name with OAI_CONFIG_LIT.json"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 53,
+ "metadata": {
+ "id": "yWDhjTDMcFBi"
+ },
+ "outputs": [],
+ "source": [
+ "# create file name with OAI_CONFIG_LIT.\n",
+ "import json\n",
+ "\n",
+ "config = [{\"model\": \"gpt-4\", \"api_key\": os.environ[\"OPENAI_API_KEY\"]}]\n",
+ "\n",
+ "with open(\"OAI_CONFIG_LIT.json\", \"w\") as fp:\n",
+ " json.dump(config, fp)"
+ ]
+ },
{
"cell_type": "markdown",
"metadata": {
"id": "1oC3NAFyd4Kb"
},
"source": [
- "create OAI_CONFIG_LIST.json file in pwd & upload\n",
- "in it\n",
+ "**create OAI_CONFIG_LIST.json file in pwd & upload\n",
+ "in it**\n",
"\n",
"\n",
"[\n",
@@ -163,7 +215,7 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 41,
"metadata": {
"id": "H1bRXWu-cE_C"
},
@@ -179,30 +231,9 @@
")"
]
},
- {
- "cell_type": "markdown",
- "metadata": {
- "id": "yV0pNPiRPy8h"
- },
- "source": [
- "# create file name with OAI_CONFIG_LIT.json & put below authentications code"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "id": "yWDhjTDMcFBi"
- },
- "outputs": [],
- "source": [
- "# create file name with OAI_CONFIG_LIT.\n",
- "[{\"model\": \"gpt-4\", \"api_key\": \"sk-yourapikey\"}]"
- ]
- },
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 42,
"metadata": {
"id": "5gapqmsscFG-"
},
@@ -218,70 +249,30 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 43,
"metadata": {
- "colab": {
- "base_uri": "https://localhost:8080/"
- },
- "id": "5dLkCqa0dLXV",
- "outputId": "28ab5984-c875-4281-95e5-d48bfdd12e99"
+ "id": "5dLkCqa0dLXV"
},
- "outputs": [
- {
- "name": "stderr",
- "output_type": "stream",
- "text": [
- "/usr/local/lib/python3.10/dist-packages/langchain_core/_api/deprecation.py:117: LangChainDeprecationWarning: The class `langchain_community.embeddings.openai.OpenAIEmbeddings` was deprecated in langchain-community 0.1.0 and will be removed in 0.2.0. An updated version of the class exists in the langchain-openai package and should be used instead. To use it run `pip install -U langchain-openai` and import as `from langchain_openai import OpenAIEmbeddings`.\n",
- " warn_deprecated(\n"
- ]
- }
- ],
+ "outputs": [],
"source": [
"import lancedb\n",
"\n",
- "embeddings = OpenAIEmbeddings(openai_api_key=\"sk-yourapikey\")\n",
- "\n",
- "db = lancedb.connect(\"/tmp/lancedb\")\n",
- "table = db.create_table(\n",
- " \"my_table\",\n",
- " data=[\n",
- " {\n",
- " \"vector\": embeddings.embed_query(\"Hello food\"),\n",
- " \"text\": \"Hello food\",\n",
- " \"id\": \"1\",\n",
- " }\n",
- " ],\n",
- " mode=\"overwrite\",\n",
- ")\n",
+ "embeddings = OpenAIEmbeddings()\n",
"\n",
- "vectorstore = LanceDB.from_documents(docs, embeddings, connection=table)"
+ "vectorstore = LanceDB.from_documents(documents=docs, embedding=embeddings)"
]
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 45,
"metadata": {
- "colab": {
- "base_uri": "https://localhost:8080/"
- },
- "id": "YMBoF5kucFMJ",
- "outputId": "13d7edab-5f3d-4698-fe6f-40f33dcd865a"
+ "id": "YMBoF5kucFMJ"
},
- "outputs": [
- {
- "name": "stderr",
- "output_type": "stream",
- "text": [
- "/usr/local/lib/python3.10/dist-packages/langchain_core/_api/deprecation.py:117: LangChainDeprecationWarning: The class `langchain_community.llms.openai.OpenAI` was deprecated in langchain-community 0.0.10 and will be removed in 0.2.0. An updated version of the class exists in the langchain-openai package and should be used instead. To use it run `pip install -U langchain-openai` and import as `from langchain_openai import OpenAI`.\n",
- " warn_deprecated(\n"
- ]
- }
- ],
+ "outputs": [],
"source": [
"qa = ConversationalRetrievalChain.from_llm(\n",
" OpenAI(\n",
" temperature=0,\n",
- " openai_api_key=\"sk-yourapikey\",\n",
" ),\n",
" vectorstore.as_retriever(),\n",
" memory=ConversationBufferMemory(memory_key=\"chat_history\", return_messages=True),\n",
@@ -290,7 +281,7 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 46,
"metadata": {
"id": "HjSVygLIcSEX"
},
@@ -303,34 +294,26 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 47,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
- "height": 160
+ "height": 90
},
"id": "XCqxSaQSepsW",
- "outputId": "c1fc1bdc-9f2e-467b-cb51-fdde3fc964ae"
+ "outputId": "309a74fe-05c6-4d78-ad3e-e720b77d6e3c"
},
"outputs": [
- {
- "name": "stderr",
- "output_type": "stream",
- "text": [
- "/usr/local/lib/python3.10/dist-packages/langchain_core/_api/deprecation.py:117: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
- " warn_deprecated(\n"
- ]
- },
{
"data": {
"application/vnd.google.colaboratory.intrinsic+json": {
"type": "string"
},
"text/plain": [
- "' Good food is food that provides the recommended amounts of nutrients for the body to perform all its physiological activities. It is important to eat the right food, at the right time, in the right amounts, and prepared correctly in order to maintain a balanced diet and promote good nutrition. Good food is essential for physical and cognitive development and can improve overall health and quality of life.'"
+ "' Good food is any type of food that provides the recommended amounts of nutrients for the body to perform its physiological activities. It should be eaten at the right time, in the right amounts, and prepared correctly. Good food is important for physical and cognitive development, and can help prevent health problems. Foods can also be classified according to their functions in the body, such as energy-giving foods, body-building foods, and protective foods.'"
]
},
- "execution_count": 18,
+ "execution_count": 47,
"metadata": {},
"output_type": "execute_result"
}
@@ -342,13 +325,13 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 18,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "BY4Fz-l7cUCA",
- "outputId": "3a10b926-3659-46d3-d76b-7e083daf8fca"
+ "outputId": "e90c9b52-ab8e-4176-e249-fba0f20cae68"
},
"outputs": [
{
@@ -357,16 +340,16 @@
"text": [
"Requirement already satisfied: pyautogen in /usr/local/lib/python3.10/dist-packages (0.1.14)\n",
"Requirement already satisfied: diskcache in /usr/local/lib/python3.10/dist-packages (from pyautogen) (5.6.3)\n",
- "Requirement already satisfied: flaml in /usr/local/lib/python3.10/dist-packages (from pyautogen) (2.1.1)\n",
+ "Requirement already satisfied: flaml in /usr/local/lib/python3.10/dist-packages (from pyautogen) (2.1.2)\n",
"Requirement already satisfied: openai<1 in /usr/local/lib/python3.10/dist-packages (from pyautogen) (0.28.1)\n",
"Requirement already satisfied: python-dotenv in /usr/local/lib/python3.10/dist-packages (from pyautogen) (1.0.1)\n",
"Requirement already satisfied: termcolor in /usr/local/lib/python3.10/dist-packages (from pyautogen) (2.4.0)\n",
"Requirement already satisfied: requests>=2.20 in /usr/local/lib/python3.10/dist-packages (from openai<1->pyautogen) (2.31.0)\n",
- "Requirement already satisfied: tqdm in /usr/local/lib/python3.10/dist-packages (from openai<1->pyautogen) (4.66.1)\n",
- "Requirement already satisfied: aiohttp in /usr/local/lib/python3.10/dist-packages (from openai<1->pyautogen) (3.9.3)\n",
- "Requirement already satisfied: NumPy>=1.17.0rc1 in /usr/local/lib/python3.10/dist-packages (from flaml->pyautogen) (1.23.5)\n",
+ "Requirement already satisfied: tqdm in /usr/local/lib/python3.10/dist-packages (from openai<1->pyautogen) (4.66.4)\n",
+ "Requirement already satisfied: aiohttp in /usr/local/lib/python3.10/dist-packages (from openai<1->pyautogen) (3.9.5)\n",
+ "Requirement already satisfied: NumPy>=1.17 in /usr/local/lib/python3.10/dist-packages (from flaml->pyautogen) (1.25.2)\n",
"Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests>=2.20->openai<1->pyautogen) (3.3.2)\n",
- "Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests>=2.20->openai<1->pyautogen) (3.6)\n",
+ "Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests>=2.20->openai<1->pyautogen) (3.7)\n",
"Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests>=2.20->openai<1->pyautogen) (2.0.7)\n",
"Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests>=2.20->openai<1->pyautogen) (2024.2.2)\n",
"Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.10/dist-packages (from aiohttp->openai<1->pyautogen) (1.3.1)\n",
@@ -393,7 +376,7 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 48,
"metadata": {
"id": "Vca8Y_khcUID"
},
@@ -425,7 +408,7 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 49,
"metadata": {
"id": "1XHjzIYAcfE7"
},
@@ -451,13 +434,13 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 50,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "hOZKxakHchZ4",
- "outputId": "7ca36fe0-a211-409a-abb0-57e3cd05e429"
+ "outputId": "20693c38-773a-47d2-bfc4-8311c7ef4d1d"
},
"outputs": [
{
@@ -475,7 +458,6 @@
"\n",
"***** Suggested function Call: answer_food_question *****\n",
"Arguments: \n",
- "\n",
"{\n",
" \"question\": \"what is good food?\"\n",
"}\n",
@@ -487,13 +469,13 @@
"user_proxy (to assistant):\n",
"\n",
"***** Response from calling function \"answer_food_question\" *****\n",
- " Good food is food that is able to provide the recommended amounts of nutrients for the body to perform all its physiological activities. It is important for our health and well-being because it helps us maintain a balanced diet, promotes physical and cognitive development, and protects us from foodborne illnesses. Good food also ensures that we have enough energy for physical activity and basic body functions, and it helps us maintain a healthy weight. Additionally, good food can improve our overall quality of life and productivity.\n",
+ " Good food is food that is able to provide the recommended amounts of nutrients for the body to perform all its physiological activities. It is important because it helps with physical and cognitive development, promotes good health, and improves the quality of life. Good food should be eaten at the right time, in the right amounts, and prepared correctly. It can also be classified into different categories based on its function in the body, such as energy-giving foods, body-building foods, and protective foods.\n",
"*****************************************************************\n",
"\n",
"--------------------------------------------------------------------------------\n",
"assistant (to user_proxy):\n",
"\n",
- "Good food is food that is able to provide the recommended amounts of nutrients for the body to perform all its physiological activities. It is important for our health and well-being because it helps us maintain a balanced diet, promotes physical and cognitive development, and protects us from foodborne illnesses. Good food also ensures that we have enough energy for physical activity and basic body functions, and it helps us maintain a healthy weight. Additionally, good food can improve our overall quality of life and productivity.\n",
+ "Good food is food that provides the recommended amounts of nutrients for the body to perform all its physiological activities. It is important because it helps with physical and cognitive development, promotes good health, and improves the quality of life. Good food should be eaten at the right time, in the right amounts, and prepared correctly. It can also be classified into different categories based on its function in the body, such as energy-giving foods, body-building foods, and protective foods.\n",
"\n",
"TERMINATE\n",
"\n",
@@ -518,13 +500,13 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 51,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "UDXo2V06fNjz",
- "outputId": "37cf6766-9b68-4a81-e0c6-245d2af28a30"
+ "outputId": "057425b5-f0c1-4132-cdc6-fd03197e95c3"
},
"outputs": [
{
@@ -549,26 +531,26 @@
" - Deficiency Symptoms: Osteoporosis, rickets in children, muscle cramps, dental problems.\n",
"\n",
"2. Iron:\n",
- " - Sources: Red meat, poultry, eggs, fruits, green vegetables, fortified bread.\n",
- " - Functions: Essential for the production of red blood cells, helps in oxygen transport.\n",
- " - Deficiency Symptoms: Anemia, fatigue, weakness, immune system problems.\n",
+ " - Sources: Red meat, poultry, fish, legumes, fortified cereals.\n",
+ " - Functions: Essential for the production of red blood cells, carries oxygen in the blood.\n",
+ " - Deficiency Symptoms: Anemia, fatigue, weakness, pale skin, shortness of breath.\n",
"\n",
- "3. Magnesium:\n",
- " - Sources: Nuts, seeds, whole grains, green leafy vegetables, fish, beans, yogurt.\n",
- " - Functions: Helps in over 300 enzyme reactions, including regulation of blood pressure, supports immune system.\n",
- " - Deficiency Symptoms: Loss of appetite, nausea, fatigue, weakness, muscle cramps, numbness.\n",
+ "3. Potassium:\n",
+ " - Sources: Bananas, oranges, cantaloupes, raisins, nuts, fish, chicken, beef, and pork.\n",
+ " - Functions: Helps maintain fluid balance, nerve transmission, muscle contractions.\n",
+ " - Deficiency Symptoms: Weakness, fatigue, muscle cramps, constipation.\n",
"\n",
- "4. Potassium:\n",
- " - Sources: Bananas, oranges, cantaloupe, honeydew, apricots, grapefruit, cooked spinach, cooked broccoli, potatoes, sweet potatoes, mushrooms, peas, cucumbers, zucchini, eggplant, pumpkins, leafy greens.\n",
- " - Functions: Maintains fluid balance, helps in nerve transmission and muscle contraction.\n",
- " - Deficiency Symptoms: Fatigue, weakness, constipation, muscle cramps.\n",
+ "4. Magnesium:\n",
+ " - Sources: Green leafy vegetables, nuts, seeds, whole grains, fish.\n",
+ " - Functions: Involved in over 300 enzymatic reactions in the body including energy production, protein synthesis, muscle and nerve function.\n",
+ " - Deficiency Symptoms: Loss of appetite, nausea, fatigue, weakness, muscle cramps, numbness and tingling.\n",
"\n",
"5. Zinc:\n",
- " - Sources: Meat, shellfish, legumes, seeds, nuts, dairy, eggs, whole grains.\n",
- " - Functions: Necessary for immune function, protein synthesis, DNA synthesis, cell division, wound healing.\n",
- " - Deficiency Symptoms: Growth retardation, loss of appetite, impaired immune function, hair loss, diarrhea, delayed sexual maturation.\n",
+ " - Sources: Meat, shellfish, legumes, seeds, nuts, dairy, eggs.\n",
+ " - Functions: Supports immune function, protein synthesis, wound healing, DNA synthesis, and cell division.\n",
+ " - Deficiency Symptoms: Loss of appetite, impaired immune function, hair loss, diarrhea, delayed sexual maturation.\n",
"\n",
- "Please note that this is not an exhaustive list and there are other essential minerals as well. Also, the symptoms of deficiency can vary from person to person and can often be symptoms of other conditions as well. Always consult with a healthcare provider for accurate information.\n",
+ "Please note that this is not an exhaustive list and there are many other essential minerals that the body needs. It's also important to remember that while these minerals are essential for health, they should be consumed in moderation as too much can also lead to health problems. Always consult with a healthcare provider or a registered dietitian for personalized advice.\n",
"\n",
"--------------------------------------------------------------------------------\n",
"user_proxy (to assistant):\n",
@@ -592,13 +574,13 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 52,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "UrlFGYW0g0sJ",
- "outputId": "e82396e5-3dca-402c-889c-394557aeea0d"
+ "outputId": "0fbf3ed8-0103-4904-9737-2a41e6e1cf0d"
},
"outputs": [
{
@@ -628,22 +610,13 @@
"user_proxy (to assistant):\n",
"\n",
"***** Response from calling function \"answer_food_question\" *****\n",
- " Foods that are rich in Vitamin A, such as yellow/orange fruits and vegetables, dark green and deep yellow fruits and vegetables, liver, egg yolk, dairy products, and margarine can help maintain healthy eyes.\n",
+ " Fruits and vegetables, particularly dark green leafy vegetables and yellow fruits, are considered protective and can help keep eyes healthy.\n",
"*****************************************************************\n",
"\n",
"--------------------------------------------------------------------------------\n",
"assistant (to user_proxy):\n",
"\n",
- "Foods that are rich in Vitamin A can help maintain healthy eyes. These include:\n",
- "\n",
- "1. Yellow/orange fruits and vegetables: These include carrots, sweet potatoes, pumpkins, and apricots.\n",
- "2. Dark green and deep yellow fruits and vegetables: These include spinach, kale, and other leafy greens.\n",
- "3. Liver: This is a great source of Vitamin A.\n",
- "4. Egg yolk: This is another good source of Vitamin A.\n",
- "5. Dairy products: These include milk, cheese, and yogurt.\n",
- "6. Margarine: This is also a good source of Vitamin A.\n",
- "\n",
- "Including these foods in your diet can help keep your eyes healthy.\n",
+ "Fruits and vegetables, particularly dark green leafy vegetables and yellow fruits, are considered protective and can help keep eyes healthy. These foods are rich in vitamins A, C, E, and minerals like Copper and Zinc which are essential for eye health. Foods like carrots, sweet potatoes, spinach, kale, and other dark green leafy vegetables; and fish like salmon and tuna are good for eye health.\n",
"\n",
"TERMINATE\n",
"\n",
@@ -665,15 +638,6 @@
"\"\"\",\n",
")"
]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "id": "En6-kvjcjaid"
- },
- "outputs": [],
- "source": []
}
],
"metadata": {
diff --git a/tutorials/better-rag-FLAIR/README.md b/examples/better-rag-FLAIR/README.md
similarity index 100%
rename from tutorials/better-rag-FLAIR/README.md
rename to examples/better-rag-FLAIR/README.md
diff --git a/tutorials/better-rag-FLAIR/app.py b/examples/better-rag-FLAIR/app.py
similarity index 100%
rename from tutorials/better-rag-FLAIR/app.py
rename to examples/better-rag-FLAIR/app.py
diff --git a/tutorials/better-rag-FLAIR/main.ipynb b/examples/better-rag-FLAIR/main.ipynb
similarity index 100%
rename from tutorials/better-rag-FLAIR/main.ipynb
rename to examples/better-rag-FLAIR/main.ipynb
diff --git a/tutorials/better-rag-FLAIR/requirements.txt b/examples/better-rag-FLAIR/requirements.txt
similarity index 100%
rename from tutorials/better-rag-FLAIR/requirements.txt
rename to examples/better-rag-FLAIR/requirements.txt
diff --git a/examples/databricks_DBRX_website_bot/README.md b/examples/databricks_DBRX_website_bot/README.md
index e99db29e..18d3883e 100644
--- a/examples/databricks_DBRX_website_bot/README.md
+++ b/examples/databricks_DBRX_website_bot/README.md
@@ -15,7 +15,7 @@ export DATABRICKS_TOKEN=
DATABRICKS_SERVING_ENDPOINT=
```
-3. Run the application
+3. Run the application in CLI mode
```
python main.py
```
@@ -25,3 +25,12 @@ Accepted arguments:
- `embed_model`: Huggingface model to use for embeddings. Default is `mixedbread-ai/mxbai-embed-large-v1`.
- `uri`: URI of the vector store. Default is `~/tmp/lancedb_hogwarts`.
- `force_create_embeddings`: Whether to force create embeddings. Default is `False`.
+- `illustrate`: Whether to illustrate the responses. Default is `True`.
+
+4. Run the application in GUI mode
+```
+streamlit run gui.py
+```
+
+## MLX SDXL
+The MLX SDXL implementation is taken from MLX [examples repo](https://github.com/ml-explore/mlx-examples/tree/main/stable_diffusion). The implementation is modified a bit to make it work faster with the current application.
\ No newline at end of file
diff --git a/examples/databricks_DBRX_website_bot/__init__.py b/examples/databricks_DBRX_website_bot/__init__.py
new file mode 100644
index 00000000..e69de29b
diff --git a/examples/databricks_DBRX_website_bot/diffusion_mlx/__init__.py b/examples/databricks_DBRX_website_bot/diffusion_mlx/__init__.py
new file mode 100644
index 00000000..f266816f
--- /dev/null
+++ b/examples/databricks_DBRX_website_bot/diffusion_mlx/__init__.py
@@ -0,0 +1,306 @@
+# Copyright © 2023-2024 Apple Inc.
+
+import time
+from typing import Optional, Tuple
+
+import mlx.core as mx
+
+from .model_io import (
+ _DEFAULT_MODEL,
+ load_autoencoder,
+ load_diffusion_config,
+ load_text_encoder,
+ load_tokenizer,
+ load_unet,
+)
+from .sampler import SimpleEulerAncestralSampler, SimpleEulerSampler
+
+
+class StableDiffusion:
+ def __init__(self, model: str = _DEFAULT_MODEL, float16: bool = True):
+ self.dtype = mx.float16 if float16 else mx.float32
+ self.diffusion_config = load_diffusion_config(model)
+ self.unet = load_unet(model, float16)
+ self.text_encoder = load_text_encoder(model, float16)
+ self.autoencoder = load_autoencoder(model, False)
+ self.sampler = SimpleEulerSampler(self.diffusion_config)
+ self.tokenizer = load_tokenizer(model)
+
+ def ensure_models_are_loaded(self):
+ mx.eval(self.unet.parameters())
+ mx.eval(self.text_encoder.parameters())
+ mx.eval(self.autoencoder.parameters())
+
+ def _tokenize(self, tokenizer, text: str, negative_text: Optional[str] = None):
+ # Tokenize the text
+ tokens = [tokenizer.tokenize(text)]
+ if negative_text is not None:
+ tokens += [tokenizer.tokenize(negative_text)]
+ lengths = [len(t) for t in tokens]
+ N = max(lengths)
+ tokens = [t + [0] * (N - len(t)) for t in tokens]
+ tokens = mx.array(tokens)
+
+ return tokens
+
+ def _get_text_conditioning(
+ self,
+ text: str,
+ n_images: int = 1,
+ cfg_weight: float = 7.5,
+ negative_text: str = "",
+ ):
+ # Tokenize the text
+ tokens = self._tokenize(
+ self.tokenizer, text, (negative_text if cfg_weight > 1 else None)
+ )
+
+ # Compute the features
+ conditioning = self.text_encoder(tokens).last_hidden_state
+
+ # Repeat the conditioning for each of the generated images
+ if n_images > 1:
+ conditioning = mx.repeat(conditioning, n_images, axis=0)
+
+ return conditioning
+
+ def _denoising_step(
+ self, x_t, t, t_prev, conditioning, cfg_weight: float = 7.5, text_time=None
+ ):
+ x_t_unet = mx.concatenate([x_t] * 2, axis=0) if cfg_weight > 1 else x_t
+ t_unet = mx.broadcast_to(t, [len(x_t_unet)])
+ eps_pred = self.unet(
+ x_t_unet, t_unet, encoder_x=conditioning, text_time=text_time
+ )
+
+ if cfg_weight > 1:
+ eps_text, eps_neg = eps_pred.split(2)
+ eps_pred = eps_neg + cfg_weight * (eps_text - eps_neg)
+
+ x_t_prev = self.sampler.step(eps_pred, x_t, t, t_prev)
+
+ return x_t_prev
+
+ def _denoising_loop(
+ self,
+ x_T,
+ T,
+ conditioning,
+ num_steps: int = 50,
+ cfg_weight: float = 7.5,
+ text_time=None,
+ ):
+ x_t = x_T
+ for t, t_prev in self.sampler.timesteps(
+ num_steps, start_time=T, dtype=self.dtype
+ ):
+ x_t = self._denoising_step(
+ x_t, t, t_prev, conditioning, cfg_weight, text_time
+ )
+ yield x_t
+
+ def generate_latents(
+ self,
+ text: str,
+ n_images: int = 1,
+ num_steps: int = 50,
+ cfg_weight: float = 7.5,
+ negative_text: str = "",
+ latent_size: Tuple[int] = (64, 64),
+ seed=None,
+ ):
+ # Set the PRNG state
+ seed = int(time.time()) if seed is None else seed
+ mx.random.seed(seed)
+
+ # Get the text conditioning
+ conditioning = self._get_text_conditioning(
+ text, n_images, cfg_weight, negative_text
+ )
+
+ # Create the latent variables
+ x_T = self.sampler.sample_prior(
+ (n_images, *latent_size, self.autoencoder.latent_channels), dtype=self.dtype
+ )
+
+ # Perform the denoising loop
+ yield from self._denoising_loop(
+ x_T, self.sampler.max_time, conditioning, num_steps, cfg_weight
+ )
+
+ def generate_latents_from_image(
+ self,
+ image,
+ text: str,
+ n_images: int = 1,
+ strength: float = 0.8,
+ num_steps: int = 50,
+ cfg_weight: float = 7.5,
+ negative_text: str = "",
+ seed=None,
+ ):
+ # Set the PRNG state
+ seed = int(time.time()) if seed is None else seed
+ mx.random.seed(seed)
+
+ # Define the num steps and start step
+ start_step = self.sampler.max_time * strength
+ num_steps = int(num_steps * strength)
+
+ # Get the text conditioning
+ conditioning = self._get_text_conditioning(
+ text, n_images, cfg_weight, negative_text
+ )
+
+ # Get the latents from the input image and add noise according to the
+ # start time.
+ x_0, _ = self.autoencoder.encode(image[None])
+ x_0 = mx.broadcast_to(x_0, (n_images,) + x_0.shape[1:])
+ x_T = self.sampler.add_noise(x_0, mx.array(start_step))
+
+ # Perform the denoising loop
+ yield from self._denoising_loop(
+ x_T, start_step, conditioning, num_steps, cfg_weight
+ )
+
+ def decode(self, x_t):
+ x = self.autoencoder.decode(x_t)
+ x = mx.clip(x / 2 + 0.5, 0, 1)
+ return x
+
+
+class StableDiffusionXL(StableDiffusion):
+ def __init__(self, model: str = _DEFAULT_MODEL, float16: bool = False):
+ super().__init__(model, float16)
+
+ self.sampler = SimpleEulerAncestralSampler(self.diffusion_config)
+
+ self.text_encoder_1 = self.text_encoder
+ self.tokenizer_1 = self.tokenizer
+ del self.tokenizer, self.text_encoder
+
+ self.text_encoder_2 = load_text_encoder(
+ model,
+ float16,
+ model_key="text_encoder_2",
+ )
+ self.tokenizer_2 = load_tokenizer(
+ model,
+ merges_key="tokenizer_2_merges",
+ vocab_key="tokenizer_2_vocab",
+ )
+
+ def ensure_models_are_loaded(self):
+ mx.eval(self.unet.parameters())
+ mx.eval(self.text_encoder_1.parameters())
+ mx.eval(self.text_encoder_2.parameters())
+ mx.eval(self.autoencoder.parameters())
+
+ def _get_text_conditioning(
+ self,
+ text: str,
+ n_images: int = 1,
+ cfg_weight: float = 7.5,
+ negative_text: str = "",
+ ):
+ tokens_1 = self._tokenize(
+ self.tokenizer_1,
+ text,
+ (negative_text if cfg_weight > 1 else None),
+ )
+ tokens_2 = self._tokenize(
+ self.tokenizer_2,
+ text,
+ (negative_text if cfg_weight > 1 else None),
+ )
+
+ conditioning_1 = self.text_encoder_1(tokens_1)
+ conditioning_2 = self.text_encoder_2(tokens_2)
+ conditioning = mx.concatenate(
+ [conditioning_1.hidden_states[-2], conditioning_2.hidden_states[-2]],
+ axis=-1,
+ )
+ pooled_conditioning = conditioning_2.pooled_output
+
+ if n_images > 1:
+ conditioning = mx.repeat(conditioning, n_images, axis=0)
+ pooled_conditioning = mx.repeat(pooled_conditioning, n_images, axis=0)
+
+ return conditioning, pooled_conditioning
+
+ def generate_latents(
+ self,
+ text: str,
+ n_images: int = 1,
+ num_steps: int = 2,
+ cfg_weight: float = 0.0,
+ negative_text: str = "",
+ latent_size: Tuple[int] = (64, 64),
+ seed=None,
+ ):
+ # Set the PRNG state
+ seed = int(time.time()) if seed is None else seed
+ mx.random.seed(seed)
+
+ # Get the text conditioning
+ conditioning, pooled_conditioning = self._get_text_conditioning(
+ text, n_images, cfg_weight, negative_text
+ )
+ text_time = (
+ pooled_conditioning,
+ mx.array([[512, 512, 0, 0, 512, 512.0]] * len(pooled_conditioning)),
+ )
+
+ # Create the latent variables
+ x_T = self.sampler.sample_prior(
+ (n_images, *latent_size, self.autoencoder.latent_channels), dtype=self.dtype
+ )
+
+ # Perform the denoising loop
+ yield from self._denoising_loop(
+ x_T,
+ self.sampler.max_time,
+ conditioning,
+ num_steps,
+ cfg_weight,
+ text_time=text_time,
+ )
+
+ def generate_latents_from_image(
+ self,
+ image,
+ text: str,
+ n_images: int = 1,
+ strength: float = 0.8,
+ num_steps: int = 2,
+ cfg_weight: float = 0.0,
+ negative_text: str = "",
+ seed=None,
+ ):
+ # Set the PRNG state
+ seed = seed or int(time.time())
+ mx.random.seed(seed)
+
+ # Define the num steps and start step
+ start_step = self.sampler.max_time * strength
+ num_steps = int(num_steps * strength)
+
+ # Get the text conditioning
+ conditioning, pooled_conditioning = self._get_text_conditioning(
+ text, n_images, cfg_weight, negative_text
+ )
+ text_time = (
+ pooled_conditioning,
+ mx.array([[512, 512, 0, 0, 512, 512.0]] * len(pooled_conditioning)),
+ )
+
+ # Get the latents from the input image and add noise according to the
+ # start time.
+ x_0, _ = self.autoencoder.encode(image[None])
+ x_0 = mx.broadcast_to(x_0, (n_images,) + x_0.shape[1:])
+ x_T = self.sampler.add_noise(x_0, mx.array(start_step))
+
+ # Perform the denoising loop
+ yield from self._denoising_loop(
+ x_T, start_step, conditioning, num_steps, cfg_weight, text_time=text_time
+ )
diff --git a/examples/databricks_DBRX_website_bot/diffusion_mlx/clip.py b/examples/databricks_DBRX_website_bot/diffusion_mlx/clip.py
new file mode 100644
index 00000000..b5e11fde
--- /dev/null
+++ b/examples/databricks_DBRX_website_bot/diffusion_mlx/clip.py
@@ -0,0 +1,116 @@
+# Copyright © 2023-2024 Apple Inc.
+
+from dataclasses import dataclass
+from typing import List, Optional
+
+import mlx.core as mx
+import mlx.nn as nn
+
+from .config import CLIPTextModelConfig
+
+_ACTIVATIONS = {"quick_gelu": nn.gelu_fast_approx, "gelu": nn.gelu}
+
+
+@dataclass
+class CLIPOutput:
+ # The last_hidden_state indexed at the EOS token and possibly projected if
+ # the model has a projection layer
+ pooled_output: Optional[mx.array] = None
+
+ # The full sequence output of the transformer after the final layernorm
+ last_hidden_state: Optional[mx.array] = None
+
+ # A list of hidden states corresponding to the outputs of the transformer layers
+ hidden_states: Optional[List[mx.array]] = None
+
+
+class CLIPEncoderLayer(nn.Module):
+ """The transformer encoder layer from CLIP."""
+
+ def __init__(self, model_dims: int, num_heads: int, activation: str):
+ super().__init__()
+
+ self.layer_norm1 = nn.LayerNorm(model_dims)
+ self.layer_norm2 = nn.LayerNorm(model_dims)
+
+ self.attention = nn.MultiHeadAttention(model_dims, num_heads)
+ # Add biases to the attention projections to match CLIP
+ self.attention.query_proj.bias = mx.zeros(model_dims)
+ self.attention.key_proj.bias = mx.zeros(model_dims)
+ self.attention.value_proj.bias = mx.zeros(model_dims)
+ self.attention.out_proj.bias = mx.zeros(model_dims)
+
+ self.linear1 = nn.Linear(model_dims, 4 * model_dims)
+ self.linear2 = nn.Linear(4 * model_dims, model_dims)
+
+ self.act = _ACTIVATIONS[activation]
+
+ def __call__(self, x, attn_mask=None):
+ y = self.layer_norm1(x)
+ y = self.attention(y, y, y, attn_mask)
+ x = y + x
+
+ y = self.layer_norm2(x)
+ y = self.linear1(y)
+ y = self.act(y)
+ y = self.linear2(y)
+ x = y + x
+
+ return x
+
+
+class CLIPTextModel(nn.Module):
+ """Implements the text encoder transformer from CLIP."""
+
+ def __init__(self, config: CLIPTextModelConfig):
+ super().__init__()
+
+ self.token_embedding = nn.Embedding(config.vocab_size, config.model_dims)
+ self.position_embedding = nn.Embedding(config.max_length, config.model_dims)
+ self.layers = [
+ CLIPEncoderLayer(config.model_dims, config.num_heads, config.hidden_act)
+ for i in range(config.num_layers)
+ ]
+ self.final_layer_norm = nn.LayerNorm(config.model_dims)
+
+ if config.projection_dim is not None:
+ self.text_projection = nn.Linear(
+ config.model_dims, config.projection_dim, bias=False
+ )
+
+ def _get_mask(self, N, dtype):
+ indices = mx.arange(N)
+ mask = indices[:, None] < indices[None]
+ mask = mask.astype(dtype) * (-6e4 if dtype == mx.float16 else -1e9)
+ return mask
+
+ def __call__(self, x):
+ # Extract some shapes
+ B, N = x.shape
+ eos_tokens = x.argmax(-1)
+
+ # Compute the embeddings
+ x = self.token_embedding(x)
+ x = x + self.position_embedding.weight[:N]
+
+ # Compute the features from the transformer
+ mask = self._get_mask(N, x.dtype)
+ hidden_states = []
+ for l in self.layers:
+ x = l(x, mask)
+ hidden_states.append(x)
+
+ # Apply the final layernorm and return
+ x = self.final_layer_norm(x)
+ last_hidden_state = x
+
+ # Select the EOS token
+ pooled_output = x[mx.arange(len(x)), eos_tokens]
+ if "text_projection" in self:
+ pooled_output = self.text_projection(pooled_output)
+
+ return CLIPOutput(
+ pooled_output=pooled_output,
+ last_hidden_state=last_hidden_state,
+ hidden_states=hidden_states,
+ )
diff --git a/examples/databricks_DBRX_website_bot/diffusion_mlx/config.py b/examples/databricks_DBRX_website_bot/diffusion_mlx/config.py
new file mode 100644
index 00000000..6715757a
--- /dev/null
+++ b/examples/databricks_DBRX_website_bot/diffusion_mlx/config.py
@@ -0,0 +1,65 @@
+# Copyright © 2023-2024 Apple Inc.
+
+from dataclasses import dataclass
+from typing import Optional, Tuple
+
+
+@dataclass
+class AutoencoderConfig:
+ in_channels: int = 3
+ out_channels: int = 3
+ latent_channels_out: int = 8
+ latent_channels_in: int = 4
+ block_out_channels: Tuple[int] = (128, 256, 512, 512)
+ layers_per_block: int = 2
+ norm_num_groups: int = 32
+ scaling_factor: float = 0.18215
+
+
+@dataclass
+class CLIPTextModelConfig:
+ num_layers: int = 23
+ model_dims: int = 1024
+ num_heads: int = 16
+ max_length: int = 77
+ vocab_size: int = 49408
+ projection_dim: Optional[int] = None
+ hidden_act: str = "quick_gelu"
+
+
+@dataclass
+class UNetConfig:
+ in_channels: int = 4
+ out_channels: int = 4
+ conv_in_kernel: int = 3
+ conv_out_kernel: int = 3
+ block_out_channels: Tuple[int] = (320, 640, 1280, 1280)
+ layers_per_block: Tuple[int] = (2, 2, 2, 2)
+ mid_block_layers: int = 2
+ transformer_layers_per_block: Tuple[int] = (1, 1, 1, 1)
+ num_attention_heads: Tuple[int] = (5, 10, 20, 20)
+ cross_attention_dim: Tuple[int] = (1024,) * 4
+ norm_num_groups: int = 32
+ down_block_types: Tuple[str] = (
+ "CrossAttnDownBlock2D",
+ "CrossAttnDownBlock2D",
+ "CrossAttnDownBlock2D",
+ "DownBlock2D",
+ )
+ up_block_types: Tuple[str] = (
+ "UpBlock2D",
+ "CrossAttnUpBlock2D",
+ "CrossAttnUpBlock2D",
+ "CrossAttnUpBlock2D",
+ )
+ addition_embed_type: Optional[str] = None
+ addition_time_embed_dim: Optional[int] = None
+ projection_class_embeddings_input_dim: Optional[int] = None
+
+
+@dataclass
+class DiffusionConfig:
+ beta_schedule: str = "scaled_linear"
+ beta_start: float = 0.00085
+ beta_end: float = 0.012
+ num_train_steps: int = 1000
diff --git a/examples/databricks_DBRX_website_bot/diffusion_mlx/model_io.py b/examples/databricks_DBRX_website_bot/diffusion_mlx/model_io.py
new file mode 100644
index 00000000..2c2227db
--- /dev/null
+++ b/examples/databricks_DBRX_website_bot/diffusion_mlx/model_io.py
@@ -0,0 +1,330 @@
+# Copyright © 2023-2024 Apple Inc.
+
+import json
+from typing import Optional
+
+import mlx.core as mx
+from huggingface_hub import hf_hub_download
+from mlx.utils import tree_unflatten
+
+from .clip import CLIPTextModel
+from .config import AutoencoderConfig, CLIPTextModelConfig, DiffusionConfig, UNetConfig
+from .tokenizer import Tokenizer
+from .unet import UNetModel
+from .vae import Autoencoder
+
+_DEFAULT_MODEL = "stabilityai/stable-diffusion-2-1-base"
+_MODELS = {
+ # See https://huggingface.co/stabilityai/sdxl-turbo for the model details and license
+ "stabilityai/sdxl-turbo": {
+ "unet_config": "unet/config.json",
+ "unet": "unet/diffusion_pytorch_model.safetensors",
+ "text_encoder_config": "text_encoder/config.json",
+ "text_encoder": "text_encoder/model.safetensors",
+ "text_encoder_2_config": "text_encoder_2/config.json",
+ "text_encoder_2": "text_encoder_2/model.safetensors",
+ "vae_config": "vae/config.json",
+ "vae": "vae/diffusion_pytorch_model.safetensors",
+ "diffusion_config": "scheduler/scheduler_config.json",
+ "tokenizer_vocab": "tokenizer/vocab.json",
+ "tokenizer_merges": "tokenizer/merges.txt",
+ "tokenizer_2_vocab": "tokenizer_2/vocab.json",
+ "tokenizer_2_merges": "tokenizer_2/merges.txt",
+ },
+ # See https://huggingface.co/stabilityai/stable-diffusion-2-1-base for the model details and license
+ "stabilityai/stable-diffusion-2-1-base": {
+ "unet_config": "unet/config.json",
+ "unet": "unet/diffusion_pytorch_model.safetensors",
+ "text_encoder_config": "text_encoder/config.json",
+ "text_encoder": "text_encoder/model.safetensors",
+ "vae_config": "vae/config.json",
+ "vae": "vae/diffusion_pytorch_model.safetensors",
+ "diffusion_config": "scheduler/scheduler_config.json",
+ "tokenizer_vocab": "tokenizer/vocab.json",
+ "tokenizer_merges": "tokenizer/merges.txt",
+ },
+}
+
+
+def map_unet_weights(key, value):
+ # Map up/downsampling
+ if "downsamplers" in key:
+ key = key.replace("downsamplers.0.conv", "downsample")
+ if "upsamplers" in key:
+ key = key.replace("upsamplers.0.conv", "upsample")
+
+ # Map the mid block
+ if "mid_block.resnets.0" in key:
+ key = key.replace("mid_block.resnets.0", "mid_blocks.0")
+ if "mid_block.attentions.0" in key:
+ key = key.replace("mid_block.attentions.0", "mid_blocks.1")
+ if "mid_block.resnets.1" in key:
+ key = key.replace("mid_block.resnets.1", "mid_blocks.2")
+
+ # Map attention layers
+ if "to_k" in key:
+ key = key.replace("to_k", "key_proj")
+ if "to_out.0" in key:
+ key = key.replace("to_out.0", "out_proj")
+ if "to_q" in key:
+ key = key.replace("to_q", "query_proj")
+ if "to_v" in key:
+ key = key.replace("to_v", "value_proj")
+
+ # Map transformer ffn
+ if "ff.net.2" in key:
+ key = key.replace("ff.net.2", "linear3")
+ if "ff.net.0" in key:
+ k1 = key.replace("ff.net.0.proj", "linear1")
+ k2 = key.replace("ff.net.0.proj", "linear2")
+ v1, v2 = mx.split(value, 2)
+
+ return [(k1, v1), (k2, v2)]
+
+ if "conv_shortcut.weight" in key:
+ value = value.squeeze()
+
+ # Transform the weights from 1x1 convs to linear
+ if len(value.shape) == 4 and ("proj_in" in key or "proj_out" in key):
+ value = value.squeeze()
+
+ if len(value.shape) == 4:
+ value = value.transpose(0, 2, 3, 1)
+ value = value.reshape(-1).reshape(value.shape)
+
+ return [(key, value)]
+
+
+def map_clip_text_encoder_weights(key, value):
+ # Remove prefixes
+ if key.startswith("text_model."):
+ key = key[11:]
+ if key.startswith("embeddings."):
+ key = key[11:]
+ if key.startswith("encoder."):
+ key = key[8:]
+
+ # Map attention layers
+ if "self_attn." in key:
+ key = key.replace("self_attn.", "attention.")
+ if "q_proj." in key:
+ key = key.replace("q_proj.", "query_proj.")
+ if "k_proj." in key:
+ key = key.replace("k_proj.", "key_proj.")
+ if "v_proj." in key:
+ key = key.replace("v_proj.", "value_proj.")
+
+ # Map ffn layers
+ if "mlp.fc1" in key:
+ key = key.replace("mlp.fc1", "linear1")
+ if "mlp.fc2" in key:
+ key = key.replace("mlp.fc2", "linear2")
+
+ return [(key, value)]
+
+
+def map_vae_weights(key, value):
+ # Map up/downsampling
+ if "downsamplers" in key:
+ key = key.replace("downsamplers.0.conv", "downsample")
+ if "upsamplers" in key:
+ key = key.replace("upsamplers.0.conv", "upsample")
+
+ # Map attention layers
+ if "to_k" in key:
+ key = key.replace("to_k", "key_proj")
+ if "to_out.0" in key:
+ key = key.replace("to_out.0", "out_proj")
+ if "to_q" in key:
+ key = key.replace("to_q", "query_proj")
+ if "to_v" in key:
+ key = key.replace("to_v", "value_proj")
+
+ # Map the mid block
+ if "mid_block.resnets.0" in key:
+ key = key.replace("mid_block.resnets.0", "mid_blocks.0")
+ if "mid_block.attentions.0" in key:
+ key = key.replace("mid_block.attentions.0", "mid_blocks.1")
+ if "mid_block.resnets.1" in key:
+ key = key.replace("mid_block.resnets.1", "mid_blocks.2")
+
+ # Map the quant/post_quant layers
+ if "quant_conv" in key:
+ key = key.replace("quant_conv", "quant_proj")
+ value = value.squeeze()
+
+ # Map the conv_shortcut to linear
+ if "conv_shortcut.weight" in key:
+ value = value.squeeze()
+
+ if len(value.shape) == 4:
+ value = value.transpose(0, 2, 3, 1)
+ value = value.reshape(-1).reshape(value.shape)
+
+ return [(key, value)]
+
+
+def _flatten(params):
+ return [(k, v) for p in params for (k, v) in p]
+
+
+def _load_safetensor_weights(mapper, model, weight_file, float16: bool = False):
+ dtype = mx.float16 if float16 else mx.float32
+ weights = mx.load(weight_file)
+ weights = _flatten([mapper(k, v.astype(dtype)) for k, v in weights.items()])
+ model.update(tree_unflatten(weights))
+
+
+def _check_key(key: str, part: str):
+ if key not in _MODELS:
+ raise ValueError(
+ f"[{part}] '{key}' model not found, choose one of {{{','.join(_MODELS.keys())}}}"
+ )
+
+
+def load_unet(key: str = _DEFAULT_MODEL, float16: bool = False):
+ """Load the stable diffusion UNet from Hugging Face Hub."""
+ _check_key(key, "load_unet")
+
+ # Download the config and create the model
+ unet_config = _MODELS[key]["unet_config"]
+ with open(hf_hub_download(key, unet_config)) as f:
+ config = json.load(f)
+
+ n_blocks = len(config["block_out_channels"])
+ model = UNetModel(
+ UNetConfig(
+ in_channels=config["in_channels"],
+ out_channels=config["out_channels"],
+ block_out_channels=config["block_out_channels"],
+ layers_per_block=[config["layers_per_block"]] * n_blocks,
+ transformer_layers_per_block=config.get(
+ "transformer_layers_per_block", (1,) * 4
+ ),
+ num_attention_heads=(
+ [config["attention_head_dim"]] * n_blocks
+ if isinstance(config["attention_head_dim"], int)
+ else config["attention_head_dim"]
+ ),
+ cross_attention_dim=[config["cross_attention_dim"]] * n_blocks,
+ norm_num_groups=config["norm_num_groups"],
+ down_block_types=config["down_block_types"],
+ up_block_types=config["up_block_types"][::-1],
+ addition_embed_type=config.get("addition_embed_type", None),
+ addition_time_embed_dim=config.get("addition_time_embed_dim", None),
+ projection_class_embeddings_input_dim=config.get(
+ "projection_class_embeddings_input_dim", None
+ ),
+ )
+ )
+
+ # Download the weights and map them into the model
+ unet_weights = _MODELS[key]["unet"]
+ weight_file = hf_hub_download(key, unet_weights)
+ _load_safetensor_weights(map_unet_weights, model, weight_file, float16)
+
+ return model
+
+
+def load_text_encoder(
+ key: str = _DEFAULT_MODEL,
+ float16: bool = False,
+ model_key: str = "text_encoder",
+ config_key: Optional[str] = None,
+):
+ """Load the stable diffusion text encoder from Hugging Face Hub."""
+ _check_key(key, "load_text_encoder")
+
+ config_key = config_key or (model_key + "_config")
+
+ # Download the config and create the model
+ text_encoder_config = _MODELS[key][config_key]
+ with open(hf_hub_download(key, text_encoder_config)) as f:
+ config = json.load(f)
+
+ with_projection = "WithProjection" in config["architectures"][0]
+
+ model = CLIPTextModel(
+ CLIPTextModelConfig(
+ num_layers=config["num_hidden_layers"],
+ model_dims=config["hidden_size"],
+ num_heads=config["num_attention_heads"],
+ max_length=config["max_position_embeddings"],
+ vocab_size=config["vocab_size"],
+ projection_dim=config["projection_dim"] if with_projection else None,
+ hidden_act=config.get("hidden_act", "quick_gelu"),
+ )
+ )
+
+ # Download the weights and map them into the model
+ text_encoder_weights = _MODELS[key][model_key]
+ weight_file = hf_hub_download(key, text_encoder_weights)
+ _load_safetensor_weights(map_clip_text_encoder_weights, model, weight_file, float16)
+
+ return model
+
+
+def load_autoencoder(key: str = _DEFAULT_MODEL, float16: bool = False):
+ """Load the stable diffusion autoencoder from Hugging Face Hub."""
+ _check_key(key, "load_autoencoder")
+
+ # Download the config and create the model
+ vae_config = _MODELS[key]["vae_config"]
+ with open(hf_hub_download(key, vae_config)) as f:
+ config = json.load(f)
+
+ model = Autoencoder(
+ AutoencoderConfig(
+ in_channels=config["in_channels"],
+ out_channels=config["out_channels"],
+ latent_channels_out=2 * config["latent_channels"],
+ latent_channels_in=config["latent_channels"],
+ block_out_channels=config["block_out_channels"],
+ layers_per_block=config["layers_per_block"],
+ norm_num_groups=config["norm_num_groups"],
+ scaling_factor=config.get("scaling_factor", 0.18215),
+ )
+ )
+
+ # Download the weights and map them into the model
+ vae_weights = _MODELS[key]["vae"]
+ weight_file = hf_hub_download(key, vae_weights)
+ _load_safetensor_weights(map_vae_weights, model, weight_file, float16)
+
+ return model
+
+
+def load_diffusion_config(key: str = _DEFAULT_MODEL):
+ """Load the stable diffusion config from Hugging Face Hub."""
+ _check_key(key, "load_diffusion_config")
+
+ diffusion_config = _MODELS[key]["diffusion_config"]
+ with open(hf_hub_download(key, diffusion_config)) as f:
+ config = json.load(f)
+
+ return DiffusionConfig(
+ beta_start=config["beta_start"],
+ beta_end=config["beta_end"],
+ beta_schedule=config["beta_schedule"],
+ num_train_steps=config["num_train_timesteps"],
+ )
+
+
+def load_tokenizer(
+ key: str = _DEFAULT_MODEL,
+ vocab_key: str = "tokenizer_vocab",
+ merges_key: str = "tokenizer_merges",
+):
+ _check_key(key, "load_tokenizer")
+
+ vocab_file = hf_hub_download(key, _MODELS[key][vocab_key])
+ with open(vocab_file, encoding="utf-8") as f:
+ vocab = json.load(f)
+
+ merges_file = hf_hub_download(key, _MODELS[key][merges_key])
+ with open(merges_file, encoding="utf-8") as f:
+ bpe_merges = f.read().strip().split("\n")[1 : 49152 - 256 - 2 + 1]
+ bpe_merges = [tuple(m.split()) for m in bpe_merges]
+ bpe_ranks = dict(map(reversed, enumerate(bpe_merges)))
+
+ return Tokenizer(bpe_ranks, vocab)
diff --git a/examples/databricks_DBRX_website_bot/diffusion_mlx/sampler.py b/examples/databricks_DBRX_website_bot/diffusion_mlx/sampler.py
new file mode 100644
index 00000000..ff4433d0
--- /dev/null
+++ b/examples/databricks_DBRX_website_bot/diffusion_mlx/sampler.py
@@ -0,0 +1,105 @@
+# Copyright © 2023 Apple Inc.
+
+import mlx.core as mx
+
+from .config import DiffusionConfig
+
+
+def _linspace(a, b, num):
+ x = mx.arange(0, num) / (num - 1)
+ return (b - a) * x + a
+
+
+def _interp(y, x_new):
+ """Interpolate the function defined by (arange(0, len(y)), y) at positions x_new."""
+ x_low = x_new.astype(mx.int32)
+ x_high = mx.minimum(x_low + 1, len(y) - 1)
+
+ y_low = y[x_low]
+ y_high = y[x_high]
+ delta_x = x_new - x_low
+ y_new = y_low * (1 - delta_x) + delta_x * y_high
+
+ return y_new
+
+
+class SimpleEulerSampler:
+ """A simple Euler integrator that can be used to sample from our diffusion models.
+
+ The method ``step()`` performs one Euler step from x_t to x_t_prev.
+ """
+
+ def __init__(self, config: DiffusionConfig):
+ # Compute the noise schedule
+ if config.beta_schedule == "linear":
+ betas = _linspace(
+ config.beta_start, config.beta_end, config.num_train_steps
+ )
+ elif config.beta_schedule == "scaled_linear":
+ betas = _linspace(
+ config.beta_start**0.5, config.beta_end**0.5, config.num_train_steps
+ ).square()
+ else:
+ raise NotImplementedError(f"{config.beta_schedule} is not implemented.")
+
+ alphas = 1 - betas
+ alphas_cumprod = mx.cumprod(alphas)
+
+ self._sigmas = mx.concatenate(
+ [mx.zeros(1), ((1 - alphas_cumprod) / alphas_cumprod).sqrt()]
+ )
+
+ @property
+ def max_time(self):
+ return len(self._sigmas) - 1
+
+ def sample_prior(self, shape, dtype=mx.float32, key=None):
+ noise = mx.random.normal(shape, key=key)
+ return (
+ noise * self._sigmas[-1] * (self._sigmas[-1].square() + 1).rsqrt()
+ ).astype(dtype)
+
+ def add_noise(self, x, t, key=None):
+ noise = mx.random.normal(x.shape, key=key)
+ s = self.sigmas(t)
+ return (x + noise * s) * (s.square() + 1).rsqrt()
+
+ def sigmas(self, t):
+ return _interp(self._sigmas, t)
+
+ def timesteps(self, num_steps: int, start_time=None, dtype=mx.float32):
+ start_time = start_time or (len(self._sigmas) - 1)
+ assert 0 < start_time <= (len(self._sigmas) - 1)
+ steps = _linspace(start_time, 0, num_steps + 1).astype(dtype)
+ return list(zip(steps, steps[1:]))
+
+ def step(self, eps_pred, x_t, t, t_prev):
+ sigma = self.sigmas(t).astype(eps_pred.dtype)
+ sigma_prev = self.sigmas(t_prev).astype(eps_pred.dtype)
+
+ dt = sigma_prev - sigma
+ x_t_prev = (sigma.square() + 1).sqrt() * x_t + eps_pred * dt
+
+ x_t_prev = x_t_prev * (sigma_prev.square() + 1).rsqrt()
+
+ return x_t_prev
+
+
+class SimpleEulerAncestralSampler(SimpleEulerSampler):
+ def step(self, eps_pred, x_t, t, t_prev):
+ sigma = self.sigmas(t).astype(eps_pred.dtype)
+ sigma_prev = self.sigmas(t_prev).astype(eps_pred.dtype)
+
+ sigma2 = sigma.square()
+ sigma_prev2 = sigma_prev.square()
+ sigma_up = (sigma_prev2 * (sigma2 - sigma_prev2) / sigma2).sqrt()
+ sigma_down = (sigma_prev2 - sigma_up**2).sqrt()
+
+ dt = sigma_down - sigma
+ x_t_prev = (sigma2 + 1).sqrt() * x_t + eps_pred * dt
+ noise = mx.random.normal(x_t_prev.shape).astype(x_t_prev.dtype)
+ x_t_prev = x_t_prev + noise * sigma_up
+
+ x_t_prev = x_t_prev * (sigma_prev2 + 1).rsqrt()
+
+ return x_t_prev
diff --git a/examples/databricks_DBRX_website_bot/diffusion_mlx/tokenizer.py b/examples/databricks_DBRX_website_bot/diffusion_mlx/tokenizer.py
new file mode 100644
index 00000000..ae9b967a
--- /dev/null
+++ b/examples/databricks_DBRX_website_bot/diffusion_mlx/tokenizer.py
@@ -0,0 +1,100 @@
+# Copyright © 2023 Apple Inc.
+
+import regex
+
+
+class Tokenizer:
+ """A simple port of CLIPTokenizer from https://github.com/huggingface/transformers/ ."""
+
+ def __init__(self, bpe_ranks, vocab):
+ self.bpe_ranks = bpe_ranks
+ self.vocab = vocab
+ self.pat = regex.compile(
+ r"""<\|startoftext\|>|<\|endoftext\|>|'s|'t|'re|'ve|'m|'ll|'d|[\p{L}]+|[\p{N}]|[^\s\p{L}\p{N}]+""",
+ regex.IGNORECASE,
+ )
+
+ self._cache = {self.bos: self.bos, self.eos: self.eos}
+
+ @property
+ def bos(self):
+ return "<|startoftext|>"
+
+ @property
+ def bos_token(self):
+ return self.vocab[self.bos]
+
+ @property
+ def eos(self):
+ return "<|endoftext|>"
+
+ @property
+ def eos_token(self):
+ return self.vocab[self.eos]
+
+ def bpe(self, text):
+ if text in self._cache:
+ return self._cache[text]
+
+ unigrams = list(text[:-1]) + [text[-1] + ""]
+ unique_bigrams = set(zip(unigrams, unigrams[1:]))
+
+ if not unique_bigrams:
+ return unigrams
+
+ # In every iteration try to merge the two most likely bigrams. If none
+ # was merged we are done.
+ #
+ # Ported from https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip.py
+ while unique_bigrams:
+ bigram = min(
+ unique_bigrams, key=lambda pair: self.bpe_ranks.get(pair, float("inf"))
+ )
+ if bigram not in self.bpe_ranks:
+ break
+
+ new_unigrams = []
+ skip = False
+ for a, b in zip(unigrams, unigrams[1:]):
+ if skip:
+ skip = False
+ continue
+
+ if (a, b) == bigram:
+ new_unigrams.append(a + b)
+ skip = True
+
+ else:
+ new_unigrams.append(a)
+
+ if not skip:
+ new_unigrams.append(b)
+
+ unigrams = new_unigrams
+ unique_bigrams = set(zip(unigrams, unigrams[1:]))
+
+ self._cache[text] = unigrams
+
+ return unigrams
+
+ def tokenize(self, text, prepend_bos=True, append_eos=True):
+ if isinstance(text, list):
+ return [self.tokenize(t, prepend_bos, append_eos) for t in text]
+
+ # Lower case cleanup and split according to self.pat. Hugging Face does
+ # a much more thorough job here but this should suffice for 95% of
+ # cases.
+ clean_text = regex.sub(r"\s+", " ", text.lower())
+ tokens = regex.findall(self.pat, clean_text)
+
+ # Split the tokens according to the byte-pair merge file
+ bpe_tokens = [ti for t in tokens for ti in self.bpe(t)]
+
+ # Map to token ids and return
+ tokens = [self.vocab[t] for t in bpe_tokens]
+ if prepend_bos:
+ tokens = [self.bos_token] + tokens
+ if append_eos:
+ tokens.append(self.eos_token)
+
+ return tokens
diff --git a/examples/databricks_DBRX_website_bot/diffusion_mlx/unet.py b/examples/databricks_DBRX_website_bot/diffusion_mlx/unet.py
new file mode 100644
index 00000000..ec2915e5
--- /dev/null
+++ b/examples/databricks_DBRX_website_bot/diffusion_mlx/unet.py
@@ -0,0 +1,461 @@
+# Copyright © 2023 Apple Inc.
+
+import math
+from typing import Optional
+
+import mlx.core as mx
+import mlx.nn as nn
+
+from .config import UNetConfig
+
+
+def upsample_nearest(x, scale: int = 2):
+ B, H, W, C = x.shape
+ x = mx.broadcast_to(x[:, :, None, :, None, :], (B, H, scale, W, scale, C))
+ x = x.reshape(B, H * scale, W * scale, C)
+
+ return x
+
+
+class TimestepEmbedding(nn.Module):
+ def __init__(self, in_channels: int, time_embed_dim: int):
+ super().__init__()
+
+ self.linear_1 = nn.Linear(in_channels, time_embed_dim)
+ self.linear_2 = nn.Linear(time_embed_dim, time_embed_dim)
+
+ def __call__(self, x):
+ x = self.linear_1(x)
+ x = nn.silu(x)
+ x = self.linear_2(x)
+
+ return x
+
+
+class TransformerBlock(nn.Module):
+ def __init__(
+ self,
+ model_dims: int,
+ num_heads: int,
+ hidden_dims: Optional[int] = None,
+ memory_dims: Optional[int] = None,
+ ):
+ super().__init__()
+
+ self.norm1 = nn.LayerNorm(model_dims)
+ self.attn1 = nn.MultiHeadAttention(model_dims, num_heads)
+ self.attn1.out_proj.bias = mx.zeros(model_dims)
+
+ memory_dims = memory_dims or model_dims
+ self.norm2 = nn.LayerNorm(model_dims)
+ self.attn2 = nn.MultiHeadAttention(
+ model_dims, num_heads, key_input_dims=memory_dims
+ )
+ self.attn2.out_proj.bias = mx.zeros(model_dims)
+
+ hidden_dims = hidden_dims or 4 * model_dims
+ self.norm3 = nn.LayerNorm(model_dims)
+ self.linear1 = nn.Linear(model_dims, hidden_dims)
+ self.linear2 = nn.Linear(model_dims, hidden_dims)
+ self.linear3 = nn.Linear(hidden_dims, model_dims)
+
+ def __call__(self, x, memory, attn_mask, memory_mask):
+ # Self attention
+ y = self.norm1(x)
+ y = self.attn1(y, y, y, attn_mask)
+ x = x + y
+
+ # Cross attention
+ y = self.norm2(x)
+ y = self.attn2(y, memory, memory, memory_mask)
+ x = x + y
+
+ # FFN
+ y = self.norm3(x)
+ y_a = self.linear1(y)
+ y_b = self.linear2(y)
+ y = y_a * nn.gelu(y_b)
+ y = self.linear3(y)
+ x = x + y
+
+ return x
+
+
+class Transformer2D(nn.Module):
+ """A transformer model for inputs with 2 spatial dimensions."""
+
+ def __init__(
+ self,
+ in_channels: int,
+ model_dims: int,
+ encoder_dims: int,
+ num_heads: int,
+ num_layers: int = 1,
+ norm_num_groups: int = 32,
+ ):
+ super().__init__()
+
+ self.norm = nn.GroupNorm(norm_num_groups, in_channels, pytorch_compatible=True)
+ self.proj_in = nn.Linear(in_channels, model_dims)
+ self.transformer_blocks = [
+ TransformerBlock(model_dims, num_heads, memory_dims=encoder_dims)
+ for i in range(num_layers)
+ ]
+ self.proj_out = nn.Linear(model_dims, in_channels)
+
+ def __call__(self, x, encoder_x, attn_mask, encoder_attn_mask):
+ # Save the input to add to the output
+ input_x = x
+ dtype = x.dtype
+
+ # Perform the input norm and projection
+ B, H, W, C = x.shape
+ x = self.norm(x.astype(mx.float32)).astype(dtype).reshape(B, -1, C)
+ x = self.proj_in(x)
+
+ # Apply the transformer
+ for block in self.transformer_blocks:
+ x = block(x, encoder_x, attn_mask, encoder_attn_mask)
+
+ # Apply the output projection and reshape
+ x = self.proj_out(x)
+ x = x.reshape(B, H, W, C)
+
+ return x + input_x
+
+
+class ResnetBlock2D(nn.Module):
+ def __init__(
+ self,
+ in_channels: int,
+ out_channels: Optional[int] = None,
+ groups: int = 32,
+ temb_channels: Optional[int] = None,
+ ):
+ super().__init__()
+
+ out_channels = out_channels or in_channels
+
+ self.norm1 = nn.GroupNorm(groups, in_channels, pytorch_compatible=True)
+ self.conv1 = nn.Conv2d(
+ in_channels, out_channels, kernel_size=3, stride=1, padding=1
+ )
+ if temb_channels is not None:
+ self.time_emb_proj = nn.Linear(temb_channels, out_channels)
+ self.norm2 = nn.GroupNorm(groups, out_channels, pytorch_compatible=True)
+ self.conv2 = nn.Conv2d(
+ out_channels, out_channels, kernel_size=3, stride=1, padding=1
+ )
+
+ if in_channels != out_channels:
+ self.conv_shortcut = nn.Linear(in_channels, out_channels)
+
+ def __call__(self, x, temb=None):
+ dtype = x.dtype
+
+ if temb is not None:
+ temb = self.time_emb_proj(nn.silu(temb))
+
+ y = self.norm1(x.astype(mx.float32)).astype(dtype)
+ y = nn.silu(y)
+ y = self.conv1(y)
+ if temb is not None:
+ y = y + temb[:, None, None, :]
+ y = self.norm2(y.astype(mx.float32)).astype(dtype)
+ y = nn.silu(y)
+ y = self.conv2(y)
+
+ x = y + (x if "conv_shortcut" not in self else self.conv_shortcut(x))
+
+ return x
+
+
+class UNetBlock2D(nn.Module):
+ def __init__(
+ self,
+ in_channels: int,
+ out_channels: int,
+ temb_channels: int,
+ prev_out_channels: Optional[int] = None,
+ num_layers: int = 1,
+ transformer_layers_per_block: int = 1,
+ num_attention_heads: int = 8,
+ cross_attention_dim=1280,
+ resnet_groups: int = 32,
+ add_downsample=True,
+ add_upsample=True,
+ add_cross_attention=True,
+ ):
+ super().__init__()
+
+ # Prepare the in channels list for the resnets
+ if prev_out_channels is None:
+ in_channels_list = [in_channels] + [out_channels] * (num_layers - 1)
+ else:
+ in_channels_list = [prev_out_channels] + [out_channels] * (num_layers - 1)
+ res_channels_list = [out_channels] * (num_layers - 1) + [in_channels]
+ in_channels_list = [
+ a + b for a, b in zip(in_channels_list, res_channels_list)
+ ]
+
+ # Add resnet blocks that also process the time embedding
+ self.resnets = [
+ ResnetBlock2D(
+ in_channels=ic,
+ out_channels=out_channels,
+ temb_channels=temb_channels,
+ groups=resnet_groups,
+ )
+ for ic in in_channels_list
+ ]
+
+ # Add optional cross attention layers
+ if add_cross_attention:
+ self.attentions = [
+ Transformer2D(
+ in_channels=out_channels,
+ model_dims=out_channels,
+ num_heads=num_attention_heads,
+ num_layers=transformer_layers_per_block,
+ encoder_dims=cross_attention_dim,
+ )
+ for i in range(num_layers)
+ ]
+
+ # Add an optional downsampling layer
+ if add_downsample:
+ self.downsample = nn.Conv2d(
+ out_channels, out_channels, kernel_size=3, stride=2, padding=1
+ )
+
+ # or upsampling layer
+ if add_upsample:
+ self.upsample = nn.Conv2d(
+ out_channels, out_channels, kernel_size=3, stride=1, padding=1
+ )
+
+ def __call__(
+ self,
+ x,
+ encoder_x=None,
+ temb=None,
+ attn_mask=None,
+ encoder_attn_mask=None,
+ residual_hidden_states=None,
+ ):
+ output_states = []
+
+ for i in range(len(self.resnets)):
+ if residual_hidden_states is not None:
+ x = mx.concatenate([x, residual_hidden_states.pop()], axis=-1)
+
+ x = self.resnets[i](x, temb)
+
+ if "attentions" in self:
+ x = self.attentions[i](x, encoder_x, attn_mask, encoder_attn_mask)
+
+ output_states.append(x)
+
+ if "downsample" in self:
+ x = self.downsample(x)
+ output_states.append(x)
+
+ if "upsample" in self:
+ x = self.upsample(upsample_nearest(x))
+ output_states.append(x)
+
+ return x, output_states
+
+
+class UNetModel(nn.Module):
+ """The conditional 2D UNet model that actually performs the denoising."""
+
+ def __init__(self, config: UNetConfig):
+ super().__init__()
+
+ self.conv_in = nn.Conv2d(
+ config.in_channels,
+ config.block_out_channels[0],
+ config.conv_in_kernel,
+ padding=(config.conv_in_kernel - 1) // 2,
+ )
+
+ self.timesteps = nn.SinusoidalPositionalEncoding(
+ config.block_out_channels[0],
+ max_freq=1,
+ min_freq=math.exp(
+ -math.log(10000) + 2 * math.log(10000) / config.block_out_channels[0]
+ ),
+ scale=1.0,
+ cos_first=True,
+ full_turns=False,
+ )
+ self.time_embedding = TimestepEmbedding(
+ config.block_out_channels[0],
+ config.block_out_channels[0] * 4,
+ )
+
+ if config.addition_embed_type == "text_time":
+ self.add_time_proj = nn.SinusoidalPositionalEncoding(
+ config.addition_time_embed_dim,
+ max_freq=1,
+ min_freq=math.exp(
+ -math.log(10000)
+ + 2 * math.log(10000) / config.addition_time_embed_dim
+ ),
+ scale=1.0,
+ cos_first=True,
+ full_turns=False,
+ )
+ self.add_embedding = TimestepEmbedding(
+ config.projection_class_embeddings_input_dim,
+ config.block_out_channels[0] * 4,
+ )
+
+ # Make the downsampling blocks
+ block_channels = [config.block_out_channels[0]] + list(
+ config.block_out_channels
+ )
+ self.down_blocks = [
+ UNetBlock2D(
+ in_channels=in_channels,
+ out_channels=out_channels,
+ temb_channels=config.block_out_channels[0] * 4,
+ num_layers=config.layers_per_block[i],
+ transformer_layers_per_block=config.transformer_layers_per_block[i],
+ num_attention_heads=config.num_attention_heads[i],
+ cross_attention_dim=config.cross_attention_dim[i],
+ resnet_groups=config.norm_num_groups,
+ add_downsample=(i < len(config.block_out_channels) - 1),
+ add_upsample=False,
+ add_cross_attention="CrossAttn" in config.down_block_types[i],
+ )
+ for i, (in_channels, out_channels) in enumerate(
+ zip(block_channels, block_channels[1:])
+ )
+ ]
+
+ # Make the middle block
+ self.mid_blocks = [
+ ResnetBlock2D(
+ in_channels=config.block_out_channels[-1],
+ out_channels=config.block_out_channels[-1],
+ temb_channels=config.block_out_channels[0] * 4,
+ groups=config.norm_num_groups,
+ ),
+ Transformer2D(
+ in_channels=config.block_out_channels[-1],
+ model_dims=config.block_out_channels[-1],
+ num_heads=config.num_attention_heads[-1],
+ num_layers=config.transformer_layers_per_block[-1],
+ encoder_dims=config.cross_attention_dim[-1],
+ ),
+ ResnetBlock2D(
+ in_channels=config.block_out_channels[-1],
+ out_channels=config.block_out_channels[-1],
+ temb_channels=config.block_out_channels[0] * 4,
+ groups=config.norm_num_groups,
+ ),
+ ]
+
+ # Make the upsampling blocks
+ block_channels = (
+ [config.block_out_channels[0]]
+ + list(config.block_out_channels)
+ + [config.block_out_channels[-1]]
+ )
+ self.up_blocks = [
+ UNetBlock2D(
+ in_channels=in_channels,
+ out_channels=out_channels,
+ temb_channels=config.block_out_channels[0] * 4,
+ prev_out_channels=prev_out_channels,
+ num_layers=config.layers_per_block[i] + 1,
+ transformer_layers_per_block=config.transformer_layers_per_block[i],
+ num_attention_heads=config.num_attention_heads[i],
+ cross_attention_dim=config.cross_attention_dim[i],
+ resnet_groups=config.norm_num_groups,
+ add_downsample=False,
+ add_upsample=(i > 0),
+ add_cross_attention="CrossAttn" in config.up_block_types[i],
+ )
+ for i, (in_channels, out_channels, prev_out_channels) in reversed(
+ list(
+ enumerate(
+ zip(block_channels, block_channels[1:], block_channels[2:])
+ )
+ )
+ )
+ ]
+
+ self.conv_norm_out = nn.GroupNorm(
+ config.norm_num_groups,
+ config.block_out_channels[0],
+ pytorch_compatible=True,
+ )
+ self.conv_out = nn.Conv2d(
+ config.block_out_channels[0],
+ config.out_channels,
+ config.conv_out_kernel,
+ padding=(config.conv_out_kernel - 1) // 2,
+ )
+
+ def __call__(
+ self,
+ x,
+ timestep,
+ encoder_x,
+ attn_mask=None,
+ encoder_attn_mask=None,
+ text_time=None,
+ ):
+ # Compute the time embeddings
+ temb = self.timesteps(timestep).astype(x.dtype)
+ temb = self.time_embedding(temb)
+
+ # Add the extra text_time conditioning
+ if text_time is not None:
+ text_emb, time_ids = text_time
+ emb = self.add_time_proj(time_ids).flatten(1).astype(x.dtype)
+ emb = mx.concatenate([text_emb, emb], axis=-1)
+ emb = self.add_embedding(emb)
+ temb = temb + emb
+
+ # Preprocess the input
+ x = self.conv_in(x)
+
+ # Run the downsampling part of the unet
+ residuals = [x]
+ for block in self.down_blocks:
+ x, res = block(
+ x,
+ encoder_x=encoder_x,
+ temb=temb,
+ attn_mask=attn_mask,
+ encoder_attn_mask=encoder_attn_mask,
+ )
+ residuals.extend(res)
+
+ # Run the middle part of the unet
+ x = self.mid_blocks[0](x, temb)
+ x = self.mid_blocks[1](x, encoder_x, attn_mask, encoder_attn_mask)
+ x = self.mid_blocks[2](x, temb)
+
+ # Run the upsampling part of the unet
+ for block in self.up_blocks:
+ x, _ = block(
+ x,
+ encoder_x=encoder_x,
+ temb=temb,
+ attn_mask=attn_mask,
+ encoder_attn_mask=encoder_attn_mask,
+ residual_hidden_states=residuals,
+ )
+
+ # Postprocess the output
+ dtype = x.dtype
+ x = self.conv_norm_out(x.astype(mx.float32)).astype(dtype)
+ x = nn.silu(x)
+ x = self.conv_out(x)
+
+ return x
diff --git a/examples/databricks_DBRX_website_bot/diffusion_mlx/vae.py b/examples/databricks_DBRX_website_bot/diffusion_mlx/vae.py
new file mode 100644
index 00000000..5fd47f13
--- /dev/null
+++ b/examples/databricks_DBRX_website_bot/diffusion_mlx/vae.py
@@ -0,0 +1,274 @@
+# Copyright © 2023 Apple Inc.
+
+import math
+from typing import List
+
+import mlx.core as mx
+import mlx.nn as nn
+
+from .config import AutoencoderConfig
+from .unet import ResnetBlock2D, upsample_nearest
+
+
+class Attention(nn.Module):
+ """A single head unmasked attention for use with the VAE."""
+
+ def __init__(self, dims: int, norm_groups: int = 32):
+ super().__init__()
+
+ self.group_norm = nn.GroupNorm(norm_groups, dims, pytorch_compatible=True)
+ self.query_proj = nn.Linear(dims, dims)
+ self.key_proj = nn.Linear(dims, dims)
+ self.value_proj = nn.Linear(dims, dims)
+ self.out_proj = nn.Linear(dims, dims)
+
+ def __call__(self, x):
+ B, H, W, C = x.shape
+
+ y = self.group_norm(x)
+
+ queries = self.query_proj(y).reshape(B, H * W, C)
+ keys = self.key_proj(y).reshape(B, H * W, C)
+ values = self.value_proj(y).reshape(B, H * W, C)
+
+ scale = 1 / math.sqrt(queries.shape[-1])
+ scores = (queries * scale) @ keys.transpose(0, 2, 1)
+ attn = mx.softmax(scores, axis=-1)
+ y = (attn @ values).reshape(B, H, W, C)
+
+ y = self.out_proj(y)
+ x = x + y
+
+ return x
+
+
+class EncoderDecoderBlock2D(nn.Module):
+ def __init__(
+ self,
+ in_channels: int,
+ out_channels: int,
+ num_layers: int = 1,
+ resnet_groups: int = 32,
+ add_downsample=True,
+ add_upsample=True,
+ ):
+ super().__init__()
+
+ # Add the resnet blocks
+ self.resnets = [
+ ResnetBlock2D(
+ in_channels=in_channels if i == 0 else out_channels,
+ out_channels=out_channels,
+ groups=resnet_groups,
+ )
+ for i in range(num_layers)
+ ]
+
+ # Add an optional downsampling layer
+ if add_downsample:
+ self.downsample = nn.Conv2d(
+ out_channels, out_channels, kernel_size=3, stride=2, padding=0
+ )
+
+ # or upsampling layer
+ if add_upsample:
+ self.upsample = nn.Conv2d(
+ out_channels, out_channels, kernel_size=3, stride=1, padding=1
+ )
+
+ def __call__(self, x):
+ for resnet in self.resnets:
+ x = resnet(x)
+
+ if "downsample" in self:
+ x = mx.pad(x, [(0, 0), (0, 1), (0, 1), (0, 0)])
+ x = self.downsample(x)
+
+ if "upsample" in self:
+ x = self.upsample(upsample_nearest(x))
+
+ return x
+
+
+class Encoder(nn.Module):
+ """Implements the encoder side of the Autoencoder."""
+
+ def __init__(
+ self,
+ in_channels: int,
+ out_channels: int,
+ block_out_channels: List[int] = [64],
+ layers_per_block: int = 2,
+ resnet_groups: int = 32,
+ ):
+ super().__init__()
+
+ self.conv_in = nn.Conv2d(
+ in_channels, block_out_channels[0], kernel_size=3, stride=1, padding=1
+ )
+
+ channels = [block_out_channels[0]] + list(block_out_channels)
+ self.down_blocks = [
+ EncoderDecoderBlock2D(
+ in_channels,
+ out_channels,
+ num_layers=layers_per_block,
+ resnet_groups=resnet_groups,
+ add_downsample=i < len(block_out_channels) - 1,
+ add_upsample=False,
+ )
+ for i, (in_channels, out_channels) in enumerate(zip(channels, channels[1:]))
+ ]
+
+ self.mid_blocks = [
+ ResnetBlock2D(
+ in_channels=block_out_channels[-1],
+ out_channels=block_out_channels[-1],
+ groups=resnet_groups,
+ ),
+ Attention(block_out_channels[-1], resnet_groups),
+ ResnetBlock2D(
+ in_channels=block_out_channels[-1],
+ out_channels=block_out_channels[-1],
+ groups=resnet_groups,
+ ),
+ ]
+
+ self.conv_norm_out = nn.GroupNorm(
+ resnet_groups, block_out_channels[-1], pytorch_compatible=True
+ )
+ self.conv_out = nn.Conv2d(block_out_channels[-1], out_channels, 3, padding=1)
+
+ def __call__(self, x):
+ x = self.conv_in(x)
+
+ for l in self.down_blocks:
+ x = l(x)
+
+ x = self.mid_blocks[0](x)
+ x = self.mid_blocks[1](x)
+ x = self.mid_blocks[2](x)
+
+ x = self.conv_norm_out(x)
+ x = nn.silu(x)
+ x = self.conv_out(x)
+
+ return x
+
+
+class Decoder(nn.Module):
+ """Implements the decoder side of the Autoencoder."""
+
+ def __init__(
+ self,
+ in_channels: int,
+ out_channels: int,
+ block_out_channels: List[int] = [64],
+ layers_per_block: int = 2,
+ resnet_groups: int = 32,
+ ):
+ super().__init__()
+
+ self.conv_in = nn.Conv2d(
+ in_channels, block_out_channels[-1], kernel_size=3, stride=1, padding=1
+ )
+
+ self.mid_blocks = [
+ ResnetBlock2D(
+ in_channels=block_out_channels[-1],
+ out_channels=block_out_channels[-1],
+ groups=resnet_groups,
+ ),
+ Attention(block_out_channels[-1], resnet_groups),
+ ResnetBlock2D(
+ in_channels=block_out_channels[-1],
+ out_channels=block_out_channels[-1],
+ groups=resnet_groups,
+ ),
+ ]
+
+ channels = list(reversed(block_out_channels))
+ channels = [channels[0]] + channels
+ self.up_blocks = [
+ EncoderDecoderBlock2D(
+ in_channels,
+ out_channels,
+ num_layers=layers_per_block,
+ resnet_groups=resnet_groups,
+ add_downsample=False,
+ add_upsample=i < len(block_out_channels) - 1,
+ )
+ for i, (in_channels, out_channels) in enumerate(zip(channels, channels[1:]))
+ ]
+
+ self.conv_norm_out = nn.GroupNorm(
+ resnet_groups, block_out_channels[0], pytorch_compatible=True
+ )
+ self.conv_out = nn.Conv2d(block_out_channels[0], out_channels, 3, padding=1)
+
+ def __call__(self, x):
+ x = self.conv_in(x)
+
+ x = self.mid_blocks[0](x)
+ x = self.mid_blocks[1](x)
+ x = self.mid_blocks[2](x)
+
+ for l in self.up_blocks:
+ x = l(x)
+
+ x = self.conv_norm_out(x)
+ x = nn.silu(x)
+ x = self.conv_out(x)
+
+ return x
+
+
+class Autoencoder(nn.Module):
+ """The autoencoder that allows us to perform diffusion in the latent space."""
+
+ def __init__(self, config: AutoencoderConfig):
+ super().__init__()
+
+ self.latent_channels = config.latent_channels_in
+ self.scaling_factor = config.scaling_factor
+ self.encoder = Encoder(
+ config.in_channels,
+ config.latent_channels_out,
+ config.block_out_channels,
+ config.layers_per_block,
+ resnet_groups=config.norm_num_groups,
+ )
+ self.decoder = Decoder(
+ config.latent_channels_in,
+ config.out_channels,
+ config.block_out_channels,
+ config.layers_per_block + 1,
+ resnet_groups=config.norm_num_groups,
+ )
+
+ self.quant_proj = nn.Linear(
+ config.latent_channels_out, config.latent_channels_out
+ )
+ self.post_quant_proj = nn.Linear(
+ config.latent_channels_in, config.latent_channels_in
+ )
+
+ def decode(self, z):
+ z = z / self.scaling_factor
+ return self.decoder(self.post_quant_proj(z))
+
+ def encode(self, x):
+ x = self.encoder(x)
+ x = self.quant_proj(x)
+ mean, logvar = x.split(2, axis=-1)
+ mean = mean * self.scaling_factor
+ logvar = logvar + 2 * math.log(self.scaling_factor)
+
+ return mean, logvar
+
+ def __call__(self, x, key=None):
+ mean, logvar = self.encode(x)
+ z = mx.random.normal(mean.shape, key=key) * mx.exp(0.5 * logvar) + mean
+ x_hat = self.decode(z)
+
+ return dict(x_hat=x_hat, z=z, mean=mean, logvar=logvar)
diff --git a/examples/databricks_DBRX_website_bot/gen_image.py b/examples/databricks_DBRX_website_bot/gen_image.py
new file mode 100644
index 00000000..0e80f3af
--- /dev/null
+++ b/examples/databricks_DBRX_website_bot/gen_image.py
@@ -0,0 +1,142 @@
+from diffusers import (
+ StableDiffusionPipeline,
+ StableDiffusionXLPipeline,
+ AutoPipelineForText2Image,
+)
+import mlx.core as mx
+from diffusion_mlx import StableDiffusion, StableDiffusionXL
+import torch
+from tqdm import tqdm
+from PIL import Image
+import numpy as np
+import time
+
+SUPPORTS_NEGATIVE_PROMPT = False
+GLOBAL_NEGATIVE_PROMPT = (
+ "3d, cartoon, anime, (deformed eyes, nose, ears, nose), bad anatomy, ugly, text"
+)
+RESPONSE_TO_DIFFUSER_PROMPT = "Get minimal text (no longer than 70 tokesn) describe the response and use it as a prompt for a diffuser: {} | avoid adding text to the image |"
+
+"""
+MODEL_MAP = {
+ "runway_diffusion_v1": "runwayml/stable-diffusion-v1-5",
+ "sdxl": "stabilityai/stable-diffusion-xl-base-1.0",
+}
+
+def load_model(model_id="runway_diffusion_v1"):
+ global MODEL_PIPE, SUPPORTS_NEGATIVE_PROMPT
+ if model_id == "runway_diffusion_v1":
+ MODEL_PIPE = StableDiffusionPipeline.from_pretrained(MODEL_MAP[model_id])
+ elif model_id == "sdxl":
+ MODEL_PIPE = StableDiffusionXLPipeline.from_pretrained(
+ "stabilityai/stable-diffusion-xl-base-1.0", variant="fp16", use_safetensors=True
+ )
+ SUPPORTS_NEGATIVE_PROMPT = True
+ elif model_id == "sdxl-turbo":
+ MODEL_PIPE = AutoPipelineForText2Image.from_pretrained("stabilityai/sdxl-turbo", variant="fp16")
+
+
+
+def generate_image(prompt, model_id="runway_diffusion_v1"):
+ prompt += " | avoid adding text to the image |"
+ image = MODEL_PIPE(prompt).images[0] if not SUPPORTS_NEGATIVE_PROMPT else MODEL_PIPE(prompt, negative_prompt=GLOBAL_NEGATIVE_PROMPT).images[0]
+ return image
+"""
+
+### MLX version
+import mlx.core as mx
+import mlx.nn as nn
+
+
+def load_models(model="sdxl", float16=True, quantize=True, preload_models=True):
+ # Load the models
+ if model == "sdxl":
+ model = StableDiffusionXL("stabilityai/sdxl-turbo", float16=float16)
+ if quantize:
+ nn.quantize(
+ model.text_encoder_1,
+ class_predicate=lambda _, m: isinstance(m, nn.Linear),
+ )
+ nn.quantize(
+ model.text_encoder_2,
+ class_predicate=lambda _, m: isinstance(m, nn.Linear),
+ )
+ nn.quantize(model.unet, group_size=32, bits=8)
+ steps = 2
+ else:
+ model = StableDiffusion(
+ "stabilityai/stable-diffusion-2-1-base", float16=float16
+ )
+ if quantize:
+ nn.quantize(
+ model.text_encoder,
+ class_predicate=lambda _, m: isinstance(m, nn.Linear),
+ )
+ nn.quantize(model.unet, group_size=32, bits=8)
+ steps = 50
+
+ # Ensure that models are read in memory if needed
+ if preload_models:
+ model.ensure_models_are_loaded()
+
+ return model, steps
+
+
+def generate_image(model, steps, prompt, verbose=True):
+ # Generate the latent vectors using diffusion
+ time1 = time.time()
+ latents = model.generate_latents(
+ prompt,
+ n_images=1,
+ num_steps=steps,
+ negative_text=GLOBAL_NEGATIVE_PROMPT,
+ )
+ for x_t in tqdm(latents, total=steps):
+ mx.eval(x_t)
+
+ # The following is not necessary but it may help in memory
+ # constrained systems by reusing the memory kept by the unet and the text
+ # encoders.
+
+ # if model == "sdxl":
+ # del MODEL_PIPE.text_encoder_1
+ # del MODEL_PIPE.text_encoder_2
+ # else:
+ # del MODEL_PIPE.text_encoder
+ # del sd.unet
+ # del sd.sampler
+ peak_mem_unet = mx.metal.get_peak_memory() / 1024**3
+
+ # Decode them into images
+ decoded = []
+ for i in tqdm(range(0, 1, 1)):
+ decoded.append(model.decode(x_t[i : i + 1]))
+ mx.eval(decoded[-1])
+ peak_mem_overall = mx.metal.get_peak_memory() / 1024**3
+
+ # Arrange them on a grid
+ x = mx.concatenate(decoded, axis=0)
+ x = mx.pad(x, [(0, 0), (8, 8), (8, 8), (0, 0)])
+ B, H, W, C = x.shape
+ x = x.reshape(1, B, H, W, C).transpose(0, 2, 1, 3, 4)
+ x = x.reshape(1 * H, B * W, C)
+ x = (x * 255).astype(mx.uint8)
+
+ time2 = time.time()
+ if verbose:
+ print(f"Time taken to generate the image: {time2 - time1:.3f}s")
+ # Save them to disc
+ im = Image.fromarray(np.array(x))
+
+ # Report the peak memory used during generation
+ if verbose:
+ print(f"Peak memory used for the unet: {peak_mem_unet:.3f}GB")
+ print(f"Peak memory used overall: {peak_mem_overall:.3f}GB")
+
+ return im
+
+
+if __name__ == "__main__":
+ load_models()
+ generate_image("A cartoon of a cute cat", verbose=True)
+ generate_image("Hogwartz school of witchcraft and wizardry", verbose=True)
diff --git a/examples/databricks_DBRX_website_bot/gui.py b/examples/databricks_DBRX_website_bot/gui.py
new file mode 100644
index 00000000..95360a09
--- /dev/null
+++ b/examples/databricks_DBRX_website_bot/gui.py
@@ -0,0 +1,76 @@
+import streamlit as st
+from main import build_RAG
+from gen_image import generate_image, RESPONSE_TO_DIFFUSER_PROMPT
+from llama_index.core import Settings
+
+
+def add_to_session(key, value):
+ st.session_state[key] = value
+
+
+def main():
+ st.title("Databricks DBRX Website Bot")
+ if st.session_state.get("query_engine") is None:
+ context = st.text_area(
+ "Enter the link to the context",
+ value="https://harrypotter.fandom.com/wiki/Hogwarts_School_of_Witchcraft_and_Wizardry",
+ )
+ illustrate = st.checkbox("Illustrate")
+ steps = st.selectbox("Select the number of steps for diffusion", (1, 2))
+ build_rag = st.button("Build RAG")
+ query_engine, model = None, None
+ if build_rag:
+ query_engine, model, _ = build_RAG(
+ context,
+ "mixedbread-ai/mxbai-embed-large-v1",
+ "~/tmp/lancedb_hogwarts_12",
+ False,
+ illustrate,
+ "sdxl",
+ )
+ add_to_session("query_engine", query_engine)
+ add_to_session("model", model)
+ add_to_session("steps", steps or 1)
+ add_to_session("illustrate", illustrate)
+ print("steps", steps)
+ st._experimental_rerun()
+ else:
+ query_engine = st.session_state["query_engine"]
+ model = st.session_state["model"]
+ steps = st.session_state["steps"]
+ illustrate = st.session_state["illustrate"]
+ col1, col2 = st.columns(2)
+ with col1:
+ query = st.text_input(
+ "Enter a question",
+ value="What is Hogwarts?",
+ label_visibility="collapsed",
+ )
+ with col2:
+ enter = st.button("Enter")
+ if enter:
+ response = query_engine.chat(query)
+ if illustrate:
+ with col1:
+ st.write("Response")
+ st.write(response.response)
+ with col2:
+ st.write("Illustration")
+ with st.spinner("waiting"):
+ image = generate_image(
+ model,
+ steps,
+ Settings.llm.complete(
+ RESPONSE_TO_DIFFUSER_PROMPT.format(
+ str(response.response)
+ )
+ ).text,
+ )
+ st.image(image)
+ else:
+ st.write("Response")
+ st.write(response)
+
+
+if __name__ == "__main__":
+ main()
diff --git a/examples/databricks_DBRX_website_bot/main.py b/examples/databricks_DBRX_website_bot/main.py
index cef72f9b..9dad226c 100644
--- a/examples/databricks_DBRX_website_bot/main.py
+++ b/examples/databricks_DBRX_website_bot/main.py
@@ -5,6 +5,9 @@
from llama_index.vector_stores.lancedb import LanceDBVectorStore
from llama_index.llms.databricks import Databricks
from llama_index.embeddings.huggingface import HuggingFaceEmbedding
+from gen_image import load_models, generate_image, RESPONSE_TO_DIFFUSER_PROMPT
+
+MODEL, STEPS = None, None
def get_doc_from_url(url):
@@ -17,22 +20,24 @@ def build_RAG(
embed_model="mixedbread-ai/mxbai-embed-large-v1",
uri="~/tmp/lancedb_hogwart",
force_create_embeddings=False,
+ illustrate=True,
+ diffuser_model="sdxl",
):
Settings.embed_model = HuggingFaceEmbedding(model_name=embed_model)
Settings.llm = Databricks(model="databricks-dbrx-instruct")
-
+ if illustrate:
+ print("Loading sdxl model")
+ model, steps = load_models(diffuser_model)
+ # This is a hack to tradeoff between speed and quality
+ steps = 1
+ print("Model loaded")
documents = get_doc_from_url(url)
vector_store = LanceDBVectorStore(uri=uri)
storage_context = StorageContext.from_defaults(vector_store=vector_store)
index = VectorStoreIndex.from_documents(documents, storage_context=storage_context)
query_engine = index.as_chat_engine()
- print("Ask a question relevant to the given context:")
- while True:
- query = input()
- response = query_engine.chat(query)
- print(response)
- print("\n")
+ return query_engine, model, steps
if __name__ == "__main__":
@@ -61,5 +66,42 @@ def build_RAG(
default=False,
help="Force create embeddings",
)
+ parser.add_argument(
+ "--diffuser_model",
+ type=str,
+ default="sdxl",
+ help="Model ID",
+ )
+
+ parser.add_argument(
+ "--illustrate",
+ type=bool,
+ default=True,
+ help="Annotate",
+ )
args = parser.parse_args()
- build_RAG(args.url, args.embed_model, args.uri, args.force_create_embeddings)
+ # hardcode model because no one should use sd
+ args.diffuser_model = "sdxl"
+ query_engine, model, steps = build_RAG(
+ args.url,
+ args.embed_model,
+ args.uri,
+ args.force_create_embeddings,
+ args.illustrate,
+ args.diffuser_model,
+ )
+
+ print("Ask a question relevant to the given context:")
+ while True:
+ query = input()
+ response = query_engine.chat(query)
+ print(response)
+ print("\n Illustrating the response...:")
+ image = generate_image(
+ model,
+ steps,
+ Settings.llm.complete(
+ RESPONSE_TO_DIFFUSER_PROMPT.format(str(response.response))
+ ).text,
+ )
+ image.show()
diff --git a/examples/databricks_DBRX_website_bot/requirements.txt b/examples/databricks_DBRX_website_bot/requirements.txt
index a25f1f64..f5719f61 100644
--- a/examples/databricks_DBRX_website_bot/requirements.txt
+++ b/examples/databricks_DBRX_website_bot/requirements.txt
@@ -2,4 +2,12 @@ llama-index
llama-index-llms-databricks
llama-index-embeddings-huggingface
llama-index-readers-web
-llama-index-vector-stores-lancedb
\ No newline at end of file
+llama-index-vector-stores-lancedb
+diffusers
+mlx>=0.11
+huggingface-hub
+regex
+numpy
+tqdm
+Pillow
+streamlit
\ No newline at end of file
diff --git a/examples/imagebind_demo/README.md b/examples/imagebind_demo/README.md
index 2fec470d..1c925f14 100644
--- a/examples/imagebind_demo/README.md
+++ b/examples/imagebind_demo/README.md
@@ -2,6 +2,8 @@
A gradio app showcasing multi-modal capabilities of Imagebind supported via lanceDB API
+![alt text](<../../assets/imagebind-demo.png>)
+
## Usage
you can run it locally by cloning the project as mentioned below, or access via Spaces:
diff --git a/examples/movie-recommendation-with-genres/README.md b/examples/movie-recommendation-with-genres/README.md
new file mode 100644
index 00000000..f79e4e4a
--- /dev/null
+++ b/examples/movie-recommendation-with-genres/README.md
@@ -0,0 +1,9 @@
+# Movie Recommendation using Emebeddings and VectorDB
+
+![alt text](../../assets/movie-recommendation-with-genre.png)
+
+This example provides a comprehensive guide on creating a movie recommendation system by leveraging the power of Embeddings and VectorDB. We'll explore how combining these two techniques can significantly enhance the recommendation experience, addressing key challenges faced by traditional systems.
+
+Colab walkthrough -
+
+[Read the Blog Post](https://blog.lancedb.com/movie-recommendation-system-using-lancedb-and-doc2vec/)
\ No newline at end of file
diff --git a/examples/movie-recommendation-with-genres/movie_recommendation_with_doc2vec_and_lancedb.ipynb b/examples/movie-recommendation-with-genres/movie_recommendation_with_doc2vec_and_lancedb.ipynb
new file mode 100644
index 00000000..2a75b19a
--- /dev/null
+++ b/examples/movie-recommendation-with-genres/movie_recommendation_with_doc2vec_and_lancedb.ipynb
@@ -0,0 +1,614 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "K45xhdPRsZJV"
+ },
+ "source": [
+ "# Movie Recommendation System using Doc2vec Embeddings and Vector DB"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "XUj6NXD0sdgf"
+ },
+ "source": [
+ "This Colab notebook aims to illustrate the process of creating a recommendation system using embeddings and a Vector DB.\n",
+ "\n",
+ "This approach involves combining the various movie genres or characteristics of a movie to form Doc2Vec embeddings, which offer a comprehensive portrayal of the movie content.\n",
+ "\n",
+ "These embeddings serve dual purposes: they can either be directly inputted into a classification model for genre classification or stored in a VectorDB. By storing embeddings in a VectorDB, efficient retrieval and query search for recommendations become possible at a later stage.\n"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "qEa74a_Wtpc7"
+ },
+ "source": [
+ "### Installing the relevant dependencies\n"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "hyde90IntuFi"
+ },
+ "outputs": [],
+ "source": [
+ "!pip install torch scikit-learn lancedb nltk gensim lancedb scipy==1.12 kaggle"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "shPjHTZbtxTh"
+ },
+ "source": [
+ "## Kaggle Configuration and Data Needs\n",
+ "\n",
+ "We are using a movies metadata data which is being uploaded on the Kaggle. To download the dataset and use it for our recommendation system, we will need a `kaggle.json` file containing our creds.\n",
+ "\n",
+ "You can download the `kaggle.json` file from your Kaggle account settings. Follow these steps and make your life easy.\n",
+ "\n",
+ "1. Go to Kaggle and log in to your account.\n",
+ "2. Navigate to Your Account Settings and click on your profile picture in the top right corner of the page, Now From the dropdown menu, select `Account`.\n",
+ "3. Scroll down to the `API` section, Click on `Create New API Token`. This will download a file named kaggle.json to your computer.\n",
+ "\n",
+ "Once you have the `kaggle.json` file, you need to upload it here on colab data space. After uploading the `kaggle.json` file, run the following code to set up the credentials and download the dataset in `data` directory"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "6Tl2qzgKsWtF"
+ },
+ "outputs": [],
+ "source": [
+ "import json\n",
+ "import os\n",
+ "\n",
+ "# Assuming kaggle.json is uploaded to the current directory\n",
+ "with open(\"kaggle.json\") as f:\n",
+ " kaggle_credentials = json.load(f)\n",
+ "\n",
+ "os.environ[\"KAGGLE_USERNAME\"] = kaggle_credentials[\"username\"]\n",
+ "os.environ[\"KAGGLE_KEY\"] = kaggle_credentials[\"key\"]"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "8va-0of3sU0x"
+ },
+ "outputs": [],
+ "source": [
+ "from kaggle.api.kaggle_api_extended import KaggleApi\n",
+ "\n",
+ "# Initialize the Kaggle API\n",
+ "api = KaggleApi()\n",
+ "api.authenticate()\n",
+ "\n",
+ "# Specify the dataset you want to download\n",
+ "dataset = \"rounakbanik/the-movies-dataset\"\n",
+ "destination = \"data/\"\n",
+ "\n",
+ "# Create the destination directory if it doesn't exist\n",
+ "if not os.path.exists(destination):\n",
+ " os.makedirs(destination)\n",
+ "\n",
+ "# Download the dataset\n",
+ "api.dataset_download_files(dataset, path=destination, unzip=True)\n",
+ "\n",
+ "print(f\"Dataset {dataset} downloaded to {destination}\")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "hBYzad3lrY4e",
+ "outputId": "5a8f7983-80be-47e0-aa9c-ae4e10495c1e"
+ },
+ "outputs": [
+ {
+ "name": "stderr",
+ "output_type": "stream",
+ "text": [
+ "100%|██████████| 1000/1000 [00:00<00:00, 5050.83it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5161.29it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5006.18it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5222.83it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5216.24it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5171.35it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5109.78it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5222.42it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5133.39it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5024.74it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5117.18it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 4963.78it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5405.55it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5369.51it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5349.33it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5374.53it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5194.32it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5296.75it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5204.32it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5309.43it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5333.12it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5289.35it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5317.42it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5322.46it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5378.43it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5488.32it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5546.43it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 2502.38it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5369.91it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 4354.99it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5193.60it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5536.27it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 3476.56it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 4819.07it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 4500.37it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5184.11it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5098.14it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5523.73it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 4655.12it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5113.63it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5336.63it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5564.83it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5310.91it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 5533.46it/s]\n",
+ "100%|██████████| 1000/1000 [00:00<00:00, 4255.41it/s]\n",
+ "100%|██████████| 466/466 [00:00<00:00, 5617.03it/s]\n",
+ "Building Vocabulary: 100%|██████████| 44506/44506 [00:00<00:00, 104121.48it/s]\n",
+ "Epoch 1: 100%|██████████| 44506/44506 [00:02<00:00, 20444.80it/s]\n",
+ "Epoch 2: 100%|██████████| 44506/44506 [00:02<00:00, 20700.43it/s]\n",
+ "Epoch 3: 100%|██████████| 44506/44506 [00:02<00:00, 20831.06it/s]\n",
+ "Epoch 4: 100%|██████████| 44506/44506 [00:02<00:00, 20885.78it/s]\n",
+ "Epoch 5: 100%|██████████| 44506/44506 [00:02<00:00, 19616.38it/s]\n",
+ "Epoch 6: 100%|██████████| 44506/44506 [00:02<00:00, 19634.24it/s]\n",
+ "Epoch 7: 100%|██████████| 44506/44506 [00:02<00:00, 20579.08it/s]\n",
+ "Epoch 8: 100%|██████████| 44506/44506 [00:02<00:00, 20727.00it/s]\n",
+ "Epoch 9: 100%|██████████| 44506/44506 [00:02<00:00, 21242.19it/s]\n",
+ "Epoch 10: 100%|██████████| 44506/44506 [00:02<00:00, 18476.39it/s]\n",
+ "Epoch 11: 100%|██████████| 44506/44506 [00:02<00:00, 21169.07it/s]\n",
+ "Epoch 12: 100%|██████████| 44506/44506 [00:02<00:00, 20967.64it/s]\n",
+ "Epoch 13: 100%|██████████| 44506/44506 [00:02<00:00, 20192.34it/s]\n",
+ "Epoch 14: 100%|██████████| 44506/44506 [00:02<00:00, 18910.62it/s]\n",
+ "Epoch 15: 100%|██████████| 44506/44506 [00:02<00:00, 20810.41it/s]\n",
+ "Epoch 16: 100%|██████████| 44506/44506 [00:02<00:00, 21361.88it/s]\n",
+ "Epoch 17: 100%|██████████| 44506/44506 [00:02<00:00, 18440.51it/s]\n",
+ "Epoch 18: 100%|██████████| 44506/44506 [00:02<00:00, 21206.01it/s]\n",
+ "Epoch 19: 100%|██████████| 44506/44506 [00:02<00:00, 20086.00it/s]\n",
+ "Epoch 20: 100%|██████████| 44506/44506 [00:02<00:00, 20943.08it/s]\n"
+ ]
+ }
+ ],
+ "source": [
+ "import pandas as pd\n",
+ "import numpy as np\n",
+ "import torch\n",
+ "import torch.nn as nn\n",
+ "import torch.optim as optim\n",
+ "from torch.utils.data import DataLoader, TensorDataset\n",
+ "from gensim.models.doc2vec import Doc2Vec, TaggedDocument\n",
+ "from nltk.tokenize import word_tokenize\n",
+ "from sklearn.preprocessing import MultiLabelBinarizer\n",
+ "from sklearn.model_selection import train_test_split\n",
+ "from tqdm import tqdm\n",
+ "\n",
+ "# Read data from CSV file\n",
+ "movie_data = pd.read_csv(\n",
+ " \"/Users/vipul/Nova/Projects/genre_spectrum/movies_metadata.csv\", low_memory=False\n",
+ ")\n",
+ "device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n",
+ "\n",
+ "\n",
+ "def preprocess_data(movie_data_chunk):\n",
+ " tagged_docs = []\n",
+ " valid_indices = []\n",
+ " movie_info = []\n",
+ "\n",
+ " # Wrap your loop with tqdm\n",
+ " for i, row in tqdm(movie_data_chunk.iterrows(), total=len(movie_data_chunk)):\n",
+ " try:\n",
+ " # Constructing movie text\n",
+ " movies_text = \"\"\n",
+ " movies_text += \"Overview: \" + row[\"overview\"] + \"\\n\"\n",
+ " genres = \", \".join([genre[\"name\"] for genre in eval(row[\"genres\"])])\n",
+ " movies_text += \"Overview: \" + row[\"overview\"] + \"\\n\"\n",
+ " movies_text += \"Genres: \" + genres + \"\\n\"\n",
+ " movies_text += \"Title: \" + row[\"title\"] + \"\\n\"\n",
+ " tagged_docs.append(\n",
+ " TaggedDocument(words=word_tokenize(movies_text.lower()), tags=[str(i)])\n",
+ " )\n",
+ " valid_indices.append(i)\n",
+ " movie_info.append((row[\"title\"], genres))\n",
+ " except Exception as e:\n",
+ " continue\n",
+ "\n",
+ " return tagged_docs, valid_indices, movie_info\n",
+ "\n",
+ "\n",
+ "def train_doc2vec_model(tagged_data, num_epochs=20):\n",
+ " # Initialize Doc2Vec model\n",
+ " doc2vec_model = Doc2Vec(vector_size=100, min_count=2, epochs=num_epochs)\n",
+ " doc2vec_model.build_vocab(tqdm(tagged_data, desc=\"Building Vocabulary\"))\n",
+ " for epoch in range(num_epochs):\n",
+ " doc2vec_model.train(\n",
+ " tqdm(tagged_data, desc=f\"Epoch {epoch+1}\"),\n",
+ " total_examples=doc2vec_model.corpus_count,\n",
+ " epochs=doc2vec_model.epochs,\n",
+ " )\n",
+ "\n",
+ " return doc2vec_model\n",
+ "\n",
+ "\n",
+ "# Preprocess data and extract genres for the first 1000 movies\n",
+ "chunk_size = 1000\n",
+ "tagged_data = []\n",
+ "valid_indices = []\n",
+ "movie_info = []\n",
+ "for chunk_start in range(0, len(movie_data), chunk_size):\n",
+ " movie_data_chunk = movie_data.iloc[chunk_start : chunk_start + chunk_size]\n",
+ " chunk_tagged_data, chunk_valid_indices, chunk_movie_info = preprocess_data(\n",
+ " movie_data_chunk\n",
+ " )\n",
+ " tagged_data.extend(chunk_tagged_data)\n",
+ " valid_indices.extend(chunk_valid_indices)\n",
+ " movie_info.extend(chunk_movie_info)\n",
+ "\n",
+ "doc2vec_model = train_doc2vec_model(tagged_data)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "VryHT1zVuEp0"
+ },
+ "source": [
+ "### Training a Neural Network for the Genre Classification Task"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "3pVNy2UKt5lu"
+ },
+ "outputs": [],
+ "source": [
+ "# Extract genre labels for the valid indices\n",
+ "genres_list = []\n",
+ "for i in valid_indices:\n",
+ " row = movie_data.loc[i]\n",
+ " genres = [genre[\"name\"] for genre in eval(row[\"genres\"])]\n",
+ " genres_list.append(genres)\n",
+ "\n",
+ "mlb = MultiLabelBinarizer()\n",
+ "genre_labels = mlb.fit_transform(genres_list)\n",
+ "\n",
+ "embeddings = []\n",
+ "for i in valid_indices:\n",
+ " embeddings.append(doc2vec_model.dv[str(i)])\n",
+ "X_train, X_test, y_train, y_test = train_test_split(\n",
+ " embeddings, genre_labels, test_size=0.2, random_state=42\n",
+ ")\n",
+ "\n",
+ "X_train_np = np.array(X_train, dtype=np.float32)\n",
+ "y_train_np = np.array(y_train, dtype=np.float32)\n",
+ "X_test_np = np.array(X_test, dtype=np.float32)\n",
+ "y_test_np = np.array(y_test, dtype=np.float32)\n",
+ "\n",
+ "X_train_tensor = torch.tensor(X_train_np)\n",
+ "y_train_tensor = torch.tensor(y_train_np)\n",
+ "X_test_tensor = torch.tensor(X_test_np)\n",
+ "y_test_tensor = torch.tensor(y_test_np)\n",
+ "\n",
+ "\n",
+ "class GenreClassifier(nn.Module):\n",
+ " def __init__(self, input_size, output_size):\n",
+ " super(GenreClassifier, self).__init__()\n",
+ " self.fc1 = nn.Linear(input_size, 512)\n",
+ " self.bn1 = nn.BatchNorm1d(512)\n",
+ " self.fc2 = nn.Linear(512, 256)\n",
+ " self.bn2 = nn.BatchNorm1d(256)\n",
+ " self.fc3 = nn.Linear(256, 128)\n",
+ " self.bn3 = nn.BatchNorm1d(128)\n",
+ " self.fc4 = nn.Linear(128, output_size)\n",
+ " self.relu = nn.ReLU()\n",
+ " self.dropout = nn.Dropout(p=0.2) # Adjust the dropout rate as needed\n",
+ "\n",
+ " def forward(self, x):\n",
+ " x = self.fc1(x)\n",
+ " x = self.bn1(x)\n",
+ " x = self.relu(x)\n",
+ " x = self.dropout(x)\n",
+ " x = self.fc2(x)\n",
+ " x = self.bn2(x)\n",
+ " x = self.relu(x)\n",
+ " x = self.dropout(x)\n",
+ " x = self.fc3(x)\n",
+ " x = self.bn3(x)\n",
+ " x = self.relu(x)\n",
+ " x = self.dropout(x)\n",
+ " x = self.fc4(x)\n",
+ " return x\n",
+ "\n",
+ "\n",
+ "# Move model to the selected device\n",
+ "model = GenreClassifier(input_size=100, output_size=len(mlb.classes_)).to(device)\n",
+ "\n",
+ "# Define loss function and optimizer\n",
+ "criterion = nn.BCEWithLogitsLoss()\n",
+ "optimizer = optim.Adam(model.parameters(), lr=0.001)\n",
+ "\n",
+ "# Training loop\n",
+ "epochs = 50\n",
+ "batch_size = 64\n",
+ "\n",
+ "train_dataset = TensorDataset(X_train_tensor.to(device), y_train_tensor.to(device))\n",
+ "train_loader = DataLoader(train_dataset, batch_size=batch_size, shuffle=True)\n",
+ "\n",
+ "for epoch in range(epochs):\n",
+ " model.train()\n",
+ " running_loss = 0.0\n",
+ " for inputs, labels in train_loader:\n",
+ " inputs, labels = inputs.to(device), labels.to(device) # Move data to device\n",
+ " optimizer.zero_grad()\n",
+ " outputs = model(inputs)\n",
+ " loss = criterion(outputs, labels)\n",
+ " loss.backward()\n",
+ " optimizer.step()\n",
+ " running_loss += loss.item() * inputs.size(0)\n",
+ " epoch_loss = running_loss / len(train_loader.dataset)\n",
+ " print(f\"Epoch [{epoch + 1}/{epochs}], Loss: {epoch_loss:.4f}\")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "yV8lTDYIubEQ"
+ },
+ "source": [
+ "### Testing the `model` to see if our model is able to predict the genres for the movies from the test dataset"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "73D3aqdJuct8"
+ },
+ "outputs": [],
+ "source": [
+ "from sklearn.metrics import f1_score\n",
+ "\n",
+ "model.eval()\n",
+ "with torch.no_grad():\n",
+ " X_test_tensor, y_test_tensor = X_test_tensor.to(device), y_test_tensor.to(\n",
+ " device\n",
+ " ) # Move test data to device\n",
+ " outputs = model(X_test_tensor)\n",
+ " test_loss = criterion(outputs, y_test_tensor)\n",
+ " print(f\"Test Loss: {test_loss.item():.4f}\")\n",
+ "\n",
+ "\n",
+ "thresholds = [0.1] * len(mlb.classes_)\n",
+ "thresholds_tensor = torch.tensor(thresholds, device=device).unsqueeze(0)\n",
+ "\n",
+ "# Convert the outputs to binary predictions using varying thresholds\n",
+ "predicted_labels = (outputs > thresholds_tensor).cpu().numpy()\n",
+ "\n",
+ "# Convert binary predictions and actual labels to multi-label format\n",
+ "predicted_multilabels = mlb.inverse_transform(predicted_labels)\n",
+ "actual_multilabels = mlb.inverse_transform(y_test_np)\n",
+ "\n",
+ "# Print the Predicted and Actual Labels for each movie\n",
+ "for i, (predicted, actual) in enumerate(zip(predicted_multilabels, actual_multilabels)):\n",
+ " print(f\"Movie {i+1}:\")\n",
+ " print(f\" Predicted Labels: {predicted}\")\n",
+ " print(f\" Actual Labels: {actual}\")\n",
+ "\n",
+ "\n",
+ "# Compute F1-score\n",
+ "f1 = f1_score(y_test_np, predicted_labels, average=\"micro\")\n",
+ "print(f\"F1-score: {f1:.4f}\")\n",
+ "\n",
+ "# Saving the trained model\n",
+ "torch.save(model.state_dict(), \"trained_model.pth\")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "kZrHpMm4un0G"
+ },
+ "source": [
+ "### Storing the Doc2Vec Embeddings into LanceDB VectorDatabase"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "BTTNb9irrY4h"
+ },
+ "outputs": [],
+ "source": [
+ "import lancedb\n",
+ "import numpy as np\n",
+ "import pandas as pd\n",
+ "\n",
+ "data = []\n",
+ "\n",
+ "for i in valid_indices:\n",
+ " embedding = doc2vec_model.dv[str(i)]\n",
+ " title, genres = movie_info[valid_indices.index(i)]\n",
+ " data.append({\"title\": title, \"genres\": genres, \"vector\": embedding.tolist()})\n",
+ "\n",
+ "db = lancedb.connect(\".db\")\n",
+ "tbl = db.create_table(\"doc2vec_embeddings\", data, mode=\"Overwrite\")\n",
+ "db[\"doc2vec_embeddings\"].head()"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "ciUFn7uQrY4i"
+ },
+ "outputs": [],
+ "source": [
+ "def get_recommendations(title):\n",
+ " pd_data = pd.DataFrame(data)\n",
+ " result = (\n",
+ " tbl.search(pd_data[pd_data[\"title\"] == title][\"vector\"].values[0])\n",
+ " .metric(\"cosine\")\n",
+ " .limit(10)\n",
+ " .to_pandas()\n",
+ " )\n",
+ " return result[[\"title\"]]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "8Kz-JGsTuwmk"
+ },
+ "source": [
+ "### D-Day : Let's generate some recommendations"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "uw_El12JrY4j",
+ "outputId": "c245bab5-7966-4fd1-ec72-37f708c3b570"
+ },
+ "outputs": [
+ {
+ "data": {
+ "text/html": [
+ "\n",
+ "\n",
+ "
\n",
+ " \n",
+ " \n",
+ " | \n",
+ " title | \n",
+ "
\n",
+ " \n",
+ " \n",
+ " \n",
+ " 0 | \n",
+ " Vertical Limit | \n",
+ "
\n",
+ " \n",
+ " 1 | \n",
+ " Demons of War | \n",
+ "
\n",
+ " \n",
+ " 2 | \n",
+ " Fear and Desire | \n",
+ "
\n",
+ " \n",
+ " 3 | \n",
+ " Escape from Sobibor | \n",
+ "
\n",
+ " \n",
+ " 4 | \n",
+ " Last Girl Standing | \n",
+ "
\n",
+ " \n",
+ " 5 | \n",
+ " K2: Siren of the Himalayas | \n",
+ "
\n",
+ " \n",
+ " 6 | \n",
+ " Ghost Ship | \n",
+ "
\n",
+ " \n",
+ " 7 | \n",
+ " Camp Massacre | \n",
+ "
\n",
+ " \n",
+ " 8 | \n",
+ " Captain Nemo and the Underwater City | \n",
+ "
\n",
+ " \n",
+ " 9 | \n",
+ " Seas Beneath | \n",
+ "
\n",
+ " \n",
+ "
\n",
+ "
"
+ ],
+ "text/plain": [
+ " title\n",
+ "0 Vertical Limit\n",
+ "1 Demons of War\n",
+ "2 Fear and Desire\n",
+ "3 Escape from Sobibor\n",
+ "4 Last Girl Standing\n",
+ "5 K2: Siren of the Himalayas\n",
+ "6 Ghost Ship\n",
+ "7 Camp Massacre\n",
+ "8 Captain Nemo and the Underwater City\n",
+ "9 Seas Beneath"
+ ]
+ },
+ "execution_count": 20,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "get_recommendations(\"Vertical Limit\")"
+ ]
+ }
+ ],
+ "metadata": {
+ "colab": {
+ "provenance": []
+ },
+ "kernelspec": {
+ "display_name": "env",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.12.3"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 0
+}
diff --git a/examples/parent_document_retriever/main.ipynb b/examples/parent_document_retriever/main.ipynb
index eead31ea..9b7e51a3 100644
--- a/examples/parent_document_retriever/main.ipynb
+++ b/examples/parent_document_retriever/main.ipynb
@@ -109,9 +109,9 @@
"metadata": {},
"outputs": [],
"source": [
- "os.environ[\"OPENAI_API_KEY\"] = (\n",
- " \"YOUR_API_KEY_HERE\" # NEEDED if you run LLM Experiment below\n",
- ")"
+ "os.environ[\n",
+ " \"OPENAI_API_KEY\"\n",
+ "] = \"YOUR_API_KEY_HERE\" # NEEDED if you run LLM Experiment below"
]
},
{
diff --git a/tutorials/Advace_RAG_LlamaParser/README.md b/tutorials/Advace_RAG_LlamaParser/README.md
new file mode 100644
index 00000000..50a5d258
--- /dev/null
+++ b/tutorials/Advace_RAG_LlamaParser/README.md
@@ -0,0 +1,18 @@
+## Advanced RAG: Extracting Complex PDFs containing tables & Text Using LlamaParse
+[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/Advace_RAG_LlamaParser/main.ipynb)
+
+This example contains code and examples for comparing LangChain, LlamaIndex, and LlamaParse in extracting data from PDFs, especially those with complex tables and text.
+
+### Overview
+In this project, we explore:
+
+* Q&A on PDF data using LangChain
+
+* Q&A on PDF data using LlamaIndex
+
+* Q&A on PDF data using LlamaIndex with LlamaParse
+
+The results of each method are compared in colab notebook
+[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/Advace_RAG_LlamaParser/main.ipynb)
+
+
diff --git a/tutorials/Advace_RAG_LlamaParser/main.ipynb b/tutorials/Advace_RAG_LlamaParser/main.ipynb
new file mode 100644
index 00000000..c303d0f9
--- /dev/null
+++ b/tutorials/Advace_RAG_LlamaParser/main.ipynb
@@ -0,0 +1,3529 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "7wD8dJo-WZH7"
+ },
+ "source": [
+ "This notebook compares Langchain & Llamaindex for understand which method is best extraction of table & text from PDF in the following\n",
+ "\n",
+ "\n",
+ "Here we have covered\n",
+ "\n",
+ "1. Langchain RAG\n",
+ "2. Llamaindex RAG\n",
+ "3. Langchain wiht llamaparser\n",
+ "4. Llamaindex with llamaparser\n",
+ "\n",
+ "\n",
+ "from above this method will get idea about which is best method for table extraction for the following data used\n"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "HGcMcXLF7zoM",
+ "outputId": "b4f6876b-0531-4bce-c1a6-1615c77322a2"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Collecting llama-index\n",
+ " Downloading llama_index-0.10.37-py3-none-any.whl (6.8 kB)\n",
+ "Collecting llama-index-core\n",
+ " Downloading llama_index_core-0.10.37.post1-py3-none-any.whl (15.4 MB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m15.4/15.4 MB\u001b[0m \u001b[31m40.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hCollecting llama-index-embeddings-openai\n",
+ " Downloading llama_index_embeddings_openai-0.1.9-py3-none-any.whl (6.0 kB)\n",
+ "Collecting llama-parse\n",
+ " Downloading llama_parse-0.4.3-py3-none-any.whl (7.7 kB)\n",
+ "Collecting llama-index-agent-openai<0.3.0,>=0.1.4 (from llama-index)\n",
+ " Downloading llama_index_agent_openai-0.2.5-py3-none-any.whl (13 kB)\n",
+ "Collecting llama-index-cli<0.2.0,>=0.1.2 (from llama-index)\n",
+ " Downloading llama_index_cli-0.1.12-py3-none-any.whl (26 kB)\n",
+ "Collecting llama-index-indices-managed-llama-cloud<0.2.0,>=0.1.2 (from llama-index)\n",
+ " Downloading llama_index_indices_managed_llama_cloud-0.1.6-py3-none-any.whl (6.7 kB)\n",
+ "Collecting llama-index-legacy<0.10.0,>=0.9.48 (from llama-index)\n",
+ " Downloading llama_index_legacy-0.9.48-py3-none-any.whl (2.0 MB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m2.0/2.0 MB\u001b[0m \u001b[31m52.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hCollecting llama-index-llms-openai<0.2.0,>=0.1.13 (from llama-index)\n",
+ " Downloading llama_index_llms_openai-0.1.19-py3-none-any.whl (11 kB)\n",
+ "Collecting llama-index-multi-modal-llms-openai<0.2.0,>=0.1.3 (from llama-index)\n",
+ " Downloading llama_index_multi_modal_llms_openai-0.1.6-py3-none-any.whl (5.8 kB)\n",
+ "Collecting llama-index-program-openai<0.2.0,>=0.1.3 (from llama-index)\n",
+ " Downloading llama_index_program_openai-0.1.6-py3-none-any.whl (5.2 kB)\n",
+ "Collecting llama-index-question-gen-openai<0.2.0,>=0.1.2 (from llama-index)\n",
+ " Downloading llama_index_question_gen_openai-0.1.3-py3-none-any.whl (2.9 kB)\n",
+ "Collecting llama-index-readers-file<0.2.0,>=0.1.4 (from llama-index)\n",
+ " Downloading llama_index_readers_file-0.1.22-py3-none-any.whl (36 kB)\n",
+ "Collecting llama-index-readers-llama-parse<0.2.0,>=0.1.2 (from llama-index)\n",
+ " Downloading llama_index_readers_llama_parse-0.1.4-py3-none-any.whl (2.5 kB)\n",
+ "Requirement already satisfied: PyYAML>=6.0.1 in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (6.0.1)\n",
+ "Requirement already satisfied: SQLAlchemy[asyncio]>=1.4.49 in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (2.0.30)\n",
+ "Requirement already satisfied: aiohttp<4.0.0,>=3.8.6 in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (3.9.5)\n",
+ "Collecting dataclasses-json (from llama-index-core)\n",
+ " Downloading dataclasses_json-0.6.6-py3-none-any.whl (28 kB)\n",
+ "Collecting deprecated>=1.2.9.3 (from llama-index-core)\n",
+ " Downloading Deprecated-1.2.14-py2.py3-none-any.whl (9.6 kB)\n",
+ "Collecting dirtyjson<2.0.0,>=1.0.8 (from llama-index-core)\n",
+ " Downloading dirtyjson-1.0.8-py3-none-any.whl (25 kB)\n",
+ "Requirement already satisfied: fsspec>=2023.5.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (2023.6.0)\n",
+ "Collecting httpx (from llama-index-core)\n",
+ " Downloading httpx-0.27.0-py3-none-any.whl (75 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m75.6/75.6 kB\u001b[0m \u001b[31m9.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hCollecting jsonpath-ng<2.0.0,>=1.6.0 (from llama-index-core)\n",
+ " Downloading jsonpath_ng-1.6.1-py3-none-any.whl (29 kB)\n",
+ "Collecting llamaindex-py-client<0.2.0,>=0.1.18 (from llama-index-core)\n",
+ " Downloading llamaindex_py_client-0.1.19-py3-none-any.whl (141 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m141.9/141.9 kB\u001b[0m \u001b[31m14.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hRequirement already satisfied: nest-asyncio<2.0.0,>=1.5.8 in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (1.6.0)\n",
+ "Requirement already satisfied: networkx>=3.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (3.3)\n",
+ "Requirement already satisfied: nltk<4.0.0,>=3.8.1 in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (3.8.1)\n",
+ "Requirement already satisfied: numpy in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (1.25.2)\n",
+ "Collecting openai>=1.1.0 (from llama-index-core)\n",
+ " Downloading openai-1.30.1-py3-none-any.whl (320 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m320.6/320.6 kB\u001b[0m \u001b[31m26.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hRequirement already satisfied: pandas in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (2.0.3)\n",
+ "Requirement already satisfied: pillow>=9.0.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (9.4.0)\n",
+ "Requirement already satisfied: requests>=2.31.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (2.31.0)\n",
+ "Requirement already satisfied: spacy<4.0.0,>=3.7.1 in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (3.7.4)\n",
+ "Requirement already satisfied: tenacity<9.0.0,>=8.2.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (8.3.0)\n",
+ "Collecting tiktoken>=0.3.3 (from llama-index-core)\n",
+ " Downloading tiktoken-0.7.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.1 MB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m1.1/1.1 MB\u001b[0m \u001b[31m54.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hRequirement already satisfied: tqdm<5.0.0,>=4.66.1 in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (4.66.4)\n",
+ "Requirement already satisfied: typing-extensions>=4.5.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (4.11.0)\n",
+ "Collecting typing-inspect>=0.8.0 (from llama-index-core)\n",
+ " Downloading typing_inspect-0.9.0-py3-none-any.whl (8.8 kB)\n",
+ "Requirement already satisfied: wrapt in /usr/local/lib/python3.10/dist-packages (from llama-index-core) (1.14.1)\n",
+ "Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core) (1.3.1)\n",
+ "Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core) (23.2.0)\n",
+ "Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core) (1.4.1)\n",
+ "Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core) (6.0.5)\n",
+ "Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core) (1.9.4)\n",
+ "Requirement already satisfied: async-timeout<5.0,>=4.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core) (4.0.3)\n",
+ "Collecting ply (from jsonpath-ng<2.0.0,>=1.6.0->llama-index-core)\n",
+ " Downloading ply-3.11-py2.py3-none-any.whl (49 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m49.6/49.6 kB\u001b[0m \u001b[31m5.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hRequirement already satisfied: beautifulsoup4<5.0.0,>=4.12.3 in /usr/local/lib/python3.10/dist-packages (from llama-index-readers-file<0.2.0,>=0.1.4->llama-index) (4.12.3)\n",
+ "Collecting pypdf<5.0.0,>=4.0.1 (from llama-index-readers-file<0.2.0,>=0.1.4->llama-index)\n",
+ " Downloading pypdf-4.2.0-py3-none-any.whl (290 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m290.4/290.4 kB\u001b[0m \u001b[31m27.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hCollecting striprtf<0.0.27,>=0.0.26 (from llama-index-readers-file<0.2.0,>=0.1.4->llama-index)\n",
+ " Downloading striprtf-0.0.26-py3-none-any.whl (6.9 kB)\n",
+ "Requirement already satisfied: pydantic>=1.10 in /usr/local/lib/python3.10/dist-packages (from llamaindex-py-client<0.2.0,>=0.1.18->llama-index-core) (2.7.1)\n",
+ "Requirement already satisfied: anyio in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core) (3.7.1)\n",
+ "Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core) (2024.2.2)\n",
+ "Collecting httpcore==1.* (from httpx->llama-index-core)\n",
+ " Downloading httpcore-1.0.5-py3-none-any.whl (77 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m77.9/77.9 kB\u001b[0m \u001b[31m10.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hRequirement already satisfied: idna in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core) (3.7)\n",
+ "Requirement already satisfied: sniffio in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core) (1.3.1)\n",
+ "Collecting h11<0.15,>=0.13 (from httpcore==1.*->httpx->llama-index-core)\n",
+ " Downloading h11-0.14.0-py3-none-any.whl (58 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m58.3/58.3 kB\u001b[0m \u001b[31m6.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hRequirement already satisfied: click in /usr/local/lib/python3.10/dist-packages (from nltk<4.0.0,>=3.8.1->llama-index-core) (8.1.7)\n",
+ "Requirement already satisfied: joblib in /usr/local/lib/python3.10/dist-packages (from nltk<4.0.0,>=3.8.1->llama-index-core) (1.4.2)\n",
+ "Requirement already satisfied: regex>=2021.8.3 in /usr/local/lib/python3.10/dist-packages (from nltk<4.0.0,>=3.8.1->llama-index-core) (2023.12.25)\n",
+ "Requirement already satisfied: distro<2,>=1.7.0 in /usr/lib/python3/dist-packages (from openai>=1.1.0->llama-index-core) (1.7.0)\n",
+ "Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests>=2.31.0->llama-index-core) (3.3.2)\n",
+ "Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests>=2.31.0->llama-index-core) (2.0.7)\n",
+ "Requirement already satisfied: spacy-legacy<3.1.0,>=3.0.11 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (3.0.12)\n",
+ "Requirement already satisfied: spacy-loggers<2.0.0,>=1.0.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (1.0.5)\n",
+ "Requirement already satisfied: murmurhash<1.1.0,>=0.28.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (1.0.10)\n",
+ "Requirement already satisfied: cymem<2.1.0,>=2.0.2 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (2.0.8)\n",
+ "Requirement already satisfied: preshed<3.1.0,>=3.0.2 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (3.0.9)\n",
+ "Requirement already satisfied: thinc<8.3.0,>=8.2.2 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (8.2.3)\n",
+ "Requirement already satisfied: wasabi<1.2.0,>=0.9.1 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (1.1.2)\n",
+ "Requirement already satisfied: srsly<3.0.0,>=2.4.3 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (2.4.8)\n",
+ "Requirement already satisfied: catalogue<2.1.0,>=2.0.6 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (2.0.10)\n",
+ "Requirement already satisfied: weasel<0.4.0,>=0.1.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (0.3.4)\n",
+ "Requirement already satisfied: typer<0.10.0,>=0.3.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (0.9.4)\n",
+ "Requirement already satisfied: smart-open<7.0.0,>=5.2.1 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (6.4.0)\n",
+ "Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (3.1.4)\n",
+ "Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (67.7.2)\n",
+ "Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (24.0)\n",
+ "Requirement already satisfied: langcodes<4.0.0,>=3.2.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core) (3.4.0)\n",
+ "Requirement already satisfied: greenlet!=0.4.17 in /usr/local/lib/python3.10/dist-packages (from SQLAlchemy[asyncio]>=1.4.49->llama-index-core) (3.0.3)\n",
+ "Collecting mypy-extensions>=0.3.0 (from typing-inspect>=0.8.0->llama-index-core)\n",
+ " Downloading mypy_extensions-1.0.0-py3-none-any.whl (4.7 kB)\n",
+ "Collecting marshmallow<4.0.0,>=3.18.0 (from dataclasses-json->llama-index-core)\n",
+ " Downloading marshmallow-3.21.2-py3-none-any.whl (49 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m49.3/49.3 kB\u001b[0m \u001b[31m4.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hRequirement already satisfied: python-dateutil>=2.8.2 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-index-core) (2.8.2)\n",
+ "Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-index-core) (2023.4)\n",
+ "Requirement already satisfied: tzdata>=2022.1 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-index-core) (2024.1)\n",
+ "Requirement already satisfied: exceptiongroup in /usr/local/lib/python3.10/dist-packages (from anyio->httpx->llama-index-core) (1.2.1)\n",
+ "Requirement already satisfied: soupsieve>1.2 in /usr/local/lib/python3.10/dist-packages (from beautifulsoup4<5.0.0,>=4.12.3->llama-index-readers-file<0.2.0,>=0.1.4->llama-index) (2.5)\n",
+ "Requirement already satisfied: language-data>=1.2 in /usr/local/lib/python3.10/dist-packages (from langcodes<4.0.0,>=3.2.0->spacy<4.0.0,>=3.7.1->llama-index-core) (1.2.0)\n",
+ "Requirement already satisfied: annotated-types>=0.4.0 in /usr/local/lib/python3.10/dist-packages (from pydantic>=1.10->llamaindex-py-client<0.2.0,>=0.1.18->llama-index-core) (0.6.0)\n",
+ "Requirement already satisfied: pydantic-core==2.18.2 in /usr/local/lib/python3.10/dist-packages (from pydantic>=1.10->llamaindex-py-client<0.2.0,>=0.1.18->llama-index-core) (2.18.2)\n",
+ "Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.10/dist-packages (from python-dateutil>=2.8.2->pandas->llama-index-core) (1.16.0)\n",
+ "Requirement already satisfied: blis<0.8.0,>=0.7.8 in /usr/local/lib/python3.10/dist-packages (from thinc<8.3.0,>=8.2.2->spacy<4.0.0,>=3.7.1->llama-index-core) (0.7.11)\n",
+ "Requirement already satisfied: confection<1.0.0,>=0.0.1 in /usr/local/lib/python3.10/dist-packages (from thinc<8.3.0,>=8.2.2->spacy<4.0.0,>=3.7.1->llama-index-core) (0.1.4)\n",
+ "Requirement already satisfied: cloudpathlib<0.17.0,>=0.7.0 in /usr/local/lib/python3.10/dist-packages (from weasel<0.4.0,>=0.1.0->spacy<4.0.0,>=3.7.1->llama-index-core) (0.16.0)\n",
+ "Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2->spacy<4.0.0,>=3.7.1->llama-index-core) (2.1.5)\n",
+ "Requirement already satisfied: marisa-trie>=0.7.7 in /usr/local/lib/python3.10/dist-packages (from language-data>=1.2->langcodes<4.0.0,>=3.2.0->spacy<4.0.0,>=3.7.1->llama-index-core) (1.1.1)\n",
+ "Installing collected packages: striprtf, ply, dirtyjson, pypdf, mypy-extensions, marshmallow, jsonpath-ng, h11, deprecated, typing-inspect, tiktoken, httpcore, httpx, dataclasses-json, openai, llamaindex-py-client, llama-index-legacy, llama-index-core, llama-parse, llama-index-readers-file, llama-index-llms-openai, llama-index-indices-managed-llama-cloud, llama-index-embeddings-openai, llama-index-readers-llama-parse, llama-index-multi-modal-llms-openai, llama-index-cli, llama-index-agent-openai, llama-index-program-openai, llama-index-question-gen-openai, llama-index\n",
+ "Successfully installed dataclasses-json-0.6.6 deprecated-1.2.14 dirtyjson-1.0.8 h11-0.14.0 httpcore-1.0.5 httpx-0.27.0 jsonpath-ng-1.6.1 llama-index-0.10.37 llama-index-agent-openai-0.2.5 llama-index-cli-0.1.12 llama-index-core-0.10.37.post1 llama-index-embeddings-openai-0.1.9 llama-index-indices-managed-llama-cloud-0.1.6 llama-index-legacy-0.9.48 llama-index-llms-openai-0.1.19 llama-index-multi-modal-llms-openai-0.1.6 llama-index-program-openai-0.1.6 llama-index-question-gen-openai-0.1.3 llama-index-readers-file-0.1.22 llama-index-readers-llama-parse-0.1.4 llama-parse-0.4.3 llamaindex-py-client-0.1.19 marshmallow-3.21.2 mypy-extensions-1.0.0 openai-1.30.1 ply-3.11 pypdf-4.2.0 striprtf-0.0.26 tiktoken-0.7.0 typing-inspect-0.9.0\n",
+ "Collecting llama-index-postprocessor-flag-embedding-reranker\n",
+ " Downloading llama_index_postprocessor_flag_embedding_reranker-0.1.3-py3-none-any.whl (3.0 kB)\n",
+ "Requirement already satisfied: llama-index-core<0.11.0,>=0.10.35 in /usr/local/lib/python3.10/dist-packages (from llama-index-postprocessor-flag-embedding-reranker) (0.10.37.post1)\n",
+ "Requirement already satisfied: PyYAML>=6.0.1 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (6.0.1)\n",
+ "Requirement already satisfied: SQLAlchemy[asyncio]>=1.4.49 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2.0.30)\n",
+ "Requirement already satisfied: aiohttp<4.0.0,>=3.8.6 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.9.5)\n",
+ "Requirement already satisfied: dataclasses-json in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (0.6.6)\n",
+ "Requirement already satisfied: deprecated>=1.2.9.3 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.2.14)\n",
+ "Requirement already satisfied: dirtyjson<2.0.0,>=1.0.8 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.0.8)\n",
+ "Requirement already satisfied: fsspec>=2023.5.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2023.6.0)\n",
+ "Requirement already satisfied: httpx in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (0.27.0)\n",
+ "Requirement already satisfied: jsonpath-ng<2.0.0,>=1.6.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.6.1)\n",
+ "Requirement already satisfied: llamaindex-py-client<0.2.0,>=0.1.18 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (0.1.19)\n",
+ "Requirement already satisfied: nest-asyncio<2.0.0,>=1.5.8 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.6.0)\n",
+ "Requirement already satisfied: networkx>=3.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.3)\n",
+ "Requirement already satisfied: nltk<4.0.0,>=3.8.1 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.8.1)\n",
+ "Requirement already satisfied: numpy in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.25.2)\n",
+ "Requirement already satisfied: openai>=1.1.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.30.1)\n",
+ "Requirement already satisfied: pandas in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2.0.3)\n",
+ "Requirement already satisfied: pillow>=9.0.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (9.4.0)\n",
+ "Requirement already satisfied: requests>=2.31.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2.31.0)\n",
+ "Requirement already satisfied: spacy<4.0.0,>=3.7.1 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.7.4)\n",
+ "Requirement already satisfied: tenacity<9.0.0,>=8.2.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (8.3.0)\n",
+ "Requirement already satisfied: tiktoken>=0.3.3 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (0.7.0)\n",
+ "Requirement already satisfied: tqdm<5.0.0,>=4.66.1 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (4.66.4)\n",
+ "Requirement already satisfied: typing-extensions>=4.5.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (4.11.0)\n",
+ "Requirement already satisfied: typing-inspect>=0.8.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (0.9.0)\n",
+ "Requirement already satisfied: wrapt in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.14.1)\n",
+ "Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.3.1)\n",
+ "Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (23.2.0)\n",
+ "Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.4.1)\n",
+ "Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (6.0.5)\n",
+ "Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.9.4)\n",
+ "Requirement already satisfied: async-timeout<5.0,>=4.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (4.0.3)\n",
+ "Requirement already satisfied: ply in /usr/local/lib/python3.10/dist-packages (from jsonpath-ng<2.0.0,>=1.6.0->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.11)\n",
+ "Requirement already satisfied: pydantic>=1.10 in /usr/local/lib/python3.10/dist-packages (from llamaindex-py-client<0.2.0,>=0.1.18->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2.7.1)\n",
+ "Requirement already satisfied: anyio in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.7.1)\n",
+ "Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2024.2.2)\n",
+ "Requirement already satisfied: httpcore==1.* in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.0.5)\n",
+ "Requirement already satisfied: idna in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.7)\n",
+ "Requirement already satisfied: sniffio in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.3.1)\n",
+ "Requirement already satisfied: h11<0.15,>=0.13 in /usr/local/lib/python3.10/dist-packages (from httpcore==1.*->httpx->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (0.14.0)\n",
+ "Requirement already satisfied: click in /usr/local/lib/python3.10/dist-packages (from nltk<4.0.0,>=3.8.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (8.1.7)\n",
+ "Requirement already satisfied: joblib in /usr/local/lib/python3.10/dist-packages (from nltk<4.0.0,>=3.8.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.4.2)\n",
+ "Requirement already satisfied: regex>=2021.8.3 in /usr/local/lib/python3.10/dist-packages (from nltk<4.0.0,>=3.8.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2023.12.25)\n",
+ "Requirement already satisfied: distro<2,>=1.7.0 in /usr/lib/python3/dist-packages (from openai>=1.1.0->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.7.0)\n",
+ "Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests>=2.31.0->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.3.2)\n",
+ "Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests>=2.31.0->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2.0.7)\n",
+ "Requirement already satisfied: spacy-legacy<3.1.0,>=3.0.11 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.0.12)\n",
+ "Requirement already satisfied: spacy-loggers<2.0.0,>=1.0.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.0.5)\n",
+ "Requirement already satisfied: murmurhash<1.1.0,>=0.28.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.0.10)\n",
+ "Requirement already satisfied: cymem<2.1.0,>=2.0.2 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2.0.8)\n",
+ "Requirement already satisfied: preshed<3.1.0,>=3.0.2 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.0.9)\n",
+ "Requirement already satisfied: thinc<8.3.0,>=8.2.2 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (8.2.3)\n",
+ "Requirement already satisfied: wasabi<1.2.0,>=0.9.1 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.1.2)\n",
+ "Requirement already satisfied: srsly<3.0.0,>=2.4.3 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2.4.8)\n",
+ "Requirement already satisfied: catalogue<2.1.0,>=2.0.6 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2.0.10)\n",
+ "Requirement already satisfied: weasel<0.4.0,>=0.1.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (0.3.4)\n",
+ "Requirement already satisfied: typer<0.10.0,>=0.3.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (0.9.4)\n",
+ "Requirement already satisfied: smart-open<7.0.0,>=5.2.1 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (6.4.0)\n",
+ "Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.1.4)\n",
+ "Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (67.7.2)\n",
+ "Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (24.0)\n",
+ "Requirement already satisfied: langcodes<4.0.0,>=3.2.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.4.0)\n",
+ "Requirement already satisfied: greenlet!=0.4.17 in /usr/local/lib/python3.10/dist-packages (from SQLAlchemy[asyncio]>=1.4.49->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.0.3)\n",
+ "Requirement already satisfied: mypy-extensions>=0.3.0 in /usr/local/lib/python3.10/dist-packages (from typing-inspect>=0.8.0->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.0.0)\n",
+ "Requirement already satisfied: marshmallow<4.0.0,>=3.18.0 in /usr/local/lib/python3.10/dist-packages (from dataclasses-json->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (3.21.2)\n",
+ "Requirement already satisfied: python-dateutil>=2.8.2 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2.8.2)\n",
+ "Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2023.4)\n",
+ "Requirement already satisfied: tzdata>=2022.1 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2024.1)\n",
+ "Requirement already satisfied: exceptiongroup in /usr/local/lib/python3.10/dist-packages (from anyio->httpx->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.2.1)\n",
+ "Requirement already satisfied: language-data>=1.2 in /usr/local/lib/python3.10/dist-packages (from langcodes<4.0.0,>=3.2.0->spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.2.0)\n",
+ "Requirement already satisfied: annotated-types>=0.4.0 in /usr/local/lib/python3.10/dist-packages (from pydantic>=1.10->llamaindex-py-client<0.2.0,>=0.1.18->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (0.6.0)\n",
+ "Requirement already satisfied: pydantic-core==2.18.2 in /usr/local/lib/python3.10/dist-packages (from pydantic>=1.10->llamaindex-py-client<0.2.0,>=0.1.18->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2.18.2)\n",
+ "Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.10/dist-packages (from python-dateutil>=2.8.2->pandas->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.16.0)\n",
+ "Requirement already satisfied: blis<0.8.0,>=0.7.8 in /usr/local/lib/python3.10/dist-packages (from thinc<8.3.0,>=8.2.2->spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (0.7.11)\n",
+ "Requirement already satisfied: confection<1.0.0,>=0.0.1 in /usr/local/lib/python3.10/dist-packages (from thinc<8.3.0,>=8.2.2->spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (0.1.4)\n",
+ "Requirement already satisfied: cloudpathlib<0.17.0,>=0.7.0 in /usr/local/lib/python3.10/dist-packages (from weasel<0.4.0,>=0.1.0->spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (0.16.0)\n",
+ "Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2->spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (2.1.5)\n",
+ "Requirement already satisfied: marisa-trie>=0.7.7 in /usr/local/lib/python3.10/dist-packages (from language-data>=1.2->langcodes<4.0.0,>=3.2.0->spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.35->llama-index-postprocessor-flag-embedding-reranker) (1.1.1)\n",
+ "Installing collected packages: llama-index-postprocessor-flag-embedding-reranker\n",
+ "Successfully installed llama-index-postprocessor-flag-embedding-reranker-0.1.3\n",
+ "Collecting git+https://github.com/FlagOpen/FlagEmbedding.git\n",
+ " Cloning https://github.com/FlagOpen/FlagEmbedding.git to /tmp/pip-req-build-wmws0zv2\n",
+ " Running command git clone --filter=blob:none --quiet https://github.com/FlagOpen/FlagEmbedding.git /tmp/pip-req-build-wmws0zv2\n",
+ " Resolved https://github.com/FlagOpen/FlagEmbedding.git to commit 95b873d9ac923bca47436efeae39ca4559970210\n",
+ " Preparing metadata (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ "Requirement already satisfied: torch>=1.6.0 in /usr/local/lib/python3.10/dist-packages (from FlagEmbedding==1.2.9) (2.2.1+cu121)\n",
+ "Requirement already satisfied: transformers>=4.33.0 in /usr/local/lib/python3.10/dist-packages (from FlagEmbedding==1.2.9) (4.40.2)\n",
+ "Collecting datasets (from FlagEmbedding==1.2.9)\n",
+ " Downloading datasets-2.19.1-py3-none-any.whl (542 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m542.0/542.0 kB\u001b[0m \u001b[31m6.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hCollecting accelerate>=0.20.1 (from FlagEmbedding==1.2.9)\n",
+ " Downloading accelerate-0.30.1-py3-none-any.whl (302 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m302.6/302.6 kB\u001b[0m \u001b[31m10.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hCollecting sentence_transformers (from FlagEmbedding==1.2.9)\n",
+ " Downloading sentence_transformers-2.7.0-py3-none-any.whl (171 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m171.5/171.5 kB\u001b[0m \u001b[31m9.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hRequirement already satisfied: numpy>=1.17 in /usr/local/lib/python3.10/dist-packages (from accelerate>=0.20.1->FlagEmbedding==1.2.9) (1.25.2)\n",
+ "Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.10/dist-packages (from accelerate>=0.20.1->FlagEmbedding==1.2.9) (24.0)\n",
+ "Requirement already satisfied: psutil in /usr/local/lib/python3.10/dist-packages (from accelerate>=0.20.1->FlagEmbedding==1.2.9) (5.9.5)\n",
+ "Requirement already satisfied: pyyaml in /usr/local/lib/python3.10/dist-packages (from accelerate>=0.20.1->FlagEmbedding==1.2.9) (6.0.1)\n",
+ "Requirement already satisfied: huggingface-hub in /usr/local/lib/python3.10/dist-packages (from accelerate>=0.20.1->FlagEmbedding==1.2.9) (0.20.3)\n",
+ "Requirement already satisfied: safetensors>=0.3.1 in /usr/local/lib/python3.10/dist-packages (from accelerate>=0.20.1->FlagEmbedding==1.2.9) (0.4.3)\n",
+ "Requirement already satisfied: filelock in /usr/local/lib/python3.10/dist-packages (from torch>=1.6.0->FlagEmbedding==1.2.9) (3.14.0)\n",
+ "Requirement already satisfied: typing-extensions>=4.8.0 in /usr/local/lib/python3.10/dist-packages (from torch>=1.6.0->FlagEmbedding==1.2.9) (4.11.0)\n",
+ "Requirement already satisfied: sympy in /usr/local/lib/python3.10/dist-packages (from torch>=1.6.0->FlagEmbedding==1.2.9) (1.12)\n",
+ "Requirement already satisfied: networkx in /usr/local/lib/python3.10/dist-packages (from torch>=1.6.0->FlagEmbedding==1.2.9) (3.3)\n",
+ "Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/dist-packages (from torch>=1.6.0->FlagEmbedding==1.2.9) (3.1.4)\n",
+ "Requirement already satisfied: fsspec in /usr/local/lib/python3.10/dist-packages (from torch>=1.6.0->FlagEmbedding==1.2.9) (2023.6.0)\n",
+ "Collecting nvidia-cuda-nvrtc-cu12==12.1.105 (from torch>=1.6.0->FlagEmbedding==1.2.9)\n",
+ " Using cached nvidia_cuda_nvrtc_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (23.7 MB)\n",
+ "Collecting nvidia-cuda-runtime-cu12==12.1.105 (from torch>=1.6.0->FlagEmbedding==1.2.9)\n",
+ " Using cached nvidia_cuda_runtime_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (823 kB)\n",
+ "Collecting nvidia-cuda-cupti-cu12==12.1.105 (from torch>=1.6.0->FlagEmbedding==1.2.9)\n",
+ " Using cached nvidia_cuda_cupti_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (14.1 MB)\n",
+ "Collecting nvidia-cudnn-cu12==8.9.2.26 (from torch>=1.6.0->FlagEmbedding==1.2.9)\n",
+ " Using cached nvidia_cudnn_cu12-8.9.2.26-py3-none-manylinux1_x86_64.whl (731.7 MB)\n",
+ "Collecting nvidia-cublas-cu12==12.1.3.1 (from torch>=1.6.0->FlagEmbedding==1.2.9)\n",
+ " Using cached nvidia_cublas_cu12-12.1.3.1-py3-none-manylinux1_x86_64.whl (410.6 MB)\n",
+ "Collecting nvidia-cufft-cu12==11.0.2.54 (from torch>=1.6.0->FlagEmbedding==1.2.9)\n",
+ " Using cached nvidia_cufft_cu12-11.0.2.54-py3-none-manylinux1_x86_64.whl (121.6 MB)\n",
+ "Collecting nvidia-curand-cu12==10.3.2.106 (from torch>=1.6.0->FlagEmbedding==1.2.9)\n",
+ " Using cached nvidia_curand_cu12-10.3.2.106-py3-none-manylinux1_x86_64.whl (56.5 MB)\n",
+ "Collecting nvidia-cusolver-cu12==11.4.5.107 (from torch>=1.6.0->FlagEmbedding==1.2.9)\n",
+ " Using cached nvidia_cusolver_cu12-11.4.5.107-py3-none-manylinux1_x86_64.whl (124.2 MB)\n",
+ "Collecting nvidia-cusparse-cu12==12.1.0.106 (from torch>=1.6.0->FlagEmbedding==1.2.9)\n",
+ " Using cached nvidia_cusparse_cu12-12.1.0.106-py3-none-manylinux1_x86_64.whl (196.0 MB)\n",
+ "Collecting nvidia-nccl-cu12==2.19.3 (from torch>=1.6.0->FlagEmbedding==1.2.9)\n",
+ " Using cached nvidia_nccl_cu12-2.19.3-py3-none-manylinux1_x86_64.whl (166.0 MB)\n",
+ "Collecting nvidia-nvtx-cu12==12.1.105 (from torch>=1.6.0->FlagEmbedding==1.2.9)\n",
+ " Using cached nvidia_nvtx_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (99 kB)\n",
+ "Requirement already satisfied: triton==2.2.0 in /usr/local/lib/python3.10/dist-packages (from torch>=1.6.0->FlagEmbedding==1.2.9) (2.2.0)\n",
+ "Collecting nvidia-nvjitlink-cu12 (from nvidia-cusolver-cu12==11.4.5.107->torch>=1.6.0->FlagEmbedding==1.2.9)\n",
+ " Using cached nvidia_nvjitlink_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl (21.1 MB)\n",
+ "Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.10/dist-packages (from transformers>=4.33.0->FlagEmbedding==1.2.9) (2023.12.25)\n",
+ "Requirement already satisfied: requests in /usr/local/lib/python3.10/dist-packages (from transformers>=4.33.0->FlagEmbedding==1.2.9) (2.31.0)\n",
+ "Requirement already satisfied: tokenizers<0.20,>=0.19 in /usr/local/lib/python3.10/dist-packages (from transformers>=4.33.0->FlagEmbedding==1.2.9) (0.19.1)\n",
+ "Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.10/dist-packages (from transformers>=4.33.0->FlagEmbedding==1.2.9) (4.66.4)\n",
+ "Requirement already satisfied: pyarrow>=12.0.0 in /usr/local/lib/python3.10/dist-packages (from datasets->FlagEmbedding==1.2.9) (14.0.2)\n",
+ "Requirement already satisfied: pyarrow-hotfix in /usr/local/lib/python3.10/dist-packages (from datasets->FlagEmbedding==1.2.9) (0.6)\n",
+ "Collecting dill<0.3.9,>=0.3.0 (from datasets->FlagEmbedding==1.2.9)\n",
+ " Downloading dill-0.3.8-py3-none-any.whl (116 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m116.3/116.3 kB\u001b[0m \u001b[31m15.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hRequirement already satisfied: pandas in /usr/local/lib/python3.10/dist-packages (from datasets->FlagEmbedding==1.2.9) (2.0.3)\n",
+ "Collecting xxhash (from datasets->FlagEmbedding==1.2.9)\n",
+ " Downloading xxhash-3.4.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (194 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m194.1/194.1 kB\u001b[0m \u001b[31m11.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hCollecting multiprocess (from datasets->FlagEmbedding==1.2.9)\n",
+ " Downloading multiprocess-0.70.16-py310-none-any.whl (134 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m134.8/134.8 kB\u001b[0m \u001b[31m16.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hRequirement already satisfied: aiohttp in /usr/local/lib/python3.10/dist-packages (from datasets->FlagEmbedding==1.2.9) (3.9.5)\n",
+ "Collecting huggingface-hub (from accelerate>=0.20.1->FlagEmbedding==1.2.9)\n",
+ " Downloading huggingface_hub-0.23.0-py3-none-any.whl (401 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m401.2/401.2 kB\u001b[0m \u001b[31m14.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hRequirement already satisfied: scikit-learn in /usr/local/lib/python3.10/dist-packages (from sentence_transformers->FlagEmbedding==1.2.9) (1.2.2)\n",
+ "Requirement already satisfied: scipy in /usr/local/lib/python3.10/dist-packages (from sentence_transformers->FlagEmbedding==1.2.9) (1.11.4)\n",
+ "Requirement already satisfied: Pillow in /usr/local/lib/python3.10/dist-packages (from sentence_transformers->FlagEmbedding==1.2.9) (9.4.0)\n",
+ "Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets->FlagEmbedding==1.2.9) (1.3.1)\n",
+ "Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets->FlagEmbedding==1.2.9) (23.2.0)\n",
+ "Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets->FlagEmbedding==1.2.9) (1.4.1)\n",
+ "Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets->FlagEmbedding==1.2.9) (6.0.5)\n",
+ "Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets->FlagEmbedding==1.2.9) (1.9.4)\n",
+ "Requirement already satisfied: async-timeout<5.0,>=4.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets->FlagEmbedding==1.2.9) (4.0.3)\n",
+ "Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests->transformers>=4.33.0->FlagEmbedding==1.2.9) (3.3.2)\n",
+ "Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests->transformers>=4.33.0->FlagEmbedding==1.2.9) (3.7)\n",
+ "Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests->transformers>=4.33.0->FlagEmbedding==1.2.9) (2.0.7)\n",
+ "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests->transformers>=4.33.0->FlagEmbedding==1.2.9) (2024.2.2)\n",
+ "Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2->torch>=1.6.0->FlagEmbedding==1.2.9) (2.1.5)\n",
+ "Requirement already satisfied: python-dateutil>=2.8.2 in /usr/local/lib/python3.10/dist-packages (from pandas->datasets->FlagEmbedding==1.2.9) (2.8.2)\n",
+ "Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas->datasets->FlagEmbedding==1.2.9) (2023.4)\n",
+ "Requirement already satisfied: tzdata>=2022.1 in /usr/local/lib/python3.10/dist-packages (from pandas->datasets->FlagEmbedding==1.2.9) (2024.1)\n",
+ "Requirement already satisfied: joblib>=1.1.1 in /usr/local/lib/python3.10/dist-packages (from scikit-learn->sentence_transformers->FlagEmbedding==1.2.9) (1.4.2)\n",
+ "Requirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.10/dist-packages (from scikit-learn->sentence_transformers->FlagEmbedding==1.2.9) (3.5.0)\n",
+ "Requirement already satisfied: mpmath>=0.19 in /usr/local/lib/python3.10/dist-packages (from sympy->torch>=1.6.0->FlagEmbedding==1.2.9) (1.3.0)\n",
+ "Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.10/dist-packages (from python-dateutil>=2.8.2->pandas->datasets->FlagEmbedding==1.2.9) (1.16.0)\n",
+ "Building wheels for collected packages: FlagEmbedding\n",
+ " Building wheel for FlagEmbedding (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+ " Created wheel for FlagEmbedding: filename=FlagEmbedding-1.2.9-py3-none-any.whl size=165917 sha256=24688c17b3bc6214be93c7bef77d4b9baacde749336b83f600188a704c6d8cad\n",
+ " Stored in directory: /tmp/pip-ephem-wheel-cache-45wml86h/wheels/41/cf/a5/5dee96ed64e5aaffe5aa3d583828258fdefed9a305db6e7f48\n",
+ "Successfully built FlagEmbedding\n",
+ "Installing collected packages: xxhash, nvidia-nvtx-cu12, nvidia-nvjitlink-cu12, nvidia-nccl-cu12, nvidia-curand-cu12, nvidia-cufft-cu12, nvidia-cuda-runtime-cu12, nvidia-cuda-nvrtc-cu12, nvidia-cuda-cupti-cu12, nvidia-cublas-cu12, dill, nvidia-cusparse-cu12, nvidia-cudnn-cu12, multiprocess, huggingface-hub, nvidia-cusolver-cu12, datasets, sentence_transformers, accelerate, FlagEmbedding\n",
+ " Attempting uninstall: huggingface-hub\n",
+ " Found existing installation: huggingface-hub 0.20.3\n",
+ " Uninstalling huggingface-hub-0.20.3:\n",
+ " Successfully uninstalled huggingface-hub-0.20.3\n",
+ "Successfully installed FlagEmbedding-1.2.9 accelerate-0.30.1 datasets-2.19.1 dill-0.3.8 huggingface-hub-0.23.0 multiprocess-0.70.16 nvidia-cublas-cu12-12.1.3.1 nvidia-cuda-cupti-cu12-12.1.105 nvidia-cuda-nvrtc-cu12-12.1.105 nvidia-cuda-runtime-cu12-12.1.105 nvidia-cudnn-cu12-8.9.2.26 nvidia-cufft-cu12-11.0.2.54 nvidia-curand-cu12-10.3.2.106 nvidia-cusolver-cu12-11.4.5.107 nvidia-cusparse-cu12-12.1.0.106 nvidia-nccl-cu12-2.19.3 nvidia-nvjitlink-cu12-12.4.127 nvidia-nvtx-cu12-12.1.105 sentence_transformers-2.7.0 xxhash-3.4.1\n",
+ "Collecting llama-index-vector-stores-lancedb\n",
+ " Downloading llama_index_vector_stores_lancedb-0.1.3-py3-none-any.whl (4.1 kB)\n",
+ "Collecting lancedb<0.6.0,>=0.5.1 (from llama-index-vector-stores-lancedb)\n",
+ " Downloading lancedb-0.5.7-py3-none-any.whl (115 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m115.1/115.1 kB\u001b[0m \u001b[31m3.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hRequirement already satisfied: llama-index-core<0.11.0,>=0.10.1 in /usr/local/lib/python3.10/dist-packages (from llama-index-vector-stores-lancedb) (0.10.37.post1)\n",
+ "Collecting deprecation (from lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb)\n",
+ " Downloading deprecation-2.1.0-py2.py3-none-any.whl (11 kB)\n",
+ "Collecting pylance==0.9.18 (from lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb)\n",
+ " Downloading pylance-0.9.18-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (21.6 MB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m21.6/21.6 MB\u001b[0m \u001b[31m14.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hCollecting ratelimiter~=1.0 (from lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb)\n",
+ " Downloading ratelimiter-1.2.0.post0-py3-none-any.whl (6.6 kB)\n",
+ "Collecting retry>=0.9.2 (from lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb)\n",
+ " Downloading retry-0.9.2-py2.py3-none-any.whl (8.0 kB)\n",
+ "Requirement already satisfied: tqdm>=4.27.0 in /usr/local/lib/python3.10/dist-packages (from lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (4.66.4)\n",
+ "Requirement already satisfied: pydantic>=1.10 in /usr/local/lib/python3.10/dist-packages (from lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (2.7.1)\n",
+ "Requirement already satisfied: attrs>=21.3.0 in /usr/local/lib/python3.10/dist-packages (from lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (23.2.0)\n",
+ "Collecting semver>=3.0 (from lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb)\n",
+ " Downloading semver-3.0.2-py3-none-any.whl (17 kB)\n",
+ "Requirement already satisfied: cachetools in /usr/local/lib/python3.10/dist-packages (from lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (5.3.3)\n",
+ "Requirement already satisfied: pyyaml>=6.0 in /usr/local/lib/python3.10/dist-packages (from lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (6.0.1)\n",
+ "Requirement already satisfied: click>=8.1.7 in /usr/local/lib/python3.10/dist-packages (from lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (8.1.7)\n",
+ "Requirement already satisfied: requests>=2.31.0 in /usr/local/lib/python3.10/dist-packages (from lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (2.31.0)\n",
+ "Collecting overrides>=0.7 (from lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb)\n",
+ " Downloading overrides-7.7.0-py3-none-any.whl (17 kB)\n",
+ "Requirement already satisfied: pyarrow>=12 in /usr/local/lib/python3.10/dist-packages (from pylance==0.9.18->lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (14.0.2)\n",
+ "Requirement already satisfied: numpy>=1.22 in /usr/local/lib/python3.10/dist-packages (from pylance==0.9.18->lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (1.25.2)\n",
+ "Requirement already satisfied: SQLAlchemy[asyncio]>=1.4.49 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (2.0.30)\n",
+ "Requirement already satisfied: aiohttp<4.0.0,>=3.8.6 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (3.9.5)\n",
+ "Requirement already satisfied: dataclasses-json in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (0.6.6)\n",
+ "Requirement already satisfied: deprecated>=1.2.9.3 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.2.14)\n",
+ "Requirement already satisfied: dirtyjson<2.0.0,>=1.0.8 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.0.8)\n",
+ "Requirement already satisfied: fsspec>=2023.5.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (2023.6.0)\n",
+ "Requirement already satisfied: httpx in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (0.27.0)\n",
+ "Requirement already satisfied: jsonpath-ng<2.0.0,>=1.6.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.6.1)\n",
+ "Requirement already satisfied: llamaindex-py-client<0.2.0,>=0.1.18 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (0.1.19)\n",
+ "Requirement already satisfied: nest-asyncio<2.0.0,>=1.5.8 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.6.0)\n",
+ "Requirement already satisfied: networkx>=3.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (3.3)\n",
+ "Requirement already satisfied: nltk<4.0.0,>=3.8.1 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (3.8.1)\n",
+ "Requirement already satisfied: openai>=1.1.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.30.1)\n",
+ "Requirement already satisfied: pandas in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (2.0.3)\n",
+ "Requirement already satisfied: pillow>=9.0.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (9.4.0)\n",
+ "Requirement already satisfied: spacy<4.0.0,>=3.7.1 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (3.7.4)\n",
+ "Requirement already satisfied: tenacity<9.0.0,>=8.2.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (8.3.0)\n",
+ "Requirement already satisfied: tiktoken>=0.3.3 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (0.7.0)\n",
+ "Requirement already satisfied: typing-extensions>=4.5.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (4.11.0)\n",
+ "Requirement already satisfied: typing-inspect>=0.8.0 in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (0.9.0)\n",
+ "Requirement already satisfied: wrapt in /usr/local/lib/python3.10/dist-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.14.1)\n",
+ "Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.3.1)\n",
+ "Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.4.1)\n",
+ "Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (6.0.5)\n",
+ "Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.9.4)\n",
+ "Requirement already satisfied: async-timeout<5.0,>=4.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp<4.0.0,>=3.8.6->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (4.0.3)\n",
+ "Requirement already satisfied: ply in /usr/local/lib/python3.10/dist-packages (from jsonpath-ng<2.0.0,>=1.6.0->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (3.11)\n",
+ "Requirement already satisfied: anyio in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (3.7.1)\n",
+ "Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (2024.2.2)\n",
+ "Requirement already satisfied: httpcore==1.* in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.0.5)\n",
+ "Requirement already satisfied: idna in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (3.7)\n",
+ "Requirement already satisfied: sniffio in /usr/local/lib/python3.10/dist-packages (from httpx->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.3.1)\n",
+ "Requirement already satisfied: h11<0.15,>=0.13 in /usr/local/lib/python3.10/dist-packages (from httpcore==1.*->httpx->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (0.14.0)\n",
+ "Requirement already satisfied: joblib in /usr/local/lib/python3.10/dist-packages (from nltk<4.0.0,>=3.8.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.4.2)\n",
+ "Requirement already satisfied: regex>=2021.8.3 in /usr/local/lib/python3.10/dist-packages (from nltk<4.0.0,>=3.8.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (2023.12.25)\n",
+ "Requirement already satisfied: distro<2,>=1.7.0 in /usr/lib/python3/dist-packages (from openai>=1.1.0->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.7.0)\n",
+ "Requirement already satisfied: annotated-types>=0.4.0 in /usr/local/lib/python3.10/dist-packages (from pydantic>=1.10->lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (0.6.0)\n",
+ "Requirement already satisfied: pydantic-core==2.18.2 in /usr/local/lib/python3.10/dist-packages (from pydantic>=1.10->lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (2.18.2)\n",
+ "Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests>=2.31.0->lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (3.3.2)\n",
+ "Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests>=2.31.0->lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (2.0.7)\n",
+ "Requirement already satisfied: decorator>=3.4.2 in /usr/local/lib/python3.10/dist-packages (from retry>=0.9.2->lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb) (4.4.2)\n",
+ "Collecting py<2.0.0,>=1.4.26 (from retry>=0.9.2->lancedb<0.6.0,>=0.5.1->llama-index-vector-stores-lancedb)\n",
+ " Downloading py-1.11.0-py2.py3-none-any.whl (98 kB)\n",
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m98.7/98.7 kB\u001b[0m \u001b[31m11.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+ "\u001b[?25hRequirement already satisfied: spacy-legacy<3.1.0,>=3.0.11 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (3.0.12)\n",
+ "Requirement already satisfied: spacy-loggers<2.0.0,>=1.0.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.0.5)\n",
+ "Requirement already satisfied: murmurhash<1.1.0,>=0.28.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.0.10)\n",
+ "Requirement already satisfied: cymem<2.1.0,>=2.0.2 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (2.0.8)\n",
+ "Requirement already satisfied: preshed<3.1.0,>=3.0.2 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (3.0.9)\n",
+ "Requirement already satisfied: thinc<8.3.0,>=8.2.2 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (8.2.3)\n",
+ "Requirement already satisfied: wasabi<1.2.0,>=0.9.1 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.1.2)\n",
+ "Requirement already satisfied: srsly<3.0.0,>=2.4.3 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (2.4.8)\n",
+ "Requirement already satisfied: catalogue<2.1.0,>=2.0.6 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (2.0.10)\n",
+ "Requirement already satisfied: weasel<0.4.0,>=0.1.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (0.3.4)\n",
+ "Requirement already satisfied: typer<0.10.0,>=0.3.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (0.9.4)\n",
+ "Requirement already satisfied: smart-open<7.0.0,>=5.2.1 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (6.4.0)\n",
+ "Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (3.1.4)\n",
+ "Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (67.7.2)\n",
+ "Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (24.0)\n",
+ "Requirement already satisfied: langcodes<4.0.0,>=3.2.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (3.4.0)\n",
+ "Requirement already satisfied: greenlet!=0.4.17 in /usr/local/lib/python3.10/dist-packages (from SQLAlchemy[asyncio]>=1.4.49->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (3.0.3)\n",
+ "Requirement already satisfied: mypy-extensions>=0.3.0 in /usr/local/lib/python3.10/dist-packages (from typing-inspect>=0.8.0->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.0.0)\n",
+ "Requirement already satisfied: marshmallow<4.0.0,>=3.18.0 in /usr/local/lib/python3.10/dist-packages (from dataclasses-json->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (3.21.2)\n",
+ "Requirement already satisfied: python-dateutil>=2.8.2 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (2.8.2)\n",
+ "Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (2023.4)\n",
+ "Requirement already satisfied: tzdata>=2022.1 in /usr/local/lib/python3.10/dist-packages (from pandas->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (2024.1)\n",
+ "Requirement already satisfied: exceptiongroup in /usr/local/lib/python3.10/dist-packages (from anyio->httpx->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.2.1)\n",
+ "Requirement already satisfied: language-data>=1.2 in /usr/local/lib/python3.10/dist-packages (from langcodes<4.0.0,>=3.2.0->spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.2.0)\n",
+ "Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.10/dist-packages (from python-dateutil>=2.8.2->pandas->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.16.0)\n",
+ "Requirement already satisfied: blis<0.8.0,>=0.7.8 in /usr/local/lib/python3.10/dist-packages (from thinc<8.3.0,>=8.2.2->spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (0.7.11)\n",
+ "Requirement already satisfied: confection<1.0.0,>=0.0.1 in /usr/local/lib/python3.10/dist-packages (from thinc<8.3.0,>=8.2.2->spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (0.1.4)\n",
+ "Requirement already satisfied: cloudpathlib<0.17.0,>=0.7.0 in /usr/local/lib/python3.10/dist-packages (from weasel<0.4.0,>=0.1.0->spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (0.16.0)\n",
+ "Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2->spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (2.1.5)\n",
+ "Requirement already satisfied: marisa-trie>=0.7.7 in /usr/local/lib/python3.10/dist-packages (from language-data>=1.2->langcodes<4.0.0,>=3.2.0->spacy<4.0.0,>=3.7.1->llama-index-core<0.11.0,>=0.10.1->llama-index-vector-stores-lancedb) (1.1.1)\n",
+ "Installing collected packages: ratelimiter, semver, py, overrides, deprecation, retry, pylance, lancedb, llama-index-vector-stores-lancedb\n",
+ "Successfully installed deprecation-2.1.0 lancedb-0.5.7 llama-index-vector-stores-lancedb-0.1.3 overrides-7.7.0 py-1.11.0 pylance-0.9.18 ratelimiter-1.2.0.post0 retry-0.9.2 semver-3.0.2\n"
+ ]
+ }
+ ],
+ "source": [
+ "# install dependencies\n",
+ "%pip install llama-index llama-index-core llama-index-embeddings-openai llama-parse\n",
+ "%pip install llama-index-postprocessor-flag-embedding-reranker\n",
+ "%pip install git+https://github.com/FlagOpen/FlagEmbedding.git\n",
+ "%pip install llama-index-vector-stores-lancedb\n",
+ "%pip install --upgrade --quiet langchain langchain-community langchainhub langchain-openai langchain-chroma bs4 lancedb\n",
+ "%pip install unstructured"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "NqK1g8Bg7zlZ"
+ },
+ "outputs": [],
+ "source": [
+ "# llama-parse is async-first, running the async code in a notebook requires the use of nest_asyncio\n",
+ "import os\n",
+ "import nest_asyncio\n",
+ "\n",
+ "nest_asyncio.apply()\n",
+ "\n",
+ "# API access to llama-cloud\n",
+ "os.environ[\"LLAMA_CLOUD_API_KEY\"] = \"llx-...\"\n",
+ "# Using OpenAI API for embeddings/llms\n",
+ "os.environ[\"OPENAI_API_KEY\"] = \"sk-proj-...\""
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "4OmWRDtAKONC"
+ },
+ "source": [
+ "### Download the PDF (contains both tables & text)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "smCjT2FIj9Fo"
+ },
+ "outputs": [],
+ "source": [
+ "!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/10q/uber_10q_march_2022.pdf' -O './uber_10q_march_2022.pdf'"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "1I--ouSiTGvj"
+ },
+ "source": [
+ "# 1. Langchain with Q&A on PDF"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "4ysMDhHiR2bG"
+ },
+ "outputs": [],
+ "source": [
+ "import bs4\n",
+ "from langchain import hub\n",
+ "from langchain_community.document_loaders import WebBaseLoader\n",
+ "from langchain_openai import ChatOpenAI\n",
+ "from langchain_community.document_loaders import PyPDFLoader\n",
+ "from langchain.vectorstores import LanceDB\n",
+ "from langchain_core.output_parsers import StrOutputParser\n",
+ "from langchain_core.runnables import RunnablePassthrough\n",
+ "from langchain_openai import OpenAIEmbeddings\n",
+ "from langchain_text_splitters import RecursiveCharacterTextSplitter"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 54
+ },
+ "id": "4emkzsCqSMTe",
+ "outputId": "d122f7f4-9072-4a4f-acf8-3d3b39755328"
+ },
+ "outputs": [
+ {
+ "data": {
+ "application/vnd.google.colaboratory.intrinsic+json": {
+ "type": "string"
+ },
+ "text/plain": [
+ "'The net loss value attributable to Uber Technologies, Inc. for the period was $5.9 billion, compared to $108 million in the same period the previous year. This represents a significant increase in net loss year-over-year.'"
+ ]
+ },
+ "execution_count": 9,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "llm = ChatOpenAI(model=\"gpt-3.5-turbo-0125\")\n",
+ "\n",
+ "loader = PyPDFLoader(\"/content/uber_10q_march_2022.pdf\")\n",
+ "docs = loader.load()\n",
+ "text_splitter = RecursiveCharacterTextSplitter(chunk_size=1000, chunk_overlap=200)\n",
+ "splits = text_splitter.split_documents(docs)\n",
+ "vectorstore = LanceDB.from_documents(documents=splits, embedding=OpenAIEmbeddings())\n",
+ "\n",
+ "# Retrieve and generate using the relevant snippets of the blog.\n",
+ "retriever = vectorstore.as_retriever()\n",
+ "prompt = hub.pull(\"rlm/rag-prompt\")\n",
+ "\n",
+ "\n",
+ "def format_docs(docs):\n",
+ " return \"\\n\\n\".join(doc.page_content for doc in docs)\n",
+ "\n",
+ "\n",
+ "rag_chain = (\n",
+ " {\"context\": retriever | format_docs, \"question\": RunnablePassthrough()}\n",
+ " | prompt\n",
+ " | llm\n",
+ " | StrOutputParser()\n",
+ ")\n",
+ "\n",
+ "qa_langchain_query1 = (\n",
+ " \" what is the net loss value attributable to Uber compared to last year?\"\n",
+ ")\n",
+ "rag_chain.invoke(qa_langchain_query1)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 36
+ },
+ "id": "G4qR6GxTSMWO",
+ "outputId": "38fff0fe-377f-4b10-aa4f-362712e539d9"
+ },
+ "outputs": [
+ {
+ "data": {
+ "application/vnd.google.colaboratory.intrinsic+json": {
+ "type": "string"
+ },
+ "text/plain": [
+ "\"I don't know.\""
+ ]
+ },
+ "execution_count": 10,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "qa_langchain_query2 = \"how is the Cash paid for Income taxes, net of refunds from Supplemental disclosures of cash flow information?\"\n",
+ "rag_chain.invoke(qa_langchain_query2)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 36
+ },
+ "id": "b3DM-lCnSMZG",
+ "outputId": "4c97e6f3-d610-447f-ba4f-62715ae274a1"
+ },
+ "outputs": [
+ {
+ "data": {
+ "application/vnd.google.colaboratory.intrinsic+json": {
+ "type": "string"
+ },
+ "text/plain": [
+ "\"I don't have detailed charts of intangible assets, net as of December 31, 2021 and March 31, 2022.\""
+ ]
+ },
+ "execution_count": 11,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "qa_langchain_query3 = \"give me detailed charts of intangible assets, net as of December 31, 2021 and March 31, 2022\"\n",
+ "rag_chain.invoke(qa_langchain_query3)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "pwZzEShGwwvT"
+ },
+ "source": [
+ "FOR QUERY 2 & QUERY 3 we are not getting the answer\n",
+ "\n",
+ "**LETS TRY LLAMAINDEX**"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "GVD2sPBEcRE3"
+ },
+ "source": [
+ "# 2 . Llamaindex with Q&A on PDF"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "H__qJIWYdmgY"
+ },
+ "outputs": [],
+ "source": [
+ "import textwrap\n",
+ "from llama_index.vector_stores.lancedb import LanceDBVectorStore\n",
+ "from llama_index.core import SimpleDirectoryReader, Document, StorageContext\n",
+ "from llama_index.core import VectorStoreIndex\n",
+ "from llama_index.llms.openai import OpenAI\n",
+ "from llama_index.embeddings.openai import OpenAIEmbedding\n",
+ "from llama_index.core import VectorStoreIndex\n",
+ "from llama_index.core import SimpleDirectoryReader\n",
+ "from llama_index.postprocessor.flag_embedding_reranker import (\n",
+ " FlagEmbeddingReranker,\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "BCNNrAw9Dklk"
+ },
+ "outputs": [],
+ "source": [
+ "from llama_index.vector_stores.lancedb import LanceDBVectorStore\n",
+ "\n",
+ "vector_store_pdf = LanceDBVectorStore(uri=\"/tmp/lancedb_lamaindex\")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 393,
+ "referenced_widgets": [
+ "0ab0af38e5c54405a48fd40de7bfe606",
+ "17300f8bea7644f09907376fab719f92",
+ "f2bbc601684446b5be521833592a319f",
+ "326bb35286ba4d23a5e3b8b364c7a44f",
+ "d19be0dd23344eb2887605d88f80c142",
+ "2e3f44b38e6842a9bd9db4c5e6cc6bcf",
+ "738a133fb9204208913431c2a35fef5c",
+ "d407aa9ec48347d9b5536a69fe9f5dc7",
+ "d779e07ce42c4f088dea05f8fd22922f",
+ "1700dc0afbb14e4cb85da9eedfca8605",
+ "498b64a4f33046c9a65b5a5486376359",
+ "f7de0a6aa31c41de90f45e862fe4ce15",
+ "a034635cd3d94f8b89b56e6b972daa63",
+ "5bec1c5f11434bf481862e67f73403f0",
+ "bd254eea803c4e44b60783f05c5c1ca1",
+ "1bbdd18be2e74c7f8916c95a1e5cf511",
+ "0f2a76e74d74459182402f92fbb946c9",
+ "d3d91529e30a4d67881a782d7eb54060",
+ "b69deb8741ad47229438548949e2da42",
+ "f894d72c0e19494abb732b2db46ced98",
+ "6fe509c6157e4ddbb1cfcf319cf04fac",
+ "de859b9a94684d39ac8a7aef81058a3b",
+ "09f4df92faa844b28fef90c3620b09e1",
+ "d0a4ce2181c84af8b277058689a35641",
+ "faaff37bf60e4870a9cbebd20d3afc6e",
+ "f55e2cf0eb34482da1902adb652fca5a",
+ "55ee9cd902c147ae97c4deea101f25e8",
+ "f3f8c3dc8a404bd4af55b105f98267d7",
+ "e13aaa64b9c848f9a67f61295fb546fe",
+ "2a6b2dacce79419b9a2469b6362abca8",
+ "3c63c0a7eac54b0fb72775e3fb6ba891",
+ "61b79e900bfb4b849cfa740236cf6e44",
+ "57ae83a64b224117a6d01573d3437540",
+ "e9394568881e456c99758e52813428f6",
+ "8779121983404d48ab560c1e0d2300f9",
+ "48550e72c11c4da7985434d6df76df28",
+ "62c949dbc1714614b00254b3b3ee80c1",
+ "1c7cdfc1b5ec488dbb08e75b5c89dc0e",
+ "12eb26078d544f3abe72e91835cc14fc",
+ "77e3dff7c22343848b9ad097e47b16df",
+ "cb5a5f4ddfe14e37a6807a74005b01f3",
+ "9492f5081dd841099c26140b03ef791a",
+ "86d744b64d604e388d3aadfbbf84eb1c",
+ "5eda98429678436daa918640f2193bd9",
+ "2ff3c80449634437b0d88f9a87932b14",
+ "69a388720e4346e6a98e77c06acb1089",
+ "22407e9553af4c00a93aa9085ca10b67",
+ "edd1f2ad4cca4cb093f3d2ab09ebbe85",
+ "bfb0b78895c34af1abf9a8d669c24aeb",
+ "356acddfa87f4c1b808245a7880a2ff4",
+ "f8502cc7258f4efeb43d9e2b0b5e91b2",
+ "1d645f1f896140158cc30061c0e67080",
+ "14467a4613e74d819502803825959e5e",
+ "cdc69d3551bd4bb183d30361b164d0ac",
+ "17173ad4ef464e9fb4611c2c9c611736",
+ "edb9212d6a3540b19de36e895b6179e7",
+ "edada2634ba54b9290fe904edc9905f0",
+ "1447b1135b994f14866cd17a34124ff6",
+ "9b8c980b418146e992ec2e5cfde8bf6b",
+ "45793d371304439db504cb04d4913911",
+ "2b190420fb764d06bd468c2914773fa8",
+ "c4829c7eed1541129e4944ad7784b43d",
+ "74e2c433490840a0a4caa296bd3521f0",
+ "7d690fb215c64c7a81cdce7156456ca2",
+ "cd48bd077b9c4a7b822a79fa66b65031",
+ "b246fdb13dc84c02b266565bb75e621f"
+ ]
+ },
+ "id": "1BLK8QPhcyMh",
+ "outputId": "fd0df920-77f7-453f-8b06-3cb35e054467"
+ },
+ "outputs": [
+ {
+ "name": "stderr",
+ "output_type": "stream",
+ "text": [
+ "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.\n",
+ " warnings.warn(\n",
+ "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_token.py:89: UserWarning: \n",
+ "The secret `HF_TOKEN` does not exist in your Colab secrets.\n",
+ "To authenticate with the Hugging Face Hub, create a token in your settings tab (https://huggingface.co/settings/tokens), set it as secret in your Google Colab and restart your session.\n",
+ "You will be able to reuse this secret in all of your notebooks.\n",
+ "Please note that authentication is recommended but still optional to access public models or datasets.\n",
+ " warnings.warn(\n"
+ ]
+ },
+ {
+ "data": {
+ "application/vnd.jupyter.widget-view+json": {
+ "model_id": "0ab0af38e5c54405a48fd40de7bfe606",
+ "version_major": 2,
+ "version_minor": 0
+ },
+ "text/plain": [
+ "tokenizer_config.json: 0%| | 0.00/443 [00:00, ?B/s]"
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ },
+ {
+ "data": {
+ "application/vnd.jupyter.widget-view+json": {
+ "model_id": "f7de0a6aa31c41de90f45e862fe4ce15",
+ "version_major": 2,
+ "version_minor": 0
+ },
+ "text/plain": [
+ "sentencepiece.bpe.model: 0%| | 0.00/5.07M [00:00, ?B/s]"
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ },
+ {
+ "data": {
+ "application/vnd.jupyter.widget-view+json": {
+ "model_id": "09f4df92faa844b28fef90c3620b09e1",
+ "version_major": 2,
+ "version_minor": 0
+ },
+ "text/plain": [
+ "tokenizer.json: 0%| | 0.00/17.1M [00:00, ?B/s]"
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ },
+ {
+ "data": {
+ "application/vnd.jupyter.widget-view+json": {
+ "model_id": "e9394568881e456c99758e52813428f6",
+ "version_major": 2,
+ "version_minor": 0
+ },
+ "text/plain": [
+ "special_tokens_map.json: 0%| | 0.00/279 [00:00, ?B/s]"
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ },
+ {
+ "data": {
+ "application/vnd.jupyter.widget-view+json": {
+ "model_id": "2ff3c80449634437b0d88f9a87932b14",
+ "version_major": 2,
+ "version_minor": 0
+ },
+ "text/plain": [
+ "config.json: 0%| | 0.00/801 [00:00, ?B/s]"
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ },
+ {
+ "data": {
+ "application/vnd.jupyter.widget-view+json": {
+ "model_id": "edb9212d6a3540b19de36e895b6179e7",
+ "version_major": 2,
+ "version_minor": 0
+ },
+ "text/plain": [
+ "model.safetensors: 0%| | 0.00/2.24G [00:00, ?B/s]"
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ },
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "$22\n"
+ ]
+ }
+ ],
+ "source": [
+ "from llama_index.core import SimpleDirectoryReader\n",
+ "from llama_index.postprocessor.flag_embedding_reranker import (\n",
+ " FlagEmbeddingReranker,\n",
+ ")\n",
+ "\n",
+ "reader = SimpleDirectoryReader(input_dir=\"/content/data_pdf/\")\n",
+ "\n",
+ "documents_pdf_loader = reader.load_data()\n",
+ "\n",
+ "from llama_index.vector_stores.lancedb import LanceDBVectorStore\n",
+ "\n",
+ "vector_store_pdf = LanceDBVectorStore(uri=\"/tmp/lancedb_lamaindex\")\n",
+ "storage_context_pdf = StorageContext.from_defaults(vector_store=vector_store_pdf)\n",
+ "lance_index_pdf = VectorStoreIndex.from_documents(\n",
+ " documents_pdf_loader, storage_context=storage_context_pdf\n",
+ ")\n",
+ "\n",
+ "\n",
+ "reranker = FlagEmbeddingReranker(\n",
+ " top_n=5,\n",
+ " model=\"BAAI/bge-reranker-large\",\n",
+ ")\n",
+ "\n",
+ "Lance_index_query_pdf = lance_index_pdf.as_query_engine(\n",
+ " similarity_top_k=10, node_postprocessors=[reranker]\n",
+ ")\n",
+ "\n",
+ "qa_lama_query1 = \"how is the Cash paid for Income taxes, net of refunds from Supplemental disclosures of cash flow information?\"\n",
+ "output1 = Lance_index_query_pdf.query(qa_lama_query1)\n",
+ "print(output1.response)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "oCt6P6WCfTVV",
+ "outputId": "99bf8c85-56d8-4df5-d2b8-bf2b4439eeb9"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "The net loss attributable to Uber Technologies, Inc. was $5.9 billion in the current period, compared to a net loss of $108 million in the same period last year.\n"
+ ]
+ }
+ ],
+ "source": [
+ "# query\n",
+ "qa_lama_query2 = (\n",
+ " \" what is the net loss value attributable to Uber compared to last year?\"\n",
+ ")\n",
+ "output2 = Lance_index_query_pdf.query(qa_lama_query2)\n",
+ "print(output2.response)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "M5RqjSoVcyVh",
+ "outputId": "1b6bc4d9-1dd2-4f34-a147-33323a2906a5"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "The detailed charts of intangible assets, net as of December 31, 2021 and March 31, 2022 are as follows:\n",
+ "\n",
+ "**As of December 31, 2021:**\n",
+ "- Consumer, Merchant and other relationships: $1,574 million\n",
+ "- Developed technology: $653 million\n",
+ "- Trade names and trademarks: $175 million\n",
+ "- Patents: $8 million\n",
+ "- Other: $2 million\n",
+ "- Total Intangible assets: $2,412 million\n",
+ "\n",
+ "**As of March 31, 2022:**\n",
+ "- Consumer, Merchant and other relationships: $1,494 million\n",
+ "- Developed technology: $599 million\n",
+ "- Trade names and trademarks: $167 million\n",
+ "- Patents: $7 million\n",
+ "- Other: $2 million\n",
+ "- Total Intangible assets: $2,269 million\n"
+ ]
+ }
+ ],
+ "source": [
+ "# query\n",
+ "qa_lama_query3 = \"give me detailed charts of intangible assets, net as of December 31, 2021 and March 31, 2022\"\n",
+ "output3 = Lance_index_query_pdf.query(qa_lama_query3)\n",
+ "print(output3.response)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "-wbk2rFy0iOZ",
+ "outputId": "f5b59d01-e78f-477c-bbb7-f4210c6b7232"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Adjusted EBITDA for 2021 was a loss of $359 million, while for 2022 it improved to $168 million. Interest expense for the period increased from $115 million in 2021 to $129 million in 2022.\n"
+ ]
+ }
+ ],
+ "source": [
+ "qa_lama_query4 = \"what is Adjusted EBITDA 2021 vs 2022 ? what is intreset expense\"\n",
+ "output2 = Lance_index_query_pdf.query(qa_lama_query4)\n",
+ "print(output2.response)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "zFgLYsCkcmUH"
+ },
+ "source": [
+ "# 3 Llamaparser with Langchain on PDF\n",
+ "\n",
+ "we are simply saving all llamaparser output in .md file & based on that we are doing Q& A. there are better methods also to add llamaparser with Langchain **but lets do this experiment**"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "QRne5LQsKugl",
+ "outputId": "6ffe8b7e-41ac-4d9c-8993-b23db079b3d5"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Started parsing the file under job_id 6aef2258-0e5c-4796-8ea1-f82e817cb542\n"
+ ]
+ }
+ ],
+ "source": [
+ "import os\n",
+ "from llama_parse import LlamaParse\n",
+ "\n",
+ "# Ensure the data folder exists\n",
+ "if not os.path.exists(\"data\"):\n",
+ " os.makedirs(\"data\")\n",
+ "\n",
+ "# Load data using LlamaParse\n",
+ "documents_LlamaParse = LlamaParse(result_type=\"markdown\").load_data(\n",
+ " \"/content/uber_10q_march_2022.pdf\"\n",
+ ")\n",
+ "\n",
+ "# Open the file in append mode ('a') and write the content\n",
+ "with open(\"data/output.md\", \"a\") as f: # Open the file in append mode ('a')\n",
+ " for doc in documents_LlamaParse:\n",
+ " f.write(doc.text + \"\\n\")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "Jg1y4lO58FDB"
+ },
+ "outputs": [],
+ "source": [
+ "import bs4\n",
+ "from langchain import hub\n",
+ "from langchain_community.document_loaders import WebBaseLoader\n",
+ "from langchain_chroma import Chroma\n",
+ "from langchain_core.output_parsers import StrOutputParser\n",
+ "from langchain_core.runnables import RunnablePassthrough\n",
+ "from langchain_openai import OpenAIEmbeddings\n",
+ "from langchain_text_splitters import RecursiveCharacterTextSplitter"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "QENCaglMjher",
+ "outputId": "bb1513e4-33c0-46cb-937b-c2700a796fd3"
+ },
+ "outputs": [
+ {
+ "name": "stderr",
+ "output_type": "stream",
+ "text": [
+ "\n",
+ " 0%| | 0/1 [00:00, ?it/s]\u001b[A[nltk_data] Downloading package averaged_perceptron_tagger to\n",
+ "[nltk_data] /root/nltk_data...\n",
+ "[nltk_data] Unzipping taggers/averaged_perceptron_tagger.zip.\n",
+ "\n",
+ "100%|██████████| 1/1 [00:12<00:00, 12.93s/it]\n"
+ ]
+ }
+ ],
+ "source": [
+ "#\n",
+ "from langchain_openai import ChatOpenAI\n",
+ "from langchain_community.document_loaders import DirectoryLoader\n",
+ "\n",
+ "llm = ChatOpenAI(model=\"gpt-3.5-turbo-0125\")\n",
+ "\n",
+ "loader = DirectoryLoader(\"/content/data\", glob=\"**/*.md\", show_progress=True)\n",
+ "documents = loader.load()\n",
+ "\n",
+ "text_splitter = RecursiveCharacterTextSplitter(chunk_size=2000, chunk_overlap=300)\n",
+ "docs = text_splitter.split_documents(documents)\n",
+ "\n",
+ "\n",
+ "# print(docs[])"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "-SoJIIOvoDzr",
+ "outputId": "030722f4-b1ff-4cba-8827-4cb2d09ffb83"
+ },
+ "outputs": [
+ {
+ "data": {
+ "text/plain": [
+ "Document(page_content='Document\\n\\nUNITED STATES SECURITIES AND EXCHANGE COMMISSION Washington, D.C. 20549\\n\\nFORM 10-Q\\n\\n(Mark One)\\n\\nQUARTERLY REPORT PURSUANT TO SECTION 13 OR 15(d) OF THE SECURITIES EXCHANGE ACT OF 1934 For the quarterly period ended March 31, 2022\\n\\nCommission File Number: 001-38902\\n\\nUBER TECHNOLOGIES, INC.\\n\\n(Exact name of registrant as specified in its charter)\\n\\nDelaware 45-2647441\\n\\n(State or other jurisdiction of incorporation or organization) 1515 3rd Street (I.R.S. Employer Identification No.)\\n\\nSan Francisco, California 94158\\n\\n(Address of principal executive offices, including zip code) (415) 612-8582\\n\\n(Registrant’s telephone number, including area code)\\n\\nSecurities registered pursuant to Section 12(b) of the Act:\\n\\nTitle of each class Trading Symbol(s) Name of each exchange on which registered Common Stock, par value $0.00001 per share UBER New York Stock Exchange\\n\\nIndicate by check mark whether the registrant (1) has filed all reports required to be filed by Section 13 or 15(d) of the Securities Exchange Act of 1934 during the preceding 12 months (or for such shorter period that the registrant was required to file such reports), and (2) has been subject to such filing requirements for the past 90 days. Yes ☒ No ☐\\n\\nIndicate by check mark whether the registrant has submitted electronically every Interactive Data File required to be submitted pursuant to Rule 405 of Regulation S-T (§232.405 of this chapter) during the preceding 12 months (or for such shorter period that the registrant was required to submit such files). Yes ☒ No ☐\\n\\nIndicate by check mark whether the registrant is a large accelerated filer, an accelerated filer, a non-accelerated filer, a smaller reporting company, or an emerging growth company. See the definitions of “large accelerated filer,” “accelerated filer,” “smaller reporting company,” and “emerging growth company” in Rule 12b-2 of the Exchange Act.', metadata={'source': '/content/data/output.md'})"
+ ]
+ },
+ "execution_count": 13,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "docs[0]"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "gIOg35WpjEm2"
+ },
+ "outputs": [],
+ "source": [
+ "from langchain.vectorstores import LanceDB\n",
+ "\n",
+ "\n",
+ "vectorstore = LanceDB.from_documents(documents=docs, embedding=OpenAIEmbeddings())\n",
+ "\n",
+ "# Retrieve and generate using the relevant snippets of the blog.\n",
+ "retriever = vectorstore.as_retriever()\n",
+ "prompt = hub.pull(\"rlm/rag-prompt\")\n",
+ "\n",
+ "\n",
+ "def format_docs(docs):\n",
+ " return \"\\n\\n\".join(doc.page_content for doc in docs)\n",
+ "\n",
+ "\n",
+ "rag_chain_lama = (\n",
+ " {\"context\": retriever | format_docs, \"question\": RunnablePassthrough()}\n",
+ " | prompt\n",
+ " | llm\n",
+ " | StrOutputParser()\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 54
+ },
+ "id": "_W6S8msJjEq5",
+ "outputId": "1e9ce002-3278-4c5f-b0be-6fe592004b58"
+ },
+ "outputs": [
+ {
+ "data": {
+ "application/vnd.google.colaboratory.intrinsic+json": {
+ "type": "string"
+ },
+ "text/plain": [
+ "'The net loss attributable to Uber Technologies, Inc. for the first quarter of 2022 was $5.9 billion, compared to a net loss of $108 million in the same period in 2021. This represents a significant increase in net loss compared to the previous year.'"
+ ]
+ },
+ "execution_count": 16,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "query1 = \" what is the net loss value attributable to Uber compared to last year?\"\n",
+ "rag_chain_lama.invoke(query1)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 36
+ },
+ "id": "_2Q_n_-wjEtV",
+ "outputId": "a0e8db08-a688-4c8d-9025-e8808cf80d7a"
+ },
+ "outputs": [
+ {
+ "data": {
+ "application/vnd.google.colaboratory.intrinsic+json": {
+ "type": "string"
+ },
+ "text/plain": [
+ "'The Cash paid for Income taxes, net of refunds is not specifically mentioned in the provided context.'"
+ ]
+ },
+ "execution_count": 17,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "# query\n",
+ "query2 = \"how is the Cash paid for Income taxes, net of refunds from Supplemental disclosures of cash flow information?\"\n",
+ "rag_chain_lama.invoke(query2)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/",
+ "height": 90
+ },
+ "id": "0z8sapoziX7f",
+ "outputId": "ae0e0d32-330d-4aa3-f274-a4ad455db430"
+ },
+ "outputs": [
+ {
+ "data": {
+ "application/vnd.google.colaboratory.intrinsic+json": {
+ "type": "string"
+ },
+ "text/plain": [
+ "'Detailed charts of intangible assets, net as of December 31, 2021 and March 31, 2022, are as follows:\\n\\nAs of December 31, 2021:\\n- Consumer, Merchant and other relationships: $1,494 million\\n- Developed technology: $599 million\\n- Trade names and trademarks: $167 million\\n- Patents: $7 million\\n- Other: $2 million\\n\\nAs of March 31, 2022:\\n- Consumer, Merchant and other relationships: $1,574 million\\n- Developed technology: $653 million\\n- Trade names and trademarks: $175 million\\n- Patents: $8 million\\n- Other: $2 million'"
+ ]
+ },
+ "execution_count": 18,
+ "metadata": {},
+ "output_type": "execute_result"
+ }
+ ],
+ "source": [
+ "# query\n",
+ "query3 = \"give me detailed charts of intangible assets, net as of December 31, 2021 and March 31, 2022\"\n",
+ "rag_chain_lama.invoke(query3)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "LTGUyJwxeF4Y"
+ },
+ "source": [
+ "Till now, We are not getting all answers but now lets try the llamaparser with Llamaindex to see the results"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "MqzD4Rh4eCrE"
+ },
+ "source": [
+ "# 4. Llamaparse with Llamaindex on PDF"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "h2lnfckunHLc",
+ "outputId": "a248807f-aaa3-4d45-afd9-e9365b9eb6ac"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Started parsing the file under job_id 69ea99de-dd19-46db-86d5-e03d2027a7ce\n"
+ ]
+ },
+ {
+ "name": "stderr",
+ "output_type": "stream",
+ "text": [
+ "/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.\n",
+ " warnings.warn(\n"
+ ]
+ },
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Adjusted EBITDA improved by $527 million from a loss of $359 million in 2021 to $168 million in 2022. Interest expense increased by an immaterial amount.\n"
+ ]
+ }
+ ],
+ "source": [
+ "# using llamaparser with LlamaIndex\n",
+ "\n",
+ "from llama_index.postprocessor.flag_embedding_reranker import (\n",
+ " FlagEmbeddingReranker,\n",
+ ")\n",
+ "\n",
+ "from llama_parse import LlamaParse\n",
+ "\n",
+ "pdf_table_LlamaParse = LlamaParse(result_type=\"markdown\").load_data(\n",
+ " \"/content/data_pdf/uber_10q_march_2022.pdf\"\n",
+ ")\n",
+ "\n",
+ "vector_store_lamaparser = LanceDBVectorStore(uri=\"/tmp/lancedb_parser\")\n",
+ "storage_context_lamaparser = StorageContext.from_defaults(\n",
+ " vector_store=vector_store_lamaparser\n",
+ ")\n",
+ "lance_index_lamaparser = VectorStoreIndex.from_documents(\n",
+ " pdf_table_LlamaParse, storage_context=storage_context_lamaparser\n",
+ ")\n",
+ "\n",
+ "reranker = FlagEmbeddingReranker(\n",
+ " top_n=5,\n",
+ " model=\"BAAI/bge-reranker-large\",\n",
+ ")\n",
+ "\n",
+ "Lance_index_query_lamaparser = lance_index_lamaparser.as_query_engine(\n",
+ " similarity_top_k=10, node_postprocessors=[reranker]\n",
+ ")\n",
+ "\n",
+ "\n",
+ "query_parser1 = \"what is Adjusted EBITDA 2021 vs 2022 ? what is intreset expense\"\n",
+ "\n",
+ "response_1 = Lance_index_query_lamaparser.query(query_parser1)\n",
+ "print(response_1)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "-ET0OUt1y_hj",
+ "outputId": "b8347392-e54e-400f-af35-363a5eb87b6a"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "| |Gross Carrying Value|Accumulated Amortization|Net Carrying Value|Useful Life - Years|\n",
+ "|---|---|---|---|---|\n",
+ "|Consumer, Merchant and other relationships|$1,868|$(294)|$1,574|9|\n",
+ "|Developed technology|$922|$(269)|$653|5|\n",
+ "|Trade names and trademarks|$222|$(47)|$175|6|\n",
+ "|Patents|$15|$(7)|$8|7|\n",
+ "|Other|$5|$(3)|$2|0|\n",
+ "\n",
+ "| |Gross Carrying Value|Accumulated Amortization|Net Carrying Value|Useful Life - Years|\n",
+ "|---|---|---|---|---|\n",
+ "|Consumer, Merchant and other relationships|$1,856|$(362)|$1,494|9|\n",
+ "|Developed technology|$924|$(325)|$599|5|\n",
+ "|Trade names and trademarks|$222|$(55)|$167|6|\n",
+ "|Patents|$15|$(8)|$7|6|\n",
+ "|Other|$5|$(3)|$2|0|\n"
+ ]
+ }
+ ],
+ "source": [
+ "query_parser2 = \"give me detailed charts of intangible assets, net as of December 31, 2021 and March 31, 2022\"\n",
+ "\n",
+ "response_1 = Lance_index_query_lamaparser.query(query_parser2)\n",
+ "print(response_1)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "a1zhm2CxRykD",
+ "outputId": "3243c11c-d789-41f8-ed3f-7c02ea5edae0"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "\n",
+ "***********New pdf+ lancedb ***********\n",
+ "The cash paid for income taxes, net of refunds, was $22 million for the three months ended March 31, 2021, and $41 million for the three months ended March 31, 2022.\n"
+ ]
+ }
+ ],
+ "source": [
+ "qa_lama_query4"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "id": "rXTsFl7_efUJ"
+ },
+ "source": [
+ "**WOW 🤩 the answers are more clear & better compaired to other methods .thats power of Llamaparser .this Llamaparser with LlamaIndex is doing quite well on table & text data from PDF.**\n"
+ ]
+ }
+ ],
+ "metadata": {
+ "colab": {
+ "provenance": []
+ },
+ "kernelspec": {
+ "display_name": "Python 3",
+ "name": "python3"
+ },
+ "language_info": {
+ "name": "python"
+ },
+ "widgets": {
+ "application/vnd.jupyter.widget-state+json": {
+ "09f4df92faa844b28fef90c3620b09e1": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HBoxModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HBoxModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HBoxView",
+ "box_style": "",
+ "children": [
+ "IPY_MODEL_d0a4ce2181c84af8b277058689a35641",
+ "IPY_MODEL_faaff37bf60e4870a9cbebd20d3afc6e",
+ "IPY_MODEL_f55e2cf0eb34482da1902adb652fca5a"
+ ],
+ "layout": "IPY_MODEL_55ee9cd902c147ae97c4deea101f25e8"
+ }
+ },
+ "0ab0af38e5c54405a48fd40de7bfe606": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HBoxModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HBoxModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HBoxView",
+ "box_style": "",
+ "children": [
+ "IPY_MODEL_17300f8bea7644f09907376fab719f92",
+ "IPY_MODEL_f2bbc601684446b5be521833592a319f",
+ "IPY_MODEL_326bb35286ba4d23a5e3b8b364c7a44f"
+ ],
+ "layout": "IPY_MODEL_d19be0dd23344eb2887605d88f80c142"
+ }
+ },
+ "0f2a76e74d74459182402f92fbb946c9": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "12eb26078d544f3abe72e91835cc14fc": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "14467a4613e74d819502803825959e5e": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "ProgressStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "ProgressStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "bar_color": null,
+ "description_width": ""
+ }
+ },
+ "1447b1135b994f14866cd17a34124ff6": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "FloatProgressModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "FloatProgressModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "ProgressView",
+ "bar_style": "success",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_74e2c433490840a0a4caa296bd3521f0",
+ "max": 2239618772,
+ "min": 0,
+ "orientation": "horizontal",
+ "style": "IPY_MODEL_7d690fb215c64c7a81cdce7156456ca2",
+ "value": 2239618772
+ }
+ },
+ "1700dc0afbb14e4cb85da9eedfca8605": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "17173ad4ef464e9fb4611c2c9c611736": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "DescriptionStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ },
+ "17300f8bea7644f09907376fab719f92": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HTMLModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_2e3f44b38e6842a9bd9db4c5e6cc6bcf",
+ "placeholder": "",
+ "style": "IPY_MODEL_738a133fb9204208913431c2a35fef5c",
+ "value": "tokenizer_config.json: 100%"
+ }
+ },
+ "1bbdd18be2e74c7f8916c95a1e5cf511": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "1c7cdfc1b5ec488dbb08e75b5c89dc0e": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "1d645f1f896140158cc30061c0e67080": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "22407e9553af4c00a93aa9085ca10b67": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "FloatProgressModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "FloatProgressModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "ProgressView",
+ "bar_style": "success",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_1d645f1f896140158cc30061c0e67080",
+ "max": 801,
+ "min": 0,
+ "orientation": "horizontal",
+ "style": "IPY_MODEL_14467a4613e74d819502803825959e5e",
+ "value": 801
+ }
+ },
+ "2a6b2dacce79419b9a2469b6362abca8": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "2b190420fb764d06bd468c2914773fa8": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "2e3f44b38e6842a9bd9db4c5e6cc6bcf": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "2ff3c80449634437b0d88f9a87932b14": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HBoxModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HBoxModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HBoxView",
+ "box_style": "",
+ "children": [
+ "IPY_MODEL_69a388720e4346e6a98e77c06acb1089",
+ "IPY_MODEL_22407e9553af4c00a93aa9085ca10b67",
+ "IPY_MODEL_edd1f2ad4cca4cb093f3d2ab09ebbe85"
+ ],
+ "layout": "IPY_MODEL_bfb0b78895c34af1abf9a8d669c24aeb"
+ }
+ },
+ "326bb35286ba4d23a5e3b8b364c7a44f": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HTMLModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_1700dc0afbb14e4cb85da9eedfca8605",
+ "placeholder": "",
+ "style": "IPY_MODEL_498b64a4f33046c9a65b5a5486376359",
+ "value": " 443/443 [00:00<00:00, 11.3kB/s]"
+ }
+ },
+ "356acddfa87f4c1b808245a7880a2ff4": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "3c63c0a7eac54b0fb72775e3fb6ba891": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "ProgressStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "ProgressStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "bar_color": null,
+ "description_width": ""
+ }
+ },
+ "45793d371304439db504cb04d4913911": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "48550e72c11c4da7985434d6df76df28": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "FloatProgressModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "FloatProgressModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "ProgressView",
+ "bar_style": "success",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_cb5a5f4ddfe14e37a6807a74005b01f3",
+ "max": 279,
+ "min": 0,
+ "orientation": "horizontal",
+ "style": "IPY_MODEL_9492f5081dd841099c26140b03ef791a",
+ "value": 279
+ }
+ },
+ "498b64a4f33046c9a65b5a5486376359": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "DescriptionStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ },
+ "55ee9cd902c147ae97c4deea101f25e8": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "57ae83a64b224117a6d01573d3437540": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "DescriptionStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ },
+ "5bec1c5f11434bf481862e67f73403f0": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "FloatProgressModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "FloatProgressModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "ProgressView",
+ "bar_style": "success",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_b69deb8741ad47229438548949e2da42",
+ "max": 5069051,
+ "min": 0,
+ "orientation": "horizontal",
+ "style": "IPY_MODEL_f894d72c0e19494abb732b2db46ced98",
+ "value": 5069051
+ }
+ },
+ "5eda98429678436daa918640f2193bd9": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "DescriptionStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ },
+ "61b79e900bfb4b849cfa740236cf6e44": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "62c949dbc1714614b00254b3b3ee80c1": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HTMLModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_86d744b64d604e388d3aadfbbf84eb1c",
+ "placeholder": "",
+ "style": "IPY_MODEL_5eda98429678436daa918640f2193bd9",
+ "value": " 279/279 [00:00<00:00, 4.85kB/s]"
+ }
+ },
+ "69a388720e4346e6a98e77c06acb1089": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HTMLModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_356acddfa87f4c1b808245a7880a2ff4",
+ "placeholder": "",
+ "style": "IPY_MODEL_f8502cc7258f4efeb43d9e2b0b5e91b2",
+ "value": "config.json: 100%"
+ }
+ },
+ "6fe509c6157e4ddbb1cfcf319cf04fac": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "738a133fb9204208913431c2a35fef5c": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "DescriptionStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ },
+ "74e2c433490840a0a4caa296bd3521f0": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "77e3dff7c22343848b9ad097e47b16df": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "DescriptionStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ },
+ "7d690fb215c64c7a81cdce7156456ca2": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "ProgressStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "ProgressStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "bar_color": null,
+ "description_width": ""
+ }
+ },
+ "86d744b64d604e388d3aadfbbf84eb1c": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "8779121983404d48ab560c1e0d2300f9": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HTMLModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_12eb26078d544f3abe72e91835cc14fc",
+ "placeholder": "",
+ "style": "IPY_MODEL_77e3dff7c22343848b9ad097e47b16df",
+ "value": "special_tokens_map.json: 100%"
+ }
+ },
+ "9492f5081dd841099c26140b03ef791a": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "ProgressStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "ProgressStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "bar_color": null,
+ "description_width": ""
+ }
+ },
+ "9b8c980b418146e992ec2e5cfde8bf6b": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HTMLModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_cd48bd077b9c4a7b822a79fa66b65031",
+ "placeholder": "",
+ "style": "IPY_MODEL_b246fdb13dc84c02b266565bb75e621f",
+ "value": " 2.24G/2.24G [00:25<00:00, 78.3MB/s]"
+ }
+ },
+ "a034635cd3d94f8b89b56e6b972daa63": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HTMLModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_0f2a76e74d74459182402f92fbb946c9",
+ "placeholder": "",
+ "style": "IPY_MODEL_d3d91529e30a4d67881a782d7eb54060",
+ "value": "sentencepiece.bpe.model: 100%"
+ }
+ },
+ "b246fdb13dc84c02b266565bb75e621f": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "DescriptionStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ },
+ "b69deb8741ad47229438548949e2da42": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "bd254eea803c4e44b60783f05c5c1ca1": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HTMLModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_6fe509c6157e4ddbb1cfcf319cf04fac",
+ "placeholder": "",
+ "style": "IPY_MODEL_de859b9a94684d39ac8a7aef81058a3b",
+ "value": " 5.07M/5.07M [00:00<00:00, 18.0MB/s]"
+ }
+ },
+ "bfb0b78895c34af1abf9a8d669c24aeb": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "c4829c7eed1541129e4944ad7784b43d": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "DescriptionStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ },
+ "cb5a5f4ddfe14e37a6807a74005b01f3": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "cd48bd077b9c4a7b822a79fa66b65031": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "cdc69d3551bd4bb183d30361b164d0ac": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "d0a4ce2181c84af8b277058689a35641": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HTMLModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_f3f8c3dc8a404bd4af55b105f98267d7",
+ "placeholder": "",
+ "style": "IPY_MODEL_e13aaa64b9c848f9a67f61295fb546fe",
+ "value": "tokenizer.json: 100%"
+ }
+ },
+ "d19be0dd23344eb2887605d88f80c142": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "d3d91529e30a4d67881a782d7eb54060": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "DescriptionStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ },
+ "d407aa9ec48347d9b5536a69fe9f5dc7": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "d779e07ce42c4f088dea05f8fd22922f": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "ProgressStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "ProgressStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "bar_color": null,
+ "description_width": ""
+ }
+ },
+ "de859b9a94684d39ac8a7aef81058a3b": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "DescriptionStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ },
+ "e13aaa64b9c848f9a67f61295fb546fe": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "DescriptionStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ },
+ "e9394568881e456c99758e52813428f6": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HBoxModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HBoxModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HBoxView",
+ "box_style": "",
+ "children": [
+ "IPY_MODEL_8779121983404d48ab560c1e0d2300f9",
+ "IPY_MODEL_48550e72c11c4da7985434d6df76df28",
+ "IPY_MODEL_62c949dbc1714614b00254b3b3ee80c1"
+ ],
+ "layout": "IPY_MODEL_1c7cdfc1b5ec488dbb08e75b5c89dc0e"
+ }
+ },
+ "edada2634ba54b9290fe904edc9905f0": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HTMLModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_2b190420fb764d06bd468c2914773fa8",
+ "placeholder": "",
+ "style": "IPY_MODEL_c4829c7eed1541129e4944ad7784b43d",
+ "value": "model.safetensors: 100%"
+ }
+ },
+ "edb9212d6a3540b19de36e895b6179e7": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HBoxModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HBoxModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HBoxView",
+ "box_style": "",
+ "children": [
+ "IPY_MODEL_edada2634ba54b9290fe904edc9905f0",
+ "IPY_MODEL_1447b1135b994f14866cd17a34124ff6",
+ "IPY_MODEL_9b8c980b418146e992ec2e5cfde8bf6b"
+ ],
+ "layout": "IPY_MODEL_45793d371304439db504cb04d4913911"
+ }
+ },
+ "edd1f2ad4cca4cb093f3d2ab09ebbe85": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HTMLModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_cdc69d3551bd4bb183d30361b164d0ac",
+ "placeholder": "",
+ "style": "IPY_MODEL_17173ad4ef464e9fb4611c2c9c611736",
+ "value": " 801/801 [00:00<00:00, 25.5kB/s]"
+ }
+ },
+ "f2bbc601684446b5be521833592a319f": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "FloatProgressModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "FloatProgressModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "ProgressView",
+ "bar_style": "success",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_d407aa9ec48347d9b5536a69fe9f5dc7",
+ "max": 443,
+ "min": 0,
+ "orientation": "horizontal",
+ "style": "IPY_MODEL_d779e07ce42c4f088dea05f8fd22922f",
+ "value": 443
+ }
+ },
+ "f3f8c3dc8a404bd4af55b105f98267d7": {
+ "model_module": "@jupyter-widgets/base",
+ "model_module_version": "1.2.0",
+ "model_name": "LayoutModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/base",
+ "_model_module_version": "1.2.0",
+ "_model_name": "LayoutModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "LayoutView",
+ "align_content": null,
+ "align_items": null,
+ "align_self": null,
+ "border": null,
+ "bottom": null,
+ "display": null,
+ "flex": null,
+ "flex_flow": null,
+ "grid_area": null,
+ "grid_auto_columns": null,
+ "grid_auto_flow": null,
+ "grid_auto_rows": null,
+ "grid_column": null,
+ "grid_gap": null,
+ "grid_row": null,
+ "grid_template_areas": null,
+ "grid_template_columns": null,
+ "grid_template_rows": null,
+ "height": null,
+ "justify_content": null,
+ "justify_items": null,
+ "left": null,
+ "margin": null,
+ "max_height": null,
+ "max_width": null,
+ "min_height": null,
+ "min_width": null,
+ "object_fit": null,
+ "object_position": null,
+ "order": null,
+ "overflow": null,
+ "overflow_x": null,
+ "overflow_y": null,
+ "padding": null,
+ "right": null,
+ "top": null,
+ "visibility": null,
+ "width": null
+ }
+ },
+ "f55e2cf0eb34482da1902adb652fca5a": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HTMLModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HTMLModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HTMLView",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_61b79e900bfb4b849cfa740236cf6e44",
+ "placeholder": "",
+ "style": "IPY_MODEL_57ae83a64b224117a6d01573d3437540",
+ "value": " 17.1M/17.1M [00:00<00:00, 47.6MB/s]"
+ }
+ },
+ "f7de0a6aa31c41de90f45e862fe4ce15": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "HBoxModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "HBoxModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "HBoxView",
+ "box_style": "",
+ "children": [
+ "IPY_MODEL_a034635cd3d94f8b89b56e6b972daa63",
+ "IPY_MODEL_5bec1c5f11434bf481862e67f73403f0",
+ "IPY_MODEL_bd254eea803c4e44b60783f05c5c1ca1"
+ ],
+ "layout": "IPY_MODEL_1bbdd18be2e74c7f8916c95a1e5cf511"
+ }
+ },
+ "f8502cc7258f4efeb43d9e2b0b5e91b2": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "DescriptionStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "DescriptionStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "description_width": ""
+ }
+ },
+ "f894d72c0e19494abb732b2db46ced98": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "ProgressStyleModel",
+ "state": {
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "ProgressStyleModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/base",
+ "_view_module_version": "1.2.0",
+ "_view_name": "StyleView",
+ "bar_color": null,
+ "description_width": ""
+ }
+ },
+ "faaff37bf60e4870a9cbebd20d3afc6e": {
+ "model_module": "@jupyter-widgets/controls",
+ "model_module_version": "1.5.0",
+ "model_name": "FloatProgressModel",
+ "state": {
+ "_dom_classes": [],
+ "_model_module": "@jupyter-widgets/controls",
+ "_model_module_version": "1.5.0",
+ "_model_name": "FloatProgressModel",
+ "_view_count": null,
+ "_view_module": "@jupyter-widgets/controls",
+ "_view_module_version": "1.5.0",
+ "_view_name": "ProgressView",
+ "bar_style": "success",
+ "description": "",
+ "description_tooltip": null,
+ "layout": "IPY_MODEL_2a6b2dacce79419b9a2469b6362abca8",
+ "max": 17098107,
+ "min": 0,
+ "orientation": "horizontal",
+ "style": "IPY_MODEL_3c63c0a7eac54b0fb72775e3fb6ba891",
+ "value": 17098107
+ }
+ }
+ }
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 0
+}
diff --git a/tutorials/Agentic_RAG/README.md b/tutorials/Agentic_RAG/README.md
new file mode 100644
index 00000000..ca60dad6
--- /dev/null
+++ b/tutorials/Agentic_RAG/README.md
@@ -0,0 +1,28 @@
+## Agentic RAG
+Agentic RAG is an advanced framework designed to handle complex information retrieval tasks using a network of intelligent agents.
+These agents collaborate to perform nuanced tasks such as synthesizing information from multiple documents,
+summarizing content, and comparing data points across various sources. Agentic RAG infuses autonomy and
+intelligence into traditional retrieval systems, enabling them to act as proactive entities that
+understand context, evaluate data quality, and make informed decisions.
+
+
+## Implementation Example with Langraph and LanceDB
+
+
+Install necessary packages:
+```
+!pip install langchain-community tiktoken langchain-openai langchainhub lancedb langchain langgraph langchain-text-splitters langchain_openai gradio
+
+```
+
+## Google Colab Walkthrough
+For a detailed, interactive walkthrough of this implementation, you can explore the Google Colab notebook provided below.
+This notebook includes support for ***Gradio***, making it easier to create UIs for your machine-learning models,
+ensuring a more interactive and user-friendly experience.
+
+[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/lancedb/vectordb-recipes/blob/main/tutorials/Agentic_RAG/main.ipynb)
+
+
+## Blog
+
+For a detailed explanation of agentic rag, check out [blog post on Medium](https://aksdesai1998.medium.com/662bac582da9).
diff --git a/tutorials/Agentic_RAG/data/CIEP.pdf b/tutorials/Agentic_RAG/data/CIEP.pdf
new file mode 100644
index 00000000..1840f216
Binary files /dev/null and b/tutorials/Agentic_RAG/data/CIEP.pdf differ
diff --git a/tutorials/Agentic_RAG/data/GAE.pdf b/tutorials/Agentic_RAG/data/GAE.pdf
new file mode 100644
index 00000000..7b3a4553
Binary files /dev/null and b/tutorials/Agentic_RAG/data/GAE.pdf differ
diff --git a/tutorials/Agentic_RAG/data/info.txt b/tutorials/Agentic_RAG/data/info.txt
new file mode 100644
index 00000000..036f3759
--- /dev/null
+++ b/tutorials/Agentic_RAG/data/info.txt
@@ -0,0 +1 @@
+data take from exim website
diff --git a/tutorials/Agentic_RAG/main.ipynb b/tutorials/Agentic_RAG/main.ipynb
new file mode 100644
index 00000000..9a1d1486
--- /dev/null
+++ b/tutorials/Agentic_RAG/main.ipynb
@@ -0,0 +1,379 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "source": [
+ "#Agentic RAG\n",
+ "Agentic Retrieval-Augmented Generation (RAG) is an advanced framework designed to handle complex information retrieval tasks using a network of intelligent agents. These agents collaborate to perform nuanced tasks such as synthesizing information from multiple documents, summarizing content, and comparing data points across various sources. Agentic RAG infuses autonomy and intelligence into traditional retrieval systems, enabling them to act as passive tools and proactive entities that understand context, evaluate data quality, and make informed decisions."
+ ],
+ "metadata": {
+ "id": "MT4Z9xRQVUM2"
+ }
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "## Blog\n",
+ "\n",
+ "For a detailed explanation of agentic rag, check out [blog post on Medium](https://aksdesai1998.medium.com/662bac582da9).\n"
+ ],
+ "metadata": {
+ "id": "dbzbP8U3WDov"
+ }
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "mstdPht-oONc"
+ },
+ "outputs": [],
+ "source": [
+ "#install the required dependencies\n",
+ "%%capture --no-stderr\n",
+ "%pip install -U --quiet langchain-community tiktoken langchain-openai langchainhub lancedb langchain langgraph langchain-text-splitters langchain_openai gradio"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "base_uri": "https://localhost:8080/"
+ },
+ "id": "Rieiaeu9posq",
+ "outputId": "c6806878-ccda-484a-9820-aef37c4bd700"
+ },
+ "outputs": [
+ {
+ "name": "stderr",
+ "output_type": "stream",
+ "text": [
+ "WARNING:root:USER_AGENT environment variable not set, consider setting it to identify your requests.\n"
+ ]
+ }
+ ],
+ "source": [
+ "import os\n",
+ "import getpass\n",
+ "import gradio as gr\n",
+ "from typing import Annotated, Literal, Sequence, TypedDict\n",
+ "from langchain import hub\n",
+ "from langchain_community.document_loaders import WebBaseLoader\n",
+ "from langchain_community.vectorstores import LanceDB\n",
+ "from langchain_core.messages import BaseMessage, HumanMessage\n",
+ "from langchain_core.output_parsers import StrOutputParser\n",
+ "from langchain_core.prompts import PromptTemplate\n",
+ "from langchain_core.pydantic_v1 import BaseModel, Field\n",
+ "from langchain_openai import OpenAIEmbeddings, ChatOpenAI\n",
+ "from langchain_text_splitters import RecursiveCharacterTextSplitter\n",
+ "from langchain.tools.retriever import create_retriever_tool\n",
+ "from langgraph.graph.message import add_messages\n",
+ "from langgraph.graph import END, StateGraph\n",
+ "from langgraph.graph.message import add_messages\n",
+ "from langgraph.prebuilt import ToolExecutor, ToolNode, tools_condition\n"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "source": [],
+ "metadata": {
+ "id": "YbaPskMdVc2h"
+ },
+ "execution_count": null,
+ "outputs": []
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "colab": {
+ "background_save": true,
+ "base_uri": "https://localhost:8080/",
+ "height": 1000
+ },
+ "id": "wpwx86tgorLG",
+ "outputId": "b2216382-c340-44f0-d70e-0c8e980fac1a"
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Setting queue=True in a Colab notebook requires sharing enabled. Setting `share=True` (you can turn this off by setting `share=False` in `launch()` explicitly).\n",
+ "\n",
+ "Colab notebook detected. This cell will run indefinitely so that you can see errors and logs. To turn off, set debug=False in launch().\n",
+ "Running on public URL: https://1300bf30884d92867e.gradio.live\n",
+ "\n",
+ "This share link expires in 72 hours. For free permanent hosting and GPU upgrades, run `gradio deploy` from Terminal to deploy to Spaces (https://huggingface.co/spaces)\n"
+ ]
+ },
+ {
+ "data": {
+ "text/html": [
+ ""
+ ],
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ },
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Debug output: {'agent': {'messages': [AIMessage(content='The PM Gati Shakti National Master Plan (NMP) is an ambitious initiative launched by the Government of India aimed at improving infrastructure development across the country. Launched by Prime Minister Narendra Modi, the plan seeks to integrate the planning and coordination of various infrastructure projects across different sectors and ministries.\\n\\nThe core objective of the Gati Shakti NMP is to enhance multi-modal connectivity and reduce logistics costs by bringing together rail, road, air, and waterways projects under a single, unified framework. This holistic approach is intended to boost economic growth, create jobs, and promote regional connectivity.\\n\\nThe plan utilizes digital technology to map and synchronize projects, ensuring that all related departments and stakeholders are aligned, which helps in eliminating bottlenecks, improving project execution speed, and enhancing overall efficiency. The Gati Shakti NMP is seen as a transformative step towards making India a global manufacturing hub and improving the ease of doing business.', response_metadata={'finish_reason': 'stop'}, id='run-8bc90fb4-007b-4384-9320-35c3621eb9b8-0')]}}\n",
+ "Extracted content: The PM Gati Shakti National Master Plan (NMP) is an ambitious initiative launched by the Government of India aimed at improving infrastructure development across the country. Launched by Prime Minister Narendra Modi, the plan seeks to integrate the planning and coordination of various infrastructure projects across different sectors and ministries.\n",
+ "\n",
+ "The core objective of the Gati Shakti NMP is to enhance multi-modal connectivity and reduce logistics costs by bringing together rail, road, air, and waterways projects under a single, unified framework. This holistic approach is intended to boost economic growth, create jobs, and promote regional connectivity.\n",
+ "\n",
+ "The plan utilizes digital technology to map and synchronize projects, ensuring that all related departments and stakeholders are aligned, which helps in eliminating bottlenecks, improving project execution speed, and enhancing overall efficiency. The Gati Shakti NMP is seen as a transformative step towards making India a global manufacturing hub and improving the ease of doing business.\n",
+ "Debug output: {'agent': {'messages': [AIMessage(content='', additional_kwargs={'tool_calls': [{'index': 0, 'id': 'call_AE3GmC26FUSF63Nb4NhJCSMb', 'function': {'arguments': '{\"query\": \"steps for export import procedure\"}', 'name': 'retrieve_blog_posts'}, 'type': 'function'}, {'index': 1, 'id': 'call_B56lmGd2JIHRDCdxf9Um9Ml7', 'function': {'arguments': '{\"query\": \"customs import export procedure\"}', 'name': 'retrieve_blog_posts'}, 'type': 'function'}]}, response_metadata={'finish_reason': 'tool_calls'}, id='run-2c42f15d-390b-4509-8b23-fbc6d1fb203d-0', tool_calls=[{'name': 'retrieve_blog_posts', 'args': {'query': 'steps for export import procedure'}, 'id': 'call_AE3GmC26FUSF63Nb4NhJCSMb'}, {'name': 'retrieve_blog_posts', 'args': {'query': 'customs import export procedure'}, 'id': 'call_B56lmGd2JIHRDCdxf9Um9Ml7'}])]}}\n",
+ "Extracted content: \n",
+ "Debug output: {'retrieve': {'messages': [ToolMessage(content='<Microsoft Word - CUSTOMS IMPORT EXPORT PROCEDURES _final_Admin\\n\\nendobj\\r\\n118 0 obj\\r\\n<>/F 4/A<>/StructParent 32>>\\r\\nendobj\\r\\n119 0 obj\\n\\nEP\\x12t1Ԗ�m�l1�PI�����ٲW$��`�B[C��\\x1e�2�R�ν7ȗ���C�3�a���\\x1fZ��U\\x7f��\\x10�5y<:', name='retrieve_blog_posts', id='8714e8e7-ed08-4452-8d1f-621f4f25af81', tool_call_id='call_AE3GmC26FUSF63Nb4NhJCSMb'), ToolMessage(content='<Microsoft Word - CUSTOMS IMPORT EXPORT PROCEDURES _final_Admin\\n\\nCdTa�x\\t&��A\\x0eb�1����\\x0errMM�q�\\x7f1Q2cҩ0de�˂���>�\\x1b\\x12�\\x19?�\\x0fa�L���Q\\n\\nt\\'�P��h\\x13��u\\u05eex����|^���(I�$�%\\x1f�Q%�.\\x19U�B��jY��\\x16�?\\x19\\x19�\\x0e�$��W�BԸ���\\x1ck��\\x19��l!K�\\x05�!z�\\x1dq\\x12�\\x1d�v��]�P\"\\x11ROC��\\x14', name='retrieve_blog_posts', id='8a56ed96-cd2a-4846-a60d-258e6430aa79', tool_call_id='call_B56lmGd2JIHRDCdxf9Um9Ml7')]}}\n",
+ "Debug output: {'generate': {'messages': [\"I don't know.\"]}}\n",
+ "Debug output: {'agent': {'messages': [AIMessage(content='The term \"RCMC\" stands for Registration Cum Membership Certificate. It is a certificate that is provided by the Export Promotion Councils (EPCs) or commodity boards in India. An RCMC is issued to exporters dealing in products registered with these agencies. Holding an RCMC is mandatory for exporters to avail benefits under the Foreign Trade Policy like duty drawback, concessions, and other support.\\n\\nHere are some key points about RCMC:\\n\\n1. **Purpose**: The RCMC is used to certify that an exporter is registered with the respective EPC and is eligible for various benefits under the export-import policy.\\n\\n2. **Validity**: Typically, an RCMC is valid for five years.\\n\\n3. **Application**: Exporters must apply for an RCMC with the relevant EPC that pertains to their main line of business. If the exporter wishes to export items that are not covered by any EPC, they can obtain an RCMC from the Federation of Indian Export Organisations (FIEO).\\n\\n4. **Benefits**: With an RCMC, exporters can participate in international trade fairs, get sponsorship for trade delegations, and access market development assistance among other benefits.\\n\\n5. **Renewal and Cancellation**: The certificate needs to be renewed upon expiry. It can also be cancelled or suspended if the holder fails to abide by the regulatory requirements.\\n\\nIf you need detailed information or specific guidance related to obtaining an RCMC, please let me know!', response_metadata={'finish_reason': 'stop'}, id='run-4fa5e544-510a-4140-984c-89dedd855e71-0')]}}\n",
+ "Extracted content: The term \"RCMC\" stands for Registration Cum Membership Certificate. It is a certificate that is provided by the Export Promotion Councils (EPCs) or commodity boards in India. An RCMC is issued to exporters dealing in products registered with these agencies. Holding an RCMC is mandatory for exporters to avail benefits under the Foreign Trade Policy like duty drawback, concessions, and other support.\n",
+ "\n",
+ "Here are some key points about RCMC:\n",
+ "\n",
+ "1. **Purpose**: The RCMC is used to certify that an exporter is registered with the respective EPC and is eligible for various benefits under the export-import policy.\n",
+ "\n",
+ "2. **Validity**: Typically, an RCMC is valid for five years.\n",
+ "\n",
+ "3. **Application**: Exporters must apply for an RCMC with the relevant EPC that pertains to their main line of business. If the exporter wishes to export items that are not covered by any EPC, they can obtain an RCMC from the Federation of Indian Export Organisations (FIEO).\n",
+ "\n",
+ "4. **Benefits**: With an RCMC, exporters can participate in international trade fairs, get sponsorship for trade delegations, and access market development assistance among other benefits.\n",
+ "\n",
+ "5. **Renewal and Cancellation**: The certificate needs to be renewed upon expiry. It can also be cancelled or suspended if the holder fails to abide by the regulatory requirements.\n",
+ "\n",
+ "If you need detailed information or specific guidance related to obtaining an RCMC, please let me know!\n"
+ ]
+ }
+ ],
+ "source": [
+ "\n",
+ "# Function to set environment variables securely\n",
+ "def _set_env(key: str):\n",
+ " if key not in os.environ:\n",
+ " os.environ[key] = getpass.getpass(f\"{key}:\")\n",
+ "\n",
+ "_set_env(\"OPENAI_API_KEY\")\n",
+ "\n",
+ "# (Optional) For tracing\n",
+ "os.environ[\"LANGCHAIN_TRACING_V2\"] = \"False\"\n",
+ "_set_env(\"LANGCHAIN_API_KEY\")\n",
+ "\n",
+ "\n",
+ "# upload the data based on your usecase\n",
+ "\n",
+ "urls = [\n",
+ " 'https://content.dgft.gov.in/Website/CIEP.pdf',\n",
+ " 'https://content.dgft.gov.in/Website/GAE.pdf',\n",
+ " 'https://content.dgft.gov.in/Website/HTE.pdf',\n",
+ "]\n",
+ "\n",
+ "\n",
+ "docs = [WebBaseLoader(url).load() for url in urls]\n",
+ "docs_list = [item for sublist in docs for item in sublist]\n",
+ "\n",
+ "text_splitter = RecursiveCharacterTextSplitter.from_tiktoken_encoder(\n",
+ " chunk_size=100, chunk_overlap=50\n",
+ ")\n",
+ "doc_splits = text_splitter.split_documents(docs_list)\n",
+ "\n",
+ "# Add to lancedb as vectordb\n",
+ "\n",
+ "vectorstore = LanceDB.from_documents(\n",
+ " documents=doc_splits,\n",
+ " embedding=OpenAIEmbeddings(),\n",
+ ")\n",
+ "retriever = vectorstore.as_retriever()\n",
+ "\n",
+ "\n",
+ "# create the tools\n",
+ "retriever_tool = create_retriever_tool(\n",
+ " retriever,\n",
+ " \"retrieve_blog_posts\",\n",
+ " \"Search and return information about customs import export procedure,GST & EXPORTS , How to export\",\n",
+ ")\n",
+ "\n",
+ "tools = [retriever_tool]\n",
+ "tool_executor = ToolExecutor(tools)\n",
+ "\n",
+ "\n",
+ "\n",
+ "class AgentState(TypedDict):\n",
+ " messages: Annotated[Sequence[BaseMessage], add_messages]\n",
+ "\n",
+ "def grade_documents(state) -> Literal[\"generate\", \"rewrite\"]:\n",
+ " class grade(BaseModel):\n",
+ " binary_score: str = Field(description=\"Relevance score 'yes' or 'no'\")\n",
+ "\n",
+ " model = ChatOpenAI(temperature=0, model=\"gpt-4-0125-preview\", streaming=True)\n",
+ " llm_with_tool = model.with_structured_output(grade)\n",
+ " prompt = PromptTemplate(\n",
+ " template=\"\"\"You are a grader assessing relevance of a retrieved document to a user question. \\n\n",
+ " Here is the retrieved document: \\n\\n {context} \\n\\n\n",
+ " Here is the user question: {question} \\n\n",
+ " If the document contains keyword(s) or semantic meaning related to the user question, grade it as relevant. \\n\n",
+ " Give a binary score 'yes' or 'no' score to indicate whether the document is relevant to the question.\"\"\",\n",
+ " input_variables=[\"context\", \"question\"],\n",
+ " )\n",
+ " chain = prompt | llm_with_tool\n",
+ "\n",
+ " messages = state[\"messages\"]\n",
+ " last_message = messages[-1]\n",
+ " question = messages[0].content\n",
+ " docs = last_message.content\n",
+ "\n",
+ " scored_result = chain.invoke({\"question\": question, \"context\": docs})\n",
+ " score = scored_result.binary_score\n",
+ "\n",
+ " return \"generate\" if score == \"yes\" else \"rewrite\"\n",
+ "\n",
+ "def agent(state):\n",
+ " messages = state[\"messages\"]\n",
+ " model = ChatOpenAI(temperature=0, streaming=True, model=\"gpt-4-turbo\")\n",
+ " model = model.bind_tools(tools)\n",
+ " response = model.invoke(messages)\n",
+ " return {\"messages\": [response]}\n",
+ "\n",
+ "def rewrite(state):\n",
+ " messages = state[\"messages\"]\n",
+ " question = messages[0].content\n",
+ " msg = [\n",
+ " HumanMessage(\n",
+ " content=f\"\"\" \\n\n",
+ " Look at the input and try to reason about the underlying semantic intent / meaning. \\n\n",
+ " Here is the initial question:\n",
+ " \\n ------- \\n\n",
+ " {question}\n",
+ " \\n ------- \\n\n",
+ " Formulate an improved question: \"\"\",\n",
+ " )\n",
+ " ]\n",
+ " model = ChatOpenAI(temperature=0, model=\"gpt-4-0125-preview\", streaming=True)\n",
+ " response = model.invoke(msg)\n",
+ " return {\"messages\": [response]}\n",
+ "\n",
+ "def generate(state):\n",
+ " messages = state[\"messages\"]\n",
+ " question = messages[0].content\n",
+ " last_message = messages[-1]\n",
+ " docs = last_message.content\n",
+ "\n",
+ " prompt = hub.pull(\"rlm/rag-prompt\")\n",
+ " llm = ChatOpenAI(model_name=\"gpt-3.5-turbo\", temperature=0, streaming=True)\n",
+ "\n",
+ " def format_docs(docs):\n",
+ " return \"\\n\\n\".join(doc.page_content for doc in docs)\n",
+ "\n",
+ " rag_chain = prompt | llm | StrOutputParser()\n",
+ " response = rag_chain.invoke({\"context\": docs, \"question\": question})\n",
+ " return {\"messages\": [response]}\n",
+ "\n",
+ "workflow = StateGraph(AgentState)\n",
+ "workflow.add_node(\"agent\", agent)\n",
+ "retrieve = ToolNode([retriever_tool])\n",
+ "workflow.add_node(\"retrieve\", retrieve)\n",
+ "workflow.add_node(\"rewrite\", rewrite)\n",
+ "workflow.add_node(\"generate\", generate)\n",
+ "workflow.set_entry_point(\"agent\")\n",
+ "workflow.add_conditional_edges(\"agent\", tools_condition, {\"tools\": \"retrieve\", END: END})\n",
+ "workflow.add_conditional_edges(\"retrieve\", grade_documents)\n",
+ "workflow.add_edge(\"generate\", END)\n",
+ "workflow.add_edge(\"rewrite\", \"agent\")\n",
+ "graph = workflow.compile()\n",
+ "\n",
+ "\n",
+ "def process_message(user_message):\n",
+ " inputs = {\n",
+ " \"messages\": [(\"user\", user_message)]\n",
+ " }\n",
+ " content_output = None\n",
+ " for output in graph.stream(inputs):\n",
+ " print(f\"Debug output: {output}\") # Debugging line to print the output\n",
+ " if 'agent' in output and 'messages' in output['agent']:\n",
+ " messages = output['agent']['messages']\n",
+ " if messages and hasattr(messages[0], 'content'):\n",
+ " content_output = messages[0].content # Accessing attribute directly\n",
+ " print(f\"Extracted content: {content_output}\") # Print extracted content\n",
+ " return content_output if content_output else \"No relevant output found.\"\n",
+ "\n",
+ "\n",
+ "# Define example questions to guide the user\n",
+ "example_questions = [\n",
+ "\"explain me in short what is PM Gati Shakti National Master Plan (NMP)?\"\n",
+ "\n",
+ "]\n",
+ "\n",
+ "# Create a Gradio interface\n",
+ "iface = gr.Interface(\n",
+ " fn=process_message,\n",
+ " inputs=\"text\",\n",
+ " outputs=\"text\",\n",
+ " title=\"Agentic RAG \",\n",
+ " description=\"Enter a message to query related to export import .\",\n",
+ " examples=example_questions,\n",
+ ")\n",
+ "\n",
+ "# Launch the Gradio app\n",
+ "iface.launch(debug=True)\n",
+ "\n",
+ "\n",
+ "\n"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "JjmUVVn1TdH0"
+ },
+ "outputs": [],
+ "source": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "#some quetions for testing\n",
+ "explain me in short what is PM Gati Shakti National Master Plan (NMP)?\n",
+ "\n",
+ "what is Zero Rating of Exports?\n",
+ "\n",
+ "what is Export Inspection Council of India?\n",
+ "\n",
+ "please give us some Details of some of the major initiatives /schemes please ?"
+ ],
+ "metadata": {
+ "id": "c921cw61mPdh"
+ }
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "id": "k6Tq9E1Lqwtj"
+ },
+ "outputs": [],
+ "source": []
+ }
+ ],
+ "metadata": {
+ "colab": {
+ "provenance": []
+ },
+ "kernelspec": {
+ "display_name": "Python 3",
+ "name": "python3"
+ },
+ "language_info": {
+ "name": "python"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 0
+}
\ No newline at end of file
diff --git a/tutorials/cohere-reranker/README.md b/tutorials/cohere-reranker/README.md
index 6f5802bf..c2c8fe75 100644
--- a/tutorials/cohere-reranker/README.md
+++ b/tutorials/cohere-reranker/README.md
@@ -1,8 +1,8 @@
-Code for "Benchmarking Cohere Rerankers with LanceDB"
+# Benchmarking Cohere Rerankers with LanceDB
-### [Read the blog](blog.lancedb.com)
+### [Read the blog](https://blog.lancedb.com/benchmarking-cohere-reranker-with-lancedb/)
## Setup
```