Skip to content

Commit

Permalink
deploy: 19080f1
Browse files Browse the repository at this point in the history
  • Loading branch information
github-actions[bot] committed Jul 1, 2024
1 parent 3971442 commit 18d6845
Show file tree
Hide file tree
Showing 9 changed files with 29 additions and 26 deletions.
Binary file modified .doctrees/developer_notes/prompt.doctree
Binary file not shown.
Binary file modified .doctrees/environment.pickle
Binary file not shown.
Binary file added _images/dataclass.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
27 changes: 15 additions & 12 deletions _sources/developer_notes/prompt.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,12 @@ Design
----------------

`LightRAG` seeks to maximize developers' control over the prompt.
Thus, in most cases we help developers gather together different sections and form them into one prompt.
Thus, in most cases, we help developers gather different sections and form them into one prompt.
This prompt will then be send to the LLM as a single message.
The default role of the message we use is `system`.
Though it is not a special token, we use ``<SYS></SYS>`` to represent the system message in the prompt, which works quite well.


.. code-block:: python
simple_prompt = r"""<SYS> You are a helpful assistant. </SYS> User: What can you help me with?"""
Expand Down Expand Up @@ -53,7 +54,8 @@ Data Flow in LLM applications

Look at the most complicated case: We will have user query, retrieved context, task description, definition of tools, few-shot examples, past conversation history, step history from the agent, and the output format specification.
All these different parts need to be formatted into a single prompt.
We have to do all this with flexiblity and also easy for developers to read.
We have to do all this with flexibility and also make it easy for developers to read.



Why Jinja2?
Expand Down Expand Up @@ -85,10 +87,11 @@ To format the prompt, you can use any of Python's native string formatting.
We opted for `Jinja2` [1]_ as the templating engine for the prompt.
Besides of the placeholders using ``{{}}`` for key-word arguments, Jinja2 also allow users to write code similar to Python syntax.
This includes conditionals, loops, filters, and even comments that is lacked from Python's native string formatting.
Besides the placeholders using ``{{}}`` for keyword arguments, Jinja2 also allow users to write code similar to Python syntax.
This includes conditionals, loops, filters, and even comments, which are lacking in Python's native string formatting.
Here is one example of using `Jinja2` to format the prompt:


.. code-block:: python
def jinja2_template_example(**kwargs):
Expand Down Expand Up @@ -144,8 +147,7 @@ Prompt class


We created our :class:`Prompt Component<core.prompt_builder.Prompt>` to render the prompt with the string ``template`` and ``prompt_kwargs``.
It is a rather simple component, but it is rather handy.

It is a simple component, but it is quite handy.
Let's use the same template as above:

.. code-block:: python
Expand All @@ -163,7 +165,7 @@ Let's use the same template as above:
print(prompt(input_str=input_str)) # takes the rest arguments in keyword arguments
The ``Prompt`` class allow us to preset some of the prompt arguments at initialization, and then we can call the prompt with the rest of the arguments.
Also, by subclassing ``Component``, we get to easily visualize this component with ``print``.
Also, by subclassing ``Component``, we can easily visualize this component with ``print``.
Here is the output:

.. code-block::
Expand All @@ -181,12 +183,13 @@ Here is the output:
User: {{ input_str }}, prompt_kwargs: {'task_desc_str': 'You are a helpful assitant', 'tools': ['google', 'wikipedia', 'wikidata']}, prompt_variables: ['input_str', 'tools', 'task_desc_str']
)
As all components, you can use ``to_dict`` and ``from_dict`` to serialize and deserialize the component.
As with all components, you can use ``to_dict`` and ``from_dict`` to serialize and deserialize the component.

Default Prompt Template
-------------------------
In default, ``Prompt`` class uses the :const:`DEFAULT_LIGHTRAG_SYSTEM_PROMPT<core.default_prompt_template.DEFAULT_LIGHTRAG_SYSTEM_PROMPT>` as its string template if no template is provided.
This default template will allow you conditionally passing seven important variables designed from the data flow diagram above.

In default, the ``Prompt`` class uses the :const:`DEFAULT_LIGHTRAG_SYSTEM_PROMPT<core.default_prompt_template.DEFAULT_LIGHTRAG_SYSTEM_PROMPT>` as its string template if no template is provided.
This default template allows you to conditionally passing seven important variables designed from the data flow diagram above.
These varaibles are:

.. code-block:: python
Expand All @@ -210,7 +213,7 @@ Now, let's see the minimum case where we only have the user query:
output = prompt(input_str=input_str)
print(output)
The output will be bare minimum with only the user query and a prefix for assistant to respond:
The output will be the bare minimum with only the user query and a prefix for assistant to respond:

.. code-block::
Expand All @@ -221,7 +224,7 @@ The output will be bare minimum with only the user query and a prefix for assist
.. note::

We barely need to use the raw ``Prompt`` class directly as it is orchestrated by the ``Generator`` component.
In reality, we barely need to use the raw ``Prompt`` class directly as it is orchestrated by the ``Generator`` component together with the ``ModelClient`` that we will introduce next.



Expand Down
Binary file added _static/images/database.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added _static/images/dataclass.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion developer_notes/base_data_class.html
Original file line number Diff line number Diff line change
Expand Up @@ -471,7 +471,7 @@
In LLM applications, data constantly needs to interact with LLMs in the form of strings via prompt and be parsed back to structured data from LLMs’ text prediction.
<a class="reference internal" href="../apis/core/core.base_data_class.html#core.base_data_class.DataClass" title="core.base_data_class.DataClass"><code class="xref py py-class docutils literal notranslate"><span class="pre">core.base_data_class.DataClass</span></code></a> is designed to ease the data interaction with LLMs via prompt(input) and text prediction(output).</p>
<figure class="align-center" id="id1">
<a class="reference internal image-reference" href="_static/images/dataclass.png"><img alt="DataClass" src="_static/images/dataclass.png" style="width: 680px;" />
<a class="reference internal image-reference" href="../_images/dataclass.png"><img alt="DataClass" src="../_images/dataclass.png" style="width: 680px;" />
</a>
<figcaption>
<p><span class="caption-text">DataClass is to ease the data interaction with LLMs via prompt(input) and text prediction(output).</span><a class="headerlink" href="#id1" title="Link to this image">#</a></p>
Expand Down
24 changes: 12 additions & 12 deletions developer_notes/prompt.html
Original file line number Diff line number Diff line change
Expand Up @@ -478,7 +478,7 @@ <h2>Context<a class="headerlink" href="#context" title="Link to this heading">#<
<section id="design">
<h2>Design<a class="headerlink" href="#design" title="Link to this heading">#</a></h2>
<p><cite>LightRAG</cite> seeks to maximize developers’ control over the prompt.
Thus, in most cases we help developers gather together different sections and form them into one prompt.
Thus, in most cases, we help developers gather different sections and form them into one prompt.
This prompt will then be send to the LLM as a single message.
The default role of the message we use is <cite>system</cite>.
Though it is not a special token, we use <code class="docutils literal notranslate"><span class="pre">&lt;SYS&gt;&lt;/SYS&gt;</span></code> to represent the system message in the prompt, which works quite well.</p>
Expand All @@ -505,7 +505,7 @@ <h3>Data Flow in LLM applications<a class="headerlink" href="#data-flow-in-llm-a
</figure>
<p>Look at the most complicated case: We will have user query, retrieved context, task description, definition of tools, few-shot examples, past conversation history, step history from the agent, and the output format specification.
All these different parts need to be formatted into a single prompt.
We have to do all this with flexiblity and also easy for developers to read.</p>
We have to do all this with flexibility and also make it easy for developers to read.</p>
</section>
<section id="why-jinja2">
<h3>Why Jinja2?<a class="headerlink" href="#why-jinja2" title="Link to this heading">#</a></h3>
Expand All @@ -531,8 +531,8 @@ <h3>Why Jinja2?<a class="headerlink" href="#why-jinja2" title="Link to this head
</pre></div>
</div>
<p>We opted for <cite>Jinja2</cite> <a class="footnote-reference brackets" href="#id3" id="id2" role="doc-noteref"><span class="fn-bracket">[</span>1<span class="fn-bracket">]</span></a> as the templating engine for the prompt.
Besides of the placeholders using <code class="docutils literal notranslate"><span class="pre">{{}}</span></code> for key-word arguments, Jinja2 also allow users to write code similar to Python syntax.
This includes conditionals, loops, filters, and even comments that is lacked from Python’s native string formatting.
Besides the placeholders using <code class="docutils literal notranslate"><span class="pre">{{}}</span></code> for keyword arguments, Jinja2 also allow users to write code similar to Python syntax.
This includes conditionals, loops, filters, and even comments, which are lacking in Python’s native string formatting.
Here is one example of using <cite>Jinja2</cite> to format the prompt:</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">def</span> <span class="nf">jinja2_template_example</span><span class="p">(</span><span class="o">**</span><span class="n">kwargs</span><span class="p">):</span>
<span class="kn">from</span> <span class="nn">jinja2</span> <span class="kn">import</span> <span class="n">Template</span>
Expand Down Expand Up @@ -579,8 +579,8 @@ <h3>Why Jinja2?<a class="headerlink" href="#why-jinja2" title="Link to this head
<section id="prompt-class">
<h2>Prompt class<a class="headerlink" href="#prompt-class" title="Link to this heading">#</a></h2>
<p>We created our <a class="reference internal" href="../apis/core/core.prompt_builder.html#core.prompt_builder.Prompt" title="core.prompt_builder.Prompt"><code class="xref py py-class docutils literal notranslate"><span class="pre">Prompt</span> <span class="pre">Component</span></code></a> to render the prompt with the string <code class="docutils literal notranslate"><span class="pre">template</span></code> and <code class="docutils literal notranslate"><span class="pre">prompt_kwargs</span></code>.
It is a rather simple component, but it is rather handy.</p>
<p>Let’s use the same template as above:</p>
It is a simple component, but it is quite handy.
Let’s use the same template as above:</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">lightrag.core.prompt_builder</span> <span class="kn">import</span> <span class="n">Prompt</span>

<span class="n">prompt</span> <span class="o">=</span> <span class="n">Prompt</span><span class="p">(</span>
Expand All @@ -595,7 +595,7 @@ <h2>Prompt class<a class="headerlink" href="#prompt-class" title="Link to this h
</pre></div>
</div>
<p>The <code class="docutils literal notranslate"><span class="pre">Prompt</span></code> class allow us to preset some of the prompt arguments at initialization, and then we can call the prompt with the rest of the arguments.
Also, by subclassing <code class="docutils literal notranslate"><span class="pre">Component</span></code>, we get to easily visualize this component with <code class="docutils literal notranslate"><span class="pre">print</span></code>.
Also, by subclassing <code class="docutils literal notranslate"><span class="pre">Component</span></code>, we can easily visualize this component with <code class="docutils literal notranslate"><span class="pre">print</span></code>.
Here is the output:</p>
<div class="highlight-default notranslate"><div class="highlight"><pre><span></span><span class="n">Prompt</span><span class="p">(</span>
<span class="n">template</span><span class="p">:</span> <span class="o">&lt;</span><span class="n">SYS</span><span class="o">&gt;</span><span class="p">{{</span> <span class="n">task_desc_str</span> <span class="p">}}</span><span class="o">&lt;/</span><span class="n">SYS</span><span class="o">&gt;</span>
Expand All @@ -611,12 +611,12 @@ <h2>Prompt class<a class="headerlink" href="#prompt-class" title="Link to this h
<span class="p">)</span>
</pre></div>
</div>
<p>As all components, you can use <code class="docutils literal notranslate"><span class="pre">to_dict</span></code> and <code class="docutils literal notranslate"><span class="pre">from_dict</span></code> to serialize and deserialize the component.</p>
<p>As with all components, you can use <code class="docutils literal notranslate"><span class="pre">to_dict</span></code> and <code class="docutils literal notranslate"><span class="pre">from_dict</span></code> to serialize and deserialize the component.</p>
</section>
<section id="default-prompt-template">
<h2>Default Prompt Template<a class="headerlink" href="#default-prompt-template" title="Link to this heading">#</a></h2>
<p>In default, <code class="docutils literal notranslate"><span class="pre">Prompt</span></code> class uses the <a class="reference internal" href="../apis/core/core.default_prompt_template.html#core.default_prompt_template.DEFAULT_LIGHTRAG_SYSTEM_PROMPT" title="core.default_prompt_template.DEFAULT_LIGHTRAG_SYSTEM_PROMPT"><code class="xref py py-const docutils literal notranslate"><span class="pre">DEFAULT_LIGHTRAG_SYSTEM_PROMPT</span></code></a> as its string template if no template is provided.
This default template will allow you conditionally passing seven important variables designed from the data flow diagram above.
<p>In default, the <code class="docutils literal notranslate"><span class="pre">Prompt</span></code> class uses the <a class="reference internal" href="../apis/core/core.default_prompt_template.html#core.default_prompt_template.DEFAULT_LIGHTRAG_SYSTEM_PROMPT" title="core.default_prompt_template.DEFAULT_LIGHTRAG_SYSTEM_PROMPT"><code class="xref py py-const docutils literal notranslate"><span class="pre">DEFAULT_LIGHTRAG_SYSTEM_PROMPT</span></code></a> as its string template if no template is provided.
This default template allows you to conditionally passing seven important variables designed from the data flow diagram above.
These varaibles are:</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">LIGHTRAG_DEFAULT_PROMPT_ARGS</span> <span class="o">=</span> <span class="p">[</span>
<span class="s2">&quot;task_desc_str&quot;</span><span class="p">,</span> <span class="c1"># task description</span>
Expand All @@ -636,7 +636,7 @@ <h2>Default Prompt Template<a class="headerlink" href="#default-prompt-template"
<span class="nb">print</span><span class="p">(</span><span class="n">output</span><span class="p">)</span>
</pre></div>
</div>
<p>The output will be bare minimum with only the user query and a prefix for assistant to respond:</p>
<p>The output will be the bare minimum with only the user query and a prefix for assistant to respond:</p>
<div class="highlight-default notranslate"><div class="highlight"><pre><span></span>&lt;User&gt;
What is the capital of France?
&lt;/User&gt;
Expand All @@ -645,7 +645,7 @@ <h2>Default Prompt Template<a class="headerlink" href="#default-prompt-template"
</div>
<div class="admonition note">
<p class="admonition-title">Note</p>
<p>We barely need to use the raw <code class="docutils literal notranslate"><span class="pre">Prompt</span></code> class directly as it is orchestrated by the <code class="docutils literal notranslate"><span class="pre">Generator</span></code> component.</p>
<p>In reality, we barely need to use the raw <code class="docutils literal notranslate"><span class="pre">Prompt</span></code> class directly as it is orchestrated by the <code class="docutils literal notranslate"><span class="pre">Generator</span></code> component together with the <code class="docutils literal notranslate"><span class="pre">ModelClient</span></code> that we will introduce next.</p>
</div>
<div class="highlight admonition">
<p class="admonition-title">References</p>
Expand Down
2 changes: 1 addition & 1 deletion searchindex.js

Large diffs are not rendered by default.

0 comments on commit 18d6845

Please sign in to comment.