Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: Merge for release #690

Merged
merged 11 commits into from
Dec 11, 2024
Merged
87 changes: 83 additions & 4 deletions docs/framework/ai-module.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,7 @@ graph.add_file(file)

# Remove the file from the graph
graph.remove_file(file)
````
```

#### Applying Graphs to Conversation completion

Expand Down Expand Up @@ -228,10 +228,10 @@ response = conversation.complete(tools=tool, max_tool_depth=7)
You can pass either a single tool or a list of tools to the `complete()` or `stream_complete()` methods. The tools can be a combination of FunctionTool, Graph, or JSON-defined tools.

```python
from writer.ai import FunctionTool, retrieve_graph
from writer.ai import create_function_tool, retrieve_graph

# Define a function tool
tool1 = FunctionTool(
tool1 = create_function_tool(
name="get_data",
callable=lambda x: f"Data for {x}",
parameters={"x": {"type": "string", "description": "Input value"}}
Expand All @@ -245,7 +245,9 @@ response = conversation.complete(tools=[tool1, graph])
```

## Text generation without a conversation state
These `complete` and `stream_complete` methods are designed for one-off text generation without the need to manage a conversation state. They return the model's response as a string. Each function accepts a `config` dictionary allowing call-specific configurations.

### Text generation against a string prompt
`complete` and `stream_complete` methods are designed for one-off text generation without the need to manage a conversation state. They return the model's response as a string. Each function accepts a `config` dictionary allowing call-specific configurations.

<CodeGroup>
```python complete
Expand All @@ -260,4 +262,81 @@ for text_chunk in stream_complete("Explore the benefits of AI.", config={'temper
```
</CodeGroup>

### Text generation against graphs
The `ask` and `stream_ask` methods allow you to query one or more graphs to generate responses from the information stored within them.

#### Two approaches to questioning graphs

There are two ways to query graphs, depending on your needs:
1. **Graph-Level Methods** (`Graph.ask`, `Graph.stream_ask`): Used when working with a single graph instance. These methods are tied directly to the Graph object, encapsulating operations within that instance.
2. **Module-Level Methods** (`writer.ai.ask`, `writer.ai.stream_ask`): Designed for querying multiple graphs simultaneously. These methods operate on a broader scale, allowing mixed inputs of graph objects and IDs.

<Note>
• Use graph-level methods when working with a single graph instance.
• Use module-level methods when querying multiple graphs or when graph IDs are your primary input.
</Note>

#### Parameters

Both methods include:
• `question: str`: The main query for the LLM.
• *Optional* `subqueries: bool` (default: `False`): Allows the LLM to generate additional questions during response preparation for more detailed answers. Enabling this might increase response time.

Method-level methods require:
• `graphs_or_graph_ids: list[Graph | str]`: A list of graphs to use for the question. You can pass `Graph` objects directly into the list, use graph IDs in string form, or a mix of both.

#### Graph-level methods

The graph-level methods, `Graph.ask` and `Graph.stream_ask`, are designed for interacting with a single graph. By calling these methods on a specific `Graph` instance, you can easily pose questions and retrieve answers tailored to that graph’s content.

<CodeGroup>
```python ask
# Retrieve a specific graph
graph = retrieve_graph("f47ac10b-58cc-4372-a567-0e02b2c3d479")

# Pose a question to the graph and get a complete response
response = graph.ask("What are the benefits of renewable energy?")
print(response)
```
```python stream_ask
# Retrieve a specific graph
graph = retrieve_graph("f47ac10b-58cc-4372-a567-0e02b2c3d479")

# Pose a question and stream the response in chunks
for chunk in graph.stream_ask("Explain the history of solar energy."):
print(chunk)
```
</CodeGroup>

#### Module-level methods
The module-level methods, `writer.ai.ask` and `writer.ai.stream_ask`, are designed for querying multiple graphs simultaneously. They are useful when you need to aggregate or compare data across multiple graphs.

<CodeGroup>
```python ask
from writer.ai import ask

# Pose a question to multiple graphs
response = ask(
question="What are the latest advancements in AI?",
graphs_or_graph_ids=[
"550e8400-e29b-41d4-a716-446655440000",
"123e4567-e89b-12d3-a456-426614174000"
]
)
print(response)
```
```python stream_ask
from writer.ai import stream_ask

# Stream responses from multiple graphs
for chunk in stream_ask(
question="Describe the key features of renewable energy sources.",
graphs_or_graph_ids=[
"550e8400-e29b-41d4-a716-446655440000",
"123e4567-e89b-12d3-a456-426614174000"
]
):
print(chunk)
```
</CodeGroup>

Binary file added docs/framework/public/components/progressbar.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 5 additions & 1 deletion docs/framework/seo.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
title: "SEO"
title: "Seo and social sharing"
mode: "wide"
---

Expand Down Expand Up @@ -41,6 +41,7 @@ writer.serve.configure_webpage_metadata(

You can also use a function to generate the meta tags dynamically.

*./server_setup.py*
```python
def _meta():
last_news = db.get_last_news()
Expand All @@ -57,6 +58,7 @@ writer.serve.configure_webpage_metadata(meta=_meta)

When you share a link on social networks, they will try to fetch the metadata of the page to display a preview.

*./server_setup.py*
```python
writer.serve.configure_webpage_metadata(
opengraph_tags= {
Expand All @@ -66,9 +68,11 @@ writer.serve.configure_webpage_metadata(
"og:url": "https://myapp.com"
}
)
```

You can also use a function to generate the opengraph tags dynamically.

*./server_setup.py*
```python
def _opengraph_tags():
last_news = db.get_last_news()
Expand Down
31 changes: 16 additions & 15 deletions docs/mint.json
Original file line number Diff line number Diff line change
Expand Up @@ -32,23 +32,20 @@
"framework/introduction",
"framework/quickstart",
"framework/ai-module",
"framework/cloud-deploy",
"framework/sample-apps"
"framework/sample-apps",
"framework/component-list-link"
]
},
{
"group": "Guides",
"pages": [
"framework/application-state",
"framework/event-handlers",
"framework/backend-initiated-actions",
"framework/builder-basics",
"framework/event-handlers",
"framework/handling-inputs",
"framework/dataframe",
"framework/backend-driven-ui",
"framework/stylesheets",
"framework/frontend-scripts",
"framework/custom-components",
"framework/authentication"
"framework/dataframe",
"framework/repeater"
]
},
{
Expand All @@ -64,19 +61,23 @@
"group": "Deployment",
"pages": [
"framework/cloud-deploy",
"framework/deploy-with-docker",
"framework/testing"
"framework/deploy-with-docker"
]
},
{
"group": "Advanced",
"pages": [
"framework/repeater",
"framework/backend-initiated-actions",
"framework/authentication",
"framework/backend-driven-ui",
"framework/custom-components",
"framework/custom-server",
"framework/frontend-scripts",
"framework/page-routes",
"framework/sessions",
"framework/custom-server",
"framework/state-schema"
"framework/state-schema",
"framework/stylesheets",
"framework/testing",
"framework/seo"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "poetry.core.masonry.api"

[tool.poetry]
name = "writer"
version = "0.8.3rc1"
version = "0.8.3rc2"
description = "An open-source, Python framework for building feature-rich apps that are fully integrated with the Writer platform."
authors = ["Writer, Inc."]
readme = "README.md"
Expand Down
10 changes: 5 additions & 5 deletions src/ui/src/components/core/base/BaseInputWrapper.vue
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,16 @@
class="BaseInputWrapper"
:class="{ horizontal: isHorizontal }"
>
<label v-if="props.label">{{ props.label }}</label>
<label v-if="label">{{ label }}</label>
<slot></slot>
</div>
</template>

<script setup lang="ts">
const props = defineProps<{
label: string;
isHorizontal?: boolean;
}>();
defineProps({
label: { type: String, required: false, default: undefined },
isHorizontal: { type: Boolean, required: false },
});
</script>

<style scoped>
Expand Down
Loading
Loading