Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updating docs #166

Merged
merged 7 commits into from
Sep 12, 2024
Merged

Updating docs #166

merged 7 commits into from
Sep 12, 2024

Conversation

cesr
Copy link
Contributor

@cesr cesr commented Sep 12, 2024

No description provided.

@cesr cesr added the 🚧 wip Work in progress label Sep 12, 2024
@cesr cesr removed the 🚧 wip Work in progress label Sep 12, 2024

## How do they work?

A Latitude project can have any number of evaluations that will be available to connect to prompts. You can create evaluations in the **Evaluations** tab of your project. Latitude comes with a set of built-in evaluations that you can use to get started, it's as simple as importing them into your project.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The "Evaluation" tab is not IN the project, but in the workspace.

Comment on lines 1 to 23
---
title: Logs
description: Learn how to use the logs page to monitor your prompts and evaluate their performance.
---

## Overview

Latitude stores all the logs generated by your prompts in a database. You can use the logs page to monitor your prompts and evaluate their performance.

## How it works

Every time you run a prompt, from the API or from the UI, a new log is created.

To access the logs page, navigate to a prompt and click on the "Logs" tab. You'll see a table with all the logs generated by the prompt, some metadata like the timestamp, the prompt version used, latency, tokens used, and cost.

Clicking on a log will display a side panel with the full details of the log, including the list of messages.

## Coming soon

- Filtering and sorting
- Exporting logs to a CSV file
- Deleting logs
- Visualizations for certain metrics like latency, tokens used, and cost
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe add that you can manually generate new logs from a Dataset?

@cesr cesr merged commit 8001d36 into main Sep 12, 2024
3 checks passed
@cesr cesr deleted the update-docs branch September 12, 2024 15:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants