Skip to content

Commit

Permalink
Renaming live context to long term memory
Browse files Browse the repository at this point in the history
  • Loading branch information
jimbobbennett committed Oct 26, 2024
1 parent 77e7bcd commit 20dfee1
Show file tree
Hide file tree
Showing 11 changed files with 72 additions and 72 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ The Pieces OS Client SDK is a set of powerful code engine packages designed for
This SDK has 2 packages:

- [Pieces.OS.Client](https://www.nuget.org/packages/Pieces.OS.Client/) - this is the core SDK package providing access to the features of Pieces from your C# application
- [Pieces.Extensions.AI](https://www.nuget.org/packages/Pieces.Extensions.AI/) - this is an implementation of [Microsoft.Extensions.AI](https://www.nuget.org/packages/Microsoft.Extensions.AI/) using Pieces to provide support for multiple LLMs, as well as adding context such as snippets, files, folders, and live context to your AI conversation.
- [Pieces.Extensions.AI](https://www.nuget.org/packages/Pieces.Extensions.AI/) - this is an implementation of [Microsoft.Extensions.AI](https://www.nuget.org/packages/Microsoft.Extensions.AI/) using Pieces to provide support for multiple LLMs, as well as adding context such as snippets, files, folders, and Pieces Long-Term Memory to your AI conversation.

## Features

Expand All @@ -31,7 +31,7 @@ The Pieces SDK offers the following key features:
1. Asset Management: Save and manage assets and formats efficiently.
1. Local Server Interaction: Interact with a locally hosted server for various functionality.
1. Multi-LLM support: Use any Pieces supported LLM to power your app.
1. File, folder, and live context in copilot chats
1. File, folder, and Pieces Long-Term Memory in copilot chats

## Installation

Expand Down Expand Up @@ -67,7 +67,7 @@ This repo contains the following projects:

### Sample apps

- [Remind Me](./src/SampleApps/RemindMe/) - an app that reminds you about what you have been working on over the last few hours using live context.
- [Remind Me](./src/SampleApps/RemindMe/) - an app that reminds you about what you have been working on over the last few hours using Pieces Long-Term Memory.

## Pieces.OS.Client Examples

Expand Down
28 changes: 14 additions & 14 deletions src/Client.Example/Program.cs
Original file line number Diff line number Diff line change
Expand Up @@ -155,19 +155,19 @@

#endregion Stream the response

#region Use live context
#region Use Pieces Long-Term Memory

// Use live context
// Use Pieces Long-Term Memory
//
// This will create a new copilot chat called 1 hour context window, with live context turned on that you will be able to see in other Pieces applications,
// such as Pieces Desktop, or Pieces for Visual Studio Code. You will also be able to see live context turned on against the chat
// This will create a new copilot chat called 1 hour context window, with Pieces Long-Term Memory turned on that you will be able to see in other Pieces applications,
// such as Pieces Desktop, or Pieces for Visual Studio Code. You will also be able to see Pieces Long-Term Memory turned on against the chat
// The chat will ask a question related to this code file using a 1 hour context window, then stream the response back token by token

// {
// var chatContext = new ChatContext
// {
// LiveContext = true,
// LiveContextTimeSpan = TimeSpan.FromHours(1)
// LongTermMemory = true,
// LongTermMemoryTimeSpan = TimeSpan.FromHours(1)
// };
// var chat = await copilot.CreateChatAsync("1 hour context window", chatContext: chatContext).ConfigureAwait(false);

Expand All @@ -183,15 +183,15 @@
// Console.WriteLine();
// }

#endregion Use live context
#endregion Use Pieces Long-Term Memory

#region Use live context turned on later in the chat
#region Use Pieces Long-Term Memory turned on later in the chat

// Use live context
// Use Pieces Long-Term Memory
//
// This will create a new copilot chat called 1 hour context window. After asking a first question, live context is turned turned on
// This will create a new copilot chat called 1 hour context window. After asking a first question, Pieces Long-Term Memory is turned turned on
// that you will be able to see in other Pieces applications,
// such as Pieces Desktop, or Pieces for Visual Studio Code. You will also be able to see live context turned on against the chat
// such as Pieces Desktop, or Pieces for Visual Studio Code. You will also be able to see Pieces Long-Term Memory turned on against the chat
// The chat will ask a question related to this code file using a 1 hour context window, then stream the response back token by token

// {
Expand All @@ -209,8 +209,8 @@
// // Update the context
// chat.ChatContext = new ChatContext
// {
// LiveContext = true,
// LiveContextTimeSpan = TimeSpan.FromHours(1)
// LongTermMemory = true,
// LongTermMemoryTimeSpan = TimeSpan.FromHours(1)
// };

// question = "Describe the Program.cs file I was just reading in my IDE";
Expand All @@ -225,7 +225,7 @@
// Console.WriteLine();
// }

#endregion Use live context turned on later in the chat
#endregion Use Pieces Long-Term Memory turned on later in the chat

#region Load assets

Expand Down
8 changes: 4 additions & 4 deletions src/Client/Copilot/ChatContext.cs
Original file line number Diff line number Diff line change
Expand Up @@ -6,14 +6,14 @@ namespace Pieces.OS.Client.Copilot;
public record ChatContext
{
/// <summary>
/// Should this conversation use live context?
/// Should this conversation use Pieces Long-Term Memory?
/// </summary>
public bool LiveContext { get; set; }
public bool LongTermMemory { get; set; }

/// <summary>
/// If this conversation uses live context, what is the size of the context window time
/// If this conversation uses Pieces Long-Term Memory, what is the size of the context window time
/// </summary>
public TimeSpan? LiveContextTimeSpan { get; set; } = TimeSpan.FromMinutes(15);
public TimeSpan? LongTermMemoryTimeSpan { get; set; } = TimeSpan.FromMinutes(15);

/// <summary>
/// A list of asset Ids to use as context
Expand Down
16 changes: 8 additions & 8 deletions src/Client/Copilot/CopilotChat.cs
Original file line number Diff line number Diff line change
Expand Up @@ -406,10 +406,10 @@ private async Task UpdateConversationAssetsAsync(CancellationToken cancellationT
}

/// <summary>
/// Create the temporal grounding. This is only relevant for live context, and creates a grounding based
/// Create the temporal grounding. This is only relevant for Pieces Long-Term Memory, and creates a grounding based
/// off the time span specified in the chat context, defaulting to 15 minutes if this is not set.
///
/// As part of this, the pipeline is checked. For live context, this pipeline should contain a
/// As part of this, the pipeline is checked. For Pieces Long-Term Memory, this pipeline should contain a
/// <see cref="QGPTConversationPipelineForContextualizedCodeWorkstreamDialog"/>. If not using live
/// context, it should contain a <see cref="QGPTConversationPipelineForGeneralizedCodeDialog"/>.
///
Expand All @@ -422,10 +422,10 @@ private async Task UpdateConversationAssetsAsync(CancellationToken cancellationT
{
TemporalRangeGrounding? temporalRangeGrounding = default;

if (ChatContext?.LiveContext == true)
if (ChatContext?.LongTermMemory == true)
{
var span = ChatContext?.LiveContextTimeSpan ?? TimeSpan.FromMinutes(15);
logger?.LogInformation("Using live context with a time span of: {span}.", span);
var span = ChatContext?.LongTermMemoryTimeSpan ?? TimeSpan.FromMinutes(15);
logger?.LogInformation("Using Pieces Long-Term Memory with a time span of: {span}.", span);
// Create a temporal range from the provided time span ago to now minutes
// If the provided time span is null, use 15 minutes ago
var to = new GroupedTimestamp(value: DateTime.UtcNow);
Expand All @@ -437,7 +437,7 @@ private async Task UpdateConversationAssetsAsync(CancellationToken cancellationT
var flattenedRanges = new FlattenedRanges(iterable: [referencedRange]);
temporalRangeGrounding = new TemporalRangeGrounding(workstreams: flattenedRanges);

// If the conversation wasn't set up for live context, set this up now
// If the conversation wasn't set up for Pieces Long-Term Memory, set this up now
if (conversation.Pipeline.Conversation.ContextualizedCodeWorkstreamDialog is null)
{
var dialog = new QGPTConversationPipelineForContextualizedCodeWorkstreamDialog();
Expand All @@ -447,9 +447,9 @@ private async Task UpdateConversationAssetsAsync(CancellationToken cancellationT
}
else
{
logger?.LogInformation("Not using live context");
logger?.LogInformation("Not using Pieces Long-Term Memory");

// If the conversation was set up for live context, disable this if we are not using live context now
// If the conversation was set up for Pieces Long-Term Memory, disable this if we are not using Pieces Long-Term Memory now
if (conversation.Pipeline.Conversation.ContextualizedCodeWorkstreamDialog is null)
{
var dialog = new QGPTConversationPipelineForGeneralizedCodeDialog();
Expand Down
6 changes: 3 additions & 3 deletions src/Client/Copilot/PiecesCopilot.cs
Original file line number Diff line number Diff line change
Expand Up @@ -77,16 +77,16 @@ public async Task<ICopilotChat> CreateSeededChatAsync(string chatName = "",

QGPTPromptPipeline? pipeline;

if (chatContext?.LiveContext == true)
if (chatContext?.LongTermMemory == true)
{
logger?.LogDebug("Creating copilot chat with live context");
logger?.LogDebug("Creating copilot chat with Pieces Long-Term Memory");
var dialog = new QGPTConversationPipelineForContextualizedCodeWorkstreamDialog();
var conversationPipeline = new QGPTConversationPipeline(contextualizedCodeWorkstreamDialog: dialog);
pipeline = new QGPTPromptPipeline(conversation: conversationPipeline);
}
else
{
logger?.LogDebug("Creating copilot chat without live context");
logger?.LogDebug("Creating copilot chat without Pieces Long-Term Memory");
var dialog = new QGPTConversationPipelineForGeneralizedCodeDialog();
var conversationPipeline = new QGPTConversationPipeline(generalizedCodeDialog: dialog);
pipeline = new QGPTPromptPipeline(conversation: conversationPipeline);
Expand Down
4 changes: 2 additions & 2 deletions src/Core/src/Pieces.Os.Core/SdkModel/ModelCapabilities.cs
Original file line number Diff line number Diff line change
Expand Up @@ -62,9 +62,9 @@ public partial class ModelCapabilities : IValidatableObject
public EmbeddedModelSchema Schema { get; set; }

/// <summary>
/// True if model is able to support live context and any other temporally powered RAG Capabilities i.e. \&quot;What did I do yesterday?\&quot;
/// True if model is able to support Pieces Long-Term Memory and any other temporally powered RAG Capabilities i.e. \&quot;What did I do yesterday?\&quot;
/// </summary>
/// <value>True if model is able to support live context and any other temporally powered RAG Capabilities i.e. \&quot;What did I do yesterday?\&quot;</value>
/// <value>True if model is able to support Pieces Long-Term Memory and any other temporally powered RAG Capabilities i.e. \&quot;What did I do yesterday?\&quot;</value>
[DataMember(Name = "temporal", EmitDefaultValue = true)]
public bool Temporal { get; set; }

Expand Down
24 changes: 12 additions & 12 deletions src/Extensions.Example/Program.cs
Original file line number Diff line number Diff line change
Expand Up @@ -237,14 +237,14 @@

#endregion A continuous streaming conversation

#region Live context
#region Pieces Long-Term Memory

// This example shows how to use live context in a chat completion via the Additional Properties dictionary.
// This example shows how to use Pieces Long-Term Memory in a chat completion via the Additional Properties dictionary.
// To run this, read this GitHub issue in your browser before running this: https://github.com/pieces-app/pieces-os-client-sdk-for-csharp/issues/8

// {
// // Create a Chat completion
// IChatClient chatClient = new PiecesChatClient(client, chatName: $"Live context chat - {DateTime.Now.ToShortTimeString()}", logger: logger);
// IChatClient chatClient = new PiecesChatClient(client, chatName: $"Pieces Long-Term Memory chat - {DateTime.Now.ToShortTimeString()}", logger: logger);

// var chatMessages = new List<ChatMessage>{
// new(ChatRole.User, "Describe the Add support for Microsoft.Extensions.AI github issue I was just reading about in my browser")
Expand All @@ -253,8 +253,8 @@
// var options = new ChatOptions()
// {
// AdditionalProperties = new AdditionalPropertiesDictionary{
// { "LiveContext", true },
// { "LiveContextTimeSpan", TimeSpan.FromHours(1) }
// { "LongTermMemory", true },
// { "LongTermMemoryTimeSpan", TimeSpan.FromHours(1) }
// }
// };

Expand All @@ -266,16 +266,16 @@
// Console.WriteLine();
// }

#endregion Live context
#endregion Pieces Long-Term Memory

#region Live context turned on after a question
#region Pieces Long-Term Memory turned on after a question

// This example shows how to use live context in a chat completion via the Additional Properties dictionary.
// This example shows how to use Pieces Long-Term Memory in a chat completion via the Additional Properties dictionary.
// To run this, read this GitHub issue in your browser before running this: https://github.com/pieces-app/pieces-os-client-sdk-for-csharp/issues/8

// {
// // Create a Chat completion
// IChatClient chatClient = new PiecesChatClient(client, chatName: $"Live context chat - {DateTime.Now.ToShortTimeString()}", logger: logger);
// IChatClient chatClient = new PiecesChatClient(client, chatName: $"Pieces Long-Term Memory chat - {DateTime.Now.ToShortTimeString()}", logger: logger);

// var chatMessages = new List<ChatMessage>{
// new(ChatRole.User, "Hello")
Expand All @@ -296,8 +296,8 @@
// var options = new ChatOptions()
// {
// AdditionalProperties = new AdditionalPropertiesDictionary{
// { "LiveContext", true },
// { "LiveContextTimeSpan", TimeSpan.FromHours(1) }
// { "LongTermMemory", true },
// { "LongTermMemoryTimeSpan", TimeSpan.FromHours(1) }
// }
// };

Expand All @@ -309,7 +309,7 @@
// Console.WriteLine();
// }

#endregion Live context turned on after a question
#endregion Pieces Long-Term Memory turned on after a question

#region Create an asset and use it in a chat

Expand Down
20 changes: 10 additions & 10 deletions src/Extensions/PiecesChatClient.cs
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@ public class PiecesChatClient(IPiecesClient piecesClient, string chatName = "",
/// </summary>
public const string FoldersPropertyName = "Folders";
/// <summary>
/// A constant for the name of the LiveContext property in the <see cref="ChatOptions"/> AdditionalProperties dictionary
/// A constant for the name of the LongTermMemory property in the <see cref="ChatOptions"/> AdditionalProperties dictionary
/// </summary>
public const string LiveContextPropertyName = "LiveContext";
public const string LongTermMemoryPropertyName = "LongTermMemory";
/// <summary>
/// A constant for the name of the LiveContextTimeSpan property in the <see cref="ChatOptions"/> AdditionalProperties dictionary
/// A constant for the name of the LongTermMemoryTimeSpan property in the <see cref="ChatOptions"/> AdditionalProperties dictionary
/// </summary>
public const string LiveContextTimeSpanPropertyName = "LiveContextTimeSpan";
public const string LongTermMemoryTimeSpanPropertyName = "LongTermMemoryTimeSpan";
/// <summary>
/// A constant for the name of the PersistChat property in the <see cref="ChatOptions"/> AdditionalProperties dictionary
/// </summary>
Expand Down Expand Up @@ -61,8 +61,8 @@ public class PiecesChatClient(IPiecesClient piecesClient, string chatName = "",
// The chat options to configure the request. To use Pieces specific features, set the following
// in the AdditionalProperties collection:
//
// ["LiveContext"] = true/false; // set to true to use live context. Default to false.
// ["LiveContextTimeSpan"] = TimeSpan?; // The timespan to use for live context. Defaults to 15 minutes if not set.
// ["LongTermMemory"] = true/false; // set to true to use Pieces Long-Term Memory. Default to false.
// ["LongTermMemoryTimeSpan"] = TimeSpan?; // The timespan to use for Pieces Long-Term Memory. Defaults to 15 minutes if not set.
// ["AssetIds"] = []; // Set to an enumerable of asset ids to use saved assets in the chat. Default to none.
// ["PersistChat] = true/false; // By defaults these chats are persisted in Pieces. If this is set to false,
// the chat is deleted after the response is returned
Expand Down Expand Up @@ -125,8 +125,8 @@ public async Task<ChatCompletion> CompleteAsync(IList<ChatMessage> chatMessages,
// The chat options to configure the request. To use Pieces specific features, set the following
// in the AdditionalProperties collection:
//
// ["LiveContext"] = true/false; // set to true to use live context. Default to false.
// ["LiveContextTimeSpan"] = TimeSpan?; // The timespan to use for live context. Defaults to 15 minutes if not set.
// ["LongTermMemory"] = true/false; // set to true to use Pieces Long-Term Memory. Default to false.
// ["LongTermMemoryTimeSpan"] = TimeSpan?; // The timespan to use for Pieces Long-Term Memory. Defaults to 15 minutes if not set.
// ["AssetIds"] = []; // Set to an enumerable of asset ids to use saved assets in the chat. Default to none.
// ["PersistChat] = true/false; // By defaults these chats are persisted in Pieces. If this is set to false,
// the chat is deleted after the response is returned
Expand Down Expand Up @@ -271,8 +271,8 @@ private static ChatContext CreateChatContextFromOptions(ChatOptions? options)
return new ChatContext
{
AssetIds = GetValueFromOptions<IEnumerable<string>>(options, AssetIdsPropertyName),
LiveContext = GetBoolValueFromOptions(options, LiveContextPropertyName),
LiveContextTimeSpan = GetValueFromOptions<TimeSpan?>(options, LiveContextTimeSpanPropertyName, null),
LongTermMemory = GetBoolValueFromOptions(options, LongTermMemoryPropertyName),
LongTermMemoryTimeSpan = GetValueFromOptions<TimeSpan?>(options, LongTermMemoryTimeSpanPropertyName, null),
Files = GetValueFromOptions<IEnumerable<string>>(options, FilesPropertyName),
Folders = GetValueFromOptions<IEnumerable<string>>(options, FoldersPropertyName),
};
Expand Down
Loading

0 comments on commit 20dfee1

Please sign in to comment.