Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

741 fix backend chat history message types #803

Merged
merged 90 commits into from
Feb 6, 2024
Merged
Show file tree
Hide file tree
Changes from 86 commits
Commits
Show all changes
90 commits
Select commit Hold shift + click to select a range
46c6f33
combines two separate bits of logic for winning level in processChatR…
pmarsh-scottlogic Jan 15, 2024
ac5130c
renames increamentNumCompletedLevels to updateNumCompletedLevels
pmarsh-scottlogic Jan 15, 2024
59cba81
renames ChatHistoryMessage to ChatMessageDTO
pmarsh-scottlogic Jan 15, 2024
293bd8d
refactors getChatHistory
pmarsh-scottlogic Jan 15, 2024
2c34238
further refactors getChatHistory to remove immutability
pmarsh-scottlogic Jan 15, 2024
a47bcab
refactors makeChatMessageFromDTO
pmarsh-scottlogic Jan 15, 2024
d663d97
removed some outdated comments
pmarsh-scottlogic Jan 17, 2024
8dd475f
adds reminder comment and transformedMessage as propery to chatHistor…
pmarsh-scottlogic Jan 17, 2024
7b34914
Merge branch 'dev' into 741-transformed-messages-not-showing-correctly
pmarsh-scottlogic Jan 23, 2024
a2a9aca
sets transformed message inn chat history
pmarsh-scottlogic Jan 23, 2024
0f2ff35
adds ability to retrieve transformed message from the dto
pmarsh-scottlogic Jan 23, 2024
b6116fa
add the transformed info message in backend rather than frontend
pmarsh-scottlogic Jan 25, 2024
edc5d42
Merge branch 'dev' into 741-transformed-messages-not-showing-correctly
pmarsh-scottlogic Jan 25, 2024
816f394
adds the transformedMessageInfo to the chat response so it can be sho…
pmarsh-scottlogic Jan 25, 2024
0a9dacf
combine message transformation objects into one object
pmarsh-scottlogic Jan 25, 2024
527cde6
tidies random sequence transformation test
pmarsh-scottlogic Jan 25, 2024
b506fc6
finalise random sequence transformation test
pmarsh-scottlogic Jan 25, 2024
2feaa6a
tidy up xml tagging transformation test
pmarsh-scottlogic Jan 25, 2024
8304931
tidy up xml tagging transformation test with escaping
pmarsh-scottlogic Jan 25, 2024
1452f13
removes unnecessary test and reorders
pmarsh-scottlogic Jan 25, 2024
2aa5512
moves no transformation into transformation test block and removes un…
pmarsh-scottlogic Jan 25, 2024
53b471f
moves transform message tests into separate test file
pmarsh-scottlogic Jan 25, 2024
10d5304
remove isTriggered from defence object
pmarsh-scottlogic Jan 25, 2024
f22ffda
complete message transformation test
pmarsh-scottlogic Jan 25, 2024
98b0b0a
renames chatHisotryMessage to just chatMEssage
pmarsh-scottlogic Jan 30, 2024
b141f5e
adds type ChatMessageUserTransformed
pmarsh-scottlogic Jan 30, 2024
8c92c8e
incorporates type ChatMessageUserTransformed
pmarsh-scottlogic Jan 30, 2024
59acc03
adds comment to track progress
pmarsh-scottlogic Jan 30, 2024
8503e6a
adds chatInfoMessage
pmarsh-scottlogic Jan 30, 2024
fbfd014
renames chatUserTransformedMessage and chatGenericMessage
pmarsh-scottlogic Jan 30, 2024
8b61f37
incorporates chatInfoMessage
pmarsh-scottlogic Jan 30, 2024
6287f23
adds chatUserMessage
pmarsh-scottlogic Jan 30, 2024
56d7c04
adds chatUserMessage
pmarsh-scottlogic Jan 30, 2024
27885ff
incorporates chatUserMessage
pmarsh-scottlogic Jan 30, 2024
0197422
changes ChatGPTReply type to chatCompletionAssistantMessageParam
pmarsh-scottlogic Jan 30, 2024
30ca67d
adds and incorporates chatBotMessage
pmarsh-scottlogic Jan 30, 2024
abf0121
removes useless comment
pmarsh-scottlogic Jan 30, 2024
000e3a2
moves chat types to separate folder
pmarsh-scottlogic Jan 30, 2024
c29f7fc
fix imports for chatMessage
pmarsh-scottlogic Jan 30, 2024
2535968
improve check for system role
pmarsh-scottlogic Jan 30, 2024
3463cb9
adds ChatSystemMessage
pmarsh-scottlogic Jan 30, 2024
9681001
remove the generic message type
pmarsh-scottlogic Jan 30, 2024
3628697
adds function call chat message type
pmarsh-scottlogic Jan 30, 2024
0eef139
hack it together
pmarsh-scottlogic Jan 30, 2024
aec1ca0
fix chatController integration test according to new types
pmarsh-scottlogic Jan 30, 2024
d3b8428
fix controller unit tests
pmarsh-scottlogic Jan 30, 2024
106e517
fix remaining tests
pmarsh-scottlogic Jan 30, 2024
7dd6c8d
checks chatMessageTpye rather than existence of property
pmarsh-scottlogic Jan 31, 2024
3f9a78e
unscuffs openai integration tests
pmarsh-scottlogic Jan 31, 2024
21ed43c
use undefined instead of null for transofrmed messages
pmarsh-scottlogic Jan 31, 2024
4fa8eb0
updates test
pmarsh-scottlogic Jan 31, 2024
f4cd714
Merge branch 'dev' into 741-transformed-messages-not-showing-correctly
pmarsh-scottlogic Jan 31, 2024
1d77ddc
removes isTriggered from test to make it pass
pmarsh-scottlogic Jan 31, 2024
423f6fb
merge dev
pmarsh-scottlogic Jan 31, 2024
72b8e3f
fix some type errors
pmarsh-scottlogic Jan 31, 2024
2cecc72
merge 741-transformed-messages-not-showing-correctly into this branch
pmarsh-scottlogic Jan 31, 2024
8f42a46
fix some types
pmarsh-scottlogic Jan 31, 2024
632bf9b
move test names to single lines
pmarsh-scottlogic Jan 31, 2024
267c698
fixes a bug which was making a test fail
pmarsh-scottlogic Jan 31, 2024
2c11259
adds unknown message whne blockedReason is missing.
pmarsh-scottlogic Jan 31, 2024
3750a18
replace map with reduce for shortening
pmarsh-scottlogic Jan 31, 2024
f8712b2
refactors chat.ts puch Message to history
pmarsh-scottlogic Jan 31, 2024
8d620f5
implements undefined tricks
pmarsh-scottlogic Feb 2, 2024
99db244
merge 741-transformed
pmarsh-scottlogic Feb 2, 2024
5a6cdb6
simplifies pushMessageToHistory
pmarsh-scottlogic Feb 2, 2024
4bac47e
merge dev
pmarsh-scottlogic Feb 5, 2024
2592a12
remove broken, duplicate test
pmarsh-scottlogic Feb 5, 2024
5a3b289
cleanup diff
pmarsh-scottlogic Feb 5, 2024
6461c5b
remove comment
pmarsh-scottlogic Feb 5, 2024
726e974
rename message to infoMessage in OpenAiAddHistoryRequest
pmarsh-scottlogic Feb 5, 2024
b58bb5c
start converting message type enum to string
pmarsh-scottlogic Feb 5, 2024
7af7dd7
fix up failing tests
pmarsh-scottlogic Feb 5, 2024
f083431
replace enums in frontend
pmarsh-scottlogic Feb 5, 2024
facaaaa
merge dev
pmarsh-scottlogic Feb 5, 2024
b85744c
rename endpoint
pmarsh-scottlogic Feb 5, 2024
0b451c5
fix frontend request
pmarsh-scottlogic Feb 5, 2024
1a2756d
fixes bug where transformed message and info would disappear on refre…
pmarsh-scottlogic Feb 5, 2024
5c8ca5c
adds unit test to catch that bug I just fixed
pmarsh-scottlogic Feb 5, 2024
5574f1b
separate out chat message types that only have info associated with
pmarsh-scottlogic Feb 5, 2024
719ae0a
move chatMessageTypesAsInfo into string literal array for api runtime…
pmarsh-scottlogic Feb 5, 2024
6121cf5
adds type check to handleAddToChatHistoryAsInfo
pmarsh-scottlogic Feb 5, 2024
1cdc881
removes superefluous strings from type union
pmarsh-scottlogic Feb 5, 2024
0f56376
tidying
pmarsh-scottlogic Feb 6, 2024
0080627
remanes INFO to GENERIC_INFO
pmarsh-scottlogic Feb 6, 2024
32e2346
renames all asInfo things to addInfoMessage
pmarsh-scottlogic Feb 6, 2024
84cb2c1
use IsSystemMessage for splicing
pmarsh-scottlogic Feb 6, 2024
8c406c8
shortens a decleration with destructuring
pmarsh-scottlogic Feb 6, 2024
98e79eb
pluralises chat info message type
pmarsh-scottlogic Feb 6, 2024
562c6ec
sorts out adding blocked message as info message
pmarsh-scottlogic Feb 6, 2024
d6622ae
fix tests according to new type change
pmarsh-scottlogic Feb 6, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
61 changes: 30 additions & 31 deletions backend/src/controller/chatController.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,20 +5,23 @@ import {
detectTriggeredInputDefences,
detectTriggeredOutputDefences,
} from '@src/defence';
import { OpenAiAddHistoryRequest } from '@src/models/api/OpenAiAddHistoryRequest';
import { OpenAiAddInfoToChatHistoryRequest } from '@src/models/api/OpenAiAddInfoToChatHistoryRequest';
import { OpenAiChatRequest } from '@src/models/api/OpenAiChatRequest';
import { OpenAiClearRequest } from '@src/models/api/OpenAiClearRequest';
import { OpenAiGetHistoryRequest } from '@src/models/api/OpenAiGetHistoryRequest';
import {
CHAT_MESSAGE_TYPE,
ChatDefenceReport,
ChatHistoryMessage,
ChatHttpResponse,
ChatModel,
LevelHandlerResponse,
MessageTransformation,
defaultChatModel,
} from '@src/models/chat';
import {
ChatMessage,
ChatInfoMessage,
chatInfoMessageType,
} from '@src/models/chatMessage';
import { Defence } from '@src/models/defence';
import { EmailInfo } from '@src/models/email';
import { LEVEL_NAMES } from '@src/models/level';
Expand Down Expand Up @@ -47,25 +50,23 @@ function combineChatDefenceReports(
function createNewUserMessages(
message: string,
messageTransformation?: MessageTransformation
): ChatHistoryMessage[] {
): ChatMessage[] {
if (messageTransformation) {
return [
{
completion: null,
chatMessageType: CHAT_MESSAGE_TYPE.USER,
chatMessageType: 'USER',
infoMessage: message,
},
{
completion: null,
chatMessageType: CHAT_MESSAGE_TYPE.INFO,
chatMessageType: 'GENERIC_INFO',
infoMessage: messageTransformation.transformedMessageInfo,
},
{
completion: {
role: 'user',
content: messageTransformation.transformedMessageCombined,
},
chatMessageType: CHAT_MESSAGE_TYPE.USER_TRANSFORMED,
chatMessageType: 'USER_TRANSFORMED',
transformedMessage: messageTransformation.transformedMessage,
},
];
Expand All @@ -76,7 +77,7 @@ function createNewUserMessages(
role: 'user',
content: message,
},
chatMessageType: CHAT_MESSAGE_TYPE.USER,
chatMessageType: 'USER',
},
];
}
Expand All @@ -87,7 +88,7 @@ async function handleChatWithoutDefenceDetection(
chatResponse: ChatHttpResponse,
currentLevel: LEVEL_NAMES,
chatModel: ChatModel,
chatHistory: ChatHistoryMessage[],
chatHistory: ChatMessage[],
defences: Defence[]
): Promise<LevelHandlerResponse> {
const updatedChatHistory = createNewUserMessages(message).reduce(
Expand Down Expand Up @@ -122,7 +123,7 @@ async function handleChatWithDefenceDetection(
chatResponse: ChatHttpResponse,
currentLevel: LEVEL_NAMES,
chatModel: ChatModel,
chatHistory: ChatHistoryMessage[],
chatHistory: ChatMessage[],
defences: Defence[]
): Promise<LevelHandlerResponse> {
const messageTransformation = transformMessage(message, defences);
Expand Down Expand Up @@ -162,11 +163,7 @@ async function handleChatWithDefenceDetection(

// if blocked, restore original chat history and add user message to chat history without completion
const updatedChatHistory = combinedDefenceReport.isBlocked
? pushMessageToHistory(chatHistory, {
completion: null,
chatMessageType: CHAT_MESSAGE_TYPE.USER,
infoMessage: message,
})
? chatHistoryWithNewUserMessages
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've lost track of this PR a touch! Looks like you're keeping the transformed message now even if it was blocked, so I'm assuming that's a deliberate change.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, it was not so deliberate. I want the new messages in the history, but I do not want the completion there. I shall change that now

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So now we need a version of type ChatUserTransformedMessage which doesn't include a completion. I was in two minds about whether I should just make the completion optional, or to make a whole new type that doesn't have the completion. I went with the former for lightweightness, but very happy to try the latter.

: openAiReply.chatHistory;

const updatedChatResponse: ChatHttpResponse = {
Expand Down Expand Up @@ -284,9 +281,10 @@ async function handleChatToGPT(req: OpenAiChatRequest, res: Response) {
if (updatedChatResponse.defenceReport.isBlocked) {
// chatReponse.reply is empty if blocked
updatedChatHistory = pushMessageToHistory(updatedChatHistory, {
completion: null,
chatMessageType: CHAT_MESSAGE_TYPE.BOT_BLOCKED,
infoMessage: updatedChatResponse.defenceReport.blockedReason,
chatMessageType: 'BOT_BLOCKED',
infoMessage:
updatedChatResponse.defenceReport.blockedReason ??
'block reason unknown',
});
} else if (updatedChatResponse.openAIErrorMessage) {
const errorMsg = simplifyOpenAIErrorMessage(
Expand All @@ -307,13 +305,12 @@ async function handleChatToGPT(req: OpenAiChatRequest, res: Response) {
handleChatError(res, updatedChatResponse, errorMsg, 500);
return;
} else {
// add bot message to chat history
updatedChatHistory = pushMessageToHistory(updatedChatHistory, {
completion: {
role: 'assistant',
content: updatedChatResponse.reply,
},
chatMessageType: CHAT_MESSAGE_TYPE.BOT,
chatMessageType: 'BOT',
});
}

Expand All @@ -339,13 +336,12 @@ function simplifyOpenAIErrorMessage(openAIErrorMessage: string) {
}

function addErrorToChatHistory(
chatHistory: ChatHistoryMessage[],
chatHistory: ChatMessage[],
errorMessage: string
): ChatHistoryMessage[] {
): ChatMessage[] {
pmarsh-scottlogic marked this conversation as resolved.
Show resolved Hide resolved
console.error(errorMessage);
return pushMessageToHistory(chatHistory, {
completion: null,
chatMessageType: CHAT_MESSAGE_TYPE.ERROR_MSG,
chatMessageType: 'ERROR_MSG',
infoMessage: errorMessage,
});
}
Expand All @@ -360,23 +356,26 @@ function handleGetChatHistory(req: OpenAiGetHistoryRequest, res: Response) {
}
}

function handleAddToChatHistory(req: OpenAiAddHistoryRequest, res: Response) {
const infoMessage = req.body.message;
function handleAddInfoToChatHistory(
req: OpenAiAddInfoToChatHistoryRequest,
res: Response
) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The more I think about this separate API call, the weirder it seems. In theory it allows us to inject an arbitrary message into the chat, of any chat message type. Ok we can't add a completion, but we could add a "BOT" message with an infoMessage property. Will the ChatGPT call simply ignore that message, as it has no completion property?

Anyhow, nothing to do here, just blurting out my thoughts as they form 😅

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

well, not any more! They can only inject one that makes sense with the new checks I've added.
namely

if (... && 
  chatInfoMessageType.includes(chatMessageType) &&
  ...
) {
  // do the stuff
}

const infoMessage = req.body.infoMessage;
const chatMessageType = req.body.chatMessageType;
const level = req.body.level;
pmarsh-scottlogic marked this conversation as resolved.
Show resolved Hide resolved
if (
infoMessage &&
chatMessageType &&
chatInfoMessageType.includes(chatMessageType) &&
level !== undefined &&
level >= LEVEL_NAMES.LEVEL_1
) {
req.session.levelState[level].chatHistory = pushMessageToHistory(
req.session.levelState[level].chatHistory,
{
completion: null,
chatMessageType,
infoMessage,
}
} as ChatInfoMessage
);
res.send();
} else {
Expand All @@ -400,6 +399,6 @@ function handleClearChatHistory(req: OpenAiClearRequest, res: Response) {
export {
handleChatToGPT,
handleGetChatHistory,
handleAddToChatHistory,
handleAddInfoToChatHistory as handleAddInfoToChatHistory,
handleClearChatHistory,
};
16 changes: 0 additions & 16 deletions backend/src/models/api/OpenAiAddHistoryRequest.ts

This file was deleted.

16 changes: 16 additions & 0 deletions backend/src/models/api/OpenAiAddInfoToChatHistoryRequest.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
import { Request } from 'express';

import { CHAT_INFO_MESSAGE_TYPE } from '@src/models/chatMessage';
import { LEVEL_NAMES } from '@src/models/level';

export type OpenAiAddInfoToChatHistoryRequest = Request<
never,
never,
{
chatMessageType?: CHAT_INFO_MESSAGE_TYPE;
infoMessage?: string;
level?: LEVEL_NAMES;
},
never,
never
>;
4 changes: 2 additions & 2 deletions backend/src/models/api/OpenAiGetHistoryRequest.ts
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
import { Request } from 'express';

import { ChatHistoryMessage } from '@src/models/chat';
import { ChatMessage } from '@src/models/chatMessage';

export type OpenAiGetHistoryRequest = Request<
never,
ChatHistoryMessage[] | string,
ChatMessage[] | string,
never,
{
level?: string;
Expand Down
39 changes: 8 additions & 31 deletions backend/src/models/chat.ts
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
import {
ChatCompletionMessage,
ChatCompletionAssistantMessageParam,
ChatCompletionMessageParam,
} from 'openai/resources/chat/completions';

import { ChatMessage } from './chatMessage';
import { DEFENCE_ID } from './defence';
import { EmailInfo } from './email';

Expand All @@ -16,21 +17,6 @@ enum CHAT_MODELS {
GPT_3_5_TURBO_16K_0613 = 'gpt-3.5-turbo-16k-0613',
}

enum CHAT_MESSAGE_TYPE {
BOT,
BOT_BLOCKED,
INFO,
USER,
USER_TRANSFORMED,
LEVEL_INFO,
DEFENCE_ALERTED,
DEFENCE_TRIGGERED,
SYSTEM,
FUNCTION_CALL,
ERROR_MSG,
RESET_LEVEL,
}

enum MODEL_CONFIG {
TEMPERATURE = 'temperature',
TOP_P = 'topP',
Expand Down Expand Up @@ -72,7 +58,7 @@ interface FunctionCallResponse {
interface ToolCallResponse {
functionCallReply?: FunctionCallResponse;
chatResponse?: ChatResponse;
chatHistory: ChatHistoryMessage[];
chatHistory: ChatMessage[];
}

interface ChatAnswer {
Expand All @@ -92,8 +78,8 @@ interface ChatResponse {
}

interface ChatGptReply {
chatHistory: ChatHistoryMessage[];
completion: ChatCompletionMessage | null;
chatHistory: ChatMessage[];
completion: ChatCompletionAssistantMessageParam | null;
openAIErrorMessage: string | null;
}

Expand Down Expand Up @@ -123,17 +109,9 @@ interface ChatHttpResponse {

interface LevelHandlerResponse {
chatResponse: ChatHttpResponse;
chatHistory: ChatHistoryMessage[];
}

interface ChatHistoryMessage {
completion: ChatCompletionMessageParam | null;
chatMessageType: CHAT_MESSAGE_TYPE;
infoMessage?: string | null;
transformedMessage?: TransformedChatMessage;
chatHistory: ChatMessage[];
}

// default settings for chat model
const defaultChatModel: ChatModel = {
id: CHAT_MODELS.GPT_3_5_TURBO,
configuration: {
Expand All @@ -154,11 +132,10 @@ export type {
ChatResponse,
LevelHandlerResponse,
ChatHttpResponse,
ChatHistoryMessage,
SingleDefenceReport,
TransformedChatMessage,
FunctionCallResponse,
ToolCallResponse,
MessageTransformation,
SingleDefenceReport,
};
export { CHAT_MODELS, CHAT_MESSAGE_TYPE, MODEL_CONFIG, defaultChatModel };
export { CHAT_MODELS, MODEL_CONFIG, defaultChatModel };
70 changes: 70 additions & 0 deletions backend/src/models/chatMessage.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
import {
ChatCompletionAssistantMessageParam,
ChatCompletionMessageParam,
ChatCompletionSystemMessageParam,
ChatCompletionUserMessageParam,
} from 'openai/resources/chat/completions';

import { TransformedChatMessage } from './chat';

const chatInfoMessageType = [
pmarsh-scottlogic marked this conversation as resolved.
Show resolved Hide resolved
'DEFENCE_ALERTED',
'DEFENCE_TRIGGERED',
'LEVEL_INFO',
'RESET_LEVEL',
'ERROR_MSG',
'BOT_BLOCKED',
'USER',
'GENERIC_INFO',
] as const;

type CHAT_INFO_MESSAGE_TYPE = (typeof chatInfoMessageType)[number];

type ChatInfoMessage = {
chatMessageType: CHAT_INFO_MESSAGE_TYPE;
infoMessage: string;
};

type ChatFunctionCallMessage = {
completion: ChatCompletionMessageParam;
chatMessageType: 'FUNCTION_CALL';
};

type ChatSystemMessage = {
completion: ChatCompletionSystemMessageParam;
chatMessageType: 'SYSTEM';
};

type ChatBotMessage = {
completion: ChatCompletionAssistantMessageParam;
chatMessageType: 'BOT';
};

type ChatUserMessageAsCompletion = {
completion: ChatCompletionUserMessageParam;
chatMessageType: 'USER';
};

type ChatUserTransformedMessage = {
completion: ChatCompletionUserMessageParam;
chatMessageType: 'USER_TRANSFORMED';
transformedMessage: TransformedChatMessage;
};

type ChatCompletionMessage =
| ChatFunctionCallMessage
| ChatSystemMessage
| ChatBotMessage
| ChatUserMessageAsCompletion
| ChatUserTransformedMessage;

type ChatMessage = ChatInfoMessage | ChatCompletionMessage;

export type {
ChatMessage,
ChatSystemMessage,
ChatInfoMessage,
CHAT_INFO_MESSAGE_TYPE,
};

export { chatInfoMessageType };
Loading
Loading