Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

839 optionally return chat model in start controller and level controller #858

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion backend/src/controller/levelController.ts
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import { Response } from 'express';

import { LevelGetRequest } from '@src/models/api/LevelGetRequest';
import { isValidLevel } from '@src/models/level';
import { LEVEL_NAMES, isValidLevel } from '@src/models/level';

function handleLoadLevel(req: LevelGetRequest, res: Response) {
const { level } = req.query;
Expand All @@ -20,6 +20,8 @@ function handleLoadLevel(req: LevelGetRequest, res: Response) {
emails: req.session.levelState[level].sentEmails,
chatHistory: req.session.levelState[level].chatHistory,
defences: req.session.levelState[level].defences,
chatModel:
level === LEVEL_NAMES.SANDBOX ? req.session.chatModel : undefined,
});
}

Expand Down
7 changes: 1 addition & 6 deletions backend/src/controller/modelController.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
import { Response } from 'express';

import { OpenAIGetModelRequest } from '@src/models/api/OpenAIGetModelRequest';
import { OpenAiConfigureModelRequest } from '@src/models/api/OpenAiConfigureModelRequest';
import { OpenAiSetModelRequest } from '@src/models/api/OpenAiSetModelRequest';
import { MODEL_CONFIG } from '@src/models/chat';
Expand Down Expand Up @@ -32,8 +31,4 @@ function handleConfigureModel(req: OpenAiConfigureModelRequest, res: Response) {
}
}

function handleGetModel(req: OpenAIGetModelRequest, res: Response) {
res.send(req.session.chatModel);
}

export { handleSetModel, handleConfigureModel, handleGetModel };
export { handleSetModel, handleConfigureModel };
2 changes: 2 additions & 0 deletions backend/src/controller/startController.ts
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,8 @@ function handleStart(req: StartGetRequest, res: Response) {
defences: req.session.levelState[level].defences,
availableModels: getValidOpenAIModels(),
systemRoles,
chatModel:
level === LEVEL_NAMES.SANDBOX ? req.session.chatModel : undefined,
} as StartResponse);
}

Expand Down
2 changes: 2 additions & 0 deletions backend/src/models/api/LevelGetRequest.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import { Request } from 'express';

import { ChatModel } from '@src/models/chat';
import { ChatMessage } from '@src/models/chatMessage';
import { Defence } from '@src/models/defence';
import { EmailInfo } from '@src/models/email';
Expand All @@ -11,6 +12,7 @@ export type LevelGetRequest = Request<
emails: EmailInfo[];
chatHistory: ChatMessage[];
defences: Defence[];
chatModel?: ChatModel;
},
never,
{
Expand Down
13 changes: 0 additions & 13 deletions backend/src/models/api/OpenAIGetModelRequest.ts

This file was deleted.

2 changes: 2 additions & 0 deletions backend/src/models/api/StartGetRequest.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import { Request } from 'express';

import { ChatModel } from '@src/models/chat';
import { ChatMessage } from '@src/models/chatMessage';
import { Defence } from '@src/models/defence';
import { EmailInfo } from '@src/models/email';
Expand All @@ -14,6 +15,7 @@ export type StartResponse = {
level: LEVEL_NAMES;
systemRole: string;
}[];
chatModel?: ChatModel;
};

export type StartGetRequest = Request<
Expand Down
2 changes: 0 additions & 2 deletions backend/src/sessionRoutes.ts
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@ import { handleClearEmails } from './controller/emailController';
import { handleLoadLevel } from './controller/levelController';
import {
handleConfigureModel,
handleGetModel,
handleSetModel,
} from './controller/modelController';
import { handleResetProgress } from './controller/resetController';
Expand Down Expand Up @@ -102,7 +101,6 @@ router.post('/openai/addInfoToHistory', handleAddInfoToChatHistory);
router.post('/openai/clear', handleClearChatHistory);

// model configurations
router.get('/openai/model', handleGetModel);
router.post('/openai/model', handleSetModel);
router.post('/openai/model/configure', handleConfigureModel);

Expand Down
7 changes: 7 additions & 0 deletions frontend/src/components/ControlPanel/ControlPanel.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ import { DEFENCES_HIDDEN_LEVEL3_IDS, MODEL_DEFENCES } from '@src/Defences';
import DefenceBox from '@src/components/DefenceBox/DefenceBox';
import DocumentViewButton from '@src/components/DocumentViewer/DocumentViewButton';
import ModelBox from '@src/components/ModelBox/ModelBox';
import { ChatModel } from '@src/models/chat';
import {
DEFENCE_ID,
DefenceConfigItem,
Expand All @@ -15,6 +16,8 @@ import './ControlPanel.css';
function ControlPanel({
currentLevel,
defences,
chatModel,
setChatModelId,
chatModelOptions,
toggleDefence,
resetDefenceConfiguration,
Expand All @@ -24,6 +27,8 @@ function ControlPanel({
}: {
currentLevel: LEVEL_NAMES;
defences: Defence[];
chatModel?: ChatModel;
setChatModelId: (modelId: string) => void;
chatModelOptions: string[];
toggleDefence: (defence: Defence) => void;
resetDefenceConfiguration: (
Expand Down Expand Up @@ -94,6 +99,8 @@ function ControlPanel({
{/* only show model box in sandbox mode */}
{showConfigurations && (
<ModelBox
chatModel={chatModel}
setChatModelId={setChatModelId}
chatModelOptions={chatModelOptions}
addInfoMessage={addInfoMessage}
/>
Expand Down
8 changes: 7 additions & 1 deletion frontend/src/components/MainComponent/MainBody.tsx
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import ChatBox from '@src/components/ChatBox/ChatBox';
import ControlPanel from '@src/components/ControlPanel/ControlPanel';
import EmailBox from '@src/components/EmailBox/EmailBox';
import { ChatMessage } from '@src/models/chat';
import { ChatMessage, ChatModel } from '@src/models/chat';
import {
DEFENCE_ID,
DefenceConfigItem,
Expand All @@ -18,7 +18,9 @@ function MainBody({
defences,
emails,
messages,
chatModel,
chatModels,
setChatModelId,
addChatMessage,
addInfoMessage,
addSentEmails,
Expand All @@ -34,6 +36,8 @@ function MainBody({
defences: Defence[];
emails: EmailInfo[];
messages: ChatMessage[];
chatModel?: ChatModel;
setChatModelId: (modelId: string) => void;
chatModels: string[];
addChatMessage: (message: ChatMessage) => void;
addInfoMessage: (message: string) => void;
Expand All @@ -59,6 +63,8 @@ function MainBody({
<ControlPanel
currentLevel={currentLevel}
defences={defences}
chatModel={chatModel}
setChatModelId={setChatModelId}
chatModelOptions={chatModels}
toggleDefence={toggleDefence}
resetDefenceConfiguration={resetDefenceConfiguration}
Expand Down
46 changes: 37 additions & 9 deletions frontend/src/components/MainComponent/MainComponent.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ import { DEFAULT_DEFENCES } from '@src/Defences';
import HandbookOverlay from '@src/components/HandbookOverlay/HandbookOverlay';
import LevelMissionInfoBanner from '@src/components/LevelMissionInfoBanner/LevelMissionInfoBanner';
import ResetLevelOverlay from '@src/components/Overlay/ResetLevel';
import { ChatMessage } from '@src/models/chat';
import { ChatMessage, ChatModel } from '@src/models/chat';
import {
DEFENCE_ID,
DefenceConfigItem,
Expand Down Expand Up @@ -58,6 +58,7 @@ function MainComponent({
const [emails, setEmails] = useState<EmailInfo[]>([]);
const [chatModels, setChatModels] = useState<string[]>([]);
const [systemRoles, setSystemRoles] = useState<LevelSystemRole[]>([]);
const [chatModel, setChatModel] = useState<ChatModel | undefined>(undefined);

const isFirstRender = useRef(true);

Expand All @@ -81,11 +82,23 @@ function MainComponent({

async function loadBackendData() {
try {
const { availableModels, defences, emails, chatHistory, systemRoles } =
await startService.start(currentLevel);
const {
availableModels,
defences,
emails,
chatHistory,
systemRoles,
chatModel,
} = await startService.start(currentLevel);
setChatModels(availableModels);
setSystemRoles(systemRoles);
processBackendLevelData(currentLevel, emails, chatHistory, defences);
processBackendLevelData(
currentLevel,
emails,
chatHistory,
defences,
chatModel
);
} catch (err) {
console.warn(err);
setMessages([
Expand Down Expand Up @@ -152,17 +165,17 @@ function MainComponent({

// for going switching level without clearing progress
async function setNewLevel(newLevel: LEVEL_NAMES) {
const { emails, chatHistory, defences } = await levelService.loadLevel(
newLevel
);
processBackendLevelData(newLevel, emails, chatHistory, defences);
const { emails, chatHistory, defences, chatModel } =
await levelService.loadLevel(newLevel);
processBackendLevelData(newLevel, emails, chatHistory, defences, chatModel);
}

function processBackendLevelData(
level: LEVEL_NAMES,
emails: EmailInfo[],
chatHistory: ChatMessage[],
defences: Defence[]
defences: Defence[],
chatModel?: ChatModel
) {
setEmails(emails);

Expand All @@ -172,6 +185,9 @@ function MainComponent({
: setMessages(chatHistory);

setDefences(defences);

// we will only update the chatModel if it is defined in the backend response. It will only defined for the sandbox level.
setChatModel(chatModel);
setMainBodyKey(MainBodyKey + 1);
}

Expand Down Expand Up @@ -289,6 +305,16 @@ function MainComponent({
setMessages((messages: ChatMessage[]) => [resetMessage, ...messages]);
}

function setChatModelId(modelId: string) {
if (!chatModel) {
console.error(
'You are trying to change the id of the chatModel but it has not been loaded yet'
Copy link
Member

@chriswilty chriswilty Mar 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not convinced this error message will help a future developer work out what's going on here. Can you come up with something indicating we only expect chat model to be changed on Sandbox level?

Or if this genuinely can happen, then I don't think we want to be rendering the dropdown model selector until we have the model response, but I was under the impression that chatModel and "available models" now arrive in the same response.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"available models" arrives under the startResponse, along with chatModel. So if the game is loaded up or refreshed on sandbox, then the available models and chatModel arrive at the same time. However if the user switches from any level to sandbox, then there will be some amount of time (usually negligible) where the chat model dropdown and choose button are rendered, but the chatModel is not yet loaded.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, conditionally rendering the select model bits depending on chatModel being loaded. Wondering if we're doing this then we should also not render the model configuration sliders...
image

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"available models" arrives under the startResponse, along with chatModel. So if the game is loaded up or refreshed on sandbox, then the available models and chatModel arrive at the same time. However if the user switches from any level to sandbox, then there will be some amount of time (usually negligible) where the chat model dropdown and choose button are rendered, but the chatModel is not yet loaded.

Right, that makes sense now 😅

I'd say if we're going down this route, then we should conditionally render the sliders as you suggest. Can you see how that looks - in the code I mean, check it's not too gnarly?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed in the standup, we'll leave this for a future issue, as it has some (re)design implications.

);
return;
}
setChatModel({ ...chatModel, id: modelId });
}

return (
<div className="main-component">
<MainHeader
Expand All @@ -311,6 +337,8 @@ function MainComponent({
key={MainBodyKey}
currentLevel={currentLevel}
defences={defences}
chatModel={chatModel}
setChatModelId={setChatModelId}
chatModels={chatModels}
emails={emails}
messages={messages}
Expand Down
13 changes: 12 additions & 1 deletion frontend/src/components/ModelBox/ModelBox.tsx
Original file line number Diff line number Diff line change
@@ -1,22 +1,33 @@
import { ChatModel } from '@src/models/chat';

import ModelConfiguration from './ModelConfiguration';
import ModelSelection from './ModelSelection';

import './ModelBox.css';

function ModelBox({
chatModel,
setChatModelId,
chatModelOptions,
addInfoMessage,
}: {
chatModel?: ChatModel;
setChatModelId: (modelId: string) => void;
chatModelOptions: string[];
addInfoMessage: (message: string) => void;
}) {
return (
<div className="model-box">
<ModelSelection
chatModel={chatModel}
setChatModelId={setChatModelId}
chatModelOptions={chatModelOptions}
addInfoMessage={addInfoMessage}
/>
<ModelConfiguration addInfoMessage={addInfoMessage} />
<ModelConfiguration
chatModel={chatModel}
addInfoMessage={addInfoMessage}
/>
</div>
);
}
Expand Down
Loading
Loading