A Node.js + React AI app for managing fictitious insurance claims.
- Node.js 18 or later -- Get it https://nodejs.org/en/download .
- npm 10+ -- Node.js includes npm
- An OpenAI-capable LLM inference server. Get one here with InstructLab!
You can change the coordinates (host/port and other stuff) for the LLM and backend by creating a .env
in the root of this repo and adding the following:
OPEN_AI_AP_KEY = 'EMPTY'
AI_MODEL_TEMPERATURE = 0.9
AI_MODEL_NAME = 'mistral'
AI_BASE_URL = 'http://localhost:8000/v1'
PORT = 8005
First, get your inference server up and running. For example, with InstructLab, the default after running ilab serve
is that the server is listening on localhost:8000
. This is the default for this app as well.
The frontend and backend are in separate repositories and should be structured as follows:
.
├── parasol-insurance
└── parasol-insurance-nodejs
First clone the webui from here:
git clone https://github.com/rh-rad-ai-roadshow/parasol-insurance.git
In the Node.js Application Repo(this repo) run the buildui
npm script
npm run buildui
Then run the node.js app:
npm install
node server.mjs
App will open on http://0.0.0.0:8005
.
Open the app, click on a claim, click on the chat app, and start asking questions. The context of the claim is sent to the LLM along with your Query, and the response is shown in the chat (it may take time depending on your machine's performance).