Skip to content
This repository has been archived by the owner on Oct 10, 2024. It is now read-only.

mistralai/client-js

This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

af53169 · Jan 16, 2024
Jan 15, 2024
Jan 16, 2024
Jan 15, 2024
Jan 16, 2024
Jan 15, 2024
Jan 16, 2024
Dec 20, 2023
Dec 7, 2023
Dec 7, 2023
Dec 7, 2023
Dec 20, 2023
Jan 16, 2024
Jan 16, 2024
Jan 15, 2024

Repository files navigation

This javascript client is inspired from cohere-typescript

Mistral Javascript Client

You can use the Mistral Javascript client to interact with the Mistral AI API.

Installing

You can install the library in your project using:

npm install @mistralai/mistralai

Usage

Set up

import MistralClient from '@mistralai/mistralai';

const apiKey = process.env.MISTRAL_API_KEY || 'your_api_key';

const client = new MistralClient(apiKey);

List models

const listModelsResponse = await client.listModels();
const listModels = listModelsResponse.data;
listModels.forEach((model) => {
  console.log('Model:', model);
});

Chat with streaming

const chatStreamResponse = await client.chatStream({
  model: 'mistral-tiny',
  messages: [{role: 'user', content: 'What is the best French cheese?'}],
});

console.log('Chat Stream:');
for await (const chunk of chatStreamResponse) {
  if (chunk.choices[0].delta.content !== undefined) {
    const streamText = chunk.choices[0].delta.content;
    process.stdout.write(streamText);
  }
}

Chat without streaming

const chatResponse = await client.chat({
  model: 'mistral-tiny',
  messages: [{role: 'user', content: 'What is the best French cheese?'}],
});

console.log('Chat:', chatResponse.choices[0].message.content);

Embeddings

const input = [];
for (let i = 0; i < 1; i++) {
  input.push('What is the best French cheese?');
}

const embeddingsBatchResponse = await client.embeddings({
  model: 'mistral-embed',
  input: input,
});

console.log('Embeddings Batch:', embeddingsBatchResponse.data);

Run examples

You can run the examples in the examples directory by installing them locally:

cd examples
npm install .

API key setup

Running the examples requires a Mistral AI API key.

Get your own Mistral API Key: https://docs.mistral.ai/#api-access

Run the examples

MISTRAL_API_KEY='your_api_key' node chat_with_streaming.js

Persisting the API key in environment

Set your Mistral API Key as an environment variable. You only need to do this once.

# set Mistral API Key (using zsh for example)
$ echo 'export MISTRAL_API_KEY=[your_api_key]' >> ~/.zshenv

# reload the environment (or just quit and open a new terminal)
$ source ~/.zshenv

You can then run the examples without appending the API key:

node chat_with_streaming.js

After the env variable setup the client will find the MISTRAL_API_KEY by itself

import MistralClient from '@mistralai/mistralai';

const client = new MistralClient();