genkitx-github
is a community plugin for using GitHub Models APIs with
Firebase Genkit. Built by Xavier Portilla Edo.
This Genkit plugin allows to use GitHub models through their official APIs.
Install the plugin in your project with your favorite package manager:
npm install genkitx-github
pnpm add genkitx-github
if you are using Genkit version <v0.9.0
, please use the plugin version v1.9.0
. If you are using Genkit >=v0.9.0
, please use the plugin version >=v1.10.0
.
To use the plugin, you need to configure it with your GitHub Token key. You can do this by calling the genkit
function:
import { genkit, z } from 'genkit';
import {github, openAIGpt4o} from "genkitx-github";
const ai = genkit({
plugins: [
github({
githubToken: '<my-github-token>',
}),
model: openAIGpt4o,
]
});
You can also intialize the plugin in this way if you have set the GITHUB_TOKEN
environment variable:
import { genkit, z } from 'genkit';
import {github, openAIGpt4o} from "genkitx-github";
const ai = genkit({
plugins: [
github({
githubToken: '<my-github-token>',
}),
model: openAIGpt4o,
]
});
The simplest way to call the text generation model is by using the helper function generate
:
import { genkit, z } from 'genkit';
import {github, openAIGpt4o} from "genkitx-github";
// Basic usage of an LLM
const response = await ai.generate({
prompt: 'Tell me a joke.',
});
console.log(await response.text);
// ...configure Genkit (as shown above)...
export const myFlow = ai.defineFlow(
{
name: 'menuSuggestionFlow',
inputSchema: z.string(),
outputSchema: z.string(),
},
async (subject) => {
const llmResponse = await ai.generate({
prompt: `Suggest an item for the menu of a ${subject} themed restaurant`,
});
return llmResponse.text;
}
);
// ...configure Genkit (as shown above)...
const specialToolInputSchema = z.object({ meal: z.enum(["breakfast", "lunch", "dinner"]) });
const specialTool = ai.defineTool(
{
name: "specialTool",
description: "Retrieves today's special for the given meal",
inputSchema: specialToolInputSchema,
outputSchema: z.string(),
},
async ({ meal }): Promise<string> => {
// Retrieve up-to-date information and return it. Here, we just return a
// fixed value.
return "Baked beans on toast";
}
);
const result = ai.generate({
tools: [specialTool],
prompt: "What's for breakfast?",
});
console.log(result.then((res) => res.text));
For more detailed examples and the explanation of other functionalities, refer to the official Genkit documentation.
This plugin supports all currently available Chat/Completion and Embeddings models from GitHub Models. This plugin supports image input and multimodal models.
You can find the full API reference in the API Reference Documentation
- GPT
o1-preview
ando1-mini
it is still in beta. It does not support system roles, tools and thetemperature
andtopP
needs to be set to1
. See OpenAI annocement here - Cohere models only supports text output for now. Issue opened here.
Want to contribute to the project? That's awesome! Head over to our Contribution Guidelines.
Note
This repository depends on Google's Firebase Genkit. For issues and questions related to Genkit, please refer to instructions available in Genkit's repository.
Reach out by opening a discussion on GitHub Discussions.
This plugin is proudly maintained by Xavier Portilla Edo Xavier Portilla Edo.
I got the inspiration, structure and patterns to create this plugin from the Genkit Community Plugins repository built by the Fire Compnay as well as the ollama plugin.
This project is licensed under the Apache 2.0 License.