Skip to content

xavidop/genkitx-github

Firebase Genkit + GitHub Models

Firebase Genkit <> GitHub Models Plugin

GitHub Models Community Plugin for Google Firebase Genkit

GitHub version NPM Downloads GitHub License Static Badge
GitHub Issues or Pull Requests GitHub Issues or Pull Requests GitHub commit activity

genkitx-github is a community plugin for using GitHub Models APIs with Firebase Genkit. Built by Xavier Portilla Edo.

This Genkit plugin allows to use GitHub models through their official APIs.

Installation

Install the plugin in your project with your favorite package manager:

  • npm install genkitx-github
  • pnpm add genkitx-github

Versions

if you are using Genkit version <v0.9.0, please use the plugin version v1.9.0. If you are using Genkit >=v0.9.0, please use the plugin version >=v1.10.0.

Usage

Configuration

To use the plugin, you need to configure it with your GitHub Token key. You can do this by calling the genkit function:

import { genkit, z } from 'genkit';
import {github, openAIGpt4o} from "genkitx-github";

const ai = genkit({
  plugins: [
    github({
      githubToken: '<my-github-token>',
    }),
    model: openAIGpt4o,
  ]
});

You can also intialize the plugin in this way if you have set the GITHUB_TOKEN environment variable:

import { genkit, z } from 'genkit';
import {github, openAIGpt4o} from "genkitx-github";

const ai = genkit({
  plugins: [
    github({
      githubToken: '<my-github-token>',
    }),
    model: openAIGpt4o,
  ]
});

Basic examples

The simplest way to call the text generation model is by using the helper function generate:

import { genkit, z } from 'genkit';
import {github, openAIGpt4o} from "genkitx-github";

// Basic usage of an LLM
const response = await ai.generate({
  prompt: 'Tell me a joke.',
});

console.log(await response.text);

Within a flow

// ...configure Genkit (as shown above)...

export const myFlow = ai.defineFlow(
  {
    name: 'menuSuggestionFlow',
    inputSchema: z.string(),
    outputSchema: z.string(),
  },
  async (subject) => {
    const llmResponse = await ai.generate({
      prompt: `Suggest an item for the menu of a ${subject} themed restaurant`,
    });

    return llmResponse.text;
  }
);

Tool use

// ...configure Genkit (as shown above)...

const specialToolInputSchema = z.object({ meal: z.enum(["breakfast", "lunch", "dinner"]) });
const specialTool = ai.defineTool(
  {
    name: "specialTool",
    description: "Retrieves today's special for the given meal",
    inputSchema: specialToolInputSchema,
    outputSchema: z.string(),
  },
  async ({ meal }): Promise<string> => {
    // Retrieve up-to-date information and return it. Here, we just return a
    // fixed value.
    return "Baked beans on toast";
  }
);

const result = ai.generate({
  tools: [specialTool],
  prompt: "What's for breakfast?",
});

console.log(result.then((res) => res.text));

For more detailed examples and the explanation of other functionalities, refer to the official Genkit documentation.

Supported models

This plugin supports all currently available Chat/Completion and Embeddings models from GitHub Models. This plugin supports image input and multimodal models.

API Reference

You can find the full API reference in the API Reference Documentation

Troubleshooting

  1. GPT o1-preview and o1-mini it is still in beta. It does not support system roles, tools and the temperature and topP needs to be set to 1. See OpenAI annocement here
  2. Cohere models only supports text output for now. Issue opened here.

Contributing

Want to contribute to the project? That's awesome! Head over to our Contribution Guidelines.

Need support?

Note

This repository depends on Google's Firebase Genkit. For issues and questions related to Genkit, please refer to instructions available in Genkit's repository.

Reach out by opening a discussion on GitHub Discussions.

Credits

This plugin is proudly maintained by Xavier Portilla Edo Xavier Portilla Edo.

I got the inspiration, structure and patterns to create this plugin from the Genkit Community Plugins repository built by the Fire Compnay as well as the ollama plugin.

License

This project is licensed under the Apache 2.0 License.

License: Apache 2.0