Skip to content

Run a local instance of a code-specialized model via Ollama and connect to your VSCode for code completion / generation.

Notifications You must be signed in to change notification settings

kwame-mintah/vscode-ollama-local-code-copilot

Repository files navigation

Visual Studio Code (VSCode) Ollama Local Code Co-pilot

An example to run an instance of Ollama via docker-compose and connect to VSCode for code completion / generation.

Prerequisites

The project uses the following software:

Usage

  1. Navigate to the directory where the repository has been cloned and start the containers:

    docker-compose up -d --build
    
  2. Wait for the ollama-setup-1 service to complete downloading codellama1 and other models.

     2024-01-31 23:36:50 {"status":"verifying sha256 digest"}
     2024-01-31 23:36:50 {"status":"writing manifest"}
     2024-01-31 23:36:50 {"status":"removing any unused layers"}
     2024-01-31 23:36:50 {"status":"success"}
     100 1128k    0 1128k    0    21   2546      0 --:--:--  0:07:33 --:--:--    23
    
  3. Install CodeGPT onto VSCode and follow the instructions provided here.

Note

Initial prompts to Ollama might be slow, if the model / container was not previously running and not interacted with recently.

Notes

Using different Large Language Models (LLM) with CodeGPT

You can edit the docker-entrypoint.sh to pull any model available in the Ollama library, however CodeGPT currently supports a few models via the UI. You will need to manually type in the model name you have pulled into the 'Model' field for Ollama provider.

What devices has this been tested on?

I have only tried the container on my MacBook Pro M1 Pro with 16 RAM (2021), so YMMV. Also unable to comment on model(s) that require a Graphics processing unit (GPU).

Footnotes

  1. CodeGPT also allows for downloading model(s) via the extension and also displays installed models via the UI.

About

Run a local instance of a code-specialized model via Ollama and connect to your VSCode for code completion / generation.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published