Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tabAutocomplete doesn't work correct since 0.9.224 / 0.8.56 #2954

Open
3 tasks done
vills opened this issue Nov 15, 2024 · 3 comments
Open
3 tasks done

tabAutocomplete doesn't work correct since 0.9.224 / 0.8.56 #2954

vills opened this issue Nov 15, 2024 · 3 comments
Assignees
Labels
area:autocomplete Relates to the auto complete feature ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior "needs-triage"

Comments

@vills
Copy link

vills commented Nov 15, 2024

Before submitting your bug report

Relevant environment info

- OS: MacOS 15.0.1
- Continue version: 0.9.224+
- IDE version: VSCode
- Model: deepseeker-code-v2
- config.json:
  
  "tabAutocompleteModel": {
    "title": "DeepSeeker Coder v2 16b",
    "provider": "ollama",
    "model": "deepseek-coder-v2:16b-lite-instruct-q6_K",
    "apiBase": "https://OPENWEBUI/ollama",
    "apiKey": "open-webui-token",
    "contextLength": 8096,
    "completionOptions": {
      "temperature": 0.1,
      "topP": 0.95,
      "maxTokens": 600
    }
  },

Description

I have a setup of autocompletitions with open-webui involved. Open-webui serves as a proxy to Ollama. It was working previously fine, but with latest versions of continue.dev plugin instead of autocomplete i get explanations of what my code does.

I checked previous versions and seems like problem arrive starting from 0.9.224 pre-release and 0.8.56 versions. Previous versions works as expected.

I believe it is a bug. But maybe you changed something in configs i need change too?

Thanks.

To reproduce

No response

Log output

No response

@dosubot dosubot bot added area:autocomplete Relates to the auto complete feature ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior labels Nov 15, 2024
@sestinj
Copy link
Contributor

sestinj commented Nov 17, 2024

@vills my guess is that this is caused by using the "ollama" provider for a non-Ollama API. The thing we did specifically lately was to let Ollama handle FIM formatting for us, but when going through OpenWebUI my guess is that the FIM endpoint doesn't exist or is routed to chat.

For my own understanding (you're not the only one to have this setup, and I haven't looked into it yet): what is the benefit of routing through OpenWebUI instead of directly to Ollama? If I had to point you to a solution, it would definitely be to just go direct, but I figure there's a reason you can't.

@vills
Copy link
Author

vills commented Nov 17, 2024

Thanks for sharing your idea. I'll check it in a few days.

I use ollama behind open-webui as it provides me with (obviously) UI and authorization/authentication (i use remote server with GPU that i share with my friends). Plus it can route to different payed platforms. It's kind of very simple to setup and get this features.

@uganson
Copy link

uganson commented Nov 18, 2024

I am also experiencing the same issue, since 0.8.56 the tab autocompletions show a description of the code before the cursor rather than a code suggestion, where prior versions work as expected. I am connecting directly to an ollama server running locally.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:autocomplete Relates to the auto complete feature ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior "needs-triage"
Projects
None yet
Development

No branches or pull requests

4 participants