Replies: 3 comments 5 replies
-
Probably not. It's something I have thought about greatly. But I doubt I'll be able to come close to what Copilot.vim or Copilot.lua do. Also, for anything other than ollama models, it's too expensive. |
Beta Was this translation helpful? Give feedback.
-
I think it's worth reconsidering. It could be disabled by default... It also could be hooked up to be manually prompted... like "finish what I'm trying to say here", but with a more immediate & polished UX than :cc ? Maybe there's already a way to hook codecompanion up to "finish what I'm trying to do here? There's also an expanding set of things that this could lead to... I think windsurf keeps track of all the edits in the file so far, and puts those in the context when sending to their model. They also seem to ask for edits in a wider window within the file, rather than just a continuation past the cursor, the result is a pretty slick experience that can often guess what you're intending to do next. I'll try using this side-by-side with a manual copilot inline complete and report back :) |
Beta Was this translation helpful? Give feedback.
-
FYI, if you want code completion with local LLMs, checkout Tabby |
Beta Was this translation helpful? Give feedback.
-
Will this plugin add code completion? Just like this: https://github.com/monkoose/neocodeium
Beta Was this translation helpful? Give feedback.
All reactions