You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I discovered your plugin yesterday and found it quite interesting.
I found out that I can start a local server with AI models with lmstudio.ai, after setting that up, I wanted to use it from my Android Studio IDE, and searching found your plugin, I thought it would be posible to config to connect to localhost instead of openai/google/etc
[Question]
Is there a way to configure the plugin to use a local LLM on my localhost?
I tried overwriting some of the pre-configured service APIs (OpenAI, Google, etc.), but it seems everything is read-only. When I change the endpoint of a preset option, apply the changes or save, and reopen the plugin settings, the modifications are reverted.
Also tried all the config in the "developer mode", but it seems I'm doing something wrong as everything I config doesn't seem to change the results.
I wasn't able to connect to the server.
[Posible bug] I also tried using the "EDIT REQUEST" option beforehand, but it doesn't retain the configuration I set in the "plugin config". For example, I set the model to "lalala1" in the plugin config, but when the "Edit request JSON" screens appears, all the json properties are loaded with different configurations (model, temperature, etc).
[Request]
It would be great if the JSON configuration was visible and editable from the plugin config, allowing us to add our own "rules/requests" to the model initially, so we don’t need to edit each request manually.
[Request]
Lastly, is it possible to have everything operate within the IDE, similar to "Gemini" or "GitHub Copilot" plugins?
Thank you for reading and for sharing your plugin!
The text was updated successfully, but these errors were encountered:
abruno-
changed the title
[Feature Request/Question] Allow to use localhost
[Feature Request/Question] Allow to use LLMs locally
May 23, 2024
The base urls should be configurable. That is a rarely used feature, though, so I'll give it some testing.
The problem with using additional LLM providers is that the APIs differ in some minor aspects. I'd probably need to treat Ollama support as a separate feature. It's been on my todo list for awhile.
This plugin has actually evolved away from being overly integrated with the IDE. This is because the components are used in several other project, e.g. apps.simiacrypt.us
Hi, I discovered your plugin yesterday and found it quite interesting.
I found out that I can start a local server with AI models with lmstudio.ai, after setting that up, I wanted to use it from my Android Studio IDE, and searching found your plugin, I thought it would be posible to config to connect to localhost instead of openai/google/etc
[Question]
Is there a way to configure the plugin to use a local LLM on my localhost?
I tried overwriting some of the pre-configured service APIs (OpenAI, Google, etc.), but it seems everything is read-only. When I change the endpoint of a preset option, apply the changes or save, and reopen the plugin settings, the modifications are reverted.
Also tried all the config in the "developer mode", but it seems I'm doing something wrong as everything I config doesn't seem to change the results.
I wasn't able to connect to the server.
[Posible bug] I also tried using the "EDIT REQUEST" option beforehand, but it doesn't retain the configuration I set in the "plugin config". For example, I set the model to "lalala1" in the plugin config, but when the "Edit request JSON" screens appears, all the json properties are loaded with different configurations (model, temperature, etc).
[Request]
It would be great if the JSON configuration was visible and editable from the plugin config, allowing us to add our own "rules/requests" to the model initially, so we don’t need to edit each request manually.
[Request]
Lastly, is it possible to have everything operate within the IDE, similar to "Gemini" or "GitHub Copilot" plugins?
Thank you for reading and for sharing your plugin!
The text was updated successfully, but these errors were encountered: