You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I try to test ollama it fails, and running with summarize on gives: "url not allowed on the configured scope: http://127.0.0.1:11434/api/generate"
Steps to reproduce
install ollama
add model
run Vibe 3.0 (also checked first 2.6.7)
set up ollama as summary platform, enter URL, and model name.
Run Check, or run transcription.
What OS are you seeing the problem on?
Window
Relevant log output
App Version: vibe 3.0.0
Commit Hash: d6348c6
Arch: x86_64
Platform: windows
Kernel Version: 10.0.19045
OS: windows
OS Version: 10.0.19045
Cuda Version: n/a
Models: ggml-distil-large-v3.bin, ggml-medium.bin
Default Model: "C:\\Users\\Vive\\AppData\\Local\\github.com.thewh1teagle.vibe\\ggml-distil-large-v3.bin"
Cargo features: vulkan
{"avx":{"enabled":true,"support":true},"avx2":{"enabled":true,"support":true},"f16c":{"enabled":true,"support":true},"fma":{"enabled":true,"support":true}}
<details><summary>logs</summary>
cmd: "l:\\vibe\\ffmpeg.exe""-i""L:\\phonerecordings\\recording-20241013-182351.mp3""-ar""16000""-ac""1""-c:a""pcm_s16le""C:\\Users\\Vive\\AppData\\Local\\Temp\\vibe_temp_2024-12-10\\8814fbe9672f7004.wav""-hide_banner""-y""-loglevel""error"</details><details><summary>logs</summary>```consolecmd: "l:\\vibe\\ffmpeg.exe""-i""L:\\phonerecordings\\recording-20241013-182351.mp3""-ar""16000""-ac""1""-c:a""pcm_s16le""C:\\Users\\Vive\\AppData\\Local\\Temp\\vibe_temp_2024-12-10\\8814fbe9672f7004.wav""-hide_banner""-y""-loglevel""error"
```
The text was updated successfully, but these errors were encountered:
When I try to test ollama it fails, and running with summarize on gives: "url not allowed on the configured scope: http://127.0.0.1:11434/api/generate"
I have the same issue, I have a central ollama server on my network. I get the same error as well:
url not allowed on the configured scope: https://ollama.int.<mydomain>.com/api/generate
From the server side, the server api is never called. I've verified that other applications are having no issue talking to the ollama server nor the model selected.
If I setup a ssh tunnel so that localhost:11434 is piped through to the server, then it works fine, but something in the application itself is blocking the url from reaching out to anything other than localhost.
Looking into it, it looks like it is a security feature of Tauri. https://github.com/thewh1teagle/vibe/blob/main/desktop/src-tauri/capabilities/main.json#L59 is too restrictive. I'm not sure of the security impacts, but since this is being requested on the front end instead of the back end, it would be nice if all urls were allowed and not just localhost or anthropic urls when calling to an LLM.
What happened?
When I try to test ollama it fails, and running with summarize on gives: "url not allowed on the configured scope: http://127.0.0.1:11434/api/generate"
Steps to reproduce
What OS are you seeing the problem on?
Window
Relevant log output
The text was updated successfully, but these errors were encountered: