You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Error: The specified module could not be found.
\?\C:\Users*\AppData\Roaming\npm\node_modules@picovoice\picollm-node-demo\node_modules@picovoice\picollm-node\lib\windows\amd64\pv_picollm.node
at Module._extensions..node (node:internal/modules/cjs/loader:1454:18)
at Module.load (node:internal/modules/cjs/loader:1208:32)
at Module._load (node:internal/modules/cjs/loader:1024:12)
at Module.require (node:internal/modules/cjs/loader:1233:19)
at require (node:internal/modules/helpers:179:18)
at new PicoLLM (C:\Users*\AppData\Roaming\npm\node_modules@picovoice\picollm-node-demo\node_modules@picovoice\picollm-node\dist\picollm.js:61:27)
at completionDemo (C:\Users*\AppData\Roaming\npm\node_modules@picovoice\picollm-node-demo\chat.js:153:19)
at Object. (C:\Users*\AppData\Roaming\npm\node_modules@picovoice\picollm-node-demo\chat.js:207:1)
at Module._compile (node:internal/modules/cjs/loader:1358:14)
at Module._extensions..js (node:internal/modules/cjs/loader:1416:10) {
code: 'ERR_DLOPEN_FAILED'
}
I confirmed that the file path was correct using fs.existsSync and FileExplorer with testing llama-2-7b-chat-269.pllm and phi2-290.pllm with both resulting in the above exception. Also confirmed by putting in an invalid path and receiving a "File not found at 'modelPath'" exception instead. I'm fairly certain that the demo sees the file but just unable to load it.
I did see a similar issue reported regarding the Android implementation of picollm, but had not been resolved yet.
Steps To Reproduce
Install picollm demo for nodejs: npm install -g @picovoice/picollm-node-demo
Download chat compatible pllm file from picovoice (I tested llama-2-7b-chat-269.pllm and phi2-290.pllm with same result)
Run the picollm demo: picollm-chat-demo --access_key "${VALID_ACCESS_KEY}" --model_path "${VALID_MODEL_PATH}" --prompt "What is 5 * 6?"
There should be the exception as a result of running Step 3.
Expected Behavior
Expected to receive a response akin to "The answer is 30." back from the picollm node demo based on the prompt "What is 5 * 6?".
The text was updated successfully, but these errors were encountered:
Thank you for reporting the issue. We’ve been able to recreate the problem and are currently working on a solution. We will notify you here as soon as the fix is implemented and the new package is available.
Thanks for the quick response. The issue is still occurring. I confirmed the new demo package version 1.0.2 was installed and attempted to run it with the same results as originally reported. Not sure this helps, but based on stacktrace pointing towards the picollm-node package, the version installed with the demo is still 1.0.1, so the culprit may be something there?
Confirmed to be functioning as expected now, was able to load the model and submit the prompt resulting in a response back. Thanks for getting this resolved.
Have you checked the docs and existing issues?
SDK
Node.js
picoLLM package version
1.0.1
Framework version
Node.js v20.15.1
Platform
Windows (x86_64)
OS/Browser version
Windows 10
Describe the bug
In attempting to run the picollm-chat-demo for nodejs, an exception is thrown when attempting open/load the downloaded pllm model file:
I confirmed that the file path was correct using fs.existsSync and FileExplorer with testing llama-2-7b-chat-269.pllm and phi2-290.pllm with both resulting in the above exception. Also confirmed by putting in an invalid path and receiving a "File not found at 'modelPath'" exception instead. I'm fairly certain that the demo sees the file but just unable to load it.
I did see a similar issue reported regarding the Android implementation of picollm, but had not been resolved yet.
Steps To Reproduce
Expected Behavior
Expected to receive a response akin to "The answer is 30." back from the picollm node demo based on the prompt "What is 5 * 6?".
The text was updated successfully, but these errors were encountered: