Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

picoLLM Issue: Unable to load PLLM model in Picollm Node Demo #42

Closed
2 tasks done
AlphaBlackout opened this issue Jul 22, 2024 · 5 comments
Closed
2 tasks done
Assignees
Labels
bug Something isn't working

Comments

@AlphaBlackout
Copy link

AlphaBlackout commented Jul 22, 2024

Have you checked the docs and existing issues?

  • I have read all of the relevant Picovoice picoLLM docs
  • I have searched the existing issues for picoLLM

SDK

Node.js

picoLLM package version

1.0.1

Framework version

Node.js v20.15.1

Platform

Windows (x86_64)

OS/Browser version

Windows 10

Describe the bug

In attempting to run the picollm-chat-demo for nodejs, an exception is thrown when attempting open/load the downloaded pllm model file:

node:internal/modules/cjs/loader:1454
return process.dlopen(module, path.toNamespacedPath(filename));
^

Error: The specified module could not be found.
\?\C:\Users*\AppData\Roaming\npm\node_modules@picovoice\picollm-node-demo\node_modules@picovoice\picollm-node\lib\windows\amd64\pv_picollm.node
at Module._extensions..node (node:internal/modules/cjs/loader:1454:18)
at Module.load (node:internal/modules/cjs/loader:1208:32)
at Module._load (node:internal/modules/cjs/loader:1024:12)
at Module.require (node:internal/modules/cjs/loader:1233:19)
at require (node:internal/modules/helpers:179:18)
at new PicoLLM (C:\Users*
\AppData\Roaming\npm\node_modules@picovoice\picollm-node-demo\node_modules@picovoice\picollm-node\dist\picollm.js:61:27)
at completionDemo (C:\Users*\AppData\Roaming\npm\node_modules@picovoice\picollm-node-demo\chat.js:153:19)
at Object. (C:\Users*
\AppData\Roaming\npm\node_modules@picovoice\picollm-node-demo\chat.js:207:1)
at Module._compile (node:internal/modules/cjs/loader:1358:14)
at Module._extensions..js (node:internal/modules/cjs/loader:1416:10) {
code: 'ERR_DLOPEN_FAILED'
}

I confirmed that the file path was correct using fs.existsSync and FileExplorer with testing llama-2-7b-chat-269.pllm and phi2-290.pllm with both resulting in the above exception. Also confirmed by putting in an invalid path and receiving a "File not found at 'modelPath'" exception instead. I'm fairly certain that the demo sees the file but just unable to load it.

I did see a similar issue reported regarding the Android implementation of picollm, but had not been resolved yet.

Steps To Reproduce

  1. Install picollm demo for nodejs: npm install -g @picovoice/picollm-node-demo
  2. Download chat compatible pllm file from picovoice (I tested llama-2-7b-chat-269.pllm and phi2-290.pllm with same result)
  3. Run the picollm demo: picollm-chat-demo --access_key "${VALID_ACCESS_KEY}" --model_path "${VALID_MODEL_PATH}" --prompt "What is 5 * 6?"
  4. There should be the exception as a result of running Step 3.

Expected Behavior

Expected to receive a response akin to "The answer is 30." back from the picollm node demo based on the prompt "What is 5 * 6?".

@AlphaBlackout AlphaBlackout added the bug Something isn't working label Jul 22, 2024
@mrrostam
Copy link
Contributor

Thank you for reporting the issue. We’ve been able to recreate the problem and are currently working on a solution. We will notify you here as soon as the fix is implemented and the new package is available.

@mrrostam
Copy link
Contributor

mrrostam commented Jul 22, 2024

@AlphaBlackout, could you try the new released demo package and see if it resolves the issue you were encountering?

@AlphaBlackout
Copy link
Author

@AlphaBlackout, could you try the new released demo package and see if it resolves the issue you were encountering?

Thanks for the quick response. The issue is still occurring. I confirmed the new demo package version 1.0.2 was installed and attempted to run it with the same results as originally reported. Not sure this helps, but based on stacktrace pointing towards the picollm-node package, the version installed with the demo is still 1.0.1, so the culprit may be something there?

@matt200-ok
Copy link
Contributor

@AlphaBlackout, the new released demo package should resolve the issue. Please confirm if the demo is now working for you.

@AlphaBlackout
Copy link
Author

@AlphaBlackout, the new released demo package should resolve the issue. Please confirm if the demo is now working for you.

Confirmed to be functioning as expected now, was able to load the model and submit the prompt resulting in a response back. Thanks for getting this resolved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants