Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add API support #88

Open
Mekriz opened this issue Aug 27, 2024 · 3 comments
Open

Add API support #88

Mekriz opened this issue Aug 27, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@Mekriz
Copy link

Mekriz commented Aug 27, 2024

Add API support so that you can use models from other devices

@guinmoon
Copy link
Owner

Hi. Judging by the statistics my application is mostly run on ios. What is the benefit of a server running on ios? On PC and Mac there is ollama and llmstudio, it will not be easy to compete with them....

@Mekriz
Copy link
Author

Mekriz commented Aug 28, 2024

I have a snapdragon 865 phone and an m1 ipad, I would like to use AI from the phone, but it works 50 times slower there (it's just impossible to use it). While on the iPad, the AI works great.

@parthpat12
Copy link

Big up vote here. Another use case: I have Mac Studio at home that is running Ollama/Llama.cpp and I would like to connect to that instance from LLMFarm when I’m away. It’s much more powerful then inferencing on iPhone. There is only one app that does that (Enchanted iOS) but it has limited model feature settings. Please!

@guinmoon guinmoon added the enhancement New feature or request label Oct 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants