You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi. Judging by the statistics my application is mostly run on ios. What is the benefit of a server running on ios? On PC and Mac there is ollama and llmstudio, it will not be easy to compete with them....
I have a snapdragon 865 phone and an m1 ipad, I would like to use AI from the phone, but it works 50 times slower there (it's just impossible to use it). While on the iPad, the AI works great.
Big up vote here. Another use case: I have Mac Studio at home that is running Ollama/Llama.cpp and I would like to connect to that instance from LLMFarm when I’m away. It’s much more powerful then inferencing on iPhone. There is only one app that does that (Enchanted iOS) but it has limited model feature settings. Please!
Add API support so that you can use models from other devices
The text was updated successfully, but these errors were encountered: