-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Call the application via API? #596
Comments
@gamercoder153 this is unclear - Big-AGI is actually calling upstream APIs, but it's not an API itself. What's the reason behind the request? |
@enricoros Instead of using the OpenAI API in my application, I want to use Big AGI to utilize beam functionality. |
@gamercoder153 I'm afraid it won't be possible unfortunately. Beam requires the human in the loop, it cannot be fully automated. |
@enricoros I wish it could happen. It's okay, no worries BTW. You guys are doing a great job. the only problem with using big-agi is I cant use custom tools and RAG |
@gamercoder153 custom tools could happen soon. How would you see yourself using those tools? (The new version being worked on has great function calling support, so I'm seeing which is the best way to connect those function calls to your tools) |
@enricoros While I can't answer for @gamercoder153 one interesting tool use would be to have the capability to 1) identify code blocks in an LLM output, 2) connect to some online instance that can run code (e.g., Google Colab's REST API or potentially via the Repl.it API) and serve those code blocks, and 3) return the output/error messages directly to the chat window, so I can easily add any custom messages I'd want to ask the LLM. This way, one could prompt it for something like "Write me a Python function that computes the function x^23+2*x^5 for x =25". Suppose the code, when run on Repl.it generates and error, I'd have that error pasted directly in the chat window, to minimize clicks, so I can simply add a custom message to the LLM, or just, with a single click, forward the error to the LLM. This would make code tool use much more pleasant , I believe. |
My use case is.. I want to say.. send me notification tomorrow at 7pm .. and it would use NTFY to do that. I would use that all the time. Looking forward to custom tools in Big AGI. |
Not sure what you mean exactly, but I have submitted a roadmap request if that is what you were after. |
I just discovered that Big-AGI UI can display HTML ..when toggled on. That's it for me... I can now do what I wanted, send notifications to NTFY server. I wish there was an option to have it "On" by default, so that it would display HTML when there is a valid HTML in the response. That would make the UX so much smoother. Still ... amazing feature. |
@AMOz1 this is really smart. Can you expand more in how you did it? I love this hack, very unconventional and interesting. Regarding setting it on HTML rendering, it's easy to do, and probably should be there in Big-AGI 2 (we already have one advanced UI mode which turns it on by default). |
@enricoros
|
Is there a way to call the API to get responses?
The text was updated successfully, but these errors were encountered: