Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Call the application via API? #596

Open
gamercoder153 opened this issue Jul 15, 2024 · 11 comments
Open

Call the application via API? #596

gamercoder153 opened this issue Jul 15, 2024 · 11 comments

Comments

@gamercoder153
Copy link

Is there a way to call the API to get responses?

@enricoros
Copy link
Owner

@gamercoder153 this is unclear - Big-AGI is actually calling upstream APIs, but it's not an API itself. What's the reason behind the request?

@gamercoder153
Copy link
Author

@enricoros Instead of using the OpenAI API in my application, I want to use Big AGI to utilize beam functionality.

@enricoros
Copy link
Owner

enricoros commented Jul 15, 2024

@gamercoder153 I'm afraid it won't be possible unfortunately. Beam requires the human in the loop, it cannot be fully automated.
It's something we can consider in the future, but won't be able to do soon.
What prevents you from using Big-AGI UI as your main application?

@gamercoder153
Copy link
Author

@enricoros I wish it could happen. It's okay, no worries BTW. You guys are doing a great job. the only problem with using big-agi is I cant use custom tools and RAG

@enricoros
Copy link
Owner

enricoros commented Jul 15, 2024

@gamercoder153 custom tools could happen soon. How would you see yourself using those tools? (The new version being worked on has great function calling support, so I'm seeing which is the best way to connect those function calls to your tools)

@friederrr
Copy link

@enricoros While I can't answer for @gamercoder153 one interesting tool use would be to have the capability to 1) identify code blocks in an LLM output, 2) connect to some online instance that can run code (e.g., Google Colab's REST API or potentially via the Repl.it API) and serve those code blocks, and 3) return the output/error messages directly to the chat window, so I can easily add any custom messages I'd want to ask the LLM.

This way, one could prompt it for something like "Write me a Python function that computes the function x^23+2*x^5 for x =25". Suppose the code, when run on Repl.it generates and error, I'd have that error pasted directly in the chat window, to minimize clicks, so I can simply add a custom message to the LLM, or just, with a single click, forward the error to the LLM.

This would make code tool use much more pleasant , I believe.

@AMOz1
Copy link

AMOz1 commented Aug 24, 2024

My use case is.. I want to say.. send me notification tomorrow at 7pm .. and it would use NTFY to do that. I would use that all the time. Looking forward to custom tools in Big AGI.

@friederrr
Copy link

Not sure what you mean exactly, but I have submitted a roadmap request if that is what you were after.

#627

@AMOz1
Copy link

AMOz1 commented Aug 26, 2024

I just discovered that Big-AGI UI can display HTML ..when toggled on. That's it for me... I can now do what I wanted, send notifications to NTFY server. I wish there was an option to have it "On" by default, so that it would display HTML when there is a valid HTML in the response. That would make the UX so much smoother. Still ... amazing feature.

enricoros added a commit that referenced this issue Sep 16, 2024
@enricoros
Copy link
Owner

@AMOz1 this is really smart. Can you expand more in how you did it? I love this hack, very unconventional and interesting.

Regarding setting it on HTML rendering, it's easy to do, and probably should be there in Big-AGI 2 (we already have one advanced UI mode which turns it on by default).

@AMOz1
Copy link

AMOz1 commented Sep 16, 2024

@enricoros
sure...

  • on ntfy.sh I create an account and "subscribe" to my arbitrary topic
  • on mobile, I install NTFY app and login
  • in Big-AGI , I use custom persona with the following system prompt https://gist.github.com/AMOz1/0f7d9e48c19a05059c66e2a2afb6cc05
  • then, I just say something like ... "let's send the notification to buy milk tomorrow at 5pm" .. and Claude Sonnet 3,5 (in my case) constructs a HTML with a button to send the notification
  • NTFY server then sends the notification with the specified "Delay" header parameter and it pops as a notification in my NTFY mobile app

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants