Skip to content

Commit

Permalink
Merge pull request #84 from mlverse/updates
Browse files Browse the repository at this point in the history
Documentation improvements
  • Loading branch information
edgararuiz authored Apr 3, 2024
2 parents 2b471e5 + 150b05c commit b6a4cfc
Show file tree
Hide file tree
Showing 6 changed files with 119 additions and 6 deletions.
46 changes: 45 additions & 1 deletion R/ch-submit.R
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#' Method to easily integrate to new LLM's
#' Method to easily integrate to new LLM API's
#' @param defaults Defaults object, generally puled from `chattr_defaults()`
#' @param prompt The prompt to send to the LLM
#' @param stream To output the response from the LLM as it happens, or wait until
Expand All @@ -10,6 +10,50 @@
#' prompt (TRUE)
#' @param ... Optional arguments; currently unused.
#' @keywords internal
#' @details Use this function to integrate your own LLM API. It has a few
#' requirements to get it to work properly:
#' * The output of the function needs to be the parsed response from the LLM
#' * For those that support streaming, make sure to use the `cat()` function to
#' output the response of the LLM API as it is happening.
#' * If `preview` is set to TRUE, do not send to the LLM API. Simply return the
#' resulting prompt.
#'
#' The `defaults` argument controls which method to use. You can use the
#' `chattr_defaults()` function, and set the provider. The `provider` value
#' is what creates the R class name. It will pre-pend `cl_` to the class name.
#' See the examples for more clarity.
#' @examples
#' \dontrun{
#' library(chattr)
#' ch_submit.ch_my_llm <- function(defaults,
#' prompt = NULL,
#' stream = NULL,
#' prompt_build = TRUE,
#' preview = FALSE,
#' ...) {
#' # Use `prompt_build` to append the prompts you with to append
#' if(prompt_build) prompt <- paste0("Use the tidyverse\n", prompt)
#' # If `preview` is true, return the resulting prompt back
#' if(preview) return(prompt)
#' llm_response <- paste0("You said this: \n", prompt)
#' if(stream) {
#' cat("streaming:\n")
#' for(i in seq_len(nchar(llm_response))) {
#' # If `stream` is true, make sure to `cat()` the current output
#' cat(substr(llm_response, i, i))
#' Sys.sleep(0.1)
#' }
#' }
#' # Make sure to return the entire output from the LLM at the end
#' llm_response
#' }
#'
#' chattr_defaults("console", provider = "my llm")
#' chattr("hello")
#' chattr("hello", stream = FALSE)
#' chattr("hello", prompt_build = FALSE)
#' chattr("hello", preview = TRUE)
#' }
#' @export
ch_submit <- function(defaults,
prompt = NULL,
Expand Down
10 changes: 8 additions & 2 deletions R/chattr-use.R
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
#' Sets the LLM model to use in your session
#' @param model_label The label of the LLM model to use. Valid values are
#' 'copilot', 'gpt4', 'gpt35', and 'llamagpt'.
#' 'copilot', 'gpt4', 'gpt35', and 'llamagpt'. The value 'test' is also
#' acceptable, but it is meant for package examples, and internal testin.
#' @details
#' If the error "No model setup found" was returned, that is because none of the
#' expected setup for Copilot, OpenIA or LLama was automatically detected. Here
Expand All @@ -27,7 +28,12 @@ chattr_use <- function(model_label = NULL) {
if (interactive_label) {
model_label <- ch_get_ymls()
}
use_switch("configs", path_ext_set(model_label, "yml"))
if (model_label == "test") {
env_folder <- "apptest"
} else {
env_folder <- "configs"
}
use_switch(env_folder, path_ext_set(model_label, "yml"))
}

ch_get_ymls <- function(menu = TRUE) {
Expand Down
6 changes: 6 additions & 0 deletions R/chattr.R
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,12 @@
#' @inheritParams ch_submit
#' @returns The output of the LLM to the console, document or script.
#' @export
#' @examples
#' library(chattr)
#' chattr_use("test")
#' chattr("hello")
#' chattr("hello", preview = TRUE)
#'
chattr <- function(prompt = NULL,
preview = FALSE,
prompt_build = TRUE,
Expand Down
53 changes: 51 additions & 2 deletions man/ch_submit.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

7 changes: 7 additions & 0 deletions man/chattr.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 2 additions & 1 deletion man/chattr_use.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

0 comments on commit b6a4cfc

Please sign in to comment.