Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(Ollama) temperature/max_token settings lost when using model-specific request parameters #473

Open
1 task done
andrewdea opened this issue Nov 12, 2024 · 5 comments
Open
1 task done
Labels
bug Something isn't working

Comments

@andrewdea
Copy link

Please update gptel first -- errors are often fixed by the time they're reported.

  • I have updated gptel to the latest commit and tested that the issue still exists

Bug Description

setq: Symbol’s value as variable is void: options-plist

see my fix here

Backend

Ollama

Steps to Reproduce

M-x gptel-send

Additional Context

Emacs version: 29.4

Backtrace

Debugger entered--Lisp error: (void-variable options-plist)
  (plist-put options-plist :temperature gptel-temperature)
  (setq options-plist (plist-put options-plist :temperature gptel-temperature))
  (progn (setq options-plist (plist-put options-plist :temperature gptel-temperature)))
  (if gptel-temperature (progn (setq options-plist (plist-put options-plist :temperature gptel-temperature))))
  (let ((prompts-plist (list ':model (gptel--model-name gptel-model) ':messages (vconcat prompts) ':stream (or (and gptel-stream gptel-use-curl (progn (or ... ...) (aref gptel-backend 5))) :json-false)))) (if gptel-temperature (progn (setq options-plist (plist-put options-plist :temperature gptel-temperature)))) (if gptel-max-tokens (progn (setq options-plist (plist-put options-plist :num_predict gptel-max-tokens)))) (gptel--merge-plists prompts-plist (progn (or (progn (and (memq (type-of gptel-backend) cl-struct-gptel-backend-tags) t)) (signal 'wrong-type-argument (list 'gptel-backend gptel-backend))) (aref gptel-backend 10)) (gptel--model-request-params gptel-model)))
  (progn (let ((prompts-plist (list ':model (gptel--model-name gptel-model) ':messages (vconcat prompts) ':stream (or (and gptel-stream gptel-use-curl (progn ... ...)) :json-false)))) (if gptel-temperature (progn (setq options-plist (plist-put options-plist :temperature gptel-temperature)))) (if gptel-max-tokens (progn (setq options-plist (plist-put options-plist :num_predict gptel-max-tokens)))) (gptel--merge-plists prompts-plist (progn (or (progn (and (memq ... cl-struct-gptel-backend-tags) t)) (signal 'wrong-type-argument (list 'gptel-backend gptel-backend))) (aref gptel-backend 10)) (gptel--model-request-params gptel-model))))
  (closure (cl-struct-gptel-ollama-tags t) (_backend prompts) "JSON encode PROMPTS for sending to ChatGPT." (progn (let ((prompts-plist (list ':model (gptel--model-name gptel-model) ':messages (vconcat prompts) ':stream (or ... :json-false)))) (if gptel-temperature (progn (setq options-plist (plist-put options-plist :temperature gptel-temperature)))) (if gptel-max-tokens (progn (setq options-plist (plist-put options-plist :num_predict gptel-max-tokens)))) (gptel--merge-plists prompts-plist (progn (or (progn ...) (signal ... ...)) (aref gptel-backend 10)) (gptel--model-request-params gptel-model)))))(#s(gptel-ollama :name "OLL🦙M🦙" :host "localhost:11434" :header nil :protocol "http" :stream t :endpoint "/api/chat" :key nil :models (granite3-dense:2b qwen2.5:7b) :url "http://localhost:11434/api/chat" :request-params nil :curl-args nil) ((:role "system" :content "You are a large language model living in Emacs and...") (:role "user" :content "*** Welcome to IELM ***  Type (describe-mode) or p...")))
  apply((closure (cl-struct-gptel-ollama-tags t) (_backend prompts) "JSON encode PROMPTS for sending to ChatGPT." (progn (let ((prompts-plist (list ':model (gptel--model-name gptel-model) ':messages (vconcat prompts) ':stream (or ... :json-false)))) (if gptel-temperature (progn (setq options-plist (plist-put options-plist :temperature gptel-temperature)))) (if gptel-max-tokens (progn (setq options-plist (plist-put options-plist :num_predict gptel-max-tokens)))) (gptel--merge-plists prompts-plist (progn (or (progn ...) (signal ... ...)) (aref gptel-backend 10)) (gptel--model-request-params gptel-model))))) #s(gptel-ollama :name "OLL🦙M🦙" :host "localhost:11434" :header nil :protocol "http" :stream t :endpoint "/api/chat" :key nil :models (granite3-dense:2b qwen2.5:7b) :url "http://localhost:11434/api/chat" :request-params nil :curl-args nil) ((:role "system" :content "You are a large language model living in Emacs and...") (:role "user" :content "*** Welcome to IELM ***  Type (describe-mode) or p...")))
  gptel--request-data(#s(gptel-ollama :name "OLL🦙M🦙" :host "localhost:11434" :header nil :protocol "http" :stream t :endpoint "/api/chat" :key nil :models (granite3-dense:2b qwen2.5:7b) :url "http://localhost:11434/api/chat" :request-params nil :curl-args nil) ((:role "system" :content "You are a large language model living in Emacs and...") (:role "user" :content "*** Welcome to IELM ***  Type (describe-mode) or p...")))
  (let* ((gptel--system-message (if (and gptel-context--alist (eq gptel-use-context 'system)) (gptel-context--wrap system) system)) (gptel-stream stream) (start-marker (cond ((null position) (if (use-region-p) (set-marker (make-marker) (region-end)) (save-excursion (skip-syntax-forward "w.") (point-marker)))) ((markerp position) position) ((integerp position) (set-marker (make-marker) position buffer)))) (full-prompt (cond ((null prompt) (gptel--create-prompt start-marker)) ((stringp prompt) (let ((temp-buffer ...)) (save-current-buffer (set-buffer temp-buffer) (unwind-protect ... ...)))) ((consp prompt) prompt))) (request-data (gptel--request-data gptel-backend full-prompt)) (info (list :data request-data :buffer buffer :position start-marker))) (if context (progn (plist-put info :context context))) (if in-place (progn (plist-put info :in-place in-place))) (if dry-run nil (funcall (if gptel-use-curl #'gptel-curl-get-response #'gptel--url-get-response) info callback)) request-data)
  (progn (gptel--sanitize-model) (let* ((gptel--system-message (if (and gptel-context--alist (eq gptel-use-context 'system)) (gptel-context--wrap system) system)) (gptel-stream stream) (start-marker (cond ((null position) (if (use-region-p) (set-marker ... ...) (save-excursion ... ...))) ((markerp position) position) ((integerp position) (set-marker (make-marker) position buffer)))) (full-prompt (cond ((null prompt) (gptel--create-prompt start-marker)) ((stringp prompt) (let (...) (save-current-buffer ... ...))) ((consp prompt) prompt))) (request-data (gptel--request-data gptel-backend full-prompt)) (info (list :data request-data :buffer buffer :position start-marker))) (if context (progn (plist-put info :context context))) (if in-place (progn (plist-put info :in-place in-place))) (if dry-run nil (funcall (if gptel-use-curl #'gptel-curl-get-response #'gptel--url-get-response) info callback)) request-data))
  (progn (let ((--cl-keys-- --cl-rest--)) (while --cl-keys-- (cond ((memq (car --cl-keys--) '(:callback :buffer :position :context :dry-run :stream :in-place :system :allow-other-keys)) (if (cdr --cl-keys--) nil (error "Missing argument for %s" (car --cl-keys--))) (setq --cl-keys-- (cdr (cdr --cl-keys--)))) ((car (cdr (memq ... --cl-rest--))) (setq --cl-keys-- nil)) (t (error "Keyword argument %s not one of (:callback :buffer ..." (car --cl-keys--)))))) (progn (gptel--sanitize-model) (let* ((gptel--system-message (if (and gptel-context--alist (eq gptel-use-context ...)) (gptel-context--wrap system) system)) (gptel-stream stream) (start-marker (cond ((null position) (if ... ... ...)) ((markerp position) position) ((integerp position) (set-marker ... position buffer)))) (full-prompt (cond ((null prompt) (gptel--create-prompt start-marker)) ((stringp prompt) (let ... ...)) ((consp prompt) prompt))) (request-data (gptel--request-data gptel-backend full-prompt)) (info (list :data request-data :buffer buffer :position start-marker))) (if context (progn (plist-put info :context context))) (if in-place (progn (plist-put info :in-place in-place))) (if dry-run nil (funcall (if gptel-use-curl #'gptel-curl-get-response #'gptel--url-get-response) info callback)) request-data)))
  (let* ((callback (car (cdr (plist-member --cl-rest-- ':callback)))) (buffer (car (cdr (or (plist-member --cl-rest-- ':buffer) (list nil (current-buffer)))))) (position (car (cdr (plist-member --cl-rest-- ':position)))) (context (car (cdr (plist-member --cl-rest-- ':context)))) (dry-run (car (cdr (plist-member --cl-rest-- ':dry-run)))) (stream (car (cdr (plist-member --cl-rest-- ':stream)))) (in-place (car (cdr (plist-member --cl-rest-- ':in-place)))) (system (car (cdr (or (plist-member --cl-rest-- ':system) (list nil gptel--system-message)))))) (progn (let ((--cl-keys-- --cl-rest--)) (while --cl-keys-- (cond ((memq (car --cl-keys--) '...) (if (cdr --cl-keys--) nil (error "Missing argument for %s" ...)) (setq --cl-keys-- (cdr ...))) ((car (cdr ...)) (setq --cl-keys-- nil)) (t (error "Keyword argument %s not one of (:callback :buffer ..." (car --cl-keys--)))))) (progn (gptel--sanitize-model) (let* ((gptel--system-message (if (and gptel-context--alist ...) (gptel-context--wrap system) system)) (gptel-stream stream) (start-marker (cond (... ...) (... position) (... ...))) (full-prompt (cond (... ...) (... ...) (... prompt))) (request-data (gptel--request-data gptel-backend full-prompt)) (info (list :data request-data :buffer buffer :position start-marker))) (if context (progn (plist-put info :context context))) (if in-place (progn (plist-put info :in-place in-place))) (if dry-run nil (funcall (if gptel-use-curl #'gptel-curl-get-response #'gptel--url-get-response) info callback)) request-data))))
  gptel-request(nil :stream t)
  (if (and arg (require 'gptel-transient nil t)) (call-interactively #'gptel-menu) (message "Querying %s..." (progn (or (progn (and (memq (type-of gptel-backend) cl-struct-gptel-backend-tags) t)) (signal 'wrong-type-argument (list 'gptel-backend gptel-backend))) (aref gptel-backend 1))) (gptel--sanitize-model) (gptel-request nil :stream gptel-stream) (gptel--update-status " Waiting..." 'warning))
  (closure (t) (&optional arg) "Submit this prompt to the current LLM backend.\n\nBy..." (interactive "P") (if (and arg (require 'gptel-transient nil t)) (call-interactively #'gptel-menu) (message "Querying %s..." (progn (or (progn (and ... t)) (signal 'wrong-type-argument (list ... gptel-backend))) (aref gptel-backend 1))) (gptel--sanitize-model) (gptel-request nil :stream gptel-stream) (gptel--update-status " Waiting..." 'warning)))(nil)
  apply((closure (t) (&optional arg) "Submit this prompt to the current LLM backend.\n\nBy..." (interactive "P") (if (and arg (require 'gptel-transient nil t)) (call-interactively #'gptel-menu) (message "Querying %s..." (progn (or (progn (and ... t)) (signal 'wrong-type-argument (list ... gptel-backend))) (aref gptel-backend 1))) (gptel--sanitize-model) (gptel-request nil :stream gptel-stream) (gptel--update-status " Waiting..." 'warning))) nil)
  (if (derived-mode-p 'org-mode) (let* ((val (seq-mapn #'(lambda (a b) (or a b)) (gptel-org--entry-properties) (list gptel--system-message gptel-backend gptel-model gptel-temperature gptel-max-tokens)))) (progn (ignore (consp val)) (let* ((x24 (car-safe val)) (x25 (cdr-safe val))) (progn (ignore (consp x25)) (let* ((x26 ...) (x27 ...)) (progn (ignore ...) (let* ... ...))))))) (apply send-fun args))
  gptel-org--send-with-props((closure (t) (&optional arg) "Submit this prompt to the current LLM backend.\n\nBy..." (interactive "P") (if (and arg (require 'gptel-transient nil t)) (call-interactively #'gptel-menu) (message "Querying %s..." (progn (or (progn (and ... t)) (signal 'wrong-type-argument (list ... gptel-backend))) (aref gptel-backend 1))) (gptel--sanitize-model) (gptel-request nil :stream gptel-stream) (gptel--update-status " Waiting..." 'warning))) nil)
  apply(gptel-org--send-with-props (closure (t) (&optional arg) "Submit this prompt to the current LLM backend.\n\nBy..." (interactive "P") (if (and arg (require 'gptel-transient nil t)) (call-interactively #'gptel-menu) (message "Querying %s..." (progn (or (progn (and ... t)) (signal 'wrong-type-argument (list ... gptel-backend))) (aref gptel-backend 1))) (gptel--sanitize-model) (gptel-request nil :stream gptel-stream) (gptel--update-status " Waiting..." 'warning))) nil)
  gptel-send(nil)
  funcall-interactively(gptel-send nil)
  command-execute(gptel-send record)
  execute-extended-command(nil "gptel-send" "gptel-send")
  funcall-interactively(execute-extended-command nil "gptel-send" "gptel-send")
  command-execute(execute-extended-command)

Log Information

No response

@andrewdea andrewdea added the bug Something isn't working label Nov 12, 2024
@karthink
Copy link
Owner

Thanks for reporting!

Fixed in 4aa6b7c. I was working on it already, I explain in the PR thread (#472).

@karthink
Copy link
Owner

@andrewdea There is still a potential issue with merging plists non-recursively, in the gptel--merge-plists call that appears in that function, where the options set above (temperature and max_tokens) can be lost. I'll leave this thread open if you have any ideas on how to address that.

@karthink karthink changed the title void variable in gptel-ollama.el (Ollama) temperature/max_token settings lost when using model-specific request parameters Nov 12, 2024
@andrewdea
Copy link
Author

Hi @karthink sorry about the delay.
If I'm understanding correctly, we want to keep any global options, but the temperature and max_tokens are a bit different because they can set both at the global level and for a specific backend/model.

I created this PR to address this problem: #490. Let me know if it's working as expected.

@karthink
Copy link
Owner

@andrewdea Thanks for the PR.

If I'm understanding correctly, we want to keep any global options

Actually this is not the case. It was decided in #471, specifically here, that the priority order is

  1. model-specific
  2. backend-specific
  3. global variable

So the feature is working as intended. If you think the order should be different, please let us know your reasoning in #471, or by creating a new discussion and linking to #471 and this issue.

@andrewdea
Copy link
Author

@karthink perfect yeah, I didn't write it right the first time but that's what I meant: we keep options according to this priority order.
I meant to say that we want to keep any model- or backend-specific option, which is why we merged the options plist with the global values. The fix should make it so that this merge doesn't necessarily override the global values for temperature and max_tokens: we only override those when the model- or backend-specific options include replacement values.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants