Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference Server 에 Sampling parameter 적용 이슈 #25

Open
WagyuShark opened this issue Nov 11, 2024 · 0 comments
Open

Inference Server 에 Sampling parameter 적용 이슈 #25

WagyuShark opened this issue Nov 11, 2024 · 0 comments
Assignees

Comments

@WagyuShark
Copy link
Contributor

  • 현재 OpenAILike 을 통해 LLM 추론서버를 사용중
  • Sampling parameter를 요청마다 변경해서 적용할 수 없음
  • llama Index에서 매 추론 요청마다 Sampling parameter를 포함할 수 있게 하는 함수를 찾아 수정해야함
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants