generated from konveyor-ecosystem/template-repo
-
Notifications
You must be signed in to change notification settings - Fork 32
Issues: konveyor/kai
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Amazon Bedrock: InvokeModel operation: The provided model doesn't support on-demand throughput - "meta.llama3-2-90b-instruct-v1:0"
wontfix
This will not be worked on
#411
opened Oct 4, 2024 by
jwmatthews
Build priority-aware task manager to focus on relevant work/changes first
#409
opened Oct 2, 2024 by
fabianvf
Merging Something isn't working
kai/config.toml' and
build/config.toml` is causing confusion with changing models
bug
#400
opened Sep 27, 2024 by
jwmatthews
podman compose - broken log/trace to disk when using the supplied build/config_example.toml
bug
Something isn't working
podman compose up: psycopg2.OperationalError: connection to server at "127.0.0.1", port 5432 failed: Connection refused
bug
Something isn't working
#395
opened Sep 27, 2024 by
jwmatthews
The class Something isn't working
ChatOpenAI
was deprecated in LangChain 0.0.10 and will be removed in 0.3.0
bug
#393
opened Sep 27, 2024 by
jwmatthews
[Experiment] Explore solved incident generated via semantic diff instead of llm_summary
experiment
#383
opened Sep 21, 2024 by
jwmatthews
Solved incident "llm_lazy" - "llm_summary" running against coolstore and llama3 is seeing multiple issues
bug
Something isn't working
#382
opened Sep 21, 2024 by
jwmatthews
Capture number of tokens in a request and response when possible
#373
opened Sep 17, 2024 by
jwmatthews
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.