Skip to content

Commit

Permalink
lint
Browse files Browse the repository at this point in the history
  • Loading branch information
philippemiron committed Sep 8, 2024
1 parent 761560e commit e10d635
Showing 1 changed file with 4 additions and 2 deletions.
6 changes: 4 additions & 2 deletions src/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,12 +66,14 @@ def call_llm(user_query: str) -> Generator[str, None, None]:

# RAG config
config = genai.GenerationConfig(
temperature=0.3, # [0, 1] lower values are more deterministic
temperature=0.3, # [0, 1] lower values are more deterministic
top_p=0.9,
top_k=20, # decrease from 40 to 20 to make model more focused on high probability token
)

response = st.session_state["llm"].generate_content(prompt, generation_config=config, stream=True)
response = st.session_state["llm"].generate_content(
prompt, generation_config=config, stream=True
)
for chunk in response:
yield chunk.text

Expand Down

0 comments on commit e10d635

Please sign in to comment.