Skip to content

Due TruLens work with Local LLM using Langchain without OpenAI Key? #673

Answered by jimwhite
navinchandrac asked this question in Q&A
Discussion options

You must be logged in to vote

You don't show your code so the error doesn't make much sense because it references LlamaIndex but the notebook you link appears to use LangChain.

The LiteLLM class provides compatibility with various LLM implementations: https://www.trulens.org/trulens_eval/api/litellm_provider/

Here's a snippet of how I use it with Gemini (and LlamaIndex):

from trulens_eval import Feedback, LiteLLM, TruLlama
from trulens_eval.feedback import Groundedness

# Initialize provider class
GEMINI_PROVIDER = LiteLLM(model_engine="gemini-pro")

GROUNDED = Groundedness(groundedness_provider=GEMINI_PROVIDER)

# Define a groundedness feedback function
f_groundedness = (
    Feedback(GROUNDED.groundedness_measure_wi…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@YasaminAbbaszadegan
Comment options

Answer selected by joshreini1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants