Due TruLens work with Local LLM using Langchain without OpenAI Key? #673
-
Does TruLens work with Local LLM using Langchain without OpenAI Key? (Example for Local LLM: https://colab.research.google.com/drive/1XVjur9cdYLhxe6mrQwQUlWIm4VJmiYlu?usp=sharing#scrollTo=wJ7k5mYXj0vG) Getting below error while using TruLens with Local LLM using Langchain for above example ImportError: cannot import name 'LLM' from 'llama_index.llms.base' (/usr/local/lib/python3.10/dist-packages/llama_index/llms/base.py) Also which LLM used by TruLens (as internal) for to get expected feedback parameters result comparison with LLM App actual feedback result? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
You don't show your code so the error doesn't make much sense because it references LlamaIndex but the notebook you link appears to use LangChain. The LiteLLM class provides compatibility with various LLM implementations: https://www.trulens.org/trulens_eval/api/litellm_provider/ Here's a snippet of how I use it with Gemini (and LlamaIndex): from trulens_eval import Feedback, LiteLLM, TruLlama
from trulens_eval.feedback import Groundedness
# Initialize provider class
GEMINI_PROVIDER = LiteLLM(model_engine="gemini-pro")
GROUNDED = Groundedness(groundedness_provider=GEMINI_PROVIDER)
# Define a groundedness feedback function
f_groundedness = (
Feedback(GROUNDED.groundedness_measure_with_cot_reasons)
.on(TruLlama.select_source_nodes().node.text.collect())
.on_output()
.aggregate(GROUNDED.grounded_statements_aggregator)
) |
Beta Was this translation helpful? Give feedback.
You don't show your code so the error doesn't make much sense because it references LlamaIndex but the notebook you link appears to use LangChain.
The LiteLLM class provides compatibility with various LLM implementations: https://www.trulens.org/trulens_eval/api/litellm_provider/
Here's a snippet of how I use it with Gemini (and LlamaIndex):