Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error : Not supporting AWS bedrock Guardrails #1028

Open
mamahajan opened this issue Oct 11, 2024 · 2 comments
Open

Error : Not supporting AWS bedrock Guardrails #1028

mamahajan opened this issue Oct 11, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@mamahajan
Copy link

Problem statement : I am trying to use guardrail on the jupyter_ai extension.
Steps done to install jupyter_ai extension on AWS Sagemaker Instance :

pip install jupyter_ai_magics==1.0.15
pip install jupyter_ai==1.0.15
sudo systemctl restart jupyter-server

jupyter lab --AiExtension.default_language_model=bedrock-chat:anthropic.claude-3-haiku-20240307-v1:0

  1. jupyter lab --AiExtension.model_parameters bedrock-chat:anthropic.claude-3-haiku-20240307-v1:0='{"guardrails":{"guardrailIdentifier":"********","guardrailVersion":"2"}}'
    OR
  2. jupyter lab --AiExtension.model_parameters bedrock-chat:anthropic.claude-3-haiku-20240307-v1:0='{"guardrails":{"id":"*******","version":2}}'

Installed Versions :

Boto3 : 1.35.16
Langchain : 0.1.20
jupyter_ai : 1.0.15
jupyter_ai_magics : 1.0.15
jupyterlab : 3.6.8 ( Can't upgrade to 4.x as AWS sagemaker instance doesn't support that)

LLM Model used : anthropic.claude-3-haiku-20240307-v1:0

As you can see from the above commands run, I have tried running both with id and guardrailidentifier but both of them are failing with following error message.

1. Error when tried with guardrail id parameter (refer above)
Traceback (most recent call last):
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/llms/bedrock.py", line 545, in _prepare_input_and_invoke
response = self.client.invoke_model(**request_options)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/botocore/client.py", line 569, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/botocore/client.py", line 980, in _make_api_call
request_dict = self._convert_to_request_dict(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/botocore/client.py", line 1047, in _convert_to_request_dict
request_dict = self._serializer.serialize_to_request(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/botocore/validate.py", line 381, in serialize_to_request
raise ParamValidationError(report=report.generate_report())
botocore.exceptions.ParamValidationError: Parameter validation failed:
Unknown parameter in input: "guardrail", must be one of: body, contentType, accept, modelId, trace, guardrailIdentifier, guardrailVersion

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai/chat_handlers/base.py", line 125, in on_message
await self.process_message(message)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai/chat_handlers/default.py", line 61, in process_message
response = await self.llm_chain.apredict(input=message.body, stop=["\nHuman:"])
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/llm.py", line 333, in apredict
return (await self.acall(kwargs, callbacks=callbacks))[self.output_key]
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/_api/deprecation.py", line 157, in awarning_emitting_wrapper
return await wrapped(*args, **kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/base.py", line 428, in acall
return await self.ainvoke(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/base.py", line 212, in ainvoke
raise e
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/base.py", line 203, in ainvoke
await self._acall(inputs, run_manager=run_manager)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/llm.py", line 298, in _acall
response = await self.agenerate([inputs], run_manager=run_manager)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/llm.py", line 165, in agenerate
return await self.llm.agenerate_prompt(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 570, in agenerate_prompt
return await self.agenerate(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 530, in agenerate
raise exceptions[0]
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 719, in _agenerate_with_cache
result = await self._agenerate(messages, stop=stop, **kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai_magics/providers.py", line 689, in _agenerate
return await self._generate_in_executor(*args, **kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai_magics/providers.py", line 289, in _generate_in_executor
return await loop.run_in_executor(executor, _call_with_args)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/chat_models/bedrock.py", line 300, in _generate
completion, usage_info = self._prepare_input_and_invoke(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/llms/bedrock.py", line 552, in _prepare_input_and_invoke
raise ValueError(f"Error raised by bedrock service: {e}")
ValueError: Error raised by bedrock service: Parameter validation failed:
Unknown parameter in input: "guardrail", must be one of: body, contentType, accept, modelId, trace, guardrailIdentifier, guardrailVersion

  1. Error when tried with guardrail identifier parameter (refer above)**

Traceback (most recent call last):
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/llms/bedrock.py", line 480, in _guardrails_enabled
and bool(self.guardrails["id"])
KeyError: 'id'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai/chat_handlers/base.py", line 125, in on_message
await self.process_message(message)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai/chat_handlers/default.py", line 61, in process_message
response = await self.llm_chain.apredict(input=message.body, stop=["\nHuman:"])
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/llm.py", line 333, in apredict
return (await self.acall(kwargs, callbacks=callbacks))[self.output_key]
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/_api/deprecation.py", line 157, in awarning_emitting_wrapper
return await wrapped(*args, **kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/base.py", line 428, in acall
return await self.ainvoke(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/base.py", line 212, in ainvoke
raise e
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/base.py", line 203, in ainvoke
await self._acall(inputs, run_manager=run_manager)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/llm.py", line 298, in _acall
response = await self.agenerate([inputs], run_manager=run_manager)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain/chains/llm.py", line 165, in agenerate
return await self.llm.agenerate_prompt(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 570, in agenerate_prompt
return await self.agenerate(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 530, in agenerate
raise exceptions[0]
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 719, in _agenerate_with_cache
result = await self._agenerate(messages, stop=stop, **kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai_magics/providers.py", line 689, in _agenerate
return await self._generate_in_executor(*args, **kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/jupyter_ai_magics/providers.py", line 289, in _generate_in_executor
return await loop.run_in_executor(executor, _call_with_args)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/chat_models/bedrock.py", line 300, in _generate
completion, usage_info = self._prepare_input_and_invoke(
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/llms/bedrock.py", line 519, in _prepare_input_and_invoke
if self._guardrails_enabled:
File "/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_community/llms/bedrock.py", line 485, in _guardrails_enabled
raise TypeError(
TypeError: Guardrails must be a dictionary with 'id' and 'version' keys.

Can you confirm whether usage of AWS Bedrock guardrail is supported in jupyter_ai extension ?

Reference issues :
langchain-ai/langchain-aws#156
langchain-ai/langchain-aws#25

@mamahajan mamahajan added the bug Something isn't working label Oct 11, 2024
@mamahajan
Copy link
Author

@dlqqq Will you be able to comment on this ?
It seems has AWS has moved away from langchain_community to https://github.com/langchain-ai/langchain-aws

Reference : langchain-ai/langchain#20216 (comment)

@dlqqq
Copy link
Member

dlqqq commented Oct 15, 2024

@mamahajan This issue is likely occurring because you are using Jupyter AI v1.x, which is no longer maintained as JupyterLab 3 has reached its end-of-life. We will not backport any features to Jupyter AI v1.x.

It seems has AWS has moved away from langchain_communityto https://github.com/langchain-ai/langchain-aws

Jupyter AI has migrated to use langchain-aws as of v2.20.0, which was released 3 months ago: 19c7b6b

Can you try reproducing this issue with the latest release of Jupyter AI (v2.25.0) in JupyterLab 4?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants