Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SNOW-1022303: Cannot import snowflake.connector on amazonlinux due to dependency conflict with cryptography #1865

Closed
metalaureate opened this issue Jan 29, 2024 · 8 comments
Labels

Comments

@metalaureate
Copy link

metalaureate commented Jan 29, 2024

Python version

Python 3.12.0 (main, Jan 29 2024, 22:33:41) [GCC 11.4.1 20230605 (Red Hat 11.4.1-2)]

Operating system and processor architecture

Linux-6.4.16-linuxkit-aarch64-with-glibc2.34

Installed packages

asn1crypto==1.5.1
certifi==2023.11.17
cffi==1.16.0
charset-normalizer==3.3.2
cryptography==41.0.5
filelock==3.13.1
idna==3.6
packaging==23.2
platformdirs==3.11.0
pycparser==2.21
PyJWT==2.8.0
pyOpenSSL==23.3.0
pytz==2023.4
requests==2.31.0
snowflake-connector-python==3.7.0
sortedcontainers==2.4.0
tomlkit==0.12.3
typing_extensions==4.9.0
urllib3==2.1.0

What did you do?

I am trying to use snowflake.connect in the AWS Lamdba environment but both on my local docker version of the amazonlinux env and remotely in AWS Lambda Layer attempting to import snowflake.connector produces:

>>> import snowflake.connector
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/snowflake/connector/__init__.py", line 19, in <module>
    from .connection import SnowflakeConnection
  File "/snowflake/connector/connection.py", line 30, in <module>
    from cryptography.hazmat.primitives import serialization
  File "/cryptography/hazmat/primitives/serialization/__init__.py", line 7, in <module>
    from cryptography.hazmat.primitives._serialization import (
  File "/cryptography/hazmat/primitives/_serialization.py", line 11, in <module>
    from cryptography.hazmat.primitives.hashes import HashAlgorithm
  File "/cryptography/hazmat/primitives/hashes.py", line 10, in <module>
    from cryptography.hazmat.bindings._rust import openssl as rust_openssl
ImportError: /cryptography/hazmat/bindings/_rust.abi3.so: cannot open shared object file: No such file or directory

This reproduces it:

create the amazonlinux environment

docker run -it --rm amazonlinux
yum update -y
yum groupinstall "Development Tools" -y
yum install wget openssl-devel bzip2-devel libffi-devel -y

compile Python 3.12 fml

wget https://www.python.org/ftp/python/3.12.0/Python-3.12.0.tar.xz
tar -xf Python-3.12.0.tar.xz
cd Python-3.12.0
./configure --enable-optimizations
make altinstall

create virtual environment

python3.12 -m venv venv
source venv/bin/activate

install as per https://repost.aws/knowledge-center/lambda-import-module-error-python

pip3 install --upgrade --platform manylinux2014_x86_64 --target . --python-version 3.12 --only-binary=:all: snowflake-connector-python

(I tried all which ways but still got same error)

What did you expect to see?

I expected import snowflake.connector not to throw low-level binding dependency errors.

Can you set logging to DEBUG and collect the logs?

NA

@github-actions github-actions bot changed the title Cannot import snowflake.connector on amazonlinux due to dependency conflict with cryptography SNOW-1022303: Cannot import snowflake.connector on amazonlinux due to dependency conflict with cryptography Jan 29, 2024
@sfc-gh-aling
Copy link
Collaborator

thanks for the report! we will bump the version and release a newer version after verifying the dependencies.
PR is here: #1866

@metalaureate
Copy link
Author

Thanks. Any timeline on that?

@sfc-gh-ssampat
Copy link

@sfc-gh-aling My customer would like to know when the PR #1866 will be reviewed and merged.

@sfc-gh-aling
Copy link
Collaborator

we will do a release in the coming two weeks.
if you're looking for a immediate patch release, you can find our night build here: https://pypi.org/project/snowflake-connector-python-nightly/

@sfc-gh-ssampat
Copy link

@sfc-gh-aling Is it ok to send customer this link to the night build that you mentioned above?

@metalaureate
Copy link
Author

metalaureate commented Feb 8, 2024

@sfc-gh-aling I tried creating an AWS Lamdba layer using nightly build but the same problem happens. Any sugestions for what I could try next? Here is what I did:

Note: I could not install the nightly build with targeted options. I tried pip3 install --upgrade --platform manylinux2014_x86_64 --target . --python-version 3.12 --only-binary=:all: snowflake-connector-python-nightly but it was not found, so fell back to pip3 install snowflake-connector-python-nightly

docker run -it --rm amazonlinux
# and then after python3.12 is installed:
python3.12 -m venv venv
source venv/bin/activate

pip3 install  snowflake-connector-python-nightly
cd venv/lib/python3.12/site-packages
mkdir -p /tmp/python
cp -r ./* /tmp/python/
cd /tmp
zip -r my_lambda_layer.zip python/

When the Layer is installed on AWS Lamdba and import snowflake.connector is called, the same error occurs:

Response
{
  "errorMessage": "/opt/python/cryptography/hazmat/bindings/_rust.abi3.so: cannot open shared object file: No such file or directory",
  "errorType": "ImportError",
  "requestId": "3aaa88d1-ba8b-4eab-b44e-2222899d99f5",
  "stackTrace": [
    "  File \"/var/task/lambda_function.py\", line 10, in lambda_handler\n    import snowflake.connector\n",
    "  File \"/opt/python/snowflake/connector/__init__.py\", line 19, in <module>\n    from .connection import SnowflakeConnection\n",
    "  File \"/opt/python/snowflake/connector/connection.py\", line 30, in <module>\n    from cryptography.hazmat.primitives import serialization\n",
    "  File \"/opt/python/cryptography/hazmat/primitives/serialization/__init__.py\", line 7, in <module>\n    from cryptography.hazmat.primitives._serialization import (\n",
    "  File \"/opt/python/cryptography/hazmat/primitives/_serialization.py\", line 10, in <module>\n    from cryptography.hazmat.primitives.hashes import HashAlgorithm\n",
    "  File \"/opt/python/cryptography/hazmat/primitives/hashes.py\", line 9, in <module>\n    from cryptography.hazmat.bindings._rust import openssl as rust_openssl\n"
  ]
}

I tried to install a previous version but there are no previous versions with targeted binaries until 1.9.0 which I am unable to install. I tried going back to pip3 install snowflake-connector-python==3.5.0 but that had the same problem. So whatever this is this an old problem--have none of your clients ever run the snowflake python connector in an AWS Lambda before??

@sfc-gh-aling
Copy link
Collaborator

@metalaureate
we have test coverage on AWS Lambda:

can this be related to how you prepare you docker image/python, have you tried installing Python 3.12 from binary installer or conda and see how it goes?

@sfc-gh-ssampat yes you can share the link with our customer

@metalaureate
Copy link
Author

This is fixed by making sure the Apple Silicon version of amazonlinux is emulating amd64.

Build Python 3.12 on Amazon Linux, emulated as x86

docker run --platform linux/amd64 -it --rm amazonlinux

yum update -y
yum groupinstall "Development Tools" -y
yum install wget openssl-devel bzip2-devel libffi-devel -y

wget https://www.python.org/ftp/python/3.12.0/Python-3.12.0.tar.xz
tar -xf Python-3.12.0.tar.xz
cd Python-3.12.0
./configure --enable-optimizations
make -j $(nproc)

make altinstall

Prepare Env

rm -rf /tmp/python
rm -rf /venv
rm -rf /tmp/my_lambda_layer.zip

python3.12 -m venv venv
source venv/bin/activate

pip3 install snowflake

package for layer

cd venv/lib/python3.12/site-packages
mkdir -p /tmp/python
cp -r ./* /tmp/python/
cd /tmp
zip -r my_lambda_layer.zip python/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants