serverless / serverless-python-requirements

⚡️🐍📦 Serverless plugin to bundle Python packages

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Unable to import module 'api': /lib64/libc.so.6: version `GLIBC_2.28' not found (required by /tmp/sls-py-req/cryptography/hazmat/bindings/_rust.abi3.so)

Areix opened this issue · comments

commented

Are you certain it's a bug?

  • Yes, it looks like a bug

Are you using the latest plugin release?

  • Yes, I'm using the latest plugin release

Is there an existing issue for this?

  • I have searched existing issues, it hasn't been reported yet

Issue description

I get the below error after deploy serverless lambda function.

Unable to import module 'api': /lib64/libc.so.6: version `GLIBC_2.28' not found (required by /tmp/sls-py-req/cryptography/hazmat/bindings/_rust.abi3.so)

python runtime: 3.9

Service configuration (serverless.yml) content

pythonRequirements:
    dockerizePip: non-linux  
    zip: true
    useDownloadCache: true
    useStaticCache: true
    staticCacheMaxVersions: 10
    slim: true
    # layer: true
    strip: false
    slimPatternsAppendDefaults: false
    slimPatterns:
      - '**/*.py[c|o]'
      - '**/__pycache__*'
      - '**/*.egg-info*'
    noDeploy:
      - pytest
      - boto3
      - botocore
      - docutils
      - jmespath
      - pip
      - python-dateutil
      - s3transfer
      - setuptools
      - six

Command name and used flags

sls deploy --stage dev --region ap-east-1 --verbose

Command output

Adding Python requirements helper to apis/xxxxx
Adding Python requirements helper to apis/yyyyy
Generated requirements from /home/runner/work/apis/xxxxx/requirements.txt in /home/runner/work/.serverless/apis/xxxxx/requirements.txt
Installing requirements from "/home/runner/.cache/serverless-python-requirements/52e8e73b448018a1369d4f64ef169579fb6397704f9a51744dfa78c2a6e65611_x86_64_slspyc/requirements.txt"
Using download cache directory /home/runner/.cache/serverless-python-requirements/downloadCacheslspyc
Running ...
Generated requirements from /home/runner/work/apis/yyyyy/requirements.txt in /home/runner/work/.serverless/apis/yyyyy/requirements.txt
Installing requirements from "/home/runner/.cache/serverless-python-requirements/deedc524f94a2ae84b643da28db4779abd06c0e4362c975c7473481cdbcc8a6f_x86_64_slspyc/requirements.txt"
Using download cache directory /home/runner/.cache/serverless-python-requirements/downloadCacheslspyc
Running ...
Zipping required Python packages for apis/xxxxx
Zipping required Python packages for apis/yyyyy
Excluding development dependencies for function "xxxxx"
Excluding development dependencies for function "yyyyy"
Removing Python requirements helper from apis/xxxxx
Removing Python requirements helper from apis/yyyyy
Injecting required Python packages to package

Environment information

Running "serverless" from node_modules
Framework Core: 3.22.0 (local) 3.29.0 (global)
Plugin: 6.2.2
SDK: 4.3.2

I had a similar issue recently after updating bcrypt, and found adding the following to pythonRequirements helped with this issue (not that I really know what's happening here, but plenty of posts elsewhere pointed to this);

    pipCmdExtraArgs:
      - "--platform manylinux2014_x86_64"
      - "--only-binary=:all:"
commented

it works like charm, thanks!

I had a similar problem but I solved by using venv (not dockerize it), but use this syntax:
For ARM

pipCmdExtraArgs: ['--platform', 'manylinux2014_aarch64', '--only-binary=:all:']

FOR X86_64

pipCmdExtraArgs: ['--platform', 'manylinux2014_x86_64', '--only-binary=:all:']

It never worked with:

pipCmdExtraArgs:
      - "--platform manylinux2014_x86_64"
      - "--only-binary=:all:"

My particular use case was that I wanted to use the AWS Lambda Python 3.11 runtime but a transient dependency, pendulum, did not have native builds past Python 3.9. I ended up just dockerizing pip which enabled all native dependencies to be built if needed:

  pythonRequirements:
    fileName: requirements/aws.txt
    dockerizePip: true
    useStaticCache: false
    useDownloadCache: false

Not exactly sure why the cache needed to be disabled but without it the reqs weren't getting updated as I expected. Need to look into that more.

@rsyring The issue with cache is that sometimes if you build e.g. without dockerization and end up with broken build for given arch, the result would still be cached and reused in the next packaging attempt