I read these StackOverflow questions: 1 2 that cover the same issue, as well as this Github thread. I still feel like I am missing something.
My Issue
I have a function that is deployed to AWS Lambda/API gateway using the serverless framework (python 3.10 runtime). I am also using the serverless-python-requirements plugin. My function uses the pydantic module. I have the following in my requirements.txt (excerpt):
pydantic==2.4.2
pydantic_core==2.10.1
I am not using Flask or FastAPI. My function works just fine when invoked locally (serverless invoke local -f my_function).
After deploying and invoking the deployed function with the same command (other than removing local), I get this error:
Running "serverless" from node_modules
{
"errorMessage": "No module named 'pydantic_core._pydantic_core'",
"errorType": "ModuleNotFoundError",
"requestId": "fd4eb321-5f81-42a2-9880-ea6a76a626d5",
"stackTrace": [
" File \"/opt/python/serverless_aws_lambda_sdk/instrument/__init__.py\", line 598, in stub\n return self._handler(user_handler, event, context)\n",
" File \"/opt/python/serverless_aws_lambda_sdk/instrument/__init__.py\", line 580, in _handler\n result = user_handler(event, context)\n",
" File \"/var/task/serverless_sdk/__init__.py\", line 144, in wrapped_handler\n return user_handler(event, context)\n",
" File \"/var/task/s_post_rental_app_from_airtable.py\", line 25, in error_handler\n raise e\n",
" File \"/var/task/s_post_rental_app_from_airtable.py\", line 20, in <module>\n user_handler = serverless_sdk.get_user_handler('functions.post_rental_app_from_airtable.handler.lambda_handler')\n",
" File \"/var/task/serverless_sdk/__init__.py\", line 56, in get_user_handler\n user_module = import_module(user_module_name)\n",
" File \"/var/lang/lib/python3.10/importlib/__init__.py\", line 126, in import_module\n return _bootstrap._gcd_import(name[level:], package, level)\n",
" File \"<frozen importlib._bootstrap>\", line 1050, in _gcd_import\n",
" File \"<frozen importlib._bootstrap>\", line 1027, in _find_and_load\n",
" File \"<frozen importlib._bootstrap>\", line 1006, in _find_and_load_unlocked\n",
" File \"<frozen importlib._bootstrap>\", line 688, in _load_unlocked\n",
" File \"<frozen importlib._bootstrap_external>\", line 883, in exec_module\n",
" File \"<frozen importlib._bootstrap>\", line 241, in _call_with_frames_removed\n",
" File \"/var/task/functions/post_rental_app_from_airtable/handler.py\", line 9, in <module>\n from pydantic import BaseModel, ValidationError\n",
" File \"/var/task/pydantic/__init__.py\", line 3, in <module>\n import pydantic_core\n",
" File \"/var/task/pydantic_core/__init__.py\", line 6, in <module>\n from ._pydantic_core import (\n"
]
}
Environment: darwin, node 20.2.0, framework 3.36.0 (local) 3.35.2v (global), plugin 7.1.0, SDK 4.4.0
Credentials: Local, "default" profile
Docs: docs.serverless.com
Support: forum.serverless.com
Bugs: github.com/serverless/serverless/issues
What I Tried
I read in the threads above that this problem could come about as a result of ISA mismatches in pydantic_core. So I tried the following:
Specifying the ISA for all functions in serverless.yml (provider -> architecture -> 'arm64' and 'x86_64')
provider:
name: aws
# latest supported python by lambda + serverless as of 2023-10-23
runtime: python3.10
# NOTE: arm64 may offer better price/performance
architecture: 'arm64'
Specifying that pip should build in a docker in serverless.yml, that it should use an arm64 image, and pass the platform information to pip via the dockerRunCmdExtraArgs:
custom:
pythonRequirements:
# this is necessary to avoid cross-platform build issues
dockerizePip: true
# explicitly pass the arm64 platform to the docker build
dockerImage: public.ecr.aws/sam/build-python3.10:latest-arm64
# explicitly tell pip to fetch the arm64 version of the package
dockerRunCmdExtraArgs: [ '--platform', 'linux/arm64/v8' ]
I'm not sure what else to try.
I believe the issue in your case is coming from use of staticCache in serverless-python-requirements plugin. I've managed to reproduce this exact issue by first building without dockerizePip: true (and mismatching architectures) and then trying to deploy again with dockerizePip: true + proper build image. When turning on --verbose, I've noticed that the requirements are not rebuild, but rather are injected from previously cached build.
In order to solve, you can either clean cache manually (you'll see where it is on your machine when running in --verbose mode), or set useStaticCache: false at least temporarily, after the cached builds will be correct, you can once again depend on cache to speed up packaging.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With