{"id":1098,"hash":"6a2598acd3f7b139a1b60917fe64fe86c78a3f8505b0da5329526e8cd1b0b421","pattern":"&quot;cannot import name &#39;DEFAULT_CIPHERS&#39; from &#39;urllib3.util.ssl_&#39;&quot; on AWS Lambda using a layer","full_message":"What I want to achieve\nTo scrape an website using AWS Lambda and save the data on S3.\n\nThe issues I'm having\nWhen I execute Lambda, the following error message appears.\n\n{   \"errorMessage\": \"Unable to import module 'lambda_function': cannot\nimport name 'DEFAULT_CIPHERS' from 'urllib3.util.ssl_'\n(/opt/python/urllib3/util/ssl_.py)\",   \"errorType\":\n\"Runtime.ImportModuleError\",   \"requestId\":\n\"fb66bea9-cbad-4bd3-bd4d-6125454e21be\",   \"stackTrace\": [] }\n\nCode\nThe minimum Lambda code is as follows.\n\nimport requests\nimport boto3 \n\ndef lambda_handler(event, context):\n    s3 = boto3.client('s3')\n    upload_res = s3.put_object(Bucket='horserace-dx', Key='/raw/a.html', Body='testtext')\n    \n    return event\n\nAn layer was added to the Lambda. Files were save in python folder using the commands below , frozen in a zip file, then uploaded to AWS Lambda as a layer.\n\n!pip install requests -t ./python --no-user\n!pip install pandas -t ./python --no-user\n!pip install beautifulsoup4 -t ./python --no-user\n\nThe bucket horserace-dx exists\nThe folder raw exists\nThe role of the Lambda is properly set. It can read from and write to S3\nThe runtime of the Lambda is Python 3.9. The python version of the local computer is 3.9.13.\n\nWhat I did so far\nI google \"cannot import name 'DEFAULT_CIPHERS' from 'urllib3.util.ssl_'\" and found some suggestions. I made the layer with the following code and tried again in vain.\n\n!pip install requests -t ./python --no-user\n!pip install pandas -t ./python --no-user\n!pip install beautifulsoup4 -t ./python --no-user\n!pip install urllib3==1.26.15 -t ./python --no-user\n\nSo what should I do to achieve what I want to achieve? Any suggestions would be greatly appreciated.","ecosystem":"pypi","package_name":"amazon-web-services","package_version":null,"solution":"Execute the following commands.\n\npip install requests==2.25.0 -t ./python --no-user\npip install beautifulsoup4 -t ./python --no-user\npip install pytz -t ./python --no-user\n\nOn PyPI, download the following whl files from the pages of numpy and pandas\n\nnumpy-1.24.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl\npandas-2.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl\n\nUnzip the files and move the contents to the python folder.\n\nZip the python folder and upload it to AWS Lambda Layer.\n\nSet the layer to the Lambda.\n\nThen the code runs without errors.","confidence":0.75,"source":"stackoverflow","source_url":"https://stackoverflow.com/questions/76414514/cannot-import-name-default-ciphers-from-urllib3-util-ssl-on-aws-lambda-us","votes":71,"created_at":"2026-04-19T04:52:20.953424+00:00","updated_at":"2026-04-19T04:52:20.953424+00:00"}