Juanvulcano Juanvulcano - 2 months ago 44
Python Question

AWS - OS Error permission denied Lambda Script

I'm trying to execute a Lambda Script in Python with an imported library, however I'm getting permission errors.
I am also getting some alerts about the database, but database queries are called after the subprocess so I don't think they are related. Could someone explain why do I get error?

Alert information

State changed to INSUFFICIENT_DATA at 2016/08/16. Reason: Unchecked: Initial alarm creation

Lambda Error

[Errno 13] Permission denied: OSError Traceback (most recent call last):File "/var/task/lambda_function.py", line 36, in lambda_handler
xml_output = subprocess.check_output(["./mediainfo", "--full", "--output=XML", signed_url])
File "/usr/lib64/python2.7/subprocess.py", line 566, in check_output process = Popen(stdout=PIPE, *popenargs, **kwargs)
File "/usr/lib64/python2.7/subprocess.py", line 710, in __init__ errread, errwrite) File "/usr/lib64/python2.7/subprocess.py", line 1335, in _execute_child raise child_exception
OSError: [Errno 13] Permission denied

Lambda code

import logging
import subprocess

import boto3

SIGNED_URL_EXPIRATION = 300 # The number of seconds that the Signed URL is valid
DYNAMODB_TABLE_NAME = "TechnicalMetadata"
DYNAMO = boto3.resource("dynamodb")

logger = logging.getLogger('boto3')

def lambda_handler(event, context):

:param event:
:param context:
# Loop through records provided by S3 Event trigger
for s3_record in event['Records']:
logger.info("Working on new s3_record...")
# Extract the Key and Bucket names for the asset uploaded to S3
key = s3_record['s3']['object']['key']
bucket = s3_record['s3']['bucket']['name']
logger.info("Bucket: {} \t Key: {}".format(bucket, key))
# Generate a signed URL for the uploaded asset
signed_url = get_signed_url(SIGNED_URL_EXPIRATION, bucket, key)
logger.info("Signed URL: {}".format(signed_url))
# Launch MediaInfo
# Pass the signed URL of the uploaded asset to MediaInfo as an input
# MediaInfo will extract the technical metadata from the asset
# The extracted metadata will be outputted in XML format and
# stored in the variable xml_output
xml_output = subprocess.check_output(["./mediainfo", "--full", "--output=XML", signed_url])
logger.info("Output: {}".format(xml_output))
save_record(key, xml_output)

def save_record(key, xml_output):
Save record to DynamoDB

:param key: S3 Key Name
:param xml_output: Technical Metadata in XML Format
logger.info("Saving record to DynamoDB...")
'keyName': key,
'technicalMetadata': xml_output
logger.info("Saved record to DynamoDB")

def get_signed_url(expires_in, bucket, obj):
Generate a signed URL
:param expires_in: URL Expiration time in seconds
:param bucket:
:param obj: S3 Key name
:return: Signed URL
s3_cli = boto3.client("s3")
presigned_url = s3_cli.generate_presigned_url('get_object', Params={'Bucket': bucket, 'Key': obj},
return presigned_url


I'm fairly certain that this is a restriction imposed by the lambda execution environment, but it can be worked around by executing the script through the shell.
Try providing shell=True to your subprocess call:

xml_output = subprocess.check_output(["./mediainfo", "--full", "--output=XML", signed_url], shell=True)