A simple demo project which uploads the python files and csv files in a S3 bucket, later a lambda function is triggered which reads the contents of the csv and upload it to a dynamodb table.
Run powershell as administrator to install software
- Python
- Pycharm Professional
- AWS CLI
pip install awscli
- Pipenv
pip install pipenv
- Boto3
pip install boto3
- Navigate to the Users page.
- Create an AWS IAM user whose credentials you're using.
- Under the 'Permissions' section, attach the policy called 'AdministratorAccess '
- Navigate to the Roles page.
- Create a role.
- Under the 'Permissions' section, attach the policy called 'AmazonS3FullAccess' 'AmazonDynamoDBFullAccess' 'AWSOpsWorksCloudWatchLogs'
- Create a Bucket with an unique bucket name.
- Select Region
ap-south-1
[All S3 bucket is is Global]
- Select Region
ap-south-1
[DynamoDB Tables are Region Specific] - Create a table named
movies_characters
with primary keyactor_id
as a number.
- Select Region
ap-south-1
[Lambda Functions are Region Specific] - Create a new lambda function with Author from scratch and named
csv-to-dynamodb
on runtime selectPython 3.8
, on execution role use the existing role created back in IAM Roles. - Add trigger for our S3 Bucket, with Event type as
PUT
and Suffix as.csv
- Upload the
lambda_function.py
file in thecsv-to-dynamodb
lambda.
You need to set up your AWS security credentials before the code is able to connect to AWS. You can obtain the keys from the created IAM user's Security Credentials
.
Open up command prompt and write the following command aws configure
AWS Access Key ID : YOUR_KEY
AWS Secret Access Key : YOUR_SECRET
Default region name : ap-south-1
Default output format :
You can also set it up in the following directory (C:\Users\USER_NAME\.aws\
for Windows users) and saving the following lines in the file:
Next, set up credentials (in e.g. ~\.aws\credentials
):
[default]
aws_access_key_id = YOUR_KEY
aws_secret_access_key = YOUR_SECRET
Then, set up a default region (in e.g. ~\.aws\config
):
[default]
region=ap-south-1
This sample application connects to Amazon's S3
and uploads the python files and csv files in created bucket. Later the csv-to-dynamodb
lambda is invoked and the contents of the csv files are uploaded in movies_characters
dynamodb table. Change the upload_file_bucket
variable of dejavu.py
file with the S3 bucket name.
python dejavu.py