jane / cloudtrail-efk

Sending cloudtrail logs from s3 to AWS Elasticsearch using Lambda

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

cloudtrail_efk

Takes Cloudtrail logs from S3 and puts them in Elasticsearch.

Setup:

IAM

You need to create some policies and attach them to a role. That will give your Lambda function the ability to execute. The easiest way to do this is to create a single IAM policy that grants all the permissions you need, and then attach that one policy to the Lambda role. (You could have a few policies—one for elasticsearch, one for S3, one for CloudWatch Logs—and then attach 3 policies to the one role)

The IAM policy allows 3 things: Reading your S3 bucket to get cloudtrail, posting records to your ElasticSearch cluster, and CloudWatch Logs for writing any errors or logging.

  1. Edit the lambda-iam-policy.json file
    1. Add in the bucket name for your bucket.
    2. Add in the domain name you assigned to your ElasticSearch domain.
  2. Create an IAM policy, name it like cloudtrail_efk and set its contents to be the lambda-iam-policy.json file.
  3. Create an AWS IAM role.
  4. Choose Lambda as the service for the role.
  5. Attach the policy you created.
  6. Attach AWSLambdaVPCAccessExecutionRole as well.

Lambda

  1. Pull this repo
  2. pip install requests -t .
  3. Make any changes you need
  4. Tag appropriately (use semver)
  5. zip -r cloudtrail_efk.zip cloudtrail2ES.py *
  6. Create a new lambda in the AWS console with Python 3
  7. Set the handler to be cloudtrail2ES.lambda_handler
  8. Fill in your environment variables. Example below
  9. Test the lambda function:
  • Edit test-cloudtrail-event.json to have the correct bucket and a real key (filename in S3)
  • Try the test and make sure your data is showing up in ES
  1. Publish a Lambda version that matches your Git tag

Example environment variables:

ES_INDEX: cloudtrail
ES_HOST: foo.example.com:9200
ES_USER: cloudtrail_lambda
ES_PASS: very_good_password

S3

  1. Go to your S3 Bucket in the console.
  2. Click on Properties
  3. Click on Events
  4. Click + Add Notification
  5. Name the event
  6. For Events, select "All object create events"
  7. For Prefix, enter an appropriate prefix. Probably AWSLogs/
  8. For Send to, select the lambda function you created
  9. Click Save.

About

Sending cloudtrail logs from s3 to AWS Elasticsearch using Lambda


Languages

Language:Python 100.0%