noqqe / terraform-aws-functionbeat

A Terraform module for Elastic Functionbeat to ship Cloudwatch logs

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

README

Terraform wrapper module to ship Cloudwatch Logs to Kibana via Functionbeat. See official Docs.
The official Functionbeat is based on Cloudformation and also ships with a deployment CLI. If you prefer to stick to Terraform you cannot use Functionbeat alongside your infrastructure code base. This module wrapps the base function to package the Functionbeat lambda and actually deploys via Terraform.

Requirements

Since this module executes a script ensure your machine has the following software available:

  • jq
  • curl
  • tar

Running under Alpine

ℹ️ The Functionbeat installer is not compatible with Alpine, due to missing libc. To be able to use this module on Alpine, eg. in a CI pipeline, you need to provide the missing dependencies. You can install libc6-compat using apk add --no-cache libc6-compat.

Simple example

For detailed example please refer to this blog post using Elasticsearch output Please note that output to Logstash is also possible, but in this example we use Elasticsearch.

resource "aws_security_group" "functionbeat_securitygroup" {
  name   = "Functionbeat"
  vpc_id = data.aws_vpc.vpc.id

  egress {
    from_port   = 443
    protocol    = "tcp"
    to_port     = 443
    description = "HTTPS"
    cidr_blocks = ["0.0.0.0/0"]
  }
}

module "functionbeat" {
  source = "git::ssh://git@github.com:PacoVK/functionbeat.git"

  application_name     = "crazy-test-application"
  functionbeat_version = "7.17.1"
  lambda_config = {
    name = "my-kibana-exporter"

    vpc_config = {
      vpc_id             = data.aws_vpc.vpc.id
      subnet_ids         = data.aws_subnets.private.ids
      security_group_ids = [aws_security_group.functionbeat_securitygroup.id]
    }

    output_elasticsearch = {
      hosts : ["https://your-endpoint:443"]
      protocol : "https"
      username : "elastic"
      password : "mysupersecret"
    }
  }
}

Advanced example

Head over to example/elasticsearch.tf to get an more advanced example.

Usage

Parameter Required Description
application_name X Name of the application to ship the logs from
functionbeat_version X Version to download and deploy of Functionbeat
lambda_config X Functionbeat and Lambda config (see below)
tags - Tags to add to all created AWS resources (see below)
lambda_reserved_concurrent_execution - Reserved concurrency (default: 5)
lambda_memory_size - Memory size (default: 128MB)
lambda_timeout - Timeout (default: 3s)
lambda_description - Description added to the Lambda (default: "Lambda function to ship cloudwatch logs to Kibana")
fb_log_level - Functionbeat loglevel, will be set as an ENV on the Lambda level for easy adjustion (default: info)
lambda_write_arn_to_ssm - Switch to control weather the actual Lambda ARN should be written to SSM (default:true)
fb_extra_configuration - HCL-Map with actual Functionbeat config (default: {})
fb_extra_tags - The tags of the shipper are included in their own field with each transaction published (default: [])
loggroup_name - Name of the Cloudwatch log group to be added as trigger for the function (default: null)
loggroup_filter_pattern - Regex pattern to filter logs which trigger the Lambda (default: "")

lambda_config (required)

You configure your lambda here.

  lambda_config = {
    name = "<NAME-OF-YOUR-LAMBDA>"
    vpc_config = {
      vpc_id = <TARGET-VPC>
      subnet_ids = <TARGET-SUBNET-IDS>
      security_group_ids = [<A-SECURITYGROUP-ID>]
    }
    # You can put any HCL-Map with valid Functionbeat config for Elasticsearch Output 
    output_elasticsearch = {
      hosts = ["https://your-endpoint:443"]
      protocol = "https"
      username = "elastic"
      password = "mysupersecret"
    }
  }

Converting YAML into HCL

You easily extend the Functionbeat reference by setting fb_extra_configuration. Just head over to the official Documentation. To ease you life make use of the online YAML to HCL converter to translate from YAML to valid HCL.

Example:

processors:
    - add_fields:
        target: project
        fields:
          name: myproject
          id: '574734885120952459'

becomes

processors = [
  {
    add_fields = {
      fields = {
        id = "574734885120952459"
        name = "myproject"
      }
      target = "project"
    }
  }
]

which results in the following module configuration

fb_extra_configuration = {
  processors = [
    {
      add_fields = {
        fields = {
          id = "574734885120952459"
          name = "myproject"
        }
        target = "project"
      }
    }
  ]
}

Outputs

This module exposes:

  • the functionbeat lambda ARN
  • if lambda_write_arn_to_ssm is set to true, the name of the actual created SSM parameter

Just get ahead for quick test

Requirement

  • Setup AWS config locally
  • Setup Terraform cli

In examples/ there is an advanced example. Simply checkout the module source and

cd examples/
terrafrom init
terraform apply -auto-approve

Clean up after you're done

terraform destroy -auto-approve

Integrate with serverless framework

You can easily attach cloudwatchlog groups of your serverless application, just by using the serverless-plugin-log-subscription.

  1. Use this module and install the Lambda, ensure lambda_write_arn_to_ssm is set to true, which is default.
module "functionbeat" {
  lambda_config = {
    name = "my-kibana-log-shipper"
  ...
}
  1. To attach all your Lambdas logs for your Serverless application add the following plugin config into your serverless.yml
custom:
  logSubscription:
    enabled: true
    destinationArn: '${ssm:my-kibana-log-shipper_arn}'

About

A Terraform module for Elastic Functionbeat to ship Cloudwatch logs


Languages

Language:HCL 88.2%Language:Shell 11.8%