RageZBla / bbr-pcf-pipeline-tasks

Concourse tasks to help automate bbr https://github.com/cloudfoundry-incubator/bosh-backup-and-restore

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

BBR PCF Pipeline Tasks

This is a collection of Concourse tasks for backing up a Pivotal Cloud Foundry installation using bbr..

All Foundations

PAS

PKS

Helper


Requirements

GitHub Account

For Concourse to pull the tasks it needs to reach out to GitHub. We use the SSH method to download the tasks from GitHub in the example pipelines and we strongly recommend that the HTTPS method is not used. Concourse typically polls GitHub for any changes to the target Git repo and the HTTPS method is subject to rate limits. The SSH method is not subject to the same rate limits as it authenticates the client against a GitHub user which has much higher limits.

Please create and add and SSH key to your GitHub account as this will needs to be used in the pipeline secrets.

Networking

To use any of these tasks, apart from export-om-installation, you will need either:

  • a Concourse worker with access to your Ops Manager private networks. You can find an example template for deploying an external worker in a different network to your Concourse deployment here
  • or, provide the OPSMAN_PRIVATE_KEY to use a SSH tunnel via the Ops Manager VM. This key is not required if your concourse worker has access to the Ops Manager private networks. Please note, using a SSH tunnel may increase the time taken to drain backup artifacts. Backup artifacts can be very large and using a SSH tunnel will be a significant overhead on network performance.

Disk space

The backup tasks will run bbr commands on your Concourse worker. Ensure that your Concourse workers have enough disk space to accommodate your backup artifacts.


Example pipelines

Example pipelines and secrets are provided to show how to use these tasks to back up PAS or PKS.

Triggers

Running regular backups (at least every 24 hours) and storing multiple copies of backup artifacts in different datacenters is highly recommended. The time Concourse resource can be added to the pipeline to trigger backups regularly.

Backup artifact storage

There are a variety of storage resources such as S3 that can be used to move backups to storage. A list of Concourse resources can be found here.

HTTP Proxies

BBR tasks for backing up deployments use the BOSH API and will result in HTTP requests to the director.

Setting the SET_NO_PROXY parameter on the tasks will result in a NO_PROXY environment variable being exported that contains the BOSH Director IP.

- task: bbr-backup-pas
  file: bbr-pipeline-tasks-repo/tasks/bbr-backup-pas/task.yml
  params:
    SKIP_SSL_VALIDATION: ((skip-ssl-validation))
    OPSMAN_URL: ((opsman-url))
    OPSMAN_USERNAME: ((opsman-username))
    OPSMAN_PASSWORD: ((opsman-password))
    OPSMAN_PRIVATE_KEY: ((opsman-private-key))    
    SET_NO_PROXY: true

Semantic Versioning

The inputs, outputs, params, filename, and filepath of all task files in this repo are part of a semantically versioned API. See our documentation for a detailed discussion of our semver API. See www.semver.org for an explanation of semantic versioning.

Pinning to a version

This repository has git tags that can be used to pin to a specific version. For example, here is how to pin to v1.0.0 using tag_filter:

resources:
- name: bbr-pipeline-tasks-repo
  type: git
  source:
    uri: https://github.com/pivotal-cf/bbr-pcf-pipeline-tasks.git
    branch: master
    tag_filter: v1.0.0

About

Concourse tasks to help automate bbr https://github.com/cloudfoundry-incubator/bosh-backup-and-restore

License:Apache License 2.0


Languages

Language:Shell 90.2%Language:Dockerfile 9.8%