68b32 / 2s3

Stream / Pipe files to Amazon S3

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Perform multipart upload to Amazon S3 of data read from stdin.

Example: tar -C / -cpjO /home | 2s3 -k aws.key -b com-example-backup -o home.tar.bz2

Usage: 2s3 -k AWS_KEY_FILE -b BUCKET_NAME -o OBJECT_NAME [-s CHUNK_SIZE] [-t SECONDS]

Options:
  -h, --help            show this help message and exit
  -k AWS_KEY_FILE, --keyfile=AWS_KEY_FILE
                        File that contains: <AWS_KEY>:<AWS_SECRET> for S3
                        access.
  -b BUCKET_NAME, --bucket=BUCKET_NAME
                        Name of the target bucket.
  -o OBJECT_NAME, --object=OBJECT_NAME
                        Name of the target object.
  -s CHUNK_SIZE, --size=CHUNK_SIZE
                        Split upload in CHUNK_SIZE bytes. Default is 5 MiB.
  -t SECONDS, --time=SECONDS
                        Time in seconds to wait until retry upload a failed
                        part again. Default is 5.
  -d, --debug           Print debug information

About

Stream / Pipe files to Amazon S3

License:MIT License


Languages

Language:Python 100.0%