ThreeCopies.com is a hosted service that regularly archives your server-side resources. We create three copies: hourly, daily and weekly.
What's interesting is that the entire product is will be written in EO,
a truly object-orented programming language.
The logo is made by Freepik from flaticon.com, licensed by CC 3.0 BY.
Each script is a bash scenario, which you design yourself. ThreeCopies just starts it regularly and records its output. These are some recommendations on how to design the script. There are three parts: input, package, and output. First, you collect some data from your data sources (input). Then, you compress and encrypt the data (package). Finally, you store the package somewhere (output).
We start your script inside
yegor256/threecopies
Docker container,
here is the
Dockerfile
.
If you don't want your script to be executed too frequently, you may put this code in front of it (to skip hourly executions, for example):
if [ "${period}" == "hour" ]; then exit 0; fi
To retrieve the data from a MySQL database use mysqldump:
mysqldump --lock-tables=false --host=www.example.com \
--user=username --password=password \
--databases dbname > mysql.sql
To download an entire FTP directory use wget:
wget --mirror --tries=5 --quiet --output-file=/dev/null \
--ftp-user=username --ftp-password=password \
ftp://ftp.example.com/some-directory
To package a directory use tar:
tgz="${period}-$(date "+%Y-%m-%d-%H-%M").tgz"
tar czf "${tgz}" some-directory
We recommend to use exactly that name of your .tgz
archives. The
${period}
environment variable is provided by our server to your
Docker container, it will either be set to hour
, day
, or week
.
To upload a file to Amazon S3, using s3cmd:
echo "[default]" > ~/.s3cfg
echo "access_key=AKIAICJKH*****CVLAFA" >> ~/.s3cfg
echo "secret_key=yQv3g3ao654Ns**********H1xQSfZlTkseA0haG" >> ~/.s3cfg
s3cmd --no-progress put "${tgz}" "s3://backup.example.com/${tgz}"
The tc-scripts
table contains all registered scripts:
fields:
login/H: GitHub login of the owner
name/R: Unique name of the script
bash: Bash script
hour: Epoch-sec when it recent hourly log was scheduled
day: Epoch-sec when it recent daily log was scheduled
week: Epoch-sec when it recent weekly log was scheduled
The tc-logs
table contains all recent logs:
fields:
group/H: Concatenated GitHub login and script name, e.g. "yegor256/test"
finish/R: Epoch-msec of the script finish (or MAX_LONG if still running)
login: GitHub login of the owner
period: Either "hour", "day", or "week"
ocket: S3 object name for the log
ttl: Epoch-sec when the record has to be deleted (by DynamoDB)
start: Epoch-msec time of the start
container: Docker container name
exit: Bash exit code (error if not zero)
mine (index):
login/H
finish/R
Just submit a pull request. Make sure mvn
passes.
(The MIT License)
Copyright (c) 2017 Yegor Bugayenko
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: the above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. The software is provided "as is", without warranty of any kind, express or implied, including but not limited to the warranties of merchantability, fitness for a particular purpose and non-infringement. In no event shall the authors or copyright holders be liable for any claim, damages or other liability, whether in an action of contract, tort or otherwise, arising from, out of or in connection with the software or the use or other dealings in the software.