Minimal bootstrapped version.
beporter opened this issue · comments
Write a shell script that:
- Fetches the latest default ZIP file from puphet.com.
- Unpacks the zip into the local folder. (A subfolder maybe? We could keep the tooling in the root then.)
- Commits all changes unceremoniously.
- Adds a semver tag for the new commit that is one point release (
1.0.x
) greater than the largest in the repo. - Pushes the commit and the tag to the remote.
Additional work:
- Write a README explaining what this is and how it works.
- Write a minimal composer.json file to make this repo a Packagist package.
- Write a minimal post-install composer script that will: (This may be the trickiest part.)
- Move/overwrite(?) the
ROOT/vendor/loadsys/puphpet-release/release/Vagrantfile
toROOT/Vagrantfile
- Move/overwrite(?)
ROOT/vendor/loadsys/puphpet-release/release/puphpet/
toROOT/puphpet/
. - Replace the stock
ROOT/puphpet/config.yaml
(that comes with our composer package) with the (customized) version fromROOT/puphpet.yaml
(???), if present.
- Move/overwrite(?) the
- Change this project's composer.json to use that custom composer installer.
Final steps:
- Make the repo public.
- Enable the packagist web hook.
- Set this repo up somewhere and schedule the script described above as a cron job.
- Test it in a few target projects, and open issues for things that don't work correctly or need improvement (
exec-*
scripts, for example.)
What am I forgetting?
Info on creating a composer custom installer, which if I'm reading correctly, would have to be its own repo: https://getcomposer.org/doc/articles/custom-installers.md
I'm going to see what I can do with the first half of this list today since shell scripting is kind of up my alley.
The scraper script is largely complete now, and will auto-tag new minor versions. Currently the pushing back to origin
is disabled until I can test things a little more.
We're about ready for a composer installer that knows what to do with our release/
folder now.
The only thing hold us up from scheduling the scraper to run regularly on our testing server is #3. We need that clone of this repo to auto-update itself before it fetches a new puphpet release and tries to push it back to the remote (GitHub).