jsonnet-libs / k8s

Code generator for Jsonnet Kubernetes libraries.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Version discovery and automated updates

yuriy-yarosh opened this issue · comments

It would've been nice to use the Github API and go-git to query the existing project repos for newer releases and version packages, emit a separate versions.libsonnet file for more manageable version control.

Then create a separate Makefile build script to update all the version files on demand, and automate all the further updates.

I could make a PR with updated lib versions, for now.

It's important to compare all CRDs between release versions and remove the duplicates, while keeping the most recent ones. It's pretty pointless to process multiple releases with the same CRDs, over and over.

Incremental versions table extraction would also be great.

I'm not sure what you are optimizing, this simple model of generating a library per software version keeps maintenance low. Doing something smarter inevitably brings more code that we need to maintain.

It would be best to consolidate all the config.jsonnet file structure and the version matrices - automate the updates with a dedicated build task. Updating current versions just helped me with getting a grasp over general structure and styling inconsistencies.

So, I'll be making some ammends to extract versioning into a separate version.json files for further automation, if you don't mind.

Please do this for a single lib first so we can come to an agreement before putting all the work into the other libs too.

I know there is quite a bit repetition, but DRY isn't always better. Keep in mind that most libraries move at different velocities, depending on their upstream project and popularity.

@Duologic If we're codegenerating the Same CRD's helpers over and over for different software versions - it makes sense to skip the generation of duplicates entirely. Keep only the first or the last version where the respective CRD changes were introduced.

I'm proposing to extract a versions.json file out of config.jsonnet and write a separate build tool pass, for instance, as an updater CLI pkg, which will query the respective git repos and re-populate versions.json files for every lib/* automatically, so we won't need any manual updates and double-checking CRD schema's etc.

I'm up to adding even more libs in here, but without proper automated version management policies further support may become cumbersome, to say the least.

@Duologic If we're codegenerating the Same CRD's helpers over and over for different software versions - it makes sense to skip the generation of duplicates entirely. Keep only the first or the last version where the respective CRD changes were introduced.

I don't think there is much advantage in comparing CRDs. Simply rendering the library if the version number changes is cheap and we don't need to maintain the code that does the comparing.

I'm proposing to extract a versions.json file out of config.jsonnet and write a separate build tool pass, for instance, as an updater CLI pkg, which will query the respective git repos and re-populate versions.json files for every lib/* automatically, so we won't need any manual updates and double-checking CRD schema's etc.

I see how this could potentially be useful. However beware that different projects implement versioning differently, at the surface they may seem to do it the same way but subtle differences can turn this into a lot of work.

Generally when I'm updating a piece of software on our systems, I take the update of the jsonnet library into account, so I have not worried much about automating that part, major version jumps usually requires some person to look at the changelog anyway.

I'm up to adding even more libs in here, but without proper automated version management policies further support may become cumbersome, to say the least.

Which pieces of software are you using that you would like to add? I would very much prefer to have less libraries that are actually used over many libraries that nobody uses. Again, lots of code requires lots of maintenance.

Sorry if I sound negative about this, I'm just being cautious.