cncf / k8s-conformance

🧪CNCF K8s Conformance Working Group

Home Page:https://cncf.io/ck

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Conformance test names and descriptions are redundant, missing or inconsistent

spiffxp opened this issue · comments

ref: https://github.com/kubernetes/community/blob/master/contributors/devel/conformance-tests.md#sample-conformance-test

I would like all of the existing conformance tests to have consistent formatting, and descriptions. Someone should be able to read this doc and feel like all behaviors of a given resource are adequately captured in this doc.

I am interested first and foremost on these changes being made for 1.12 and forward. I am open to these changes being backported to earlier release branches just to update docs, so that by reading this report someone can see what new conformance functionality was added (vs. the entire doc changing). I am not open to any test functionality changes being backported.

I plan on having issues in k/k link back this as an umbrella issue


egs from https://github.com/cncf/k8s-conformance/blob/master/docs/KubeConformance-1.9.md

ServiceAccounts should allow opting out of API token automount

Has no description

configmap-in-env-field
Make sure config map value can be used as an environment variable in the container (on container.env field)

Title isn't human readable

Release : v1.9

Nowhere do any of these display what release they were added in.


One nit I have to pick is the redundancy we have in the example in the linked guidelines

  • the ginko Describe(...) title: Kubelet
  • the optional ginko Context(...) info: when scheduling a busybox command in a pod
  • the ginko It(...) behavior: it should print the output to logs
  • the recommended Testname comment: Kubelet: log output
  • the recommended Description comment: By default the stdout and stderr from the process being executed in a pod MUST be sent to the pod's logs.

That's five pieces of redundant info to somehow keep in sync, let alone the actual test code. It's worth noting the suggested comments aren't even in the codebase today, and the test in question doesn't actually verify the stderr part.

Can we simplify this at all before I unleash the hounds on standardizing everything?

I agree, Testnames, Description part the comment section should be iterated properly. To the best of my knowledge I have added description and tried grouping tests using test name prefixes, etc.

I do understand that a carefully crafted Describe+Context+It blocks should make a human readable string that the the tool can use in place for Description. But having a Description field in the comment section will allow you to describe the test in detail with additional information nicely formatted as bullet points, etc.
When promoting a test to conformance, the idea is to then to say to the customer everything he needs to know about the test through the description field (instead for going back and changing top down all the wording from the Describe block onwards).
BTW: if there is no description field, the tool will just pick up text from Describe, Context and It blocks. (see v1.11 document in the end for tests that do not have documentation)

Release tag is introduced recently for the same reason, a non technical user need not have to go through and find when the source is changed but instead can rely on the tag to figure out when the test is added followed by release numbers when changes to the test happened.