nasa / opera-sds-int

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

OPERA-SDSVnV-13: Functional validation of PGE for DSWX_HLS products

LucaCinquini opened this issue · comments

L3 requirement:
Verify that the PGE for generation of L3_DSWX_HLS products, wrapping the final version of the corresponding SAS algorithm, is delivered to PCM. Test that the PGE can be successfully executed using the RunConfig file specified in the PGE/PCM ICS.

Verify that all products generated by the L3_DSWX_HLS PGE contain a version number in their file name. Verify that the version number is incremented if and only the checksum of the output data product is changed. In other words, generating the same product twice with exactly the same software should NOT result in a version increment.

Verify that the DSWX_HLS PGE is accompanied by a corresponding Interface Specification Document (ICS) which should contain the complete information needed by the PCM system to invoke and monitor the PGE, including:
o The input and ancillary data sources
o The format of the RunConfig file that encapsulates all the iinput parameters
o The set of EC2 machines that the PGE should be running on
o The success and error status code
o Any other relevant information

Inspect the output location of a PGE execution to verify that it contains files with quality metrics information, with fields specific to each PGE

Verify that DAAC Metadata is correctly produced for DWSX_HLS products, and that this is done for each DSWX_HLS product.

Inspect the data products, log files and metadata generated by the PGE and verify that all timestamps use the Universal Time Coordinate (UTC) and include the "Z" to denote the Greenwich time zone.

TestRail link: https://cae-testrail.jpl.nasa.gov/testrail/index.php?/tests/view/3351586&group_by=cases:section_id&group_order=asc&group_id=69570

To Do:

o Expand Section 5, explain that the goal is to verify that the file counter is incremented

o Review English for syntax, misspellings

o Step 21: change title to "Validate counter is incremented upon re-ingestion".
o Please fix English syntax and misspellings in multiple places.

Some general comments:

  • All of the steps should start with some sentences describing what to do. Including the exact bash commands is nice too, but ultimately these should be instructions to a human.
  • Test files pulled down from Artifactory (or any other CM source) should not be modified during the testing. Ideally set them to read-only when setting up the test directory.
  • Each test case should have its own distinct directory. Everything specific to that test (input and output) is contained within it. When setting up the test, the tester should ensure that they are working with a "fresh checkout" of the test files.
    I suggest to name those directories by the test case ID. For example, call ${TEST_ROOT} the path to the working directories for testing. Then, everything for this test would be under ${WORKING_DIR}=${TEST_ROOT}/opera-sdsvnv-13/. Underneath that should be additional subdirs (e.g. input_dir, output_dir, expected_dir, etc). The tester should also ensure that the directories for this test are "clean" when starting out ("clean" = empty save for copies of the CM'd test files). Call that out in the instructions, and include that in the Expected Result column of the step. All input files, and generated output files, related to this test case should be under ${WORKING_DIR}
  • Anything that is TBD should be called out explicitly (it's OK to have some at this point). I'm not sure exactly what rich text options exist in TestRail, but make it stand out from the rest of the text somehow.

Step-specific comments:

For Step 3, there are additional things we should validate in the ICS:

  1. output files are described
  2. doc contains an example/sample runconfig, not just a specification of the format
  3. trigger conditions for the PGE
  4. expected time & computer resources for running the PGE
  5. description of the catalog metadata provided by the PGE
  6. (maybe more)

On Step 4, let's change the title to something like "prepare test data". We're not creating any test data in these steps.

Combine Steps 9 and 11.

Step 14 - lets change the name to "Product Inspection and Verification". Technically, the receivers of our system will validate that it does the right thing.

The steps in Section 4 could use some refinement. There should be a step comparing all of the output files against the expected output files (not by hand - we should use a script or tool for the comparisons). Essentially, everything should match exactly (filenames, metadata, science data), and the instructions should note where that's not true (e.g. the production time in the metadata), which should be a small list. For floating point comparisons, make sure to note the numerical tolerances (both absolute and relative tolerances) for the comparisons. Having a manual step to inspect the output is good to have as well, though that inspection could happen on the expected output files at any time during the test.

For Section 5, I'm not sure if we really need that as part of this test case. The functionality that would show should be covered by the previous steps. But let's talk that through during the next team meeting. Mayber there's something I'm missing.

commented

I have updated 13 but need additional changes/TBD’s: https://cae-testrail.jpl.nasa.gov/testrail/index.php?/tests/view/3351586

Step 1: please add the location where the ICS can be found, which should be:
https://artifactory-fn.jpl.nasa.gov/artifactory/webapp/#/artifacts/browse/tree/General/general/gov/nasa/jpl/opera/sds/pge/documentation

Step 13: must first set the WORKING_DIR
export WORKING_DIR=$HOME/mozart/ops/opera-sds-pcm/tools/opera-sdsvnv-13/

Step 13: do we need brackets around the sub directories? For example, should "<runconfig_dir>" be simply "runconfig_dir"?

Step 13: the "-v" are not placed in the newline, I think the command must be reformatted?

Step 14: must "cd" into the directory then do "tail ..."

Step 19: add "TBD: which fields???"

Remove section 5 and in section 4 verify that the counter in the file names matches what is in the RunConfig.

It seems like not all of the above changes have been executed?

commented

I have removed section5 and added product counter check in step4
https://cae-testrail.jpl.nasa.gov/testrail/index.php?/tests/view/3351586

Looks good, thanks, closing.