score-p / scorep_binding_python

Allows tracing of python code using Score-P

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Use pytest instead of unittest

Flamefire opened this issue · comments

Pytest is the superior alternative to unittest, but it needs to be installed first. However I'd recommend using it anyway:

I found myself wanting to write a test that is executed for each instrumenter. Using the default unittest framework is cumbersome for that or would not provide good information on failure. With pytest it could look like this:

INSTRUMENTERS = ('scorep_profile', 'scorep_trace')
class TestScorepBindingsPython:
    @pytest.mark.parametrize('instrumenter', INSTRUMENTERS)
    def test_foo(self, instrumenter):
        print(instrumenter)

This will create multiple tests, one for each instrumenter, and is called test parametrization: https://docs.pytest.org/en/latest/parametrize.html#parametrize-basics

We could go further and derive a custom decorator foreach_instrumenter as the above @pytest.mark.parametrize('instrumenter', INSTRUMENTERS) should likely be used for many tests

Final note: Pytest provides support for assert foo == bar style assertions which automatically expand to the values on error, so a shorter version of self.assert_equal(foo, bar) which I find much more readable and faster to write. The current unittest tests run without change in pytest so migration can be gradual or on a as-needed basis

I am in favor of using pytest. However, I like to use the tests to debug things, as it avoids setting up the Score-P environment everytime. How would you run a certain test using a ceratin instrumenter from the command-line?

Best,

Andreas

Do you mean a specific test from test.py or one of the tests files invoked from there? If the latter: They don't change so same as now.

For running a specific test: E.g. VSCode has an integration of pytest which allows you to run a specific test with a given parameter from the IDE. I assume a commandline option exists to filter the tests to a given parameter as new test cases with new names are generated by the decorator. They are named like test_demo2[advanced], i.e. with the parameter in brackets. So I expect that it is possible to execute the test with pytest -k 'test_demo2[advanced]' test.py from the commandline.

Related to that: I'd also rename all files stripping their test_ prefixes for 2 reasons:

  • Running pytest test (the folder) collects all files with a test prefix and gathers all methods and classes with a test prefix in those creating a test case for each of them. So no inheriting is required. The current prefixed names would confuse pytest and fail as the scorep import likely won't work
  • I was also confused as I expected each file to contain a test case, but they were files to be executed. Given that they are already in a test folder there is no need to prefix them with test avoiding confusion of new developers.

This also means we could have multiple, smaller and more focused test_ case files when required instead of growing that one test.py bigger and bigger as test cases are added

To allow evaluation: Would you agree to the renaming of the test files? Then I'll open a simple PR where I'll only do that.

To allow evaluation: Would you agree to the renaming of the test files? Then I'll open a simple PR where I'll only do that.

yes. What do you thin about a folder test_cases or something to group them?

For running a specific test: E.g. VSCode has an integration of pytest which allows you to run a specific test with a given parameter from the IDE. I assume a commandline option exists to filter the tests to a given parameter as new test cases with new names are generated by the decorator. They are named like test_demo2[advanced], i.e. with the parameter in brackets. So I expect that it is possible to execute the test with pytest -k 'test_demo2[advanced]' test.py from the commandline.

I have never used VS Code ;-). what I usually do is using

python -m unittest test.TestClass.test_case or something, which will execute the test, where I had an Issue.

This also means we could have multiple, smaller and more focused test_ case files when required instead of growing that one test.py bigger and bigger as test cases are added

Sounds Interesting.

yes. What do you thin about a folder test_cases or something to group them?

See #81 for what it would look like. I think having (or not having) the test_ prefix is enough to group them. I don't see a benefit of introducing another folder. It also makes sense to run python -m scorep test/someexample.py and the distinction with the (missing) prefix allows to know how the files are intended to be run

I have never used VS Code ;-)

This was just to provide an argument that running a single test case is possible

python -m unittest test.TestClass.test_case or something, which will execute the test, where I had an Issue.

So that would become pytest -k test_case[profile] (or similar, can check after #81 or in that tree to confirm). Does this meet your needs?

See #81 for what it would look like. I think having (or not having) the test_ prefix is enough to group them. I don't see a benefit of introducing another folder. It also makes sense to run python -m scorep test/someexample.py and the distinction with the (missing) prefix allows to know how the files are intended to be run

I am not sure we are talking about the same thing. The Idea ist to group the files, that are executed in a separate folder. Probably I misused words here. The folder would look something like:

test/
--> test_scorep.py
--> test_whatever_else.py
--> cases/
    --> instrumentation.py
    --> mpi.py
    --> call_main.py
    --> context.py

So the actual test and the files executed are sperate.

So that would become pytest -k test_case[profile] (or similar, can check after #81 or in that tree to confirm). Does this meet your needs?

yes

Understood. What I wanted to say is that the extra folder is not required as the grouping is already achieved by the test_ prefix. So like https://github.com/Flamefire/scorep_binding_python/tree/remove_test_prefix/test or:

test/
--> test_scorep.py
--> test_whatever_else.py
--> instrumentation.py
--> mpi.py
--> call_main.py
--> context.py

But I guess your solution is more useful as especially with pytest we might want test helper modules. Pytest has great dependency-injection style fixtures. Something like (IIRC):

def myfixture():
  name = create_tempdir()
  yield name
  remove_tree(name)

def test_something(myfixture):
  print(os.path.join(myfixture, 'foo'))

Pytest will then automatically call myfixture, put the result of yield into the parameter and call the rest of the fixture at the end. Point being: We might want such general fixtures for test setup/teardown in separate files and not mix them with the cases

So that would become pytest -k test_case[profile]

I just validated this and it works exactly like that. Only downside: The test must be converted to pytest style tests as the parametrization doesn't work with unittest subclasses: https://docs.pytest.org/en/latest/unittest.html#pytest-features-in-unittest-testcase-subclasses

But it should be straight forward to do and even reduce the amount of code.

Merged #81 . Feel free to modify the CI builds and move to pytest.

As this is done, I close the Issue.