python-coincidence / coincidence

Helper functions for pytest.

Home Page:https://coincidence.readthedocs.io/en/latest

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

0.6.2: pytest is failing in `tests/test_params.py::testing_boolean_values` unit

kloczek opened this issue · comments

I'm packaging your module as an rpm package so I'm using the typical PEP517 based build, install and test cycle used on building packages from non-root account.

  • python3 -sBm build -w --no-isolation
  • because I'm calling build with --no-isolation I'm using during all processes only locally installed modules
  • install .whl file in </install/prefix>
  • run pytest with PYTHONPATH pointing to sitearch and sitelib inside </install/prefix>

Here is pytest output:

Here is list of installed modules in build env

Package                       Version
----------------------------- -----------------
alabaster                     0.7.12
apeye                         1.2.0
appdirs                       1.4.4
asn1crypto                    1.5.1
attrs                         22.1.0
autodocsumm                   0.2.9
Babel                         2.11.0
bcrypt                        3.2.2
beautifulsoup4                4.11.1
Brlapi                        0.8.3
build                         0.9.0
CacheControl                  0.12.11
cffi                          1.15.1
charset-normalizer            3.0.1
click                         8.1.3
consolekit                    1.4.1
contourpy                     1.0.6
cryptography                  38.0.4
cssselect                     1.1.0
cssutils                      2.6.0
cycler                        0.11.0
deprecation                   2.1.0
deprecation-alias             0.3.1
dict2css                      0.3.0
dist-meta                     0.6.0
distro                        1.8.0
dnspython                     2.2.1
docutils                      0.19
dom_toml                      0.6.0
domdf-python-tools            3.3.0
exceptiongroup                1.0.0
extras                        1.0.0
extras-require                0.4.3
fixtures                      4.0.0
fonttools                     4.38.0
gpg                           1.17.1-unknown
handy-archives                0.1.2
html5lib                      1.1
idna                          3.4
imagesize                     1.4.1
importlib-metadata            5.1.0
iniconfig                     1.1.1
Jinja2                        3.1.2
kiwisolver                    1.4.4
libcomps                      0.1.19
lockfile                      0.12.2
louis                         3.23.0
lxml                          4.9.1
MarkupSafe                    2.1.1
matplotlib                    3.6.2
mistletoe                     0.9.0
msgpack                       1.0.4
natsort                       8.0.2
numpy                         1.23.1
olefile                       0.46
packaging                     21.3
pbr                           5.9.0
pep517                        0.13.0
Pillow                        9.3.0
pip                           22.3.1
platformdirs                  2.5.2
pluggy                        1.0.0
ply                           3.11
pyasn1                        0.4.8
pyasn1-modules                0.2.8
pycparser                     2.21
Pygments                      2.13.0
PyGObject                     3.42.2
pyparsing                     3.0.9
pyproject-parser              0.5.0
pytest                        7.2.0
pytest-datadir                1.4.1
pytest-regressions            2.4.1
pytest-timeout                2.1.0
python-dateutil               2.8.2
pytz                          2022.4
PyYAML                        6.0
requests                      2.28.1
rpm                           4.17.0
ruamel.yaml                   0.17.21
ruamel.yaml.clib              0.2.6
scour                         0.38.2
setuptools                    65.6.3
shippinglabel                 1.4.1
six                           1.16.0
snowballstemmer               2.2.0
soupsieve                     2.3.2.post1
Sphinx                        5.3.0
sphinx_autodoc_typehints      1.19.4
sphinx-jinja2-compat          0.2.0
sphinx-prompt                 1.4.0
sphinx-pyproject              0.1.0
sphinx-tabs                   3.4.1
sphinx-toolbox                3.2.0
sphinxcontrib-applehelp       1.0.2.dev20221204
sphinxcontrib-devhelp         1.0.2.dev20221204
sphinxcontrib-htmlhelp        2.0.0
sphinxcontrib-jsmath          1.0.1.dev20221204
sphinxcontrib-qthelp          1.0.3.dev20221204
sphinxcontrib-serializinghtml 1.1.5
tabulate                      0.9.0
testtools                     2.5.0
toml                          0.10.2
tomli                         2.0.1
tpm2-pkcs11-tools             1.33.7
tpm2-pytss                    1.1.0
typing_extensions             4.4.0
urllib3                       1.26.12
webencodings                  0.5.1
wheel                         0.38.4
whey                          0.0.23
zipp                          3.11.0
+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-coincidence-0.6.2-2.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-coincidence-0.6.2-2.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.15, pytest-7.2.0, pluggy-1.0.0
Test session started at 18:51:37
rootdir: /home/tkloczko/rpmbuild/BUILD/coincidence-0.6.2, configfile: tox.ini
plugins: datadir-1.4.1, regressions-2.4.1, timeout-2.1.0
timeout: 300.0s
timeout method: signal
timeout func_only: False
collected 75 items

tests/test_fixtures.py ...s                                                                                                                                          [  5%]
tests/test_params.py F......                                                                                                                                         [ 14%]
tests/test_regressions.py .....................................                                                                                                      [ 64%]
tests/test_selectors.py ..sss.sssss.ss.s.s.ss.                                                                                                                       [ 93%]
tests/test_utils.py .....                                                                                                                                            [100%]

================================================================================= FAILURES =================================================================================
__________________________________________________________________________ testing_boolean_values __________________________________________________________________________

cls = <class '_pytest.runner.CallInfo'>, func = <function call_runtest_hook.<locals>.<lambda> at 0x7f7b91cdc280>, when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        """Call func, wrapping the result in a CallInfo.

        :param func:
            The function to call. Called without arguments.
        :param when:
            The phase in which the function is called.
        :param reraise:
            Exception or exceptions that shall propagate if raised by the
            function, instead of being wrapped in the CallInfo.
        """
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:339:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:260: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/_hooks.py:265: in __call__
    return self._hookexec(self.name, self.get_hookimpls(), kwargs, firstresult)
/usr/lib/python3.8/site-packages/pluggy/_manager.py:80: in _hookexec
    return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
/usr/lib/python3.8/site-packages/_pytest/runner.py:175: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:167: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1794: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/_hooks.py:265: in __call__
    return self._hookexec(self.name, self.get_hookimpls(), kwargs, firstresult)
/usr/lib/python3.8/site-packages/pluggy/_manager.py:80: in _hookexec
    return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

pyfuncitem = <Function testing_boolean_values>

    @hookimpl(trylast=True)
    def pytest_pyfunc_call(pyfuncitem: "Function") -> Optional[object]:
        testfunction = pyfuncitem.obj
        if is_async_function(testfunction):
            async_warn_and_skip(pyfuncitem.nodeid)
        funcargs = pyfuncitem.funcargs
        testargs = {arg: funcargs[arg] for arg in pyfuncitem._fixtureinfo.argnames}
        result = testfunction(**testargs)
        if hasattr(result, "__await__") or hasattr(result, "__aiter__"):
            async_warn_and_skip(pyfuncitem.nodeid)
        elif result is not None:
>           warnings.warn(
                PytestReturnNotNoneWarning(
                    f"Expected None, but {pyfuncitem.nodeid} returned {result!r}, which will be an error in a "
                    "future version of pytest.  Did you mean to use `assert` instead of `return`?"
                )
            )
E           pytest.PytestReturnNotNoneWarning: Expected None, but tests/test_params.py::testing_boolean_values returned MarkDecorator(mark=Mark(name='parametrize', args=('boolean_string, expected_boolean', [(True, True), ('True', True), ('true', True), ('tRUe', True), ('y', True), ('Y', True), ('YES', True), ('yes', True), ('Yes', True), ('yEs', True), ('ON', True), ('on', True), ('1', True), (1, True), (False, False), ('False', False), ('false', False), ('falSE', False), ('n', False), ('N', False), ('NO', False), ('no', False), ('nO', False), ('OFF', False), ('off', False), ('oFF', False), ('0', False), (0, False)]), kwargs={})), which will be an error in a future version of pytest.  Did you mean to use `assert` instead of `return`?

/usr/lib/python3.8/site-packages/_pytest/python.py:204: PytestReturnNotNoneWarning
=========================================================================== slowest 25 durations ===========================================================================

(25 durations < 0.005s hidden.  Use -vv to show these durations.)
========================================================================= short test summary info ==========================================================================
SKIPPED [1] tests/test_fixtures.py:33: Output differs on platforms where os.sep == '/'
SKIPPED [5] tests/test_selectors.py:36: Success
SKIPPED [5] tests/test_selectors.py:52: Success
SKIPPED [1] tests/test_selectors.py:67: Success
SKIPPED [1] tests/test_selectors.py:79: Success
SKIPPED [1] tests/test_selectors.py:91: Success
SKIPPED [1] tests/test_selectors.py:97: Success
FAILED tests/test_params.py::testing_boolean_values - pytest.PytestReturnNotNoneWarning: Expected None, but tests/test_params.py::testing_boolean_values returned MarkDecorator(mark=Mark(name='parametrize', args=('boolean_...
================================================================= 1 failed, 59 passed, 15 skipped in 0.56s =================================================================

gentle ping .. 🤔
I still see this unit failing in 0.6.4.

Closing as with 0.6.6 I'm no longer see that issue

+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-coincidence-0.6.6-2.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-coincidence-0.6.6-2.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/pytest -ra -m 'not network'
==================================================================================== test session starts ====================================================================================
platform linux -- Python 3.8.18, pytest-8.0.2, pluggy-1.3.0
Test session started at 02:16:27
rootdir: /home/tkloczko/rpmbuild/BUILD/coincidence-0.6.6
configfile: tox.ini
plugins: hypothesis-6.84.3, datadir-1.5.0, regressions-2.5.0, timeout-2.2.0
timeout: 300.0s
timeout method: signal
timeout func_only: False
collected 74 items

tests/test_fixtures.py ...s                                                                                                                                                           [  5%]
tests/test_params.py ......                                                                                                                                                           [ 13%]
tests/test_regressions.py .....................................                                                                                                                       [ 63%]
tests/test_selectors.py ..sss.sssss.ss.s.s.ss.                                                                                                                                        [ 93%]
tests/test_utils.py .....                                                                                                                                                             [100%]

=================================================================================== slowest 25 durations ====================================================================================

(25 durations < 0.005s hidden.  Use -vv to show these durations.)
================================================================================== short test summary info ==================================================================================
SKIPPED [1] tests/test_fixtures.py:35: Output differs on platforms where os.sep == '/'
SKIPPED [5] tests/test_selectors.py:36: Success
SKIPPED [5] tests/test_selectors.py:52: Success
SKIPPED [1] tests/test_selectors.py:67: Success
SKIPPED [1] tests/test_selectors.py:79: Success
SKIPPED [1] tests/test_selectors.py:91: Success
SKIPPED [1] tests/test_selectors.py:97: Success
============================================================================== 59 passed, 15 skipped in 0.39s ===============================================================================```
</details>