silx-kit / hdf5plugin

Set of compression filters for h5py

Home Page:http://www.silx.org/doc/hdf5plugin/latest/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

import hdf5plugin should not print "filter already loaded, skip it."

lacker opened this issue · comments

I don't think it's appropriate for a library to print out a warning message by default at import time when nothing has gone wrong.

For example, some python environments may require hdf5plugin to be imported to get bitshuffle, and some may not. If you are writing your code to work on either one of those environments, simply importing hdf5plugin is the right thing to do. It isn't a problem that should be warned about until you fix it.

I know you can turn the log level of the library down, but that isn't something you should have to put in every script that uses the hdf5 format.

I think logger level INFO should be enough. What do you think @t20100?

I am sorry, I do not agree: this warning is useful to spot overloaded environment variable with (deprecated) version of some plugins. The typical one being an ancient bitshuffle compiled without multi-threading.

All I can say is that Numpy can run with multiple versions of mathematical libraries (none, blass, MKL,...) and it does not spot a warning because of using one or another one. If one wants to know the one used, one has to explicitly look for it.

Python guidelines suggest that info level should be used to "Report events that occur during normal operation of a program". Whereas warning level is supposed to be for "An indication that something unexpected happened, or indicative of some problem in the near future." Importing hdf5plugin when some plugins are already installed is normal operation. The issue is not avoidable, in an environment where some plugins are already installed; it is not unexpected, nor does it indicate a problem. Thus "info" level seems more appropriate.

https://docs.python.org/3/howto/logging.html

Whether already loaded filters will raise an issue or not depends on the installation.
Here is a situation which causes problem when filters are already loaded: If you have the bitshuffle filter installed on the system (and so linked to the system libhdf5) and you are using a virtualenv (or install packages with --user) with h5py installed from a wheel (which bundles and links to its own libhdf5), you can get some errors when compressing to bitshuffle.
BTW, this link issue is why we made this hdf5plugin plugin in the first place.

In this case, this message is "indicative of some problem in the near future", yet in other situations this is not leading to an error and it just "reports events that occur during normal operation of a program".

We've already considered switching it to info level (#157 (comment)), and I'm not sure users will relate an unclear error message like ValueError: Unable to create dataset (error during user callback) later on to this initial warning, so +1 to turn this into "info" level.