interpretml / interpret-community

Interpret Community extends Interpret repository with additional interpretability techniques and utility functions to handle real-world datasets and workflows.

Home Page:https://interpret-community.readthedocs.io/en/latest/index.html

Repository from Github https://github.cominterpretml/interpret-communityRepository from Github https://github.cominterpretml/interpret-community

Object of type 'Timestamp' is not JSON serializable

lucazav opened this issue · comments

Hi all,

I'm trying to show the Explanation Dashboard in the following way:

raw_explanations = client.download_model_explanation(raw=True)

ExplanationDashboard(raw_explanations, explainer_setup_class.automl_pipeline, datasetX=explainer_setup_class.X_test_raw)

but I'm getting the following error:


TypeError Traceback (most recent call last)
in
3 print(raw_explanations.get_feature_importance_dict())
4
----> 5 ExplanationDashboard(raw_explanations, explainer_setup_class.automl_pipeline, datasetX=explainer_setup_class.X_test_raw)

/anaconda/envs/azureml_py36/lib/python3.6/site-packages/interpret_community/widget/explanation_dashboard.py in init(self, explanation, model, dataset, true_y, classes, features, port, use_cdn, datasetX, trueY, locale, public_ip, with_credentials)
261 if ExplanationDashboard.service.env != "cloud":
262 explanation_input.enable_predict_url()
--> 263 html = generate_inline_html(explanation_input, local_url)
264 ExplanationDashboard.explanations[str(ExplanationDashboard.model_count)] = explanation_input
265

/anaconda/envs/azureml_py36/lib/python3.6/site-packages/interpret_community/widget/explanation_dashboard.py in generate_inline_html(explanation_input_object, local_url)
294
295 def generate_inline_html(explanation_input_object, local_url):
--> 296 explanation_input = json.dumps(explanation_input_object.dashboard_input)
297 return ExplanationDashboard.default_template.render(explanation=explanation_input,
298 main_js=ExplanationDashboard._dashboard_js,

/anaconda/envs/azureml_py36/lib/python3.6/json/init.py in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw)
229 cls is None and indent is None and separators is None and
230 default is None and not sort_keys and not kw):
--> 231 return _default_encoder.encode(obj)
232 if cls is None:
233 cls = JSONEncoder

/anaconda/envs/azureml_py36/lib/python3.6/json/encoder.py in encode(self, o)
197 # exceptions aren't as detailed. The list call should be roughly
198 # equivalent to the PySequence_Fast that ''.join() would do.
--> 199 chunks = self.iterencode(o, _one_shot=True)
200 if not isinstance(chunks, (list, tuple)):
201 chunks = list(chunks)

/anaconda/envs/azureml_py36/lib/python3.6/json/encoder.py in iterencode(self, o, _one_shot)
255 self.key_separator, self.item_separator, self.sort_keys,
256 self.skipkeys, _one_shot)
--> 257 return _iterencode(o, 0)
258
259 def _make_iterencode(markers, _default, _encoder, _indent, _floatstr,

/anaconda/envs/azureml_py36/lib/python3.6/json/encoder.py in default(self, o)
178 """
179 raise TypeError("Object of type '%s' is not JSON serializable" %
--> 180 o.class.name)
181
182 def encode(self, o):

TypeError: Object of type 'Timestamp' is not JSON serializable

Is this a known bug?

I've just spun up a new Compute Instance to try this code, so I'm using the following packages:

  • ML Python SDK (1.16.0)
  • interpret-community (0.15.1)
  • interpret-core (0.2.1)

This issue should be fixed in this PR:

#341

timestamp is not json serializable so we convert it to string representation

The fix should be available in the recent release of interpret-community. Closing this issue.

Regards,