ionicsolutions / do-you-follow-what-im-explaining

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Do You Follow What I’m Explaining? A Practitioner’s Guide to Opening the AI Black Box for Humans

Kilian Kluge @ PyData Global 2022, December 1st, 12:00 UTC, Track II

Do You Follow What I'm Explaining? A Practitioner’s Guide to Opening the AI Black Box for Humans

Slides (PDF)

Abstract

Methods and techniques from the realm of artificial intelligence (AI), such as machine learning, find their way into ever more software and devices. As more people interact with these highly complex and opaque systems in their private and professional lives, there is a rising need to communicate AI-based decisions, predictions, and recommendations to their users.

So-called “interpretability” or “explainability” methods claim to allow insights into the proverbial “black boxes.” Many data scientists use tools like SHAP, LIME, or partial dependence plots in their day-to-day work to analyze and debug models.

However, as numerous studies have shown, even experienced data scientists are prone to interpret the “explanations” generated by these tools in ways that support their pre-existing beliefs. This problem becomes even more severe when “explanations” are presented to end-users in hopes of allowing them to assess and scrutinize an AI system’s output.

In this talk, we’ll explore the problem space using the example of counterfactual explanations for price estimates. Participants will learn how to employ user studies and principles from human-centric design to implement “explanations” that fulfill their purpose.

No prior data science knowledge is required to follow the talk, but a basic familiarity with the concept of minimizing an objective function will be helpful.

Further reading on topics covered in the talk

Explanation Fundamentals

Evaluating Explanations

  • Doshi-Velez & Kim (2017): Towards A Rigorous Science of Interpretable Machine Learning arXiv:1702.08608
  • Förster et al. (2020): Evaluating Explainable Artifical Intelligence – What Users Really Appreciate ECIS 2020 Proceedings

Meaningful Explanations

Misleading Explanations

Generating Meaningful Explanations for Real Estate Price Predictions

Explainable AI Slack Community

Affiliations

About