Towards Directive Explanations

Towards Directive Explanations

ACM CHI 2024 Presentation - Towards Directive Explanations

Crafting Explainable AI Systems for Actionable Human-AI Interactions

Session presented at ACM CHI 2024: https://chi2024.acm.org/

Pre-print of the paper: https://arxiv.org/abs/2401.04118

Access the presentation and poster from Zenodo: https://zenodo.org/records/11086580

Abstract

With Artificial Intelligence (AI) becoming ubiquitous in every application domain, the need for explanations is paramount to enhance transparency and trust among non-technical users. Despite the potential shown by Explainable AI (XAI) for enhancing understanding of complex AI systems, most XAI methods are designed for technical AI experts rather than non-technical consumers. Consequently, such explanations are overwhelmingly complex and seldom guide users in achieving their desired predicted outcomes. This paper presents ongoing research for crafting XAI systems tailored to guide users in achieving desired outcomes through improved human-AI interactions. This paper highlights the research objectives and methods, key takeaways and implications learned from user studies. It outlines open questions and challenges for enhanced human-AI collaboration, which the author aims to address in future work.

For more information please contact Aditya Bhattacharya: https://www.linkedin.com/in/aditya-bhattacharya-b59155b6/

Reference:

[1]. Aditya Bhattacharya. 2024. Towards Directive Explanations: Crafting Explainable AI Systems for Actionable Human-AI Interactions. In Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA '24). Association for Computing Machinery, New York, NY, USA, Article 442, 1–6. https://doi.org/10.1145/3613905.3638177

[2]. Bhattacharya, A. ACM CHI DC Presentation and Poster for - Towards Directive Explanations: Crafting Explainable AI Systems for Actionable Human-ai Interactions. Zenodo, 29 Apr. 2024, https://doi.org/10.1145/3613905.3638177.

Tags: , ,

Leave a Reply

Your email address will not be published. Required fields are marked *