No Cover Image

Conference Paper/Proceeding/Abstract 7 views

Maintaining Coherence in Explainable AI: Strategies for Consistency Across Time and Interaction

Alan Dix Orcid Logo, Tommaso Turchi Orcid Logo, Ben Wilson Orcid Logo, Alessio Malizia Orcid Logo, Anna Monreale Orcid Logo, Matt Roach Orcid Logo

SYNERGY 2025 – Designing and Building Hybrid Human–AI Systems

Swansea University Authors: Ben Wilson Orcid Logo, Matt Roach Orcid Logo

Abstract

Can we create explanations of artificial intelligence and machine learning that have some level of consistency over time as we might expect of a human explanation? This paper explores this issue, and offers several strategies for either maintaining a level of consistency or highlighting when and why...

Full description

Published in: SYNERGY 2025 – Designing and Building Hybrid Human–AI Systems
Published: CEUR-WS.org 2025
Online Access: https://ceur-ws.org/Vol-4074/short9-5.pdf
URI: https://cronfa.swan.ac.uk/Record/cronfa71398
Abstract: Can we create explanations of artificial intelligence and machine learning that have some level of consistency over time as we might expect of a human explanation? This paper explores this issue, and offers several strategies for either maintaining a level of consistency or highlighting when and why past explanations might appear inconsistent with current decisions.
Keywords: human-AI interaction, explainable AI, synergistic human-AI systems, user interface, artificial intelligence, design, adaptive interfaces, user experience
College: Faculty of Science and Engineering
Funders: Tango Project