Eliciting Explainability Requirements for Safety-Critical Systems: A Nuclear Case StudyExperience Paper
This program is tentative and subject to change.
[Context&Motivation] Explainable autonomous systems are increasingly essential for engendering trust, especially when they are deployed in safety-critical scenarios. [Question/Problem] Despite the robust reliability needed in critical settings, there remains a gap between Explainable AI and Requirements Engineering (RE), raising the question: can current RE techniques sufficiently elicit explainability requirements and what characteristics do these requirements have? [Principal Ideas/Results] We examine whether established RE techniques can be used to elicit explainability requirements and analyse the characteristics of such requirements. We answer these questions in the context of a nuclear robotics case study focused on navigation and task scheduling missions. [Contribution] We contribute: (1) an experience report of eliciting explainability requirements, (2) categories for explainability requirements for explainable autonomous robotic systems and (3) practical guidance for applying our approach in other safety-critical domains.
This program is tentative and subject to change.
Thu 10 AprDisplayed time zone: Brussels, Copenhagen, Madrid, Paris change
14:00 - 15:30 | Research Track - Session R9 - RE for Safety-critical and Autonomous SystemsResearch Track at C2 - Sala Actes | ||
14:00 30mTalk | Extending Behavior Trees for Robotic Missions with Quality RequirementsTechnical Paper Research Track Razan Ghzouli Chalmers University of Technology & University of Gothenburg, Rebekka Wohlrab Chalmers University of Technology, Jennifer Horkoff Chalmers and the University of Gothenburg | ||
14:30 30mTalk | Sharper Specs for Smarter Drones: Formalising Requirements with FRETExperience Paper Research Track Oisin Sheridan Maynooth University, Leandro Buss Becker Automation and Systems Department, Federal University of Santa Catarina, Florianópolis, Brazil, Marie Farrell The University of Manchester, Matt Luckcuck University of Nottingham, UK, Rosemary Monahan | ||
15:00 30mTalk | Eliciting Explainability Requirements for Safety-Critical Systems: A Nuclear Case StudyExperience Paper Research Track Hazel Taylor The University of Manchester, Matt Luckcuck University of Nottingham, UK, Marie Farrell The University of Manchester, Caroline Jay Department of Computer Science, University of Manchester, M13 9PL, United Kingdom, Angelo Cangelosi The University of Manchester, Louise Dennis University of Manchester |