By accessing or using this site you accept and agree to our Terms and Conditions | As an Amazon Associate, we earn from qualifying purchases
Inside this Book
If you make use of this material, you may credit the authors as follows:
Holzinger Andreas et al. (Editors), "xxAI - Beyond Explainable AI", Springer Nature, 2022, DOI: 10.1007/978-3-031-04083-2, License: http://creativecommons.org/licenses/by/4.0/
This is an open access book. Statistical machine learning (ML) has triggered a renaissance of artificial intelligence (AI). While the most successful ML models, including Deep Neural Networks (DNN), have developed better predictivity, they have become increasingly complex, at the expense of human interpretability (correlation vs. causality). The field of explainable AI (xAI) has emerged with the goal of creating tools and models that are both predictive and interpretable and understandable for humans. Explainable AI is receiving huge interest in the machine learning and AI research communities, across academia, industry, and government, and there is now an excellent opportunity to push towards successful explainable AI applications. This volume will help the research community to accelerate this process, to promote a more systematic use of explainable AI to improve models in diverse applications, and ultimately to better understand how current explainable AI methods need to be improved and what kind of theory of explainable AI is needed. After overviews of current methods and challenges, the editors include chapters that describe new developments in explainable AI. The contributions are from leading researchers in the field, drawn from both academia and industry, and many of the chapters take a clear interdisciplinary approach to problem-solving. The concepts discussed include explainability, causability, and AI interfaces with humans, and the applications include image processing, natural language, law, fairness, and climate science.
Computer Science, Informatics, Conference Proceedings, Research, Applications
Rights | License
Except where otherwise noted, this item has been published under the following license:
If you believe that this publication infringes copyright, please contact us at email@example.com and provide relevant details so that we can investigate your claim.