Guidotti / Schmid / Longo | Explainable Artificial Intelligence | Buch | 978-3-032-08329-6 | www.sack.de

Buch, Englisch, 420 Seiten, Format (B × H): 155 mm x 235 mm, Gewicht: 663 g

Reihe: Communications in Computer and Information Science

Guidotti / Schmid / Longo

Explainable Artificial Intelligence

Third World Conference, xAI 2025, Istanbul, Turkey, July 9-11, 2025, Proceedings, Part IV
Erscheinungsjahr 2025
ISBN: 978-3-032-08329-6
Verlag: Springer

Third World Conference, xAI 2025, Istanbul, Turkey, July 9-11, 2025, Proceedings, Part IV

Buch, Englisch, 420 Seiten, Format (B × H): 155 mm x 235 mm, Gewicht: 663 g

Reihe: Communications in Computer and Information Science

ISBN: 978-3-032-08329-6
Verlag: Springer


This open access five-volume set constitutes the refereed proceedings of the Second World Conference on Explainable Artificial Intelligence, xAI 2025, held in Istanbul, Turkey, during July 2025. 


The 96 revised full papers presented in these proceedings were carefully reviewed and selected from 224 submissions. The papers are organized in the following topical sections:

Volume I:

Concept-based Explainable AI; human-centered Explainability; explainability, privacy, and fairness in trustworthy AI; and XAI in healthcare.

Volume II:

Rule-based XAI systems & actionable explainable AI; features importance-based XAI; novel post-hoc & ante-hoc XAI approaches; and XAI for scientific discovery.

Volume III:

Generative AI meets explainable AI; Intrinsically interpretable explainable AI; benchmarking and XAI evaluation measures; and XAI for representational alignment.

Volume IV:

XAI in computer vision; counterfactuals in XAI; explainable sequential decision making; and explainable AI in finance & legal frameworks for XAI technologies.

Volume V:

Applications of XAI; human-centered XAI & argumentation; explainable and interactive hybrid decision making; and uncertainty in explainable AI.

Guidotti / Schmid / Longo Explainable Artificial Intelligence jetzt bestellen!

Zielgruppe


Research

Weitere Infos & Material


XAI in Computer Vision.- Comparing XAI Explanations and Synthetic Data Augmentation Strategies in Neuroimaging AI.- Superpixel Correlation for Explainable Image Classification.- On Background Bias of Post-Hoc Concept Embeddings in Computer Vision DNNs.- Explaining Vision GNNs: A Semantic and Visual Analysis of Graph-based Image Classification.- Counterfactuals in XAI.- HalCECE: A Framework for Explainable Hallucination Detection through Conceptual Counterfactuals in Image Captioning.- Diffusion Counterfactuals for Image Regressors.- Mitigating Text Toxicity with Counterfactual Generation.- Guiding LLMs to Generate High-Fidelity and High-Quality Counterfactual Explanations for Text Classification.- Exploring Ensemble Strategies for Graph Counterfactual Explanations.- Explainable Sequential Decision Making.- Leveraging XAI Techniques for Context-Aware Energy Consumption Forecasting.- ConformaSegment: A Conformal Prediction-Based, Uncertainty-Aware, and Model-Agnostic Explainability Framework for Time-Series Forecasting.- FLEXtime: Filterbank Learning to Explain Time Series.- From Text to Space: Mapping Abstract Spatial Models in LLMs during a Grid-World Navigation Task.- Class-Dependent Perturbation Effects in Evaluating Time Series Attributions.- Explainable AI in Finance & Legal Frameworks for XAI Technologies.- XAI In Fraud Detection: A Causal Perspective.- Detecting Fraud in Financial Networks: A Semi-Supervised GNN Approach with Granger-Causal Explanations.- Legal Requirements, Trust Issues and Engineering Challenges - a Multi-Disciplinary Case for User-Specific Explainability.- Explainable Fairness in Mortgage Lending.- Cyber Risk Management with Time Varying Artificial Intelligence Models.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.