Please use this identifier to cite or link to this item:
http://hdl.handle.net/1893/36283
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zhou, Ryan | en_UK |
dc.contributor.author | Bacardit, Jaume | en_UK |
dc.contributor.author | Brownlee, Alexander | en_UK |
dc.contributor.author | Cagnoni, Stefano | en_UK |
dc.contributor.author | Fyvie, Martin | en_UK |
dc.contributor.author | Iacca, Giovanni | en_UK |
dc.contributor.author | McCall, John | en_UK |
dc.contributor.author | van Stein, Niki | en_UK |
dc.contributor.author | Walker, David | en_UK |
dc.contributor.author | Hu, Ting | en_UK |
dc.date.accessioned | 2024-10-08T00:01:25Z | - |
dc.date.available | 2024-10-08T00:01:25Z | - |
dc.identifier.uri | http://hdl.handle.net/1893/36283 | - |
dc.description.abstract | AI methods are finding an increasing number of applications, but their often black-box nature has raised concerns about accountability and trust. The field of explainable artificial intelligence (XAI) has emerged in response to the need for human understanding of AI models. Evolutionary computation (EC), as a family of powerful optimization and learning tools, has significant potential to contribute to XAI. In this paper, we provide an introduction to XAI and review various techniques in current use for explaining machine learning (ML) models. We then focus on how EC can be used in XAI, and review some XAI approaches which incorporate EC techniques. Additionally, we discuss the application of XAI principles within EC itself, examining how these principles can shed some light on the behavior and outcomes of EC algorithms in general, on the (automatic) configuration of these algorithms, and on the underlying problem landscapes that these algorithms optimize. Finally, we discuss some open challenges in XAI and opportunities for future research in this field using EC. Our aim is to demonstrate that EC is well-suited for addressing current problems in explainability and to encourage further exploration of these methods to contribute to the development of more transparent and trustworthy ML models and EC algorithms. | en_UK |
dc.language.iso | en | en_UK |
dc.publisher | Institute of Electrical and Electronics Engineers | en_UK |
dc.relation | Zhou R, Bacardit J, Brownlee A, Cagnoni S, Fyvie M, Iacca G, McCall J, van Stein N, Walker D & Hu T (2024) Evolutionary Computation and Explainable AI: A Roadmap to Transparent Intelligent Systems. <i>IEEE Transactions on Evolutionary Computation</i>. | en_UK |
dc.rights | © 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | en_UK |
dc.subject | Explainability | en_UK |
dc.subject | Interpretability | en_UK |
dc.subject | Evolutionary Computation | en_UK |
dc.subject | Machine Learning | en_UK |
dc.title | Evolutionary Computation and Explainable AI: A Roadmap to Transparent Intelligent Systems | en_UK |
dc.type | Journal Article | en_UK |
dc.citation.jtitle | IEEE Transactions on Evolutionary Computation | en_UK |
dc.citation.issn | 1941-0026 | en_UK |
dc.citation.issn | 1089-778X | en_UK |
dc.citation.peerreviewed | Refereed | en_UK |
dc.type.status | AM - Accepted Manuscript | en_UK |
dc.author.email | alexander.brownlee@stir.ac.uk | en_UK |
dc.description.notes | Output Status: Forthcoming | en_UK |
dc.contributor.affiliation | Queen's University, Ontario | en_UK |
dc.contributor.affiliation | Newcastle University | en_UK |
dc.contributor.affiliation | Computing Science and Mathematics - Division | en_UK |
dc.contributor.affiliation | University of Parma | en_UK |
dc.contributor.affiliation | Robert Gordon University | en_UK |
dc.contributor.affiliation | Trento University | en_UK |
dc.contributor.affiliation | Robert Gordon University | en_UK |
dc.contributor.affiliation | Leiden University | en_UK |
dc.contributor.affiliation | University of Exeter | en_UK |
dc.contributor.affiliation | Queen's University, Ontario | en_UK |
dc.identifier.wtid | 2051482 | en_UK |
dc.contributor.orcid | 0000-0003-2892-5059 | en_UK |
dc.date.accepted | 2024-09-29 | en_UK |
dcterms.dateAccepted | 2024-09-29 | en_UK |
dc.date.filedepositdate | 2024-09-30 | en_UK |
rioxxterms.apc | not required | en_UK |
rioxxterms.type | Journal Article/Review | en_UK |
rioxxterms.version | AM | en_UK |
local.rioxx.author | Zhou, Ryan| | en_UK |
local.rioxx.author | Bacardit, Jaume| | en_UK |
local.rioxx.author | Brownlee, Alexander|0000-0003-2892-5059 | en_UK |
local.rioxx.author | Cagnoni, Stefano| | en_UK |
local.rioxx.author | Fyvie, Martin| | en_UK |
local.rioxx.author | Iacca, Giovanni| | en_UK |
local.rioxx.author | McCall, John| | en_UK |
local.rioxx.author | van Stein, Niki| | en_UK |
local.rioxx.author | Walker, David| | en_UK |
local.rioxx.author | Hu, Ting| | en_UK |
local.rioxx.project | Internal Project|University of Stirling|https://isni.org/isni/0000000122484331 | en_UK |
local.rioxx.freetoreaddate | 2024-10-07 | en_UK |
local.rioxx.licence | http://www.rioxx.net/licenses/all-rights-reserved|2024-10-07| | en_UK |
local.rioxx.filename | ECXAI_Review__IEEE_Format_R1.pdf | en_UK |
local.rioxx.filecount | 1 | en_UK |
local.rioxx.source | 1941-0026 | en_UK |
Appears in Collections: | Computing Science and Mathematics Journal Articles |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
ECXAI_Review__IEEE_Format_R1.pdf | Fulltext - Accepted Version | 1.01 MB | Adobe PDF | View/Open |
This item is protected by original copyright |
Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.
The metadata of the records in the Repository are available under the CC0 public domain dedication: No Rights Reserved https://creativecommons.org/publicdomain/zero/1.0/
If you believe that any material held in STORRE infringes copyright, please contact library@stir.ac.uk providing details and we will remove the Work from public display in STORRE and investigate your claim.