Campbellsville University, USA.
World Journal of Advanced Research and Reviews, 2025, 26(01), 1294-1301
Article DOI: 10.30574/wjarr.2025.26.1.1160
Received on 01 March 2025; revised on 07 April 2025; accepted on 10 April 2025
The integration of artificial intelligence in financial systems has revolutionized decision-making processes, particularly in credit scoring and risk assessment. However, this technological advancement brings forth crucial questions about transparency and accountability. This article examines how Explainable AI (XAI) addresses these concerns by providing interpretable insights into algorithmic decisions while maintaining model performance. Through analysis of various implementation frameworks, regulatory requirements, and case studies, this article demonstrates how financial institutions are successfully balancing the need for sophisticated AI systems with demands for transparency. The article explores both model-specific and model-agnostic techniques, evaluating their effectiveness in different financial applications while considering the challenges of implementation and compliance. Furthermore, it examines the evolution of regulatory frameworks across different jurisdictions and their impact on XAI adoption, providing insights into future directions for both technical innovation and regulatory standardization.
Explainable Artificial Intelligence; Financial Decision-Making; Regulatory Compliance; Model Transparency; Banking Technology
Preview Article PDF
Rajeev Reddy Chevuri. The role of explainable AI in promoting transparency in financial decision-making. World Journal of Advanced Research and Reviews, 2025, 26(01), 1294-1301. Article DOI: https://doi.org/10.30574/wjarr.2025.26.1.1160.
Copyright © 2025 Author(s) retain the copyright of this article. This article is published under the terms of the Creative Commons Attribution Liscense 4.0