EXPLAINABLE AI (XAI) FOR DETECTING GREENWASHING: A HYBRID NLP-GOVERNANCE MODEL FOR TRANSPARENT ESG REPORTING
グリーンウォッシュ検出のための説明可能なAI(XAI):透明なESG報告のためのハイブリッドNLP-ガバナンスモデル (AI 翻訳)
Sayali Girish Patankar
🤖 gxceed AI 要約
日本語
本研究は、ESG報告書におけるグリーンウォッシュを検出するため、自然言語処理(NLP)と説明可能なAI(XAI)を組み合わせたハイブリッドガバナンスフレームワークを提案する。従来のブラックボックス型AIと異なり、検出理由を明確に説明できる点が特徴である。101名のステークホルダー調査に基づき、信頼が低下している分野を特定し、透明性向上のための技術的ロードマップを示す。
English
This study proposes a hybrid governance framework combining NLP and Explainable AI (XAI) to detect greenwashing in ESG reports. Unlike black-box AI, it provides clear reasoning for each detection. Based on a survey of 101 stakeholders, it identifies sectors with declining trust and outlines a technological roadmap for transparency.
Unofficial AI-generated summary based on the public title and abstract. Not an official translation.
📝 gxceed 編集解説 — Why this matters
日本のGX文脈において
日本ではSSBJ基準の適用が進む中、ESG報告の信頼性確保が急務である。本モデルは、グリーンウォッシュを自動検出し説明可能な形で提示することで、投資家や規制当局の監査プロセスを強化し、日本の開示インフラの透明性向上に寄与する。
In the global GX context
Globally, greenwashing undermines trust in ESG disclosures amid rising regulatory scrutiny (e.g., CSRD, SEC climate rules). This XAI-driven model offers a transparent auditing tool that can enhance the credibility of corporate sustainability reporting and support stakeholder verification.
👥 読者別の含意
🔬研究者:Provides a novel hybrid NLP-XAI framework for greenwashing detection, with empirical insights from stakeholder surveys.
🏢実務担当者:Offers a practical tool to audit ESG reports automatically while generating explainable outputs, improving internal and external transparency.
🏛政策担当者:Highlights the need for transparent AI in regulatory oversight of ESG claims, supporting rulemaking on greenwashing prevention.
📄 Abstract(原文)
In the current phase of technological development, often described as Industry 5.0, the direction of innovation is no longer limited to automation alone. Earlier industrial transitions primarily emphasized efficiency, digital connectivity, and the integration of systems such as the Internet of Things (IoT). However, the emerging Industry 5.0 perspective introduces a stronger emphasis on the relationship between technology, human wellbeing, and environmental sustainability. In simple terms, technology is now expected to support social goals rather than operate purely as an efficiency tool. Because of this shift, the idea of corporate accountability has become increasingly important. Organizations are now expected to demonstrate responsible practices not only in terms of profit generation but also in terms of environmental and social impact. As a result, many corporations publish Environmental, Social, and Governance (ESG) reports. These disclosures are intended to show investors, regulators, and consumers how the company manages sustainability-related responsibilities. The idea of Explainable Artificial Intelligence has gained attention during the last few years as organizations attempt to improve transparency in machine learning systems. Scholars working in the field of governance technology have pointed out that interpretability becomes extremely important when artificial intelligence is used in decision-making environments that affect public trust. If an AI system produces results without explanation, users may struggle to understand or verify its conclusions. At the same time, increased pressure to appear sustainable has unintentionally created a new challenge known as greenwashing. Greenwashing occurs when organizations invest heavily in marketing themselves as environmentally responsible while making relatively small changes to their actual environmental performance. Instead of focusing on genuine sustainability improvements, companies may emphasize promotional communication designed to create a positive image. This phenomenon has broader consequences than simple misleading advertising. When companies exaggerate environmental achievements or selectively present sustainability data, they distort the global sustainability landscape. Investors may unknowingly support organizations that appear responsible but are not implementing meaningful environmental strategies. As a result, progress toward global sustainability objectives—such as those outlined by the United Nations Sustainable Development Goals—can slow down. A major cause of this problem lies in information asymmetry between corporations and stakeholders. ESG reports are often extremely long and filled with technical terminology. For many readers, verifying the claims contained in these documents is difficult. Even when artificial intelligence is used to analyze these reports, the algorithms themselves often operate as opaque systems. These systems may identify suspicious patterns but do not clearly explain how the conclusion was reached. Because of this situation, a second trust gap emerges. Stakeholders are told that artificial intelligence has verified the information contained in a report, yet the reasoning behind the verification remains hidden. This lack of clarity reduces the credibility of the auditing process. To address this issue, the present study proposes a hybrid governance framework that combines Natural Language Processing (NLP) with Explainable Artificial Intelligence (XAI). Instead of relying on hidden computational logic, the system is designed to present understandable reasoning for every detection it produces. By moving from opaque decision-making toward interpretable analysis, ESG auditing can become a more transparent and participatory process. The research presented in this paper includes findings from a survey involving 101 stakeholders drawn from diverse professional and academic backgrounds. Based on these insights, the study identifies sectors where public trust has declined most significantly and proposes a technological roadmap for improving transparency through explainable AI methods.
🔗 Provenance — このレコードを発見したソース
- openalex https://doi.org/10.5281/zenodo.20003790first seen 2026-05-15 18:05:59
🔔 こうした論文の新着を逃したくない方は キーワードアラート に登録(無料・3キーワードまで)。
gxceed は公開メタデータに基づく研究支援データセットです。要約・翻訳・解説は AI 支援で生成されています。 最終的な解釈・検証は利用者が原典資料に基づいて行うことを前提とします。