Sector News

Why businesses need explainable AI—and how to deliver it

October 8, 2022
Sustainability

Businesses increasingly rely on artificial intelligence (AI) systems to make decisions that can significantly affect individual rights, human safety, and critical business operations. But how do these models derive their conclusions? What data do they use? And can we trust the results?

Addressing these questions is the essence of “explainability,” and getting it right is becoming essential. While many companies have begun adopting basic tools to understand how and why AI models render their insights, unlocking the full value of AI requires a comprehensive strategy. Our research finds that companies seeing the biggest bottom-line returns from AI—those that attribute at least 20 percent of EBIT to their use of AI—are more likely than others to follow best practices that enable explainability.1 Further, organizations that establish digital trust among consumers through practices such as making AI explainable are more likely to see their annual revenue and EBIT grow at rates of 10 percent or more.2
Even as explainability gains importance, it is becoming significantly harder. Modeling techniques that today power many AI applications, such as deep learning and neural networks, are inherently more difficult for humans to understand. For all the predictive insights AI can deliver, advanced machine learning engines often remain a black box. The solution isn’t simply finding better ways to convey how a system works; rather, it’s about creating tools and processes that can help even the deep expert understand the outcome and then explain it to others.

To shed light on these systems and meet the needs of customers, employees, and regulators, organizations need to master the fundamentals of explainability. Gaining that mastery requires establishing a governance framework, putting in place the right practices, and investing in the right set of tools.

What makes explainability challenging
Explainability is the capacity to express why an AI system reached a particular decision, recommendation, or prediction. Developing this capability requires understanding how the AI model operates and the types of data used to train it. That sounds simple enough, but the more sophisticated an AI system becomes, the harder it is to pinpoint exactly how it derived a particular insight. AI engines get “smarter” over time by continually ingesting data, gauging the predictive power of different algorithmic combinations, and updating the resulting model. They do all this at blazing speeds, sometimes delivering outputs within fractions of a second.

Disentangling a first-order insight and explaining how the AI went from A to B might be relatively easy. But as AI engines interpolate and reinterpolate data, the insight audit trail becomes harder to follow.

Complicating matters, different consumers of the AI system’s data have different explainability needs. A bank that uses an AI engine to support credit decisions will need to provide consumers who are denied a loan with a reason for that outcome. Loan officers and AI practitioners might need even more granular information to help them understand the risk factors and weightings used in rendering the decision to ensure the model is tuned optimally. And the risk function or diversity office may need to confirm that the data used in the AI engine are not biased against certain applicants. Regulators and other stakeholders also will have specific needs and interests. READ MORE

By Liz Grennan, Andreas Kremer, Alex Singla, and Peter Zipparo

Source: mckinsey.com

comments closed

Related News

July 21, 2024

Capgemini: is hydrogen fuel the key to sustainable aviation?

Sustainability

Sustainable Aviation Fuels (SAFs) are a hot talking point as they’re able to be used without changing any elements of the aircraft currently in use. However, they are not a net zero option. Aviation accounts for 2.5% of global CO2 emissions – Capgemini & Chalmers University of Technology are exploring if hydrogen fuel could be the solution.

July 14, 2024

A greener future: What is the industry doing to meet its sustainability targets?

Sustainability

As the global spotlight on environmental sustainability intensifies, the food and beverage industry is committed to adopting greener manufacturing practices. Across all facets, from production to packaging, the industry pursues the integration of sustainable energies, affirming a steadfast commitment to a responsible and eco-friendly future. FoodBev explores.

July 7, 2024

When people thrive, business thrives: The case for human sustainability

Sustainability

Human connections drive everything of value to an organization, including revenue, innovation and intellectual property, efficiency, brand relevance, productivity, retention, adaptability, and risk. Yet organizations’ current efforts to prioritize these connections are falling short—in part because many organizations are stuck in a legacy mindset that centers on extracting value from people rather than working with them.

How can we help you?

We're easy to reach