Sector News

Why businesses need explainable AI—and how to deliver it

October 8, 2022
Sustainability

Businesses increasingly rely on artificial intelligence (AI) systems to make decisions that can significantly affect individual rights, human safety, and critical business operations. But how do these models derive their conclusions? What data do they use? And can we trust the results?

Addressing these questions is the essence of “explainability,” and getting it right is becoming essential. While many companies have begun adopting basic tools to understand how and why AI models render their insights, unlocking the full value of AI requires a comprehensive strategy. Our research finds that companies seeing the biggest bottom-line returns from AI—those that attribute at least 20 percent of EBIT to their use of AI—are more likely than others to follow best practices that enable explainability.1 Further, organizations that establish digital trust among consumers through practices such as making AI explainable are more likely to see their annual revenue and EBIT grow at rates of 10 percent or more.2
Even as explainability gains importance, it is becoming significantly harder. Modeling techniques that today power many AI applications, such as deep learning and neural networks, are inherently more difficult for humans to understand. For all the predictive insights AI can deliver, advanced machine learning engines often remain a black box. The solution isn’t simply finding better ways to convey how a system works; rather, it’s about creating tools and processes that can help even the deep expert understand the outcome and then explain it to others.

To shed light on these systems and meet the needs of customers, employees, and regulators, organizations need to master the fundamentals of explainability. Gaining that mastery requires establishing a governance framework, putting in place the right practices, and investing in the right set of tools.

What makes explainability challenging
Explainability is the capacity to express why an AI system reached a particular decision, recommendation, or prediction. Developing this capability requires understanding how the AI model operates and the types of data used to train it. That sounds simple enough, but the more sophisticated an AI system becomes, the harder it is to pinpoint exactly how it derived a particular insight. AI engines get “smarter” over time by continually ingesting data, gauging the predictive power of different algorithmic combinations, and updating the resulting model. They do all this at blazing speeds, sometimes delivering outputs within fractions of a second.

Disentangling a first-order insight and explaining how the AI went from A to B might be relatively easy. But as AI engines interpolate and reinterpolate data, the insight audit trail becomes harder to follow.

Complicating matters, different consumers of the AI system’s data have different explainability needs. A bank that uses an AI engine to support credit decisions will need to provide consumers who are denied a loan with a reason for that outcome. Loan officers and AI practitioners might need even more granular information to help them understand the risk factors and weightings used in rendering the decision to ensure the model is tuned optimally. And the risk function or diversity office may need to confirm that the data used in the AI engine are not biased against certain applicants. Regulators and other stakeholders also will have specific needs and interests. READ MORE

By Liz Grennan, Andreas Kremer, Alex Singla, and Peter Zipparo

Source: mckinsey.com

comments closed

Related News

February 4, 2023

Developing talent strategies for the energy transition

Sustainability

Navigating the energy transition will be a generational challenge, requiring top-tier talent to solve incredibly complex problems. Meeting this challenge will require retaining and reskilling today’s workers, while integrating new people with varied backgrounds and capabilities.

January 29, 2023

How playgrounds are becoming a secret weapon in the fight against climate change

Sustainability

Schoolyards can do more than absorb rainwater and cool neighborhoods. They can also help close the park equity gap nationwide: One hundred million Americans, including 28 million kids, do not live within a 10-minute walk from a park or green space. Communities of color and low-income neighborhoods have even less access to green spaces.

January 22, 2023

BCG-WEF Project: The Net-Zero Challenge

Sustainability

The race to net-zero emissions will forever change the way many companies do business. The immediacy, pace, and extent of change are still widely underestimated. Early movers can seize significant advantage. In this report, coauthored with the WEF Alliance of CEO Climate Leaders, authors explore how other companies can take a similar path by identifying, creating, and scaling green businesses. 

How can we help you?

We're easy to reach