Sector News

AI can help address inequity — if companies earn users’ trust

September 19, 2021
Diversity & Inclusion

Bullish predictions suggest that artificial intelligence (AI) could contribute up to $15.7 trillion to the global economy by 2030. From autonomous cars to faster mortgage approvals and automated advertising decisions, AI algorithms promise numerous benefits for businesses and their customers.

Unfortunately, these benefits may not be enjoyed equally. Algorithmic bias — when algorithms produce discriminatory outcomes against certain categories of individuals, typically minorities and women — may also worsen existing social inequalities, particularly when it comes to race and gender. From the recidivism prediction algorithm used in courts to the medical care prediction algorithm used by hospitals, studies have found evidence of algorithmic biases that make racial disparities worse for those impacted, not better.

Many firms have put considerable effort into combating algorithmic bias in their management and services. They often use data-science driven approaches to investigate what an algorithm’s predictions will be before launching it into the world. This can include examining different AI model specifications, specifying the objective function that the model should minimize, selecting the input data to be seeded into the model, pre-processing the data, and making post-processing model predictions.

However, the final outcome of deploying an algorithm relies on not only the algorithm predictions, but also how it will ultimately be used by business and customers — and this critical context of receptivity and adoption of algorithm is often overlooked. We argue that algorithm deployment must consider the market conditions under which the algorithm is used. Such market conditions may affect what/who, and to what extent the algorithm’s decisions will impact, and hence influence the realized benefits for users of using the algorithm.

For example, to help its hosts maximize their income (i.e., property revenue), Airbnb launched an AI algorithm-based smart pricing tool that automatically adjusts a listing’s daily price. Airbnb hosts have very limited information on competing Airbnb properties, hotel rates, seasonality, and various other demand shocks that they can use to correctly price their properties. The smart pricing algorithm was meant to help with this, incorporating relevant information on host, property, and neighborhood characteristics from the company’s enormous information sources to determine the best price for a property. In our recently published study, the average daily revenue of hosts who adopted smart pricing increased by 8.6%. Nevertheless, after the launch of the algorithm, the racial revenue gap increased (i.e., white hosts earned more) at the population level, which includes both adopters and non-adopters, because Black hosts were significantly less likely to adopt the algorithm than white hosts were. READ MORE

by Shunyuan Zhang, Kannan Srinivasan, Param Vir Singh, and Nitin Mehta

Source: hbr.org

comments closed

Related News

January 23, 2022

Supporting neurodiverse talent in the workplace

Diversity & Inclusion

Neurodiversity is a fast-growing category of organizational diversity and inclusion that employers and managers need to be aware of in order to embrace and maximize the talents of people who think differently. Sam Bevan, director of emerging at Snapchat, joins Stephen Frost, CEO of Included to discuss inclusion of neurodiverse employees at work.

January 15, 2022

Why femtech is becoming increasingly prominent

Diversity & Inclusion

All signs point to an uptick in “femtech,” the all-purpose term that is applied to technology dealing with women’s health. More money is being invested in the sector, more enterprises are emerging, and there is, finally, a greater awareness of women’s healthcare needs.

January 9, 2022

How does your company support “First-Generation Professionals”?

Diversity & Inclusion

Existing research has shown that moving up the socioeconomic ladder is becoming more difficult, and class bias has been shown to impact lifetime earnings. Few studies have investigated the workplace experience of those from different socioeconomic backgrounds. To fill this knowledge gap, the authors conducted a study on first-generation professionals (FGPs). Here’s what they learned about FGPs and what company leaders can do to support them.

Send this to a friend