It’s a cold February night in London. Academics from the London School of Economics prepare to address a packed audience in a public event on the future of work.
People of all ages and backgrounds are in attendance, but young professionals and students dominate. The issue of how work changing is not a topic exclusive to the senior HR managers of this world. It is a real, modern and tangible issue relevant to everyone.
But how exactly is work changing, and just what kind of role is artificial intelligence (AI) playing?
It’s true that AI could significantly transform the way we work. But I’ve found that AI is a hijacked term, and is notably absent from real workplaces.
In the research for our new book, entitled Robotic Process and Cognitive Automation: The Next Phase, Professor Mary Lacity of the University of Missouri-St Louis and I have found there is an exaggeration surrounding the application of AI and cognitive automation to work.
Building on two previous studies, we developed our new findings from 65 case studies in a range of organisations which are using automation. Over 600 automation providers, clients and analysts were also surveyed as part of our study.
It’s true – automation at work is evident. The focus for now is robotic process automation (RPA), which is being used to perform tasks previously completed by humans using rules to process structured data. This area is accelerating, maturing and scaling in global enterprises.
But despite the existing notion that our workforce is being run by robots, who will soon cast humans aside, we question whether AI applications that get computers to replicate how the mind works are anywhere near ready for major workplace deployment. In fact, we suggest they are unlikely to be so for decades.
In relation to RPA, our previous studies have shown that, despite the high levels of adoption (the technology is now cheap and adaptable enough to be implemented into most organisations), by taking over the routine tasks such as data entry that can be monotonous and unrewarding for humans, companies can benefit from significant cost savings, faster processing and less error, whilst simultaneously freeing up human workers to take on more interesting, rewarding roles, generating greater employee satisfaction.
More importantly, such technologies are not yet advanced enough to operate without human intellect to guide them, meaning humans will stay at the helm of our industries for many years to come.
Moreover, the impact that AI may have on future net job lost is also much overstated. In reality, the deployment of these technologies will take a lot longer than many assume. As advanced as IBM’s super AI computer “Watson” or Microsoft’s “Deep Mind” may be, these technologies currently exist as one-of-a-kind inventions, and are a long way off being deployed into the mainstream.
There are also a range of factors which may positively impact work opportunities. Shortfalls in productivity, dramatic increases in the amount of work we have to do, and skills shortages – amongst other factors, like job creation – are all reasons why we shouldn’t be too worried about the robotic revolution.
Certainly our work suggests every person’s job is likely to be changed by at least 25% over the next decade, as technology increasingly permeates task performance. Therefore, business leaders need to become much more willing to invest in the development of their human workforces, helping to build the skills needed to work in the digital businesses that most organisations will increasingly become. The past record is, unfortunately, not good on this so it is up to individuals to identify the relevant skills bases and make their own investments if they are to ensure they can remain employable.
Despite what you may have heard, the game is not over for us yet, but business leaders and individuals alike must act now to ensure they’re able to stay afloat in our unavoidably digital future.
By Leslie Willcocks, Professor of Work, Technology and Globalisation in the Department of Management at the London School of Economics at Political Science
Source: HuffPost UK
The shift from standalone hardware to smart, connected products is pervasive—and it’s here to stay. Forward-thinking hardware companies are taking leadership positions in a new era of product development. Will you be one of them?
It’s interesting to reflect on the opportunities which were imagined back in 2010, and which regularly appear on today’s supply chain agenda. Some progress has been made over the past decade, but there are still plenty of early observers to be convinced, and early adopters who haven’t realized that real-time information alone will not necessarily deliver competitive advantage.
Companies across industries and regions increasingly see sustainability as a critical driver of competitive advantage. And many are setting audacious sustainability goals reflected in concrete environmental, social, and governance (ESG) targets. The challenge: most are struggling to translate their goals into action.