Over the past several years, technology has rapidly changed what enterprise analytics can do. Analytical approaches that incorporate predictive models have begun to displace merely descriptive approaches. Descriptive analytics, which continue to be valuable for many users, have evolved as well, making greater use of visual analytics and moving toward a self-service model in which nontechnical users can often develop their own analyses. In general, analytics are quickly becoming both easier to use and more powerful.
Despite this progress, it’s still difficult to use data and analytics to understand and predict many of the important phenomena in organizations. Predictive models require a substantial amount of past data and a reasonable amount of expertise to create and use, which limits how and when they can be deployed. And while giving users of descriptive analytics more control is good, that shift often requires users to invest more time. Furthermore, existing analytics approaches — both descriptive and predictive — have historically been a bit narrow, focused on particular functions or units even though many business problems and issues cut across functions and units. This important contextual information is rarely incorporated into analytics models — silos of systems and data make it hard to do, and data analysts are often unaware of or unable to easily access relevant data because of poor cataloguing.
The good news is that a new generation of enterprise analytics is emerging, and it incorporates some degree of both automation and contextual information. The innovations rely on AI and automation, connections across existing information systems, and role-based assumptions about what decisions will be made on data and analytics. In the end, they can prepare insights and recommendations that can be delivered directly to decision makers without requiring an analyst to prepare them in advance.
Just as Google applications can tell you, on the basis of your home address, calendar entries, and map information, that it’s time to leave for the airport if you want to catch your flight, companies can increasingly take advantage of contextual information in their enterprise systems. Automation in analytics — often called “smart data discovery” or “augmented analytics” — is reducing the reliance on human expertise and judgment by automatically pointing out relationships and patterns in data. In some cases the systems even recommend what the user should do to address the situation identified in the automated analysis. Together these capabilities can transform how we analyze and consume data.
The Power of Context
Historically, data and analytics have been separate resources that needed to be combined to achieve value. If you wanted to analyze financial or HR or supply chain data, for example, you had to find the data — in a data warehouse, mart, or lake — and point your analytics tool to it.
This required extensive knowledge of what data was appropriate for your analysis and where it could be found, and many analysts lacked knowledge of the broader context. However, analytics and even AI applications can increasingly provide context. And these capabilities are now regularly included by key vendors in their transactional systems offerings, such as enterprise resource planning (ERP) and customer relationship management (CRM).
There are many examples of emerging context which involve assumptions about the decision to be made and the workflow for making it. In human resources, for example, the human capital management system can automatically optimize the candidate selection process by identifying the most suitable candidates for a particular job description (using natural language processing of resumes and matching of terms) and ranking them in order of fit. To do so, the application needs the contextual knowledge to connect background and skill information to job requirements. In supply chains, contextual analytics can draw on enterprise resource planning data, which companies use to optimize inventory levels, predict product fulfillment needs, and identify potential backlogs. The context involved there includes supply chain benchmarks, an understanding of the component stages of a business process, and the knowledge of where process bottlenecks can occur.
Unlike previous analytics systems, these new tools typically don’t require the user to make the contextual links. The data needed to perform the analyses is automatically employed, and the user interface is consistent with that for other analytical activities. Somewhat more advanced contextual applications can automatically access and analyze data across functional areas, relating, for example, proposed hiring increases to the implications for budgets and financial performance.
The insurance giant AXA XL implemented a cloud-based human capital management system that provides a variety of contextual information to users. Without customization, it included human resource KPIs, best practice benchmarks, and the capability to monitor HR trends such as diversity and attrition levels. A new enterprise reporting tool with these capabilities was introduced throughout the company in only eight weeks.
In addition to automatically finding the right data and connecting it to provide context, the analytical process itself is increasingly being automated and supported by AI. Vendors and users are moving toward these augmented analytics, in which the analytics software automatically finds patterns in data and makes it possible to query and analyze data using natural language interfaces. In short, AI and traditional analytics are being combined to make data analysis easier and more effective.
The tools involved are predictive analytics and machine learning, natural language processing (NLP) and generation (NLG), and more-traditional technologies such as rule engines. Predictive analytics and machine learning can identify trends and anomalies in data and describe which variables or features are contributing most to outcomes. NLP-enhanced analytics make it possible to analyze data through simple spoken commands, and NLG capabilities provide automated summaries of analyses in natural language.
These new capabilities remove barriers of expertise and time from the process of data preparation, insight discovery, and analysis and make it possible for “citizen data analysts” to create insights and take actions that improve their businesses. They also have positive implications for analytical and data science support organizations. Experts in analytical and AI software can focus on truly difficult problems and analyses and don’t have to spend their valuable time instructing or supporting less-expert analysts and business users. The expert-created models, along with those created by vendors, can also increasingly be incorporated into transactional systems through automated requests for data or analysis.
These augmented analytics are just being rolled out by leading vendors such as Oracle and Salesforce, but some companies have already undertaken successful experiments with them. One manufacturing company had a data-oriented culture but lacked professional data scientists. It had rolled out a discount pricing approach for customers that was based on the volume tiers of their purchases. A business analyst asked the augmented analytics system to review and find any trends in pricing data. It discovered that while relatively few customers were taking advantage of volume tier discounts, many were getting special discounts from salespeople. The company took steps to end the unauthorized discounts and gained hundreds of thousands of dollars in revenue.
Big Changes Are Coming
It’s early days for this new generation of business analytics, but the potential of the tools is striking. They promise to offer more and better insights to more people within organizations more quickly. Business intelligence analysts and quantitative professionals will still have important tasks to perform, but many will no longer have to provide support and training to amateur data users. Small to mid-size businesses that haven’t been able to afford data scientists will be able to analyze their own data with higher precision and clearer insight. All that will matter to organizations’ analytical prowess will be a cultural appetite for data, a set of transactional systems that generate data to be analyzed, and a willingness to invest in and deploy these new technologies.
by Tom Davenport and Joey Fitts
When reviewing an idea, my advice to leaders is to focus on strategic alignment. Leaders need to provide clear strategic guidance on the arenas they want their innovation teams to explore. It is on the basis of this guidance that they can then evaluate whether an idea is worth investing in.
Last month, the FDA announced the 12 winners of it’s Low or No-Cost Tech-Enabled Traceability Challenge, which was part of the New Era for Smarter Food Safety initiative. The initiative aims at achieving end-to-end traceability throughout the supply chain.
Digital tools enabled by advanced analytics, artificial intelligence, and machine learning can help companies uncover the fastest and most effective path to abating the O&G industry’s greenhouse gas emissions.