Since computers were first invented people have been looking for ‘the next big thing.’ Now, as close to half the world owns a phone faster than the earliest supercomputers, it is difficult to keep track of what we should pay attention to. While many of the inventions that utilize powerful technology barely make it past the headlines (has anyone ever seen a smart fridge?), the advances that made them possible often fall victim to the same hype.
Big Data, AI, and IoT are three of the most widely misappropriated terms of recent times, and many do not know how these technologies are linked, or how they have paved the way for the technological progress we have come to expect. This article will shed some light on these concepts, and further articles will delve into their importance in industry, obstacles that they face and what lies ahead.
The big bang
In the years following the launch of The World Wide Web in 1989, there was a huge growth in the number of machines connected to each other, and when GPS became viable between 1994 and 2000 the amount of data being generated by computers and connected devices skyrocketed. The potential of this network of devices was soon realized, and in 1999 the term ‘Internet of Things’ was first coined by Kevin Ashton of MIT, postulating: ‘If we had computers that knew everything there was to know about things, using data they gathered without any help from us, we would be able to track and count everything and greatly reduce waste, loss and cost.’
With GPS technology taking off, RFID tags being used in loyalty card systems, and the PDA market heating up, businesses were able to ‘see’ into their processes and the conditions were perfect for an information explosion. In 2005, the term ‘Big Data’ was first used by Roger Mougalas as the amount of data generated became too much for existing tools to process. The launch of the iPhone in 2007 marked Big Data’s move into the consumer sphere, and since then the rise of smartphones, wearables, tablets, and all number of smart devices have changed our perception of the physical and digital world.
Big Data, big changes
At the same time, the rise of social media and e-commerce led to the idea of a digital persona, and the incredible value of data became apparent. The ’00s also saw the emergence of the data sector, with companies forming specifically to help enterprises manage organizational data and use it to improve processes. Venkat Viswanathan, the co-founder and chairman of LatentView Analytics, had experienced the power of data in the consumer marketing sphere and saw interest in business environments too. ‘What enabled a lot of this transformation is that digital data is so much more granular,’ said Viswanathan, ‘companies were taking ideas from the consumer sector and applying it in industry.’
Industrial environments were already used to technology and data collection, as they used data to inform decisions only in real-time – checking pressure levels, temperatures, etc. Storing that data for later analysis and use was not even considered until specialized sensor data became refined enough and the cost of storage dropped: ‘As storage costs came down and cloud storage came into use over the last 5-8 years, the opportunity to look back at historical data and identify patterns arose,’ Viswanathan said.
AI’s time to shine
Once data storage became a viable option for businesses of all kinds, and the cloud made it possible to collect huge and detailed sets of data, AI finally had a solid foundation to build on. Over the years, AI research has gone through several AI winters, in which the development of algorithmic techniques hit a wall due to lack of interest or investment. As more and more data became available, and AI research branched off into narrower and narrower applications, the latest generation of algorithms has made huge progress in fundamental areas – such as natural language processing, computer vision, and machine translation – because of the massive amount of information available to learn from.
This surge in available training data from a huge range of sources has led to a vast improvement in AI systems, a phenomenon known as the unreasonable effectiveness of data, which states that even simple algorithms, given enough data, can reach accurate conclusions. Combined with decades of work perfecting these algorithms to perform specific tasks with near-human performance, AI finally has something to sink its teeth into to achieve meaningful results. But Big Data has not just given an added boost to AI – it is required, as, in the words of the Devin Gharibian-Saki the CSO of Redwood Software, ‘AI systems run on statistical models, so you can’t run AI without a lot of data to feed it.’
A holistic view
With a host of sensors, IoT devices and even user data available to provide data to teach an AI system, predictions and decisions can now be made across all areas of a business, provided that users understand what the data means and where it comes from. ‘You have to have an idea of what you want in the end, otherwise all that data, technology and sensors are useless,’ says Gharibian-Saki, especially in an environment where anything is measurable and the risk of being lost in data is greater than ever. Companies must remember that there is no quick-win from using IoT, Big Data, or AI in isolation: ‘we always look for one thing that will make all our problems go away,’ says Gharibian-Saki, ‘but IoT devices, sensors, robotics, and AI are all components in a system – without that holistic view it will take a long time to achieve a big win.’
IoT, Big Data, and AI all feed into each other and create an ecosystem of automation – IoT devices collect data on millions of criteria, which is then collated in the cloud, and used to train and improve AI algorithms. As such, ensuring that people understand how IoT, Big Data and AI interact and improve each other is the most important thing we can do to bring real improvements to our lives.
Subsequent articles in this series will focus on the intersection of IoT, Big Data and AI in industry, obstacles that this ecosystem faces now, and what the future holds for this group of technologies.
By Charles Towers-Clark
Why hasn’t artificial intelligence fully transformed supply chains? Several years ago, some of us predicted that AI-powered automation would lead to “the death of supply chain management.” However, despite heavy investments, companies have not realized the vision of AI-managed supply chains.
According to our survey, only 22% of workers globally rank compensation as the thing that matters most to them in a job. This isn’t to say that people will accept a job without fair pay: Compensation still ranks higher than all other job attributes. But it’s evident that a coin-operated view of workers, where firm leaders see employment as a purely financial transaction, underestimates the deeper human motivations for work.
In November 2019 Stanford Health Care moved into a new hospital building. With seven stories and 824,000 square feet, the hospital required over a decade and two billion dollars to plan and construct. Most descriptions of the hospital focus on the airy private patient rooms or the state-of-the-art operating rooms, but one of the most technologically sophisticated aspects of the building is found in the basement.