The Fourth Industrial Revolution: powered by AI & machine learning
02 January 2019
We are being met with a tidal wave of new and disruptive technology, as the digital revolution that underpins the Fourth Industrial Revolution gathers momentum. Artificial intelligence (AI) has become the buzz word – but will all the hype live up to our expectations?
This article was originally featured in the January 2019 issue of EPDT magazine. Sign up to receive your own copy.
Francis Griffiths, founder & CEO of AI & machine learning experts, Maiple outlines three examples across semiconductor manufacturing, home energy monitoring and securing wireless communications in implantable medical devices (IMDs) that get to the practical applications of the technology.
It’s estimated that AI will deliver a whopping $20 trillion to global GDP over the next 20 years, so it’s no surprise that governments across the world and global corporate giants, such as Google, Amazon, IBM, Microsoft and Intel, are investing billions of dollars into this technology. And while AI won’t be the solution for every problem, it does represent an opportunity to take a leap forward that will transform the way we live and work, with the scope to touch everyone in the world.
The Fourth Industrial Revolution is progressing at an exponential rate, disrupting many existing systems and processes, while also creating totally new and innovative solutions. Mass digitisation and the industrial internet of things (IIoT) is set to transform many industries. Defined by research and advisory firm, Gartner as a network of physical objects that contain embedded technology to communicate, sense and interact with their internal states or the external environment, the possibilities are endless. Networking giant, Cisco predicts that 500 billion devices are expected to be connected to the internet by 2030. Such examples could be factories, lighting, homes and buildings, autonomous vehicles, urban infrastructure, smart grids, robotics and so on.
Industry 4.0 was a term coined by the German government at the Hannover fair in 2011 to promote the use of digitisation in manufacturing. The proliferation of connected machines, M2M (machine-to-machine), augmented with smart software, secure networks and data analytics is helping to improve things like equipment utilisation and preventing factory downtime, for example. According to market intelligence firm, IDC, manufacturers predict a 48% reduction in unplanned downtime from solutions such as ‘connected factories’.
But the benefits of Industry 4.0 and IIoT require careful consideration of existing process and infrastructure if a clear business case is to be made, as it is often a requirement to integrate existing and legacy systems with new sensors and networks. One of the fundamental considerations is what critical decisions need to be made and what new insights do we need to make them? Starting here avoids wasting much time and energy mining your data without a clear focus on outcomes.
Data is available from a plethora of connected devices: from PLCs and sensors, to robots and information systems across factories, and indeed entire supply chains. Careful consideration needs to be given to some fundamental requirements: numbers of sensors and the environment; what kind of data payloads need to be handled; network topology and redundancy; response times needed; types of algorithms; whether we can add AI and machine learning (ML); user interface requirements; and cloud versus mobile and desktop implementations. In most situations, and very critically, we must ask: what security measures are in place across the ecosystem to avoid cyberattacks. In many mission critical applications, the security overheads required to mitigate risks could well offset the cost-benefits of collecting and analysing the data itself. Therefore, careful consideration must be
given to decide how the system is constructed – and how well it will perform to meet the desired outcomes.
Combining sensors in vehicles, factory floors, health monitoring and many other applications have given rise to the term sensor fusion. In many existing commercial applications, sensor systems predate the intelligent systems, therefore reinforcing the traditional method of data collection first, and analysis second. As our IT infrastructure starts converging with IoT technology, we expect that the critical decisions will come first, and that the design of sensor systems will be integrated as part of the ’smart‘ data collection process – thus bringing about high performance computer platforms that combine operational technology and traditional IT systems. This advancement has the capacity to integrate IoT on an extremely large scale, and gives the option to deploy AI and ML solutions not only in the cloud, but at the edge and directly on smart devices or machines. These solutions provide a way to manage big data implementations without compromising performance and security.
Building AI solutions: discovery, development & deployment
The following examples in semiconductor, energy and healthcare all highlight the key phases of implementing an AI solution. Maiple is building a platform that allows customers to intuitively build and deploy these types of solutions. We are guiding customers through the discovery, development and deployment stages, providing subject matter expertise at each phase of the process.
Discovery requires correct data acquisition, data formatting and usually augmentation using synthetic simulators to build valid inputs and extract features needed to train our algorithms. In most cases, we need significant amounts of data, so this phase is critically important.
Development is working with the AI and ML tools; it can be a combination of supervised, unsupervised or transfer learning. The proliferation of cloud services such as IBM Watson, Amazon Web Services, Microsoft Azure and Google offers a rich set of resources to test out different scenarios.
Deployment is often application specific and can be to the cloud, on premise, mobile or embedded solutions – the trade offs usually being speed, cost, energy consumption and security considerations.
Semiconductor manufacturing: wafer map defect classification
As complexity rises and prices drop, improving manufacturing yields, lowering costs and accelerating time-to-market are key factors within the semiconductor manufacturing industry. Applying new techniques using AI and machine learning can significantly improve wafer yields and operational efficiency.
Wafer defect maps are used in semiconductor manufacturing to identify root cause analysis within the fabrication process. Wafer inspection systems detect the presence of faults caused by particles on the wafer surface or pattern defects caused by mask, exposure or circuit pattern issues. Being able to classify these defect patterns and automatically identify systematic failures allows process engineers to focus on root cause analysis and yield improvement.
Deep convolutional neural networks (CNN) have been developed to allow start-of-the art image classification. Maiple has developed a process to generate synthetic defect cluster maps that facilitate training the CNN without the need for complex feature extraction or deep subject matter expertise. Figure 1 illustrates typical process-related defects on a wafer map used to train the CNN. Expected accuracy of classification is more than 97%. Fabrication engineers can quickly see systematic failure patterns and inspect images across wafer lots to identify process issues in real-time. Combining these defect maps with fabrication data provides a faster identification process that speeds up new product introduction and improves yields.
Verv: home energy monitoring
Verv is a great example of the application of AI and machine learning to new untapped data sets that greatly improves the customer experience in home energy management and safety. This is so much more than a typical first generation ‘smart meter’ that simply showed overall energy consumption for your house. Verv is capable of managing data sets to bring real-time insights in new and valuable ways.
We’re now capable of collecting much more data, disaggregating it in real time and showing energy usage per device connected to your supply. Combining AI and ML into the device, we can now identify not just the type of appliance, but also its condition. The consumer is empowered to save energy, and is safer as result, because the system will identify any faulty devices or appliances left on. Figure 2 illustrates the capability to digitise the signal at high speed (1 million times/sec) to extract details on a washing machine cycle.
Verv has further developed deep learning algorithms that offer predictive energy consumption that, combined with blockchain technology, provide an automatic energy trading platform. So, whether you’re a net consumer or producer of energy, the Verv platform algorithms will ensure you automatically get the best deal.
Securing insulin pump wireless communications channel
Diabetic patients use therapy from insulin pumps, a type of implantable medical device, for the infusion of insulin to control blood glucose level. While these devices offer many clinical benefits, there has been a recent increase in the number of cases wherein the wireless communication channel of such devices has been compromised. This can not only cause the device to malfunction, but also potentially threaten the patient’s life.
In this example, a neural network based, real-time deep learning classifier (specifically the multilayer perceptron model, or MCP) was designed for wireless medical device security. The deployment of this type of MCP model is to an edge-based FPGA, which was capable of classifying fake versus genuine glucose measurements at 98.1% accuracy. The system used LabVIEW to train the neural network in TensorFlow, and then inference it onto a National Instruments FGPA platform.
This deployment example highlights the benefit of a deep neural network deployed at the edge on low-powered devices. Many existing deep learning algorithms for classification, identification and segmentation, although very effective, are computationally and memory intensive, making them hard to deploy on low-power embedded frameworks. This landscape looks set to change with the development of new ‘AI-ready’ chips designed to optimise intense matrix computation.
The Fourth Industrial Revolution is happening at a much faster pace than previous industrial revolutions. There is no doubt that mass digitisation is set to transform the way we live and work. Industry 4.0 and the industrial IoT will benefit greatly from new data sources that, combined with artificial intelligence and data analytics, offer new ways to make leaps forward in productivity, quality and discovery.
Contact Details and Archive...