The fourth wave of computing is being driven by a combination of forces – organizational demands to increase revenue, profitability, and serve customers better as well as robust use cases that can be solved by the latest technology advances. The first wave of computing started with centralized mainframes, the second wave swung to distributed use of client-server-PCs, and the third wave shifted back some to centralized cloud computing. Now, the fourth wave, which is also called “distributed core” or “fog computing”, distributes computing back to the sources of data and consumption of services.
Operational efficiency leads the first phase of the fourth wave deployments. These include Internet of Things (IoT) sensors installed in high value, mission critical products deployed in field. This is a low cost entry point to add sensors to field devices, and cuts across many industry verticals. Early adopters such as manufacturing and facilities benefit economically from monitoring and acting on critical or expensive end-point data generated from manufacturing platforms, IT equipment, electrical grids, oil and gas grids, shipping containers to transportation fleet equipment. Another low-risk, high-return application is the automation of facilities such as HVAC, lighting, and facilities equipment.
Extending monitoring to predictive maintenance is the logical next step. This requires developing machine learning infrastructure and applications in the back end data center to predict upcoming failure conditions in field equipment. Predictive maintenance is an easy upsell of a value added service that creates an extra revenue stream for the provider and improved customer satisfaction. The Dell Technologies Support and Deployment services (SDS) group is applying this through SupportAssist. Over the last several years, the SDS team has collected more than 260 attributes from 50 million hard drives. Using this info, they’ve trained neural network models to make highly accurate predictions of drive failures. Those models are then pushed to the edge to monitor real time attributes of active drives in-service and to generate alerts with associated metadata of any imminent drive failures weeks in advance. This allows customers to take corrective actions like backups or replacing drives dispatched by the services team.
The next round of IoT deployments is around security. These include analysis of sound, images and video from cameras and surveillance equipment deployed in many private and public spaces. The analysis happens in batch mode retrospectively, or with a significant time lag. Infrastructure with machine learning intelligence detects unusual or abnormal situations in these physical spaces. Automating this repetitive and mundane process reduces human labor and decreases the risk of overlooking real anomalies. Moreover, humans can be deployed to address these abnormal situations.
In the initial phases of these use cases, the compute and storage are deployed at the back-end data center (aka core) or cloud. Scaling out these applications increases round trip data size transfer, and adding a real-time component overloads the end-to-end infrastructure. Real-time security alerts need very low latency decision making. To satisfy that, compute (including storage and communication) pushes closer to the action, driving the need for a layer of computing. This layer is called edge computing.
Smart Devices and Smart Services: Two Main Categories of Edge Computing
There are two broad categories for edge computing – “Internet of Things,” described above, and “Smart Services”. While IoT primarily consists of interacting with inanimate “things” like field equipment, smart services add a layer of rich interactions to humans as well. Examples of smart services at the edge include smart multi-media service, first responder services during emergencies, multi-player gaming with real-time and richer audio-visual experiences, and immersive experiences like AR/VR. This new class of “born in the edge and for the edge” services demands faster response time, high data bandwidth, and predictable quality of service. These cannot be well-serviced with long round trips back and forth from consumption point to a centralized data center. The majority of data created at the edge needs to be computed and acted upon locally.
In smart multi-media services, real-time analysis of audio and video can then be extended to object profiles recognition, human sentiment and intent analyses, and more. These can be used to provide improved, personalized services or precautionary actions as the case may be.
Retail object recognition can be used for inventory management and faster checkouts. In manufacturing, image analysis of quality of final products at the end of assembly line can improve yield and lower cost at higher speeds with lower human involvement.
More viewpoints at The hitchhiker’s guide to Edge Computing – by Moor Insights posted on Forbes
Dell Technologies and Microsoft have kicked off joint proof of concept for bringing Microsoft Brainwave to the edge. Brainwave improves quality by processing product images in manufacturing use cases. Brainwave executing on PowerEdge servers with custom FPGAs will improve accuracy, speed up the process, with reduce the amount of human intervention to improve both operational efficiency and customer experience. More info is available here: Microsoft-unveils-project-brainwave, view the video, and Open to External Testers
Today, augmented reality (AR) and virtual reality (VR) add immersive experiences to mostly 2-dimensional internet experiences. They provide distinct competitive advantages by changing the experience for your users, customers or partners. Even today’s content delivery requires content caching at edge cloud locations for latency. However, immersive experiences are computationally more intensive, requiring high data bandwidth and low latency responses matching human reaction time. It is infeasible to drive this from centralized data centers or distantly located clouds. This will require a combination of powerful edge computing distributed closer to consumers and 5G mobile for practical applications such as consumer shopping, virtual real estate property visits, and remote physical examinations. On the consumer side, these need to be supported with affordable and lightweight headsets. To learn more, refer to Enabling Mobile Augmented and Virtual Reality with 5G Networks.
As servers and accelerators can be deployed away from traditional data centers, the technology improvements described here are key drivers for moving compute closer to the edge. Technology that is more power efficient, lower cost, secure and resilient for training and inferencing; fast NVMe storage, storage class memory, software defined paradigms, cheaper sensors and AR/VR headsets. 5G promises theoretical bandwidth of gigabits and will reduce latency to sub-millisecond levels – ideal when data needs to be transferred back to the core.
OT and IT Dynamics
Are groups inside your organization ready for the fourth wave? There is strong collaboration required between Operational Technology (OT) groups and IT groups. OT from lines of business (LoB) lead the charge on use cases and business cases. IT needs to deliver on their strengths of software, infrastructure, security, and analytics to OT groups. 451 Research reports that OT budgets will rise by an average of 49% in 2018, higher than IT’s 35% gains. Despite the fact that IT has the bigger budget. The good news is that both groups rate business value of such projects very similarly.
Organizations leading the change work through longer investigation and deployment cycles but will benefit from competitive advantages to better serve their customers and improve revenue streams.
The fourth wave will require modernization of IT infrastructures on account of:
- IT will be deployed in less than ideal data center environments like manufacturing floors, retail environments, closets, switching stations, telecom central offices, and remote and branch offices. Infrastructures in these “extended” environments must operate in +10° to +20°C above data center ambient, in constrained spaces, or using fresh air cooling. Infrastructures are already available today to operate in these near to data center like environments (“near edge”) including from Dell EMC.
- To further reduce decision latency and data transfer sizes, infrastructures will be rolled out one level further to harsher environments like parking lots, rooftops, rugged terrains like hills, fields, deserts, barges, glaciers, or trucks. These are hot, dusty, and experience severe humidity, temperature, and power variations. These infrastructures must be resilient to operate in sustained 50°C to 70°C temperatures in polluted and humid environments. Self-contained data centers with optional power and cooling called micro data centers (MDC) are emerging to tackle such environments. Additionally, server, storage, and network switches will be hardened beyond what is possible today.
- Software-defined hyperconverged architectures will be more prevalent in edge deployments compared to traditional 3-tier architectures deployed by core IT. These converged devices will integrate server, storage, and networking and will be easier to deploy and manage with minimal IT resources.
- The remote management of these edge infrastructures and the services that run on them will add a layer of complexity beyond traditional data center management. Additionally, some level of autonomous management will be required in situations of dropped connections to core operations.
- More devices outside the cozy confines of the data center, significantly increases the risk of hacking and security interference. There will be more machine-to-machine interactions with 3rd party devices of an uncertain pedigree. These will demand both physical and virtual security with stronger firewalls so any intrusions can be as contained as possible. Refer to “6 ways IoT is Vulnerable”. The solution is to invest in vendors to who offer 7 to 10 years of support and vendors offering more end to end systems.
- Despite the improvements in mobile networks, network services will not be uniformly available in many of near-edge situations. Therefore, network services acceleration will be required using innovations like Smart NICs.
- This data-centric approach naturally leads to the application of machine learning models with global training at the core, fast inferencing for decision making and localized learning at the edge. This will require infrastructures to support accelerators of different types, such as GPUs or IPUs at the core and FPGAs at the edge.
- Lastly, expect more diversity of agents or parties at the edge, autonomously setting up and executing on multi-party transactions without being constrained by centralized single owner databases, leading to need for blockchain based databases.
To participate in proof of concepts, contact your Dell EMC sales team.
- “OT Stakeholder Perspective – Q1 2018 Advisor Report – Voice of the Enterprise, Internet of Things” by 451 Research
- “Edge Computing – the Fourth Wave Rises – by Moor Insights” Matt Kimball, Moor Insights and Strategy
- “6 ways IoT is vulnerable,” in IEEE Spectrum, vol. 55, no. 7, pp. 21-21, July 2018
- “Enabling Mobile Augmented and Virtual Reality with 5G Networks”, AT&T Foundry, Jan 2017
- “Azure AI – Productivity for virtually every developer and scenario”