The topic of AI is not new and each one of us is benefiting from AI every day, transforming many aspects of our lives. This trend is fueled by edge computing which is providing opportunities to move AI workloads from the Intelligent Cloud to the Intelligent Edge for improved response times and bandwidth savings. In combination with Digital Twins and IoT, there is a strong trend not only in manufacturing but also in other industries to leverage AI/ML analytics for getting better and faster insights for improved Predictive Maintenance and more. The benefit of edge deployments is especially strong when it comes to computer vision models that take large data streams like images or live video as input. With edge computing, these large data streams can now be processed locally at the device / client, eliminating the need for significant bandwidth or privacy concerns associated with streaming into a cloud data center. Edge video analytics systems can execute computer vision and deep-learning algorithms either directly integrated into the camera or with an attached edge computing system. The computer vision AI models are usually pre-trained in the cloud and are then deployed to an edge device. This approach is using the best of both worlds where the power and scalability of the Cloud is used during the resource intensive training phase and the short latency of the Edge is used during real-time model inference / evaluation. This is even more propelled by the rise of dedicated AI accelerator chips like Neural Processing Unit (NPU), Tensor Processing Unit (TPU), Vision Processing Unit (VPU) and similar specialized silicon designed and built to accelerate AI workloads. We are now entering a time like in the 1990s when dedicated graphics card (GPUs) became mainstream and made 3D computer graphics take off. We expect similar things are going to happen with the commoditization of AI-acceleration chips like NPUs, TPUs and alike in edge devices. We mentioned Edge Computing and (Video) Analytics in our Summer of AI 2021 post here where we showcased a simple but powerful edge AI demo for workplace safety leveraging no-code training and an Raspberry Pi with Adafruit for edge execution. Our Top Trends 2022 post also highlighted Edge AI as an ongoing trend to watch closely. In this post, we will share some recent newer devices and services announced at Build and trends with Edge AI based on market research that is clearly encouraging business leaders to begin integrating this cutting-edge tech in their business processes to prepare for changing customer and employee needs.

Edge AI at the Center of Maturity

Some leaders in classic businesses might still be thinking it is too early for investing into AI workloads and that the technology is not mature enough. Well, the opposite is the case and companies which don’t invest into AI now for all aspects of their business will get left behind. Market analysts see Edge AI in the maturity phase right at the center with a transformational character, which is Gartner’s highest benefit rating

Figure 1: Gartner Emerging Technologies Trend Impact Radar with Edge AI at the center of maturity (Source: Gartner article,, 8 December 2021 ) GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.

In order to have a strong ROI for AI investments, it’s important for companies to be agile and to continually enhance value through rapid AI change. AI DevOps and AI engineering play a vital role here.

According to a survey by NVIDIA for 42% of an audience that are interested in Edge AI, latency reduction for AI workloads was the top differentiator they were hoping to gain from deploying applications at the edge.

Additionally, autonomous mobile robots (AMRs) and in general, autonomous vehicles, rely on real-time object detection and the ability to segment and interpret the surrounding physical environment. Mobile robotics go hand in hand with IoT and Edge AI as they are basically intelligent IoT Edge devices on the move. Industrial IoT (IIoT) has a range of use cases like connecting machines, tools, and sensors on the manufacturing floor to provide a real-time view into production. This level of visibility can be helpful in finding the root cause of problems, identifying bottlenecks, and improving efficiency. Enabling this real-time continuous monitoring has a wide assortment of benefits that drive business outcomes such as, reduced quality management system (QMS) costs, improved overall production quality, increased operator productivity and optimized machine utilization.

According to our research, the Global IIoT market is estimated at $89.3 billion in 2021. Growing at a Compound Annual Growth Rate (CAGR) of 7.4%, it is forecasted to reach $110.6 billion by 2025.

Figure 2: Global IIoT market growth according to our own market research data analysis

You can find more in-depth analysis in our Industrial IoT Platform vendor analysis report which contains a global IIoT market overview, key trends, more industry analysis data and use cases. Click below for that report.

Cybersecurity is even more critical with Edge AI Security

Cybersecurity is top of mind for everyone with ever increasing hacks and malicious usage. Cloud security is one piece of the equation that organizations need to keep in mind, but cybersecurity does not only affect networks, servers on-premises or in the cloud, PCs and people, but also all smartphones, and actually every machine with access to the organization’s IT infrastructure which includes IoT devices. In our Quantum Security post, we mentioned a Forrester IoT Security Report2 which found out that security decision-makers on average indicate that 10% of their security budget will go to IoT Security with a growing spending priority.

The move to edge cloud AI where the computing is not physically housed in the same cloud data center but scattered in-the-field brings additional security risks. Various IoT Edge setups don’t have the same physical access model as a cloud data center where the access to servers and other infrastructure is highly controlled. With edge deployments far more scattered attack entry points are open and make Edge AI systems vulnerable for stealing of local stored data and IP. Not just for IoT and other edge deployments but in that case in particular, the assumption must be that every person could potentially get physical access to the edge device which makes Zero Trust an important paradigm shift also for IoT.

Project Volterra
The Microsoft developer conference Build recently took place and introduced new developer AI services and even hardware. Project Volterra for example is a new Windows on ARM device powered by the Qualcomm Snapdragon compute platform. Qualcomm also announced the Qualcomm Neural Processing SDK for Windows toolkit which allows to explore many AI scenarios. Project Volterra contains a powerful Neural Processing Unit (NPU) which will allow to speed up inference on the edge, but it’s also a dev box at the same time and Microsoft is going to make it easy for developers to leverage these new capabilities, by baking support for NPUs into the end-to-end Windows platform. Visual Studio, C++, .NET and many more will work on this new platform allowing development of AI solutions on the same hardware being used for execution.

Microsoft is not alone in the field and other companies are working on similar intelligent edge devices. For example, the NVIDIA Jetson Nano and others are available in the market. Regardless of which edge device is used, it is an exciting time where developers can quickly build custom AI solutions with fast execution time on intelligent edge devices.

Hybrid Loop
Another announcement for the intelligent cloud and intelligent edge was Hybrid Loop which is a powerful, cross-platform development pattern for building AI experiences that span the cloud and edge. Hybrid Loop allows to decide during the runtime whether to run inferencing on the Azure cloud or the local edge device. It can also dynamically shift the load between client and cloud. This is an exciting new offering that Microsoft is working on, as it will allow management of workloads dynamically and to quickly scale into the cloud providing a fluent decision making to trade off edge network latency VS compute power in the cloud.

Figure 3: Hybrid Loop: decide during runtime whether to run inferencing on the Azure cloud or the local edge device.

Are you ready to build your own customized Intelligent Edge AI solutions with us or want to learn more? Valorem Reply’s Data & AI team can help you reach your goals quickly and efficiently. Reach out to us as to schedule time with one of our industry experts.