News & Analysis

AI on the Edge – A Panacea for Smarter Computing

ml

By Sindhu Ramachandran

While the original objective of Edge computing was to deliver lower compute storage and processing abilities to Internet of Things (IoT) devices, it has today gone much beyond that in terms of functionality.

Artificial Intelligence (AI) is also rapidly leaving the realm of science fiction and making almost everything that this fiction envisaged possible. Its applications are impacting almost every industry and the way it operate. As data becomes the mainstay of almost all business decisions, AI is also becoming one of the most efficient business tools that can rightly leverage the power of this data. Combined with IoT capabilities, AI has actually taken over enterprise technology market. Statistics continue to prove the fact that the combination of AI and IoT, backed by the power of data is set to become the biggest technology force in business. Reports by Statista say that by 2020, the projected spending by discrete manufacturing, transportation/logistics and utilities industries for IoT platforms, systems and services, will exceed $40B for each category. Gartner predicts that by 2020, 80 %of all smartphones will have on-device AI, enabling much higher conservation of network bandwidth and power.

However, to develop solutions for business, the key decisions need to be on the best infrastructure that can support the business in delivering it. Cloud was the way forward but some subtle drawbacks- latency and the time or distance gap in decision making speeds- are now becoming an impediment. Since all business need to be real time, all the time, the distance between the computing platform and the cloud makes a huge difference to the latency here. For every 100 miles data travels, it loses speed of roughly 0.82 millisecond.

Computing on the edge takes care of this issue. It would be more interesting to think of AI on the edge as similar to what we human beings do. We try to learn from our environment to make locally optimal decisions, dynamically from local data. Today, since almost all business matters now have an AI angle, the new world of business decisions now actually run with AI on the edge.

The advantages of running AI on Edge

One of the biggest advantages of running an AI based solution on edge is that there is no time lapse or latency between the storage and analysis of data, and the decisions being delivered from the derived insights- making IoT and analytics more powerful than ever before. Costs are also no longer a challenge since the edge requires lesser connectivity for data transfer.

Data, the new gold, oil or platinum, has its idiosyncrasies. Often, enterprises are unable to derive the best benefits even if they possess reams of data. This is because real value lies in combining data sets from multiple devices, by understanding patterns that can help to predict future trends. Complex algorithms can take this data as input and process them using powerful computing devices like GPUs. These enable enterprises to create patterns that are far more insightful than basic analytics, even from large data sets. When operating on the Edge, AI has added power to provide more accurate data pertaining to various business scenarios- supporting better decision making for business. GPUs are now being deployed increasingly on the edge, to speed up this ability of AI, and increase computing time.

So, AI enabled IoT devices that collate data from both the hardware and the software stack, help operational technology to deliver intelligent, real-time decisions for business. The advantage that Edge has here is its hive like architecture of small, yet powerful computing devices, which have the ability to shift the collection, storage and analysis of this data, away from the cloud, keeping it light and agile.

Another big advantage is that on the Edge, it is easy to quickly implement mitigation tactics, since a thorough risk analysis here helps identify all possible points of entry for attackers.AI enabled solutions can help proactively detect and thwart cyber-attacks that may occur through IoT devices, even before they gain access to the cloud.

Over the recent past, need for fast inference with machine learning and deep learning has significantly contributed to the efforts to increase computing power on the Edge as well as to make them much smarter and more efficient.

Ensuring Faster Processes

However, compared to a cloud infrastructure, Edge could suffer from limited computing power-a fact that could impede the advantages it offers. The answer lies in another new development. Chip manufacturers are now helping bridge this gap with purpose-built accelerators, to accelerate the inferencing process. Known brands like NVIDIA Jetson, Intel Movidius and Myriad Chips, and the Google Edge Tensor Processing Units, provide highly optimized edge pipeline workflow. Enterprises will use this support increasingly over the near future. The combination of hardware accelerators and software platforms ensure efficient running of inference models on the edge. Once the AI inferences gather speed, the Edge will become an even more invaluable site to leverage machine learning and AI.

In addition, the IoT applications for enterprise often come with a very heavy workload, and the cloud may not be able to sustain it in some industries that are data heavy–medical devices, manufacturing and transportation, to name a few. As Edge devices with more computing power and storage are being introduced, it should be ensured that they are also supported with reliable operational infrastructure.

Trends for the near future

The step further from plain vanilla IoT is Industrial IoT, and it needs to deliver all the advantages of IoT in an industrial environment. The ability to deliver high speed, cost effective and real time decisions based on constant data collation, will be the hallmark of IIoT as well. With data coming from almost every possible source, very sophisticated machine learning models on the Edge will help make much better sense out of each video frame, speech synthesis, and even unstructured data from sensors, and AV devices.

With the widespread use of 5G network, the much better high-speed connectivity will make AI on Edge even more exciting, as it will offer even higher data rate, at much lower costs and almost nil latency. It is natural that AI on the Edge will lead the notion of intelligent digital twins that have the ability to learn from their environment and contribute to decision making locally based on local goals.

The future is certainly much faster, smarter and on the Edge!

(The author is the Principal Architect at QuEST Global and the views represented here are her own and in no way does the platform associate with it)

Leave a Response