News & AnalysisNewsletter

Deep Learning Shows Promising Growth Amid Challenges

Deep learning, a subset of machine learning and artificial intelligence (AI), has been there since a while, but became an overnight “sensation” when in 2016, Google’s AI program, a robot player beat human grandmaster Lee Seedol in the famed game of AlphaGo . Since then, deep learning training and learning methods became widely acknowledged for “humanizing” machines. Many of the advanced automation capabilities now found in enterprise AI platforms are due to the rapid growth of ML and deep learning technologies, as researchers predict deep learning to provide formidable momentum for the adoption and growth of AI, even though most of these experiments are in their infancy.

By definition, deep learning is a powerful tool for enterprises looking to gain actionable insights and enable automated responses to a flood of data, especially unstructured data, from all kinds of devices, Internet of Things (IoT), social media and – of course – from corporate data systems. From that perspective deep learning works incredibly well with unstructured data, such as images, sound, time-series of events and so on. It is seen as an “upgrade” to traditional data analysis techniques.

Read more: 2010-2019: When Artificial Intelligence Trumped Humans

Growth momentum in APAC

A recent study reveals that Deep Learning market worldwide is projected to grow by US$31 Billion, driven by a compounded growth of 41% from 2020-2025, with a special focus on the Asia-Pacific region that will witness exceptional growth. In fact, data and analytics company, GlobalData, predicts, in the Asia-Pacific region, the next few years will see deep learning becoming part of mainstream deployments, bringing commendable changes to businesses.

GlobalData estimates the APAC region to account for approximately 30% of the global AI platforms’ revenue (around US$97.5bn) by 2024. However, the share is expected to significantly go up, given the incumbent technology companies and the increasing number of start-ups that specialize in this field.

Also, the technological enhancements supporting higher computation capabilities (CPU and GPU), and the huge amount of data, which is predicted to grow multiple folds due to the growth of connected devices ecosystem, are expected to contribute to this growth.

Digital assistants like Cortana, Siri, GoogleNow and Alexa leverage deep learning to some extent for natural language processing (NLP) as well as speech recognition. Some of the other key usage areas of deep learning include multi-lingual chatbots, voice and image recognition, data processing, surveillance, fraud detection and diagnostics.

 

In APAC, deep learning is increasingly being adopted for various applications, driven by product launches and technical enhancements by regional technology vendors.

For instance, China-based SenseTime leverages its deep learning platform to power image recognition, intelligent video analytic and medical image recognition to its customers, through its facial recognition technology called DeepID. Similarly, DeepSight AI Labs, an India-based start-up (also operates in the US), uses deep learning to develop SuperSecure – Platform, a smart retrofit video surveillance solution that works on any CCTV to provide a contextualized AI solution to detect objects and behaviors.

Australia-based Daisee too offers an algorithm called Lisa, which leverages a speech-to-text engine to identify key conversational elements, determine its meaning and derive its context. Similarly, Cognitive Software Group is using deep learning / machine learning for the tagging of unstructured data to enhance natural language understanding.

Read more: AI on the Edge – The Way Forward for Smarter Computing

In the enterprise world, deep learning algorithms are applied to customer data in CRM systems, social media and other online data to better segment clients, predict churn and detect fraud. The financial industry is relying more and more on deep learning to deliver stock price predictions and execute trades at the right time. In the healthcare industry deep learning networks are exploring the possibility of repurposing known and tested drugs for use against new diseases to shorten the time before the drugs are made available to the general public.

Governmental institutions are also turning to deep learning for help to get real-time insights into metric like food production and energy infrastructure by analyzing satellite imagery.

Cheaper computing, combined with large datasets and open source frameworks like Tensorflow and Keras, brought deep learning into the enterprise, unlocking tremendous pools of value. Google, one of the companies at the forefront of the deep learning revolution, is including deep learning features in its products (e.g., auto-complete suggestions for Gmail, image search for photos, language translations, etc.). In addition, Google is using deep learning to optimize its business internally to improve search results, reduce the energy consumption of its data centers, and thousands of other projects. While other technology giants like Amazon, Facebook, Microsoft, Baidu, and Alibaba are similarly embedding deep learning in their products, so that enterprise players find useful applications for deep learning technology.

“The market is proactively deploying deep learning-based AI solutions to bring increased offline automation, safety and security to businesses and their assets. In addition, AI hardware optimization with increased computing speed on small devices will result in the cost reduction and drive deep learning adoption across the region,” says Sunil Kumar Verma, Lead ICT analyst at GlobalData.

Challenges remain…

As new use cases for deep learning are uncovered, so are the challenges that need to be addressed, the primary one being, the Need for Lots of Data.

“How much data is actually enough to train my algorithm?” That’s a question most frequently asked by anyone who works with deep learning algorithms. There is no straight-forward answer, unfortunately, but as a rule data scientists say that the more powerful abstraction you want, the more data is required.

Experts believe, in the case of neural networks, the amount of data needed for training will be much higher compared to other machine learning algorithms. The reason is that the task of a deep learning algorithm is two-folded. First, it needs to learn about the domain, and only then solve the problem. When the training begins, the algorithm starts from scratch. To learn about a given domain, the algorithm needs a huge number of parameters to tune and “play around with”.

Another most discussed limitation of deep learning is the fact that we don’t understand how a neural network arrives at a particular solution. Neural Networks at the Core of Deep Learning are Black Boxes. It’s impossible to look inside of it to see how it works. Just like in a human brain, the reasoning of a neural network is embedded in the behavior of thousands of simulated neurons, arranged into dozens or even hundreds of intricately interconnected layers.

Even though neural networks produce great results, the lack of transparency in their “thinking” process makes it hard to predict when failures might occur. The same argument also renders them unsuitable for domains where verification of the process is important. For example, Deep Patient, a deep learning program that was applied to patient records of more than 700.000 individuals at Mount Sinai Hospital in New York. After a long training period, Deep Patient was able to detect certain illnesses better than human doctors.

On one hand, this is good news. On the other, if a tool like Deep Patient is actually going to be helpful to medical personnel, it needs to provide the reasoning for its prediction, to reassure their accuracy and to justify a change in someone’s treatment. Without the justification, it is difficult to gain the trust of patients or learn why any mistakes in diagnosis were made.

Lots of potential

So, while deep learning has a lot of potential, it needs to overcome a few challenges before becoming a more versatile tool. The interest and enthusiasm for the field is, however, growing, and we can already see incredible real-world applications of this technology.

As Verma observes, “Even in its infancy, deep learning is proving to be a stepping stone for technology evolution. However, with the lack of skilled professionals and the fact that only a handful of technology companies are focusing on investing, hiring and training their workforce specifically for Deep Learning, there would be some initial roadblocks before witnessing success in adoption rates.”

Leave a Response

Sohini Bagchi
Sohini Bagchi is Editor at CXOToday, a published author and a storyteller. She can be reached at [email protected]