Big Data From IoT May Pose Challenges For Data Centers
As Internet of Things (IoT) deployments will generate large quantities of big data that need to be processed and analyzed in real time, this will bring newer sets of challenges to CIOs and data centre providers, says Gartner in a recent report. The report shows that IoT will include 26 billion units installed by 2020, and by that time, IoT product and service suppliers will generate incremental revenue exceeding $300 billion.
Fabrizio Biscotti, research director at Gartner who sees a potential transformational effect on the data center market, its customers, technology providers, and business models, raises the concern that processing large quantities of IoT data in real time will increase as a proportion of workloads of data centers, leaving providers facing new security, capacity and analytics challenges.
The IoT connects remote assets and provides a data stream between the asset and centralized management systems. Those assets can then be integrated into new and existing organizational processes to provide information on status, location, functionality, and so on. Real-time information enables more accurate understanding of status, and it enhances utilization and productivity through optimized usage and more accurate decision support. Business and data analytics give insights into the business requirements data feed from the IoT environment and will help predict the fluctuations of IoT-enriched data and information.
“The enormous number of devices, coupled with the sheer volume, velocity and structure of IoT data, creates challenges, particularly in the areas of security, data, storage management, servers and the data center network, as real-time business processes are at stake,” said Joe Skorupa, vice president and distinguished analyst at Gartner.
“Data center managers will need to deploy more forward-looking capacity management in these areas to be able to proactively meet the business priorities associated with IoT,” added Skorupa.
For example, a data center infrastructure management (DCIM) system approach of aligning IT and operational technology (OT) standards and communications protocols to be able to proactively provide the production facility to process the IoT data points based on the priorities and the business needs. Those comprehensive scenarios will impact design and architecture changes by moving toward virtualization, as well as cloud services.
Biscotti believes that consequently, organizations will have to automate selective backup of the data that they believe will be valuable. This sifting and sorting will generate additional big data processing loads that will consume additional processing, storage and network resources that will have to be managed.
- 2018: The Year Of Demystifying Unstructured Data
- The Key Security Factors In Aadhaar Authentication
- 7 Trends Driving IT Transformation In India
- Microsoft, Accenture Team Up To Help Start-Ups
- Weekly Rewind: Top 10 Stories On CXOToday (Jan 8-13)
- Security Flaws Threaten Virtually All PCs, Phones
- Top Tech Jobs In Demand This Year
- Key Cloud Transformation Predictions For 2018
- Internet of Things: The Next Big Thing In Manufacturing
- Cognitive Computing Will Redefine IoT Landscape