"Big Data Deployments To Go Mainstream In 2016"
2016 will be the year when big data becomes more mainstream and is adopted across various sectors to drive innovation and capture digitization opportunities”, said Neil Mendelson, Oracle’s VP Big Data Product Management. In the past one year, his company witnessed an increased investment by organizations in big data technology as it emerged to be one of the core components of their strategy.
Mendelson also highlights some of the biggest trends in the big data space in 2016 that in turn has helped him predict why big data will become mainstream unlike the previous years.
1. Increased demand for data scientists- With more hypotheses to investigate, professional data scientists will see increasing demand for their skills from established companies. Banks, insurers, and credit-rating firms will turn to algorithms to reduce price risk and guard against fraud more effectively. 2016 will witness an increase in the proliferation of experiments default risk, policy underwriting, and fraud detection as firms try to identify hotspots for algorithmic advantage faster than the competition.
2. Emergence of new management tools- New management tools will uncouple and enclose the big data foundation technologies from higher level data processing needs. We will also see the emergence of dataflow programming which provides simpler reusability of functional operators, and gives pluggable support for statistical and machine learning functions.
3. Data civilians operate more and more like data scientists- While complex statistics may still be limited to data scientists, data-driven decision-making will not be. In 2016, simpler big data discovery tools will let business analysts shop for datasets in enterprise Hadoop clusters, reshape them into new mashup combinations, and even analyze them with exploratory machine learning technique. This will improve both self-service access to big data and provide richer hypotheses and experiments that drive the next level of innovation.
4. DIY gives way to solutions- Early big data adapters had no choice but to build their own big data clusters and environments. But building, managing and maintaining these unique systems built on Hadoop, Spark, and other emerging technologies is costly and time-consuming. In 2016, organizations will witness technologies mature and become more mainstream thanks to cloud services and appliances with pre-configured automation and standardization.
5. Data virtualization becomes a reality. Companies will not only capture a greater variety of data in 2016, they use this in a variety of algorithms, analytics, and apps. Companies will look for a shifting focus from using a single technology such as NoSQL, Hadoop, relational, spatial or graph, to increasing reliance on data virtualization. Users and applications will connect to virtualized data, via SQL, REST and scripting languages in 2016 . Organizations will witness that successful data virtualization technology will offer performance equal to that of native methods, complete backward compatibility and security.
6. Big data gives AI something to think about. 2016 will be the year where Artificial Intelligence (AI) technologies such as Machine Learning (ML), Natural Language Processing (NLP) and Property Graphs (PG) are applied to ordinary data processing challenges. The new shift will include widespread applications of these technologies in IT tools that support applications, real-time analytics and data science.
7. Data swamps try provenance to clear things up. Data lineage used to be a nice-to-have capability because so much of the data feeding corporate dashboards came from trusted data warehouses. But in the big data era data lineage is a must-have because customers are mashing up company data with third-party data sets. Some of these new combinations will incorporate high-quality, vendor-verified data. But others will use data that’s not officially perfect, but good enough for prototyping. When surprisingly valuable findings come from these opportunistic explorations, managers will look to the lineage to know how much work is required to raise it to production-quality levels.
8. Emergence of big data cloud services with the help of IoT- Highly secure IoT Cloud services will help manufacturers create new products that safely take action on the analyzed data without human intervention.
9. Data politics drives hybrid cloud- Knowing where data comes from- not just what sensor or system, but from within which nation’s borders will make it easier for governments to enforce national data policies. Multinational corporations moving to the cloud will be caught between competing interests. Increasingly, global companies will move to hybrid cloud deployments with machines in regional data centers that act like a local wisp of a larger cloud service, honoring both the drive for cost reduction and regulatory compliance.
10. New security classification systems balance protection with access. Increasing consumer awareness of the ways data can be collected, shared, stored and stolen will amplify calls for regulatory protections of personal information. The industry can expect to see politicians, academics and columnists grappling with boundaries and ethics. Companies will increase use of classification systems that categorize documents and data into groups with pre-defined policies for access, redaction and masking. The continuous threat of ever more sophisticated hackers will prompt companies to both tighten security, as well as audit access and use of data.
- Ten Trends Redefining Enterprise IT In 2018
- 5 Ways AI Can Live Up To Its Promise In 2018
- Why Cloud Adopters Need Visibility Into Their Network
- Enterprise Networks: Things To Focus On In 2018
- Why 2018 Will Belong To Cloud, AI, Blockchain
- How CEO Can Avoid Digital Transformation Failure: McKinsey
- Predictions for RPA in Financial Services in 2018
- Which Sector Will Be The First To Go 100% Robot?
- Uber Data Breach: Accountability, Corporate Ethics In Question
- Technology Plays A Vital Role In Empowering The Workforce