Perceiving big data in the real world
It doesn’t matter which big-data conference you attend, as the concluding note remains the same, big data continues to be an emerging technology and its promises beneficial insights, which no traditional analytic tool can deliver. Now, this vague statement brings in a certain element of chaos. Are you expected to play the wait and watch game, and possibly lose the first move advantage or are you supposed to be a trailblazer and perhaps find yourself at the point of no return. Just to address this dilemma, let’s take a look at four aspects big data adopters ought to consider before taking the plunge.
In his blog, Lessons learned from real world BigData implementations, Michael Kopp, says while there are many drivers to big data, the two imperative ones are analytics and technical need for speed. Kopp, who is a technology strategist and evangelist at Compuware, further suggests while we get overwhelmed with the volume of data, it is in fact the insight that this data provides that is valuable.
Recently, an RSA study had suggested that how big data is expected to dramatically alter almost every discipline within information security. What makes this trend more interesting is that this leveraging big data to device security solutions would actually drive the adoption of big data as for security concerns and affects all.
The human factor
Furthermore, when you talk about analysing big-data, you must consider the human element. At the end of the day analysts are people and people don’t like to wait for hours. So, while ad hoc analytics doesn’t have to be instant, it must not take hours, either. So as business analytics is often an iterative process, taking a Map/Reduce approach makes sense.
Size does matter
It is necessary potential big data adopters consider the size of the big data environment. The constant data growth means that ad hoc queries either get slower over time or need to work on samples. To remedy this, companies are writing scrubbing and categorizing MapReduce jobs, which enable analysts to work on a cleansed data set. However, the implications are that scrubbing jobs need to be maintained all the time (as data input is changing over time) and they need to be able to keep up with the velocity of the input.
Big data comes at a price
While it sounds obvious, it is something vendors don’t talk about unless they’re specifically asked. For instance, the open -source based Hadoop requires a lot of hard¬ware and a lot of expertise, which is hard to come by as of yet. And while hardware might be cheap, the bigger the environment the higher the operational cost.
Finally, it must be be understood that for big data to deliver on its promises it must be cost- and time-effective to those that harness its value the business and not just technology experts.
- Weekly Rewind: Top 10 Stories On CXOToday (Jan 15-20)
- Top Fintech Trends That Will Dominate This Year
- Data Felon’s Next Target: Insurance Firms
- 2018: The Year Of Demystifying Unstructured Data
- The Key Security Factors In Aadhaar Authentication
- Top 10 Customer Experience Trends in 2018
- How Big Data Is Changing Banking And Finance
- Microsoft, Accenture Team Up To Help Start-Ups
- Weekly Rewind: Top 10 Stories On CXOToday (Jan 8-13)
- Qlik Appoints Mike Capone As New CEO