Evolving dimensions of analytics in business models
Everyone understands why providing information to make business decisions quicker is more valuable. As technologies evolve rapidly with digital, virtual, metaverse, and whatnot, business leaders are focusing on strategies to address the blurring lines between the physical and virtual worlds. These technological unions produce a tremendous amount of big data exhaust that can be analysed for insights. According to a report by Gartner, organizations that limit their data analytics strategies to mere customer acquisition could be missing out on the larger picture. Some of the smartest visionaries on the planet are re-structuring their analytics strategy to tap into this potential to make use of this technology and innovation allegiance. Sandhya Balakrishnan, Regional Head, US, at Brillio Analytics in a discussion with CXOToday shared more insights on the same.
- As the dimensions of analytics are evolving with the concept of big data, how are organizations ensuring the scale conundrum?
As stated by a recent IDC report, global data volume would reach a shocking 175 ZB by 2025. As the supply of data accelerates, so is the demand for data and data / AI products by various stakeholders in the enterprise and customers.
As supply and demand continue to grow, the scale conundrum puts immense pressure on delivering data / AI products that show quick RoI, are repeatable, easy to incorporate into existing tools and processes, and reinforce change management with high believability. The scale conundrum needs to be addressed in the three dimensions below:
a. Value Realization with a shorter gestation period
This requires enterprises to build clear frameworks to measure adoption and impact on KPIs with the introduction of new data / AI products. These frameworks should help prioritize investments into the right programs, leadership mandate to align different parts of the organization to drive alignment and adoption as well as ongoing measurement to iterate/elevate the data / AI products. Architectural choices and operating model choices that keep the long-term TCO low are other critical decisions to be made during the design phase. It is also equally important to successively cut downtime to market with repeatable frameworks that simplify data ingestion, consumption, and data/model/ BI management to release bandwidth from valuable data professionals in the organization.
Big data can be overwhelming, so the experience with which it is delivered dictates the level of adoption in most cases. The simpler we make access to data, insights and model, the greater the adoption is. We need to simplify the complexity as it adds further by incorporating data and insights into processes and workflows that exist in the organization instead of building yet another set of applications users need to learn to use.
Many high investment data, analytics programs fail to drive usage due to skepticism about how good the data is, how current the data is, how to understand the data or the insights, or sheer awareness challenges whether certain data can be used without regulatory breaches. Investments to automate data quality, data security, data observability, Responsible AI and model observability are extremely critical to ensure big data and analytics programs are not perceived as black-box systems which incur risk to the decision-making process
2. What are the pathways available for enterprises in data analytics for staying relevant?
Relevance for data analytics is being increasingly defined by the ease of use of data analytics systems and tools. New paradigms such as Data Marketplaces allow for high experience intuitive workflows for various personas who want to use data. The more we can shield the users from the complexity of technologies, governance, and processes involved in using data and analytics, the more users will use the data and analytics at points of decisions that matter the most.
Higher leverage of AI also unlocks highly relevant and contextual information and insights from semi / unstructured data. Resolving bottlenecks in the process of applying AI such as annotation helps easy scaling of insights from such high potential and relevant sources.
3. How has Brillio improved customer experience, can you share examples?
At Brillio, we drive comprehensive digital transformation for our customers by leveraging the four superpowers of technology – AI/ML, IoT, Cloud, and Mobility. We help our clients to elevate their customer’s experience across the lifecycle from being a prospect to an active customer to the post-sale and re-activation cycle. We focus on ensuring these experiences we help our customers build are omnichannel and consistent across the lifecycle, constantly nudging the customer to derive more value and buy more.
For one of the largest real estate listing services online we have helped transform the experience they deliver to both the homebuyers as well as the agents/realtors. We have built sophisticated ML-driven systems to help homebuyers discover homes easily, stay current on new updates, and provide them insights into properties and neighborhoods of interest. Similarly, we also helped boost the agent/realtor’s experience by augmenting them with home-buyers interest, propensity, and sensitivity to buy and better value-based pricing of leads.
Another example is the work we do with one of the largest Quick Service Restaurants. We have helped them over the last many years to improve their customer journey across all digital properties (web, mobile, kiosk) thereby improving average order value. We also extend those insights to improve the experience in drive-throughs and mobile orders as well as constantly improve loyalty as preferred channels of order evolve.
4. What are the future trends in the data analytics space for 2022 and beyond?
Federated enablement of Data and Analytics
More organizations are adopting a federated model of Data and Analytics to keep pace with the demand. Architectural patterns like data mesh and fabric, and new paradigms such as Data Marketplace and Data exchanges are fast becoming the norm to enable the same. Agile models of Data governance and Security will become a central pillar for the successful execution of the same.
As more enterprises have their data, the AI backbone ready the demand to monetize the data by launching new products and services which leverage this capability is accelerating. It also provides the opportunity to make expensive data and AI investments self-funded. Regulations in this space, beyond GDPR, are fast evolving as the ethics of this strategy are still grey in many industries.
Industrialization of AI
AI is breaking out of the pilot cliché to be part of mainstream technologies. Enablers are constantly evolving – domains such as Model-ops, Responsible AI, Model observability, and federated learning all contribute significantly to the potential of industrializing this capability. Managing change with humans, the loop mechanisms are also becoming highly popular. Applications for Digital twin and Metaverse will further evolve enablers in this space.