Interviews

Adapting to Disruption: Data-Driven Strategies in Today’s Changing Landscape

CXOToday has engaged in an exclusive interview with Varun Babbar, Managing Director, India and SAARC, Qlik.

 

Q1: In light of the current macroeconomic conditions and disruptive events, how are data-driven businesses strategically adapting to navigate the changing landscape?

Today, nearly 7 out of 10 global tech leaders are concerned about the growing technology investment required to remain competitive. But most, are looking to increase their data efforts. Market surveys indicate that data integration, analytics, automation, API management, and AI are all top technologies CXOs rely on for crisis management. Data-driven businesses are leveraging emerging streams of data and the power of artificial intelligence (AI) to address growth challenges while confronting ongoing changes and resource constraints. This year it has been crucial to focus on two areas – honing decision accuracy at speed and scale to better react and “pre-act” to unexpected events; and achieving connected governance by accessing, combining, and overseeing distributed data sets to handle a fragmented world.

Q2: What are some of the trends in data and data integration?

The technology needs of the future are shaped by the dynamic changes in today’s geopolitical, social, and economic landscape. As we look ahead, the importance of having trusted, analytics-ready data, and making decisions at the speed of business will be more crucial than ever. Here are some prominent trends we observe:

  1. Data-to-action pipelines to mitigate supply-chaindisruptions— solving problems before they occur. Although many organizations have the infrastructure to support real-time decision-making, the discipline has yet to reach its potential. It is crucial to move quickly in response to market changes, build contingency plans for potential crises, or capitalize on the opportunities presented by data processing at the edge. Supply chain disruption has driven the need for real-time information, but it also happens on a much smaller scale. For example, Netflix gives recommendations based on viewing preferences. This might seem simple, but much real-time data is used to achieve accuracy. It’s no longer enough to make decisions in a day, week, or month; some decisions must happen now; otherwise, they become irrelevant.
  2. Data storytelling can compel action – moving beyond just static charts to augmented analytics embedded where people work. Fortunately, you don’t have to get all the data to all the people all the time. And not every insight has to be arrived at through user exploration. Many can be more prescriptive and recommendation-oriented, delivered straight from the data. Data storytelling has been touted as the way to get data to make sense to users; stories can reach people emotionally and compel them to act. But data storytelling needs to be much more than adding charts to infographics or PowerPoints. It must be connected with action.
  3. “X fabric” holds connected governance together. The discussion in recent years has been about data fabric (as well as hubs and mesh), an important methodology that connects distributed data sets through semantic models. But for connected governance, we need more than that. In a world with millions of builders, we need other fabrics, or “X fabrics.” Being able to reuse data and analytic assets is critical, spanning models, scripts, and analytics content. As businesses move towards the ultimate goal of universal metadata within their business that can be accessed by everyone, the ‘X Fabric’ concept is emerging as a way to ensure governance for both the data and analytics processes that underpin this architecture. It’s important to remember that Data Fabric is a technology pattern is not tied to a single vendor. It requires technology independence to work with existing and future infrastructure. It fulfils several essential requirements when implemented correctly, such as unlocking all data with minimal latency and more.

Q3: How is data governance playing a key role in unlocking the full potential of enterprise data management strategy?

To ensure the effectiveness of data-driven decision-making and action, organizations also need to prioritize data governance. Data governance involves the management of data availability, usability, consistency, integrity, and security, among other factors. A data fabric can be an effective tool for achieving effective data governance. A data fabric is a unified data architecture that enables organizations to connect, manage, and govern their data assets across multiple sources, locations, and formats. It provides a flexible and scalable framework for data integration, data management, and data analytics, allowing organizations to make better decisions based on trusted data.

With a data fabric, organizations can achieve a holistic view of their data assets, gain greater control over their data, and ensure that data is used by compliance and regulatory requirements. By leveraging a data fabric for governance, organizations can ensure that their data-driven initiatives are successful and that they can achieve their strategic goals.

Q4: How can the integration of AI technology into data pipelines enhance data management and analysis, freeing up time for skilled employees?

The integration of AI technology into data pipelines can enhance data management and analysis, freeing up time for skilled employees. By incorporating AI into data management, routine tasks in data engineering can be automated, transforming the traditional 80/20 distribution between data preparation and analysis. AI can handle mundane data preparation tasks such as anomaly detection, reporting, self-healing capabilities, just-in-time deployment, and identification of sensitive attributes like personally identifiable information (PII). This automation allows data engineers and scientists to focus on more impactful work, leveraging their skills in synthesizing complex problems and deriving insights from data. By offloading repetitive tasks to AI, skilled employees have more time for in-depth analysis and decision-making, leading to more efficient and effective data-driven strategies.

Q5: How does Generative AI, described as a transformative evolution in the market, has the potential to augment analysis and processes by incorporating relevant external data, and how can organizations leverage this potential to enhance their data and analytics capabilities?

Generative AI became so popular quickly because it empowers the end user: right now, anyone can log on to ChatGPT and start using it, which is a first for an AI application. You can start talking to it, and it will understand what you want to do and offer a response. This user-centric approach eliminates barriers to entry and enables anyone to engage with AI effortlessly. Generative AI is an emerging technology best suited for use cases around content generation and summarization or extending the capabilities of traditional chatbots. At Qlik, we leverage the strengths of both generative and traditional AI to deliver measurable value to organizations across industries. Qlik’s new OpenAI connectors has been built on its long history of enabling customers with AI, ML and NLP capabilities directly within the platform. Here are few use cases:

  • Natural Language Insights

Users can augment their analytics with natural language and incorporate third-party data into existing data models. They can ask a specific question by making a selection in their data and receive the information directly back in their sheet. Further it can be used for augmenting existing data and KPIs with a narrative summary.

  • Sentiment Analysis

Another great use case is sentiment analysis, a process to determine if a piece of text, like a sentence or a social media post, expresses positive, negative, or neutral feelings. With our analytics connector, users can enrich text-based data sets like product reviews, surveys, or service tickets using OpenAI to generate sentiment analysis.

  • Real-Time Question & Answer

The ability to ask questions and get answers in real time is another great use case. Users can ask any question they want and get contextually relevant content with the most up-to-date responses available from OpenAI. Combined with small subsets of data delivered by the Qlik engine in real-time, it greatly enrich the context and value of the internal analytics.

Leave a Response