Amid the geopolitical and economic uncertainties, and the phenomenal rise of AI and data in the digital economy, how should organizations in Asia Pacific leverage these technological trends to survive and succeed?
It is the worst of times; it is the best of times… Volatility marks the current global economy, while digital transformation continues on its journey to a never-ending crest.
AI and data trends are a huge contributing factor. But with growing scrutiny on AI transparency and data governance, there’s real pressure on enterprises to make sure their GenAI models are built on reliable, real-time data.
Recent high-profile AI missteps have underscored how weak data foundations contribute to these issues, and highlighted why organizations in Asia Pacific need to continue driving organizational change and digital transformation to leverage AI and data.
DigiconAsia discussed the challenges, trends and developments related to AI transparency and data governance – as well as green technologies and open source – in this Q&A with Andrew Sellers, Head of Technology Strategy, Confluent.
What are the key technology trends you’ve observed impacting organizations this year?
Sellers: 2025 has seen rapid evolution in technology – across AI’s pervasive implementation across industries, the growing interest in green technology investments, and increased focus on improving automation capabilities.
Generative AI increasingly remains a staple for smart, evolving businesses that wish to leverage its intuitive abilities. In particular, agentic AIhas emerged as a top trend, with organisations looking to lean on AI agents that react to their environments to automate business functions.
This also ramps up the importance of real-time data. GenAI’s successful implementation into business operations hinges on the quality of the data that trains it and later contextualizes it at inference time. Traditional LLMs aren’t suitable for business applications as they are built on past data. AI models will increasingly rely on seamless data movement across workloads, driven by customers’ expectations of faster and smarter experiences and regulatory mandates.
Data streaming, which produces and consumes data in real-time from completely disparate parts of the organization, has played a pivotal role in AI for good reason. Data streaming platforms allow organizations to build a shared source of real-time truth for sophisticated model building and fine-tuning as well as bringing real-time context at query time, as organizations increasingly rely on live data streams for on-demand predictions for fraud detection, customer personalization, modernization efforts organizations increasingly rely on live data streams for on-demand predictions for fraud detection, customer personalization, modernization efforts.
Interest around developing green technologies has risen significantly, with private green investments in South-east Asia reaching US$8 billion in 2024, a 43% increase from the previous year.
In working towards more sustainable means, we envision that new hardware, cloud services, skills, algorithms and applications will be required, with the short-term efforts resulting in a rise of energy prices, as green energy demand increases.
How can APAC organizations survive and thrive in the current digital economy that is fraught with uncertainties?
Sellers: In today’s volatile digital economy, high-velocity data is your competitive differentiator. Organizations can’t just seek to survive in an evolving tech landscape – they should be seeking to set the pace for the next wave of innovation across APAC.
Some areas of consideration to increase organizational resiliency:
- Leverage data streaming for AI and digital transformation:
Data streaming platforms (DSPs) are key enablers for AI and machine learning initiatives, providing the real-time, contextual data needed for effective AI models. Intuitive, real-time data also allows businesses to automate, personalize, and scale services efficiently.
Confluent’s Data Streaming Report (DSR) 2025 outlined that Singapore businesses expect DSPs to deliver direct benefits, with better information flow, such as product / service innovation (52%), increased revenue and / or growth (48%) and improved customer satisfaction (45%). - Break down data silos
Technology like data streaming helps unify data across business units and geographies, addressing the challenge of fragmented data that is common in APAC’s diverse markets.
APAC organizations are increasingly adopting “data products” — reusable, real-time data sets that empower teams to innovate and share insights confidently. 98% say data products built on streaming platforms enable better data sharing and drive digital transformation. - Overcome organizational silos through upskilling and education
Singapore IT leaders were also quoted in DSR 2025 underscoring the frequent challenge with insufficient skills and expertise in managing AI workflows and projects, impeding enhanced collaboration throughout the team. Addressing these through upskilling, cross-team collaboration, and a culture of data-driven decision-making is essential for sustained competitiveness.
Overall, it’s important to remember that technology is only part of the equation—real value comes when it’s paired with the right organizational mindset and a strategy for change. This entails organizations rethinking how they work with data across teams, systems, and processes. Plus, enabling a culture where data is treated as a shared, reusable asset. Without this shift, even the best technology will fall short.
With many organizations embarking on AI projects, the importance of data governance cannot be overstated. What are the challenges they face and what steps should they take to ensure transparency, good governance and compliance?
Sellers: Data governance isn’t just the responsibility of any singular department – it requires the close collaboration of the entire organization, from the top-down, right to workers who rely on data daily.
Some challenges faced in employing data governance effectively include:
- Cross-functional alignment: Ensuring teams across the organization are aligned in how data is employed, and upkeep security in data usage. If there are new governance processes, it is critical to effectively communicate these changes.
- Data silos vs data as a product: Keeping a flat hierarchy across the board, when driving standardized data usage. To break the chain of data silos, organizations need to empower data owners to act as stewards through with promoting data as both an asset and a product that benefits
- Deferring security and compliance: Many important projects get delayed when the security layer is added a little too late. Delaying security implementation creates high costs that hurt project value and make compliance so expensive that teams might frequently ask for exceptions.
Therefore, this still dials back to the fundamental truth of data ingestion and analysis. We need to ensure that organizations are building trust within AI systems. While there are emerging frameworks and standards, this remains a fragmented area with patchwork solutions
The call is to employ the ABCs of governance & compliance:
- Accountability of leadership: The organization needs to maintain executive oversight, which is critical to ensuring adoption across business lines. Where necessary, there should be designated data stewards and owners for specific data assets.
- Balancing data quality & security: The first line of approach is creating clear, documented data governance policies that cover data quality, access, usage, and security, while also implementing regular security audits to protect sensitive data. In the same breath, to build a framework around upholding data integrity, it is helpful to identify high-value use cases and the data needed to implement them.
- Certifying metrics of data governance: This includes outlining metrics to quantify and assess the effectiveness of data governance, while leveraging data governance tools for cataloging, lineage tracking, quality monitoring, and metadata management.
How does collaboration in the open-source community drive innovation and business growth, especially across borders?
Sellers: We are seeing more successful corporate champions (e.g. Confluent, MongoDB, Redis) of critical open-source projects, which professionalize and resource efforts, while still maintaining independence from corporate interests for some core of the project, in order to drive others to contribute. The implication is that Open Source needs a business model, but it is still possible for everyone to win.
Open source is a framework for collaboration, particularly across borders. Otherwise, companies don’t often work together across geopolitical boundaries, but we see strong examples of companies supporting open source projects with their competitors in opposing intergovernmental spheres. Consider Apache Flink, which has core contributors from the United States, Germany, and China.
Companies donate IP to keep from maintaining forks and merging back new community features (an expensive process), to foster innovation.Growth comes from developers being able to focus on differentiated and specialized business applications, rather than more generalizable software applications and infrastructure.
Within Confluent’s developer community, this is bolstered through active community engagement, with members tapping on each other’s knowledge and exchanging best practices across borders.
By leveraging the collective intelligence of the open-source Kafka and Flink communities, Confluent stays at the forefront of real-time data streaming innovation. Community contributions help shape the roadmap and ensure the platform meets diverse, real-world needs.
How is the role of developers changing as a result of current DevOps and AIOps trends?
Sellers: In the DevOps and AIOps space, there has been a convergence of automation, intelligence, and security that has evolved how developers operate and what their roles look like.
Current DevOps trends
- There are better frameworks (e.g. Durable Execution, stream processing) that make it more possible than ever for developers to focus on differentiated solutions and less infrastructure wrangling.
- Infrastructure-as-Code, commoditized monitoring/observability, and more sophisticated cloud primitives promote better architecture patterns with faster time-to-market, allowing developers to get complex distributed systems to market faster.
- Better data governance and reusability mean data modeling is less onerous when bringing new applications and insights to market.
- AI and machine learning are also deeply embedded in DevOps workflows, automating repetitive tasks, optimizing CI/CD pipelines, and enabling predictive analytics for incident management and deployment optimization.
What this means for developers: Developers spend less time on manual, repetitive tasks and infrastructure management. Thanks to automation, platform engineering, and self-service environments, it frees up their mindspace to innovate – be it on writing code and delivering business value.
Self-healing infrastructure and AI-driven root cause analysis further reduce the operational burden on developers, improving system reliability and developer productivity.