If another pandemic arrives next year, will businesses be fully prepared to react quickly? These five strategies could boost that readiness…
At the start of this year we all took a deep sigh of relief thinking that unprecedented events were behind us. However, as the year continued, it became clear that change on a macro scale is here to stay, and we now find ourselves stuck in a perfect storm.
An economic recession is on the horizon, conflicts continue to impact global markets and organizations all over the world are looking at their bottom lines, working out what is the smart investment and why. We are already seeing the effects on the tech landscape — with VC funding declining; tech de-coupling; a continued lack of access to data skills; and more complex regulations coming into place.
With so much pressure to innovate it can be hard to know what to focus on. But what is clear is that achieving decision accuracy and integrating siloed and distributed data sets to accurately see the big picture (in real-time) will be vital to survival and future success. That is why we have outlined five key trends that every data-driven business should act upon in 2023.
-
Move AI deeper into the data pipeline: As economic uncertainty continues, many will see a pull back on investment and hiring. However, with the global skills shortage continuing to impact firms of all sizes, ensuring technologies such as AI and ML are able to automate some of the more menial data preparation tasks will be crucial. By moving AI deeper into the data pipeline before an application or dashboard has even been built, we can finally start to shift the breakdown of time spent on data preparation versus data analytics. Doing this would enable hard-to-come-by data talent to focus on the value-add; cross-pollinating and generating new insights that were not possible before: a far more productive use of their time.
-
Invest more in derivative and synthetic data: If the last few years have taught us anything, it is the value of investing time and resources into risk prediction and management using historical data. Now, some research is suggesting that models trained on synthetic data can be more accurate than others. This also eliminates some of the privacy, copyright, and ethical concerns associated with the real data. Then, derivative data will allow us to repurpose information for multiple needs, and enables the crucial scenario-planning needed to prepare for future scenarios.
-
Be ready for natural language capabilities to rival humans: Think about how often you have interacted with a customer support chat bot to resolve your issues with your bank or insurance provider. This tech is about to evolve dramatically as there are several new models in development that are significantly more powerful than anything we use today. More powerful natural language capabilities will have huge implications for how we query information and how it is interpreted and reported: we will find not only the data we seek but also the data we never even thought to ask about.
-
Mitigate supply-chain disruption with real-time data: Anyone who has attempted to buy a new car (or even something as basic as toilet paper) in the last few years knows how seriously supply chains have been impaired. Things show no sign of abating and so too is the need to react quickly, or ideally even preempt issues for a quicker response when needed. Having the power to analyze data in real-time and in context is key to this. IDC predicts that by 2027 60% of data capture and tech spending will be about enabling real-time simulation, optimization, and recommendation capabilities.
-
Consider a data fabric architecture to realize the full power of complex data: Rapidly shifting rules and regulations around data privacy, distribution, diversity and dynamics are holding back organizations’ abilities to really squeeze the best competitive edge out of data. As data governance becomes even more complex, realizing the full power of data is getting more difficult. This is why an increasing number of organizations are turning to data control plane architecture, an “X-fabric” not just for data, but also for applications, business intelligence dashboards and algorithms — enabled by catalog and cloud data integration solutions. For any organization that wants to act with certainty, this architecture could be a critical component in the modern distributed environment where data and analytics professionals need to adjust to ever-more fragmentation: disparate data centers; disrupted supply chains; a consistent need for innovation; and hampered access to skilled labor.
The good news is that after the last few years, we should all be more prepared to roll with the punches than ever before. And in a world where crisis has become a constant, calibrating for it becomes a core competence — so that we can react in the moment and anticipate what is coming next.