How would software-defined storage and AI-driven automation define the next generation of data management solutions?
Data management is getting tougher. With the cloud – and now AI – complicating the way we create, store, share, use and analyze data, organizations today need to plan fast for the deployment of next-gen data management solutions.
DigiconAsia discussed the challenges for today’s data infrastructure, and the technologies that can help organizations better manage and leverage their valuable data, with Justin Chiah, Vice President and General Manager, Data Services and Storage, Asia Pacific, HPE
Why is there a massive data surge and fragmented infrastructure that many enterprises are facing?
Justin Chiah (JC): In this age of insight, businesses are more data-driven and data-intensive than ever as they increasingly rely on data to facilitate decision making and enhance business outcomes. There is an ever-growing number of data-intensive applications and services spread across multiple IT environments, especially with the rapid growth of enterprise AI adoption in recent years.
Also, the proliferation of edge devices, especially IoT devices that are forecasted to grow around 150 billion by 2025, means more data being created and processed outside the traditional data and cloud applications. This means data volume, usage, and complexities are increasing exponentially, and at a faster pace than many organisations are prepared to handle. In fact, by next year, data growth is expected to exceed 200 zettabytes.
Enterprises today are grappling with how to manage data processing and storage and are struggling with how to implement a successful data first modernization strategy that is able to support data optimization. This is further accentuated by concerns around data sovereignty and latency considerations as well as spiraling costs due to a noticeable shift from cloud-only / cloud-first strategy to a cloud smart / hybrid cloud approach.
The massive adoption of data-intensive AI will only exacerbate the need for data to be effectively utilized in a hybrid manner, both on premise and on cloud. Against this backdrop, leading enterprises are increasingly seeing the benefits of taking a “hybrid by design” approach to data – recognizing the value of taking a consistent approach to data management across all environments, and taking advantage of the simplicity of a cloud-like operating experience.
How can businesses effectively leverage their data for greater success and innovation?
JC: Traditional data infrastructure with silos and manual processes are insufficient to handle the computing and storage requirements of today’s businesses. To effectively harness data and transform it into insights for innovation, businesses must adopt a data first strategy to reduce the problems of data silos, inconsistent data management between data on premise and in the cloud as well as increased security concerns associated with that.
Another aspect that will help businesses effectively use data is ensuring that their data adhere to the sovereignty rules and performance and latency requirements.
A data first approach means helping customers adopt a modernized and unified data fabric to allow different stakeholders in the data pipeline to have global access to data no matter where it resides. It also necessitates a cloud native platform that is automated and optimised, agile and flexible to allow users to have visibility and control of data regardless of it being on premise or on cloud.
What are some examples of organizations in Asia Pacific that have successfully enabled seamless data access across hybrid cloud environments?
JC: We are working with many organizations across industries in different parts of the APAC region.
For example, Amana Living, a not-for-profit aged care service provider in Western Australia, used HPE GreenLake edge-to-cloud platform to replace their legacy IT infrastructure in two data centers, migrated from Azure to on-prem and moved from MPLS to SD-WAN with a noticeable application performance improvement of 200%.
Korea Customer Service (KCS), the administrative agency of the Republic of Korea in charge of tariffs and customs clearance, selected HPE Ezmeral for an unified real-time access to all relevant sources of internal and external, structured and unstructured data – including, for example, customs and trade data bases, scanned documents, photos, or mass data such as system logs. With the help of HPE Ezmeral, KCS expected to increase smuggler crackdown by 5% every year, saving roughly 42 billion Korean wons, or US$34 million.
Patrick Terminals in Australia also selected HPE to provide them with an as-a-service private cloud operating model as they needed flexibility to support the rapidly changing demands of the cargo business while also managing costs effectively.