Where is edge computing employed in an organization? How does it handle data and latency?
Data surrounds us, and it is a crucial part of devices such as self-driving cars, smart factories, and even online gaming. Such applications require vast amounts of data to be available on demand to ensure optimal performance for users.
Edge computing helps organizations to deal with the data so they could make better-informed decisions at the edge. But with global data creation projected to grow to more than 180 zettabytes by year 2025, how can one be insured against latency issues?
To get answers to this question and more, DigiconAsia connected with Ravi Mayuram, Chief Technology Officer, Couchbase.
Where do we see edge computing used in everyday business? How does edge computing help business operations?
Ravi: Edge computing architectures bring data processing to the edge, close to applications. This makes them faster as data doesn’t have to travel all the way to the cloud and back. With local data processing extending the cloud to the near side of the network, data received is also more reliable. This is because there is reduced exposure for data to be affected by poor connection and lower risk for malicious actors to intercept data in motion.
Through Couchbase Mobile, edge architecture and a web gateway are established for overall network synchronization, where data is synchronized across the application ecosystem when connectivity is available. This enables businesses to store, query, search and analyse data in the cloud, at the edge, or on a device even when there is no internet connectivity.
Examples of how edge computing has improved business operations:
- In the context of a healthcare provider for communities in rural areas, unreliable electricity supply and unstable internet connection meant that medical records had to be done on paper.
- This led to an inefficient data storage system that made it challenging to trace patients’ medical history and provide follow-up care. With edge computing, all patient records sync to the cloud when internet access is available. In the meantime, offline-first peer-to-peer network is working in the background, which enables devices to sync even without internet. This creates a seamless patient experience that led to reduced manual processes and increased the number of patients seen by 30%.
- In Singapore where we have reliable electricity supply and fast mobile and fixed internet connections, edge computing supports our efforts to digitalise.
- For example, during a telehealth consultation, doctors can pull up information and other software without compromising the seamlessness of a high-definition video call needed for remote medical care.
- Edge computing ensures that the files that are typically required – such as note taking software, medical history, medication information are all readily available on the device that they work on.
- Edge computing in the healthcare sector powers next-gen healthcare applications and device management, enabling technology to keep up with their pace of work where data affects life and death decisions.
- Similarly, other industries require data to be available and accessible easily 24/7. For example, e-commerce or social media platforms generate millions of queries every second. Businesses need to ensure that the queries and data are available to all stakeholders at a tap of a finger, and this can be achieved through edge computing.
How has edge computing evolved to accommodate the internet of things
Ravi: When Internet of Things (IoT) devices were first developed, their main purpose was to track inventory. This required data to flow in only one direction – from the remote machine or device to headquarters.
However, with the proliferation of IoT and more inter-connected systems and devices, more data now needs to be trafficked across multiple devices and in multiple directions. As such, edge computing is necessary in reducing latency in data transfer across regions and devices.
Primary motivations for edge computing initiatives coalesce around themes of speed, availability and governance. They include:
- Security and data protection
- Deterministic latency and distance limitations
- Compliance with sovereign entities and industry regulations
- Continuous operation if network access is interrupted
For example, edge computing can be leveraged to deliver medical services or attention to rural or remote locations, as well as enable oil and gas operators to analyze critical warning data from pressure sensors in time to act.
These use cases represent a growing class of apps that require 100% uptime — regardless of where they are operating in the world. But with no guarantees on availability and latency, edge computing can cut down the distance data has to travel. Ultimately across web, mobile and IoT apps, edge computing can deliver faster insights, improved response times and better bandwidth availability across a secure platform.
What is the impact of latency issues on an organization?
Ravi: At the backend, high latency can lead to incomplete business processes, loss of data and downtime. This hinders employees from performing at maximum productivity and could also compromise security. For example, if a user permission changes, data sync instantly reflects the change across the app ecosystem. Latency might result in a lag, causing personnel to be able to access something they’re not authorized to.
Furthermore, as technology advances, we have become accustomed to real-time transactions and any form of delays will compromise the user experience. Hence, the consequence of high latency can be expensive, costing reputational damage and loss of market share. For example:
- Social gaming and online sports betting are competitive environments. For these companies, user retention is critical. Gamers expect to play whenever they want for as long as they want and have low tolerance for delays or lag. In this instance, latency can cause players to become frustrated and lose interest in the game.
- The travel and hospitality industry depends on a complex network of interactions between travellers, consolidators, suppliers, maintenance providers, call centers, web services and management services. Millions of queries are generated per second on popular travel booking websites. In this instance, latency would result in potential travellers dropping off the site and making the purchase on a more responsive travel site.
According to Forrester: The Real Costs Of Planned And Unplanned Downtime, IT leaders indicate that the costliest aspects of downtime are lost revenue (53%), lost productivity (47%), and negative impact on corporate reputation (41%).
Please share some tips on keeping latency low.
Ravi: Applications that support IoT devices require applications that can still operate when network connectivity is interrupted or unavailable and can make the most efficient use of network connectivity when it is available.
To do this, organizations must bring data processing and computer infrastructure to the near side of the network — edge computing. These include sensors and IoT devices that are being used to monitor and analyze energy consumption in real-time, reducing dependence on distant cloud data centers.
Below are some tips:
-
Offload legacy workloads
When incorporating cloud solutions, it is important to update legacy databases to match the speed and flexibility of new platforms. Otherwise, the new cloud technology will be held back by old systems.With Couchbase, corporations can either offload piecemeal legacy workloads or migrate databases as a whole.
-
Choose the right database
Databases have their strengths and weaknesses, and the choice of database will depend on the specific needs of the application and available resources.Relational databases, also known as SQL databases, are well-suited for applications with structured data and complex queries.
Non-relational databases, also known as NoSQL databases, excel in applications with unstructured or semi-structured data and high scalability requirements.
-
Employ distributed caching over memory buffering
Buffering stores transitory data in memory temporarily while it’s being read or written. By caching frequently used data in memory instead of making database round trips and incurring disk input/output (IO) overhead, companies can cut down on application response time.