Manual coding may be a panacea for quick results, but this expert makes his case against it with alternative solutions.

In the face of fast-changing economic landscapes and business demands, IT is often tasked to deliver data quickly and precisely. As such, it is common for implementations to be hand-coded for fast results. However, is such a data management approach truly ideal in a technology-driven world where both the speed of deployment and human-resource turnover are increasing?

With digital transformation, leveraging on data as part of everyday business operations and analytics is now a staple in organizations. Data management is the bedrock of any digital transformation process because data needs to be taken from siloes across the organization, ingested, cleansed, analyzed then shared securely.

Often, data is ingested from one or multiple data sources and imported into a data warehouse or data lake. Developers may choose to write customized code in languages such as SQL, Java or Spark, to deliver the ingested data from source to destination due to several reasons:

It is fast at resolving the immediate business needs if performed by an in-house developer

It is low-cost as no additional development tools needs to be purchased

There is ease of deployment as the integration is specific in its targeted destination

As a result of the above factors, hand coding would seem like a panacea in meeting immediate business needs in an agile, cheap and effective way for the IT team.

Is hand coding truly the cure?

According to a poll conducted in 2019, in organizations with at least 1,000 employees, 400 was the average number of data sources to feed business intelligence and analytics efforts. Considering that each of these 400 data sources has its individual data structure and format, hand coding is not viable as an approach in organizations where IT resources are limited: streamlining all these differences will take up much manpower within the team.

  • In terms of resource management, developers skilled in languages such as Spark and Java, may have their efforts better deployed elsewhere on more critical digital projects. A CNBC report (2018) had found that too many developers were tied up in projects designed to prop up legacy systems and bad software, at a cost of US$300 billion a year—US$85 billion just dealing with bad code.
  • The ease of maintenance of code is another consideration. If the project is small, one-off and does not require regular maintenance or update of code, hand coding may be a suitable approach. However, if the integration is across multiple sources and targets, using a data integration tool is better suited because when deployment upgrades to the various database sources and targets are not managed well, ‘brittle’ integration will be the result.
  • Another related consideration is the ease and speed of developers’ code learning. With proper documentation tracked and maintained using integration software, it is easier to pick up and learn integration code, compared to hand coding that is subject to documentation issues once the developer leaves the organization.
  • Lastly, the applicability of reusing data integration patterns is another dimension to evaluate. It may be more efficient for the developer to integrate across similar databases or targets with an existing data patterns, that is already created with a data integration tool, than to start from scratch with hand coding to save time and resource.

Thinking long-term

In order for an organization to benefit fully from its data management strategy, immediate short-term gains need to be balanced with the longer-term benefits that an organization will derive from implementing them. A report by McKinsey & Company (2020) had noted that industry spending on data-related costs is expected to increase, on average, by nearly 50% over 2019–2021, versus 2016–2018.

Although hand coding is a means of addressing integration needs immediately, it is a short fix because with the exponential increase in data most organizations face today, the number of integrations needed within an organization will only increase.

On the other hand, using a centralized approach to data integration, companies can benefit from cost and time savings when tenders that target individual data management functionalities such as data quality and governance are addressed by a single platform. This is in contrast to hand coding data management projects, where, as the data requirements of business increase, the more IT budget is needed to hire developer headcount with different skillsets.

The use of a data management platform also optimizes learnings from other organization as pattern recognition algorithms enable immediate applications of suitable learned data patterns. This is in contrast to hand coding, which is highly dependent on the skills of the individual. In addition, security loopholes may be missed by the developer, but provided as part of the maintenance patches of a data integration platform.

Finally, from a governance perspective, each hand coding project represents a single pointed ‘tool’ that needs to be properly managed. As the number of data sources or destinations increases, the number of tools that need to be managed multiplies, and correspondingly becomes additional governance siloes that need to be integrated as part of the organization’s governance and security framework.

In the long term, the exponential increase in governance siloes become untenable to manage in terms of security, cost and resources. McKinsey (2020) reported that by improving data governance, data architecture, data sourcing and data consumption, companies can reap overall savings of 15–35% of data spend, with 5–15% savings in the near term.

As Warren Buffett put it: “Someone is sitting in the shade today because someone planted a tree a long time ago.” Good IT leaders have to understand their organization’s long-term strategic goals, map out the corresponding digital strategy and metrics, and finally, architect and deploy a central data management platform in the immediate present to meet tomorrow’s internal capabilities and evolving stakeholder needs.