Demystifying Interoperability

by Priyanka Akhouri    May 29, 2007

Today, information systems are seen as critical to government, business productivity, and growth. One of the keys to success in the IT era is the seamless exchange of information across heterogeneous IT infrastructure. As systems connect to each other, the issue of interoperability assumes increasing significance, and is a ‘top of mind’ issue for technical and business leaders, and policy makers.

Imagine a hypothetical scenario of a business transferring funds from an overseas bank account to their supplier in India. The transfer takes place without the appropriate conversion between the foreign currency and Indian Rupees. Or a landlord, dealing with two different municipal departments, suddenly finding his property tax rates shooting through the roof because one department’s system recorded the extent of his property in square meters whilst the other ’sent’ the value in square feet! Though these examples are hypothetical, they illustrate the consequences of IT systems failing to ‘communicate’ or ‘interoperate’ with each other.

For the government, defining the interoperability roadmap assumes greater significance as it’s currently in the phase of building a robust IT infrastructure to support the elaborate e-governance initiatives. Here are some aspects which the government and businesses should keep in mind while outlining an interoperability strategy.

On interoperability:
While there could be many meanings of interoperability depending upon various contexts, from an IT perspective it could be defined as: “The ability of disparate IT products and services to exchange and use data and information (that is, to ‘talk’) in order to function together in a networked environment.” Simply put, interoperability is about ensuring systems work together.

Two aspects need to be addressed by systems in order for them to be interoperable. First is the ability to exchange data, often called technical interoperability, and second to correctly interpret the data or semantic interoperability.

Technical interoperability deals with the linking up of computer systems for transporting, exchanging, collecting, processing, and presenting data. It spans infrastructure, such as network protocols, and system level interoperability, such as using web services. One of the means of achieving technical interoperability is by the use of standards or ’standards-based’ protocols. For example, conforming to the TCP/IP stack protocols for network interoperability or using XML for transfer of data between systems.

Semantic interoperability, on the other hand, makes sure that the systems exchanging data share the same meaning for the data exchanged. Very frequently systems can technically exchange data, but do not correctly interpret it in the absence of a shared meaning for the data.

In addition to technical and semantic interoperability, there are also organizational and process interoperability to be considered - the organizing of business processes and organizational structures, including process restructuring, doing away with duplication, and the development of interoperability ‘frameworks’ for better exchange of data within and with other organizations. These also deal with cultural issues, such as inter or intra departmental ownership of information, and perception of loss of control and power due to the creation of shared assets. All of these impinge upon the achievement of interoperability.

As systems and organizations become more complex, the relevance and importance of semantic and process interoperability increases sharply. Unfortunately, most of the organizations prefer to follow a ‘bottom-up’ approach of focusing just on technical interoperability, which has serious consequences that give rise to prescriptive guidelines and ‘brittle’ systems. These systems emphasize technical requirements rather than architecture. Interoperability becomes difficult to achieve or breaks down for want of semantic and process consistency. The silos tend to remain silos!

Conversely, organizations that choose to adopt a ‘top-down’ approach to interoperability concentrating on semantics and process definition, avoid brittleness in their systems. A top down approach leads to the development of implementation guidelines driven by business requirements, which can keep pace with market evolution rather than a set of straight-jacketed rules. Projects are personified by business value rather than technical requirements; real interoperability becomes feasible and silos disintegrate.

Importance of interoperability for the IT industry:
Interoperability is important to businesses, other organizations, consumers, and governments for various reasons, but there are a couple of important trends - mega trends that are driving the focus on interoperability and making it a necessity. Today, software, hardware, and telecommunications technologies are increasingly converging - take a look at the mobile phone - it’s a music player, a video recorder, a camera, and oh, by the way, it’s also a communications device. Customers, therefore, expect solutions that integrate to achieve the desired levels of functionality and form, and we need to make sure we meet these expectations of interoperability.

The other trend we are increasingly seeing is the deployment of heterogeneous systems. Unlike the 80’s, when most IT deployments were vertical proprietary solutions with limited interoperability, people today don’t deploy technology from a single vendor or source anymore. Rather, they combine technologies from different sources and vendors, and they would like to ensure that all these disparate technologies work well together to deliver the functionality that they were seeking in the first place. So, interoperability is increasingly important today as the IT industry has become more competitive and heterogeneous than it was 20 years ago. These and other factors are a reality in the marketplace, something that users demand, making interoperability a necessity.

On standards and their relevance:
Adherence to standards is one way to facilitate technical interoperability. However, simply adhering to standards does not guarantee interoperability. The mere existence of a formal standard does not automatically imply it will achieve great market adoption or usage. A case in point here is the widespread adoption of the TCP/IP networking stack. This becomes the de facto standard for the internet, rather than the ISO developed Open Systems Interconnect (OSI) networking model, which many expected would become the dominant standard across governments and industry.

A specification is unlikely to achieve broad market support and adoption if it is not good enough or does not fulfill the requirements. The fact that it was developed by a standards organization (de jure standard) is moot. Experience confirms that other standardization methods are equally relevant, and can help achieve technical compatibility. For instance, the emergence of a widely used software specification or product (a de facto standard!) can often induce widespread compatibility more effectively than formal de jure standards. These de facto standards come into being because innovation in technology, especially IT, does not always march in step with the development of de jure standards by standards organizations. When technology innovation outpaces the development of standards, and as that technology is absorbed by users and the market, the vacuum gets filled by it becoming a de facto standard. Examples of such successful de facto standards include PDF as well as XML standards.

When most users deploy products based on de facto standards, compatibility and interoperability comes into play naturally. In fact, it is the diffusion and ubiquitous use of certain technologies that simplify the interoperability process. And companies offering such de facto standards can further encourage them by publishing and licensing (often on royalty free terms!) their proprietary technologies and Intellectual Property (IP) to other market players. This will allow more users to gain access and achieve free information exchange across platforms.

On Open Source:
People often use the phrases ‘open standard’ and ‘open source’ synonymously. While this is a widely held misconception; they are not the same. Open Source Software (OSS) is an implementation; a type of software described by its development and licensing model. An open standard, on the other hand, is a technical specification, a set of technical instructions and procedures to ensure that a product meets its purpose and consistently performs to its intended use. The open-standards process is neutral with regard to software development models, and so it is equally possible for an open standard to be implemented in proprietary software as it is in OSS.

Another fallacy is the erroneous equation of OSS with interoperability itself, that is, the use of OSS will ensure interoperability. Quite the opposite could result depending upon the circumstances. Because all OSS source code may be modified by anyone, an OSS product initially standards-conformant and/or interoperable may be rendered non-conformant or incompatible due to subsequent modifications made to it. At the very least, the freedom to modify code and the lack of market incentives to maintain backward compatibility and fidelity encourages the creation of many permutations of the same type of software application. This could add a significant implementation and testing overhead to interoperability efforts.

The proliferation of standards that exist today can confuse and actually hinder interoperability. A good strategy to achieve interoperability is to apply widely accepted de facto standards and open standards based on pervasively used technologies. TCP/IP and HTML, for example, are among the most common open standards that allow users to use the internet for information exchange, irrespective of the source and destination platforms. On the other hand, although PDF, Microsoft Office formats, Java, and Win32 APIs are proprietary, they have, due to their wide adoption become de facto standards, widely accepted among mass users.

Interoperability is not guaranteed just by adopting a standard. Building complex systems that interoperate is as much about semantic and process interoperability as it is about technical.

As told to Priyanka Akhouri

Your say
Sign in to post a comment, or Sign up for an account.