For any company to survive in the market, it has to be responsive to whatever the customers demand. The company has to accept the change and challenge of the marketplace. No one can say with authority that a particular company – be it Microsoft, Cisco, SAP, Oracle, or, for that matter, any other large consumer electronics companies like Sony, Samsung and consumer durable companies like P&G etc. – can survive in the market forever with its existing profile, whatever excellence it may have. The market is very dynamic and evolving every day. There is no definite and fixed pattern of human usage and choice. It keeps on changing every time.
In the realm of IT and that too in the software space, it becomes more relevant to collaborate with competitors and open source codes to be interoperable with a variety of technologies. Today, a customer wants best-of-breed solutions from different technology providers – something from Microsoft, something else from IBM, and still something else from Oracle, etc. and integrate all the technologies and make it work in sync with each other for better performance and higher RoI.
Today, Microsoft – the largest distributor of software in the world – is moving exactly towards the direction of working in collaboration with every technology vendor. People are under the impression that Microsoft lacks resilience and is monopolistic. But, today, the company has come a long way. Its DNA is undergoing a complete transformation. If one recalls the tie-up of Microsoft and Suse Linux, one can understand the magnitude in terms of interoperability. Similarly, Microsoft Corp. joined Trusted Computing Group (TCG), which develops open standards for computing security to provide customers and partners’ interoperability of TCG’s Trusted Network Connect (TNC) architecture and Microsoft Network Access Protection (NAP) for network access control (NAC). Microsoft and Yahoo! have also entered into beta testing of interoperability between their instant messaging (IM) services that enable users of Windows Live Messenger, the next-generation of MSN Messenger, and Yahoo! Messenger with Voice to connect with each other. Similarly, Microsoft has formed the Interoperability Customer Executive Council to identify areas for interoperability improvements across its products and the overall software industry.
Vijay Kapur, National Technology Officer, Microsoft India, says, “We heard it loud and clear from our customers that interoperability is becoming very important.” “There are a couple of mega trends that are driving the impact of the customers. Firstly, the convergence is happening everywhere“– in hardware, software, telecom, etc. Customers want these technologies will work harmoniously together and provide benefits they are looking for. Secondly, the environments are becoming extremely heterogeneous. So, you have multiple set of platforms coexisting in the same operational environment because people are choosing platforms that best fit into their work. When you have heterogeneity, all the multiple platforms need to talk to each other or work together in unison.”
The entire thrust on interoperability started with the memo from Bill Gates on trustworthy computing initiative in 2002. Vijay says, “There are four pillars to the trustworthy computing drive, i.e. reliability, security, privacy and business integrity. And, interoperability falls squarely on the business integrity.”
In fact, the last pillar”– business integrity is actually to make sure that the other three happen. Trust is not all about technology, but about how you behave as a company. And, if a large enterprise like Microsoft does not behave in a responsive manner, then the people are not going to trust them. Vijay adds, “In the past, it has been a bit of a challenge, but if you look at how we have responded to this in the last five years, there has been a significant difference. We have responded and the whole company DNA has changed over the five-year period.”
What is the need of Interoperability?
Although interoperability has different meanings in different contexts, in the IT area the term is generally understood to mean the ability of different information technology (IT) networks, applications, or components to exchange and use information. That is to say, to “talk” to each other. To put it simply, interoperability is about connecting people, data and diverse systems. It is a feature increasingly desired by customers and customers are seeking practical solutions and choice. It addresses the complexity of relationships among people, processes and technical infrastructure – all leading to the optimization of organizational performance.
The other element that I would like to highlight is that “interoperability” is usually taken as a “single” rather homogenous attribute, whereas it has various connotations, or levels, that need to be considered and catered for to achieve it.
So, at the lowest level, we have”technical interoperability which deals with IT environments – the linking up of IT systems for transporting, exchanging, collecting, processing, and presenting data. It spans infrastructure, such as network protocols and system level interoperability, such as using Web services. At the next level, we have semantic interoperability, which is about making sure that the systems exchanging data share the same meaning for the data exchanged. And finally, at the topmost level, there is organizational and process interoperability – which are about the organizing and management of business processes and organizational structures, doing away with duplication, and the development of interoperability–“frameworks” for better exchange of data within and with other organizations. These also deal with cultural issues, such as inter- or intra-departmental ownership of information, perceptions of loss of control and power due to the creation of shared assets. All of these impinge upon the achievement of interoperability, but it is these aspects of management and policy that pose the most challenges, whereas technical interoperability is often the easiest and most readily achieved.
So, as systems and organizations become more complex, the relevance and importance of semantic and process interoperability increases sharply. Unfortunately, the reality is that faced with the challenge of defining semantics or processes most organizations avoid or postpone dealing with them and choose to focus instead on the far easier (and more tangible!) issues of technical interoperability. This”“bottom-up” approach can have serious consequences for interoperability and it often gives rise to prescriptive guidelines that emphasize technical requirements rather than architecture and can lead to “brittle” systems. Interoperability becomes difficult to achieve or breaks down for want of semantic and process consistency. The silos tend to remain silos!
Why is interoperability important for consumers?
Interoperability is important to businesses, other organizations, consumers and governments for various reasons, but there are a couple of important trends – mega trends–– that are driving the focus on interoperability and that are making interoperability a necessity.
The first is the emergence of convergence. Today, software, hardware and telecommunication technologies are increasingly converging. Take a look at the mobile phone – it is a music player, a video recorder, a camera, and, by the way, it is also a communications device. Customers, therefore, expect solutions that integrate to achieve the desired levels of functionality and form and we need to make sure that we meet these expectations of interoperability.
The other trend is the deployment of heterogeneous systems. Unlike the 80s, when most IT deployments were vertical, proprietary solutions with limited interoperability, people today do not deploy technology from a single vendor or source anymore. Rather, they combine technologies from different sources and vendors and they would like to ensure that all these disparate technologies work well together to deliver the functionality that they were seeking in the first place. So, interoperability is increasingly important today as the IT industry has become more competitive and heterogeneous than it was twenty years ago. These and other factors are a reality in the marketplace, something that users demand, making interoperability a necessity.
Is it to stymie the movement of open standard architecture?
Not at all. Microsoft has consistently invested in helping customers integrate our platform and applications with a broad array of popular (and even not-so-popular) hardware, software and networks. As a result of these efforts, Microsoft offers a comprehensive portfolio of interoperability software capabilities, from the operating system to individual applications. As a matter of fact, Microsoft has been a prime contributor to the development of web services which, as a technology, are totally platform agnostic and about interoperability. The development of XML is another area where Microsoft has made significant contributions – Jean Paoli, who is General Manager for Interoperability and XML Architecture at Microsoft, was a co-creator of the XML 1.0 standard.
In fact, in a survey by Jupiter Research, 72 per cent of the IT managers rated Microsoft technologies as the most interoperable within their existing environments. Similarly, for enhancing interoperability in the financial industry via Web services, Microsoft .NET was recently named by Waters magazine as the best business environment.
Is open source software better at achieving interoperability than proprietary software?
Not necessarily. In fact, in certain cases, the opposite could be true. In fact, since all the OSS source codes may be modified by anyone, any OSS product that initially is interoperable may be altered by any user in a manner that renders it non-conformant and incompatible with other users’ versions of the software. At the very least, the freedom to modify code necessarily encourages the creation of many permutations of the same type of software application, which could add implementation and testing overhead to interoperability efforts.
The risk of proprietary software being altered in a non-conformant manner is significantly lower than OSS since the source code is typically under the control of a commercial entity whose customers demand interoperability. This is true, for example, even with Microsoft’s “Shared Source Initiative”, in which the source code is visible to the customer or government, but a premium is still placed on ensuring that any changes to the source code are properly managed and do not break interoperability.
What is your strategy for interoperability?
With such a complex and nuanced subject like interoperability, there is no real “one size fits all” strategy and we need to approach this with a “tool set” that allows us to use the appropriate mechanism to deal with different scenarios.
So, Microsoft’s commitment to interoperability is implemented through a multi-faceted strategy fostering a more comprehensive approach to interoperability than by just any one mechanism alone. We deal with interoperability’through product engineering ––purposefully building interoperability into our products; through community – collaborative agreements with others in the industry, outreach to partners, customers and competitors to drive customer solutions; through access – the licensing and availability of intellectual property to promote innovation and to drive translation between Microsoft and non-Microsoft products; and by standards – engaging in global standards setting organizations to foster industry collaboration and contributing to the development of standards. Microsoft participates in many formal and informal industry standards organizations to help define the specifications that are a prerequisite for interoperability.
Our product interoperability strategy works at two levels. First, we continue to support customers’ needs for software that works well with what they have today. Second, we are working with the industry to define a new generation of software and Web services based on eXtensible Markup Language (XML), which enables software to efficiently share information and opens the door to a greater degree of “interoperability by design” across many different kinds of software.
Our partnerships today demonstrate how we are working constructively with other companies, including working with open-source vendors like JBoss, SugarCRM, XenSource, and Novell to meet the needs of customers. This is because we understand that we have common customers that deploy our technologies in a heterogeneous environment. For example, SugarCRM, the open-source CRM solution, is deployed 30 per cent of the time on the Windows platform and JBoss open-source middleware is deployed some 50 per cent of the time on the Windows platform – and we need to work together to optimize this interaction for the benefit of those customers.
And, we have recently launched the use of our Open Specification Promise in those cases where it makes sense to make our patent technologies available to the widest possible set of users where they are needed to implement a particular specification. We worked with key people in the open-source community – including the open-source attorney Larry Rosen and Mark Webbink from Red Hat – to develop the OSP. Since its launch, we have brought 38 web services specifications, virtual hard drive formats, anti-spam technologies and OpenXML under the Open Specification Promise.
Who are the technology vendors that you are partnering with at present to promote interoperability?
A great example is our partnership with Novell which marks a historic bridging of the divide between open-source and proprietary software, wherein we are collaborating on virtualization, identity management, systems management where both Suse Linux and Microsoft servers are deployed, and interoperability between the OpenXML and ODF document formats. I have already mentioned our collaboration with some others like XenSource, JBoss and SugarCRM.
Just a few days ago, at Interop Las Vegas 2007, Microsoft made a number of interoperability-related announcements; in the identity space for fostering improved interoperability for online identity management, and with Juniper Networks to provide customers and partners with open standards-based interoperability between Juniper Networks Unified Access Control (UAC) and Microsoft Network Access Protection (NAP).
Another great example of our cross-industry/cross-organization collaboration on interoperability is the Ecma Open XML standard which is an open standard and was developed as part of collaboration between Microsoft, Apple, Barclays Capital, BP, The British Library, Essilor, Intel, NextPage, Novell, Statoil, Toshiba, and the U.S. Library of Congress. The vote by Ecma International to approve the Office Open XML formats as an official Ecma standard – and to submit the formats to the International Standards Organization (ISO)–– represents a major step towards reiterating our commitment to interoperability and we feel privileged to have been a part of the process. As a starting point, Microsoft shared the technology behind billions of existing documents. Every company involved brought different perspectives and objectives, but all understood the importance of improved interoperability, functionality and security, balanced against the practical realities of document technology.
What kind of investment are you making in the activities?
As you can well imagine, interoperability is a journey. This is a long-term effort by the company and we will
continue to invest in this area. Some of the other examples are the creation of an Interoperability Vendor Alliance with more than 35 other companies; and a year-long effort of working with the newly developed Interoperability Executive Customer which enables our customers to play a key role in this process.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.