Collaboration between Firms in Information Technology

EE 290X Group G

Chris Rigatuso
Takeshi Tachi
Dennis Sylvester
Mark Soper


Contents

Introduction & Discussion of General Business Issues

Introduction
General Business Issues of Alliances
Systemic Interactions: forces and factors
Adoption of Technologies
Examples of general business issues

Standards Organizations

Cooperative and Competitive Standards Environment
Case Study:  MPEG - Collaboration among Different Industries
Case Study:  IETF - De facto Standard vs. De jure Standard

Consortia

Definition and Description of Consortia
Strategic Issues for Firms in Consortia
Success or Failure
Adoption of Technologies
Case Study:  SEMATECH
Case Study:  Alvey Program

Strategic Alliances and Joint Ventures

Definitions of Strategic Alliance and Joint Venture
Strategic Issues for Firms
Success or Failure
Case Study: General Magic

Technology Webs

Definition of Technology Web
Strategic Issues for Firms
Success or Failure
Case Study: The Network Computer Technology Web

Conclusions

Bibliography

Appendix: ODMG Specification


Introduction and General Business Issues


Introduction

In this paper, we deal with the forces which influence the drive to form collaborative ventures amongst companies. We focus on technology companies from the software, semiconductor, and consumer electronics industries. The various forms of interorganizational cooperation are explored, with their resultant costs and benefits. We make an attempt to evaluate the success and failure of the collaboration, and to compare different forms of collaboration. This problem is compounded by the obvious result that there will be more literature available describing successes than failures. Add to this the problem that company formation internalizes the process and structure of the interorganizational form. Standards bodies by comparison are more transparent, and more publicized. We hope that the reader will appreciate the asymmetry of available information, and bear with us as we attempt to find fundamental drivers of interorganizational collaboration, and to predict the future trends based on the evolution we see currently.  We will narrow our focus to technology products in which several firms have a vested interest in developing and marketing.  The definitions of the various forms of interorganizational collaboration will be deferred until their respective sections of the paper. We use the term collaboration to mean a generic, cooperative interaction between firms to achieve some agreed upon objective.

The organization of our paper is as follows:



Economics

Interorganizational forms, such as consortia, joint ventures, etc., are often charted with creating standard products. These products in the broadest sense can include consumer storage media such as tapes, compact disks, digital video disks, as well as database software which is typically only purchased by businesses. In the economic perspective, "standard" means "commodity", and non-unique means standard. Economic theory says that, in this case, the basis for competition will be price. In technology, the role of a standard is a misnomer. It means "compatible" or inter-operable, or "interfaces conforming to industry-wide specifications" enabling you to purchase without fear of integration problems with other vendors' products which adhere to this same standard. Complementary products are produced by many vendors, each one containing a complex set of attributes, features, performance, functionality, price, and distribution tradeoffs. They may fulfill similar needs in the marketplace, but they are not "standard" themselves.

Differentiation, the ability to distinguish your product from all others, is considered the most powerful way to earn positive returns on invested capital for the producing firm. In a competitive market, this comes from building a product that is both 1) desired, and 2) hard to imitate. However, in products that display network externalities, we find that the demand for the product is geometrically proportional to its "standardization". The reason is that the complementary products depend on a rigid conformation to specifications. If one vendor controls all such complementary products, then that is called a "proprietary technology". We do see cases of this, as evidenced by the Iomega Zip Drives. This follows the Razor/Blade Gillette model, whereby the manufacturer sells the initial product at a seemingly good price (the player, or reader) and sells the repeating purchase item (the blades, or cartridges) at a hefty profit margin, knowing that you do not want to purchase another "reader" (razor). This works for a subset of technology products. In this paper, we consider several cases and technologies where this model does not apply, because of industry forces which attract multiple suppliers. Some of the attributes of products which cause high switching costs are as follows:

  1. Many complementary products (computer operating systems, software applications)
  2. Many other users of some shared resource (spreadsheets or documents that are transferred between users)
  3. Learning time for understanding and using features (reading the manual, and trial-and-error)

Once these issues are taken into account, users are reluctant to switch to other products, unless there are substantial cost/benefit advantages.

Technology product offerings are plagued with ever increasing complexity. Ease of use may be improving, but the system complexity plays a role in the consumer's decision process. A common question is: What will need to change or interoperate so that our new system will work?  High complexity means high switching costs. This means that complex products (such as relational database systems) generate high margins even at maturity. This is evidenced by most large players in the RDBMS market. The switching costs are sometimes called "vendor lock-in". The risk aversion of the firm contemplating purchases and the relative importance of the product or solutions to the business itself (revenue generation), will also reduce the willingness to shift to a new paradigm (emerging technology). Databases have very high network externalities, and thus the value of market share is perhaps the highest of any product category. In this context, some examples of the values the consumer considers are:

The bottom line is that two principle factors serve to slow down a new technology adoption, and both of them weigh heavily to influence interorganizational forms. First, market share. This means the first mover advantage should leverage complementary assets of other firms in achieving a successful product roll-out, to overcome the high switching cost considerations listed above. Secondly, compatibility, the ability to work seamlessly with a host of existing products, and the hope of working with future add on products. This expectation of long-lived usefulness is particularly strong in corporate investments of technology. It is analogous to the expectation of future returns that set prices of traded assets in the financial markets. This future returns consideration often rules out suppliers that are new (start-up companies) in favor of established leaders in the industry. For example, Oracle Corporation. is more likely to win a database installation contract, than a smaller rival with superior technology, such as Informix Corporation. Thus to prove viability, small start-up companies join alliances where complementary products are integrated, hopefully leveraging off the name recognition that a large, established player brings. In the same way, contract wins to large established customers are broadcast, in hopes of imputing viability or acceptance to future customers.



Framework for cost/benefit considerations of interorganizational relationships

Costs to producers

  1. R&D required to bring out new products
  2. Participation (time/efforts) in standards definition efforts
  3. Possible cannibalization of existing product lines
  4. Possible obsolescence of existing product inventory
  5. Possible obsolescence of existing manufacturing equipment (write downs)
  6. Possible learning curve costs for employees
  7. Competitive advantage sharing amongst competitors: striking the balance

Costs to customers

  1. Costs to purchase new equipment (players, browsers, etc.)
  2. Costs to purchase new content-based (new format) information products
  3. Costs of new add-on peripheral devices for storage, archival
  4. Costs to replace previous formatted titles (old tapes, records, CDs, movies)
  5. Costs to determine suitability and value of new formatted products

Benefits to producers

  1. Growth in revenues for existing customers/markets
  2. Defend against declining markets: Survival
  3. Leverage skills, assets and relationships in place for excess returns
  4. Obtain entry into new markets which are growing
  5. Strategic options to enter new markets, should changes dictate it

Benefits to customers (influences willingness to pay)

  1. Better performance: speed, size, density, compatibility
  2. Better quality: feature combinations for "whole product solution"
  3. Better integration: more complementary products now and future
  4. Status, signal of social, cultural success



Strategic Issues for Firms

Strategic decisions are often defined as "Decisions regarding resource allocation and investment for the firm which are irreversible". According to Michael Porter [Strat], bargaining power shifts with the lifecycle of the technology or product. Among the constituents of the value chain, the group with the highest relative knowledge of the customer's requirements and decisions to buy, wields the most bargaining power. Typically, this is first the systems integrators, with its customer service and ability to integrate a new product with existing system infrastructure, then the Value-Added-Resellers (VAR). Next it becomes the direct sales force of the product company, with growing demand in a popular product. Then it becomes the downstream distribution channel, which may be a retailer. Finally it becomes the customer group, when the product becomes an accepted commodity. The final phase coincides with technology (or market) maturity. The knowledge is diffused throughout the market, and is held by consumers, with little need for specialized consulting.

Value-added depends on the core competencies of your company supporting the life-cycle stage of the products currently provided. This takes careful attention over time. Systems integrators must keep their service mixture focused on early stage technologies where they can maintain margin. This might seem contrary to the notion of forming an expertise in any one technology. The key is the relative value (knowledge) of the systems integrator to the rest of the market. As the knowledge of the new product diffuses throughout the market, the early knowledge of the systems integrator becomes relatively less valuable; hence margins shrink and bargaining power is reduced. Thus a systems integrator should renew their mixture of products supported to focus on early-stage products. From the point of view of the innovating product companies, partnerships with consistently successful systems integrators, who know this principle, is key.


Resource Fits in Interorganizational Forms

Synergies in demand

Synergy in demand comes from demand increases that are more than the sum of the individual demands arising from separate product, service, or business assets. Marketing, distribution, and manufacturing each create value that the customer (at any level in the supply chain) will pay for. When combining these assets, the customers will pay more. Note that the set of customers may change size and constituency in this case. Because it is difficult to compare the demand as the product offering changes, and as the number of companies involved changes, this type of synergy is hard to measure. The combination of assets creates a harder-to-replicate whole product offering. "A whole product itself is the minimum set of products and services needed for customers in the target market to achieve the value proposition promised" [Moore 96]. This is particularly visible in computing technologies where software, computers, networking hardware, and common internet infrastructure all work together to be valuable to the consumers. The number of possible combinations of such products is mind boggling, and most firms specialize in one area, partnering in various ways with other firms that specialize in the other (complementary) components.

"Progress toward a whole product is the critical feedback mechanism of partnership" [Moore 96].

In the same way, combinations across companies allow for unique "packaging" of assets. Combinations create complexity. This allows uniqueness for product positioning. When the assets of many firms are involved, then leveraging across customer bases, using brand equity in concert, and sharing dissimilar knowledge capital provides a strong offensive, and changes the risk/reward characteristics of the product initiative markedly.

Synergies in cost

Synergy in cost comes from removing overlap in costs. Because the component costs of providing services and departmental functions are visible, cost sharing or cost reduction by eliminating unnecessary functions, is easily quantifiable. Because partnerships are temporary, they usually do not incur the cost savings that mergers do. Companies retain existing assets. Increasingly, technology is stateless. It crosses borders in the form of scientific papers, foreign sponsorship of university research, cross-border equity stakes in high-tech start-ups, and international academic conferences. Tapping into the global market for technology is a potentially important source of resource leverage. [Era]

Higher market share creates cost synergies, by amortizing fixed overhead (managerial costs, advertising, administration) over a larger number of units shipped. This causes the total cost per unit to drop, and thus the profit margin per unit to increase. Firms combine forces to create the possibility of a larger overall market, reducing average cost per unit for the market as a whole. This comes from two factors: "standardization" which increases overall demand from consumers, and lack of competing choices since firms are collaborating on creating an industry standard design. Notice that standardization is conceptually the opposite of differentiation, discussed earlier.

The "transaction cost" of equity stakes to combine resources across firms is much higher overall, with the associated loss in time, clarity of purpose, and morale, and money. The alliance benefits all players, and, in theory, chastises none.

"…the costs and problems of integrating cultures and harmonizing policy loom much larger in an acquisition than they do in an alliance." [Era]

Alliances multiply internal resources especially when international markets need penetrating. Each partner wants something the originator has. The deal is the negotiation of which matches comparative advantages between companies.

Supply chain relationships are an important economic outcome from joint development efforts. When downstream firms help determine the specifications, they are satisfied that future products will meet their needs. These specs also influence components manufactured and delivered downstream. So while the downstream partner develops their own complementary product, they can confidently purchase and embed other conforming component products.


Marketing

At is simplest form, a partnership can occur between producer and consumer. The producer keeps the requirements and their evolution in full view of the customer during a perhaps lengthy design process. The continuous attention of both parties signals strong intentions, trust, and ongoing commitment to the process, and to the financial transactions. It reduces risk for both parties, and helps ensure satisfaction with the outcome. This form of partnership is indeed informal for most companies, but is central to the strategy of very successful companies. For example, Nordstroms, an upscale retailer based in Northern California, has calculated the lifetime value (profit to the firm) of the average Nordstrom's customer to be $80,000. This allows sales representatives to invest time, effort, and money in the relationship, that transcend what most retailers would invest. This signals partnership with the customer, and creates customer loyalty, making the strategy a self-fulfilling prophecy. Market research in technology, has indicated that customers want a solution that is integrated, not just a collection of products that are claimed to work together. The customer does not want to incur any technology risk of configuring the solution. This gives rise to partnerships between product providers and service providers.

Joe Rodriguez, director of world wide advertising for Novell, said, "Our [advertising] campaign is based on extensive research we conducted in the US, Europe, Australia, and Asia, which told us that the overriding concern among IT (information technology) professionals was their ability to connect disparate systems and people. Users also told us they need clear information on how to build an intranet solution." [Newsbytes 9/96]

Borrowing resources from more attractive factor markets.

The negotiated alliance in a sense creates a secondary factor market whereby the business assets are combined considering the best possible allocation of resources given other resource constraints within the firm and the relative attraction of the new product potential offered via the alliance. This is supplemental to the firms existing resource base, and the presumably the new product market is supplemental to the firms revenue base.

In earlier times, the market entry was accomplished through internal development. The entry barriers are often high for any one company. Entry would engender reactions from incumbent firms, and thus retaliation would be competitive threat. [Strat]. In the partnership development situation, the roles that each company play are discussed (negotiated) up front, and thus companies can position their offering to avoid as much direct competition as possible. Niches are defined at the round-table in weeks or months, as opposed to the free market over years. The goal is to perform this business development function by leveraging outside resources, and the alliance has a much smaller overhead and obligation than reassigning internal staff away from existing projects, or hiring or acquiring these business assets/resources on the open market. A collaboration is an exchange of knowledge capital. It is a dialogue and a recognition of the value of ongoing, close relationship as part of the economic imperative of value creation.

Accounting

Accounting issues are usually incidental to interorganizational forms, but can play a role in defining a corporate strategy to take advantage of existing laws and accounting rules. For instance, a research and development partnership allows a company to defer the recognition of R&D costs. The partnership raises funds from investors, often the partners that enter into the agreement. Those funds are the used to pay the company for research. Any patents or products resulting from that research belong to the partnership, but the company can either purchase the partnership or license the product. Thus the company controls the technology without reporting the expenses resulting from research costs, as the "revenue" from the partnership offsets the research expense. This arrangement has many of the attributes of an financial market option; the firm has a call option on the patents or products developed by the partnership, with the purchase price being equivalent to the exercise or strike price.

Consider Centocor, a biotechnology firm. This firm has organized three limited partnerships, CCIP, CPII, and CPIII. The general partners of these three are each wholly owned subsidiaries of the firm. The firm and each of the three partnerships sold units consisting of limited partnership interests and warrants to purchase common stock of the parent firm "Centocor". The CCIP, CPII, and CPIII development agreements allow for 110%, 114.5%, and 102.5% reimbursement of R&D funds provided by Centocor. These R&D financing arrangements (more complex than illustrated herein) permitted the company to conduct research without incurring debt or equity dilution, in addition to avoiding the effects of reporting the research costs as an expense. A complete treatment of these issues is beyond the scope of this paper, and the reader is referred to [White 94].


Framework of Key Issues for Interorganizational Collaboration Success

  1. Centralization/Decentralization of decision authority
  2. Centralization/Decentralization of information resources
  3. Ownership of deliverables
  4. Fluidity/Rigidity of Process and procedures
  5. Admission requirements of new entrants
  6. Distribution rights and royalties
  7. Mixture of social/professional/business activities
  8. Time horizon for the development and distribution of deliverables
  9. The abilities and commitment of the constituents relative to their other commitments
  10. Members' interpersonal relationships (Academic, Professional, Industry, Media)
  11. Mixture of motivations and their time horizons for member firms
  12. Mixture of motivations and their time horizons for individual members
  13. Legal structure (non-profit, joint venture, etc.)


Managerial thinking

A focus on the individual decision maker continues to dominant American management tenets. As the market grows in complexity from continuing trends such as globalization, technology, competition, new forms of organizations appear to provide and redistribute economic benefits. These benefits take the form of time, collective effort, and investment capital as inputs, and of reduced risk and return on invested capital as (hopeful) outputs. Technology development is being increasing influenced by questions and problems of integration. Interoperation (the ability of technologies to work together) is a dominant theme. Typically products and technologies originate from different firms at different points and time, and have various lifespans between 1 and 20 years. This yields an amazing array of compatibility problems, giving rise to industries devoted to providing knowledge, labor and tools to solve the problems thus created.

During this trend in increasing complexity, both information technology (business computing applications and infrastructure) and consumer electronics are to highly visible industries with changing technology and in the last 10 years especially, a rise in the number of standards and standards organizations devoted to defining rules of design for products has occurred. In this paper, we examine several case studies, and consider some of the economic, social, political issues surrounding the formation of inter-organizational groups that define and promote technology standards. This is a broad topic, in this small paper cannot claim to do it justice. However, we hope to bring to bear new research, interview results, a survey of some past results from the academic literature and some recent cases discussed in the press. Through these submissions, in conjunction with our list of book references and hypertext links to technology consortium web sites, we hope provide value to future readers everywhere.

The current environment -- the range of users and technologies -- is so complex it defies comprehensive analysis. There's no time for painstaking consideration of a given product's implications for a range of current customers and systems. There's no way to create a bulletproof, comprehensive environment. Vendors discover the implications of their new technologies during beta tests involving hundreds of thousands of sites and during the first product release cycle. Only in actual use can vendors see the real problems and the appropriate responses. And so we have episodes such as Netscape's recent product release that contained unanticipated security holes. [Taligent 95]

Decision Making

Interorganizational decision making seeks means of facilitating the coordination of the decisions between the decision making units. The first step in facilitating this coordination is to make decision makers aware of the potential increase in collective payoff which is available to interdependent decision makers who coordinate their actions. This is the fundamental starting point which creates motivation for future efforts. Consider the effort needed to be devoted to determining the payoffs available to the parties who comprise a joint decision problem, such as a technology standard. Two common barriers, perception and communication often forestall this coordination. These barriers also prevent decisions units from recognizing opportunities for cooperation. The focus on individual decision responsibility has contributed to the suboptimal performance of decisions which involve interdependent parties. Classical hierarchical management seems ill-suited to solve problems which joint decision making provides synergistic benefits to the member organizations. [IODM 72]

In the broadest sense, there is a tradeoff between centralization (control, individual authority) and decentralization. The organizational form determines the cost of and access to information on which joint decisions will be decided. Fundamental questions ensue: where is the information we need? Who has best access to it? Who is best equipped to make rapid use of this information? Clearly these questions can excite both competitive and cooperative urges in interorganizational groups. Information pertaining to customers, existing products performance, and future products success, can be highly sensitive - yet critical to the success of near-term joint decisions. It is here were the perception and communication problems become influenced by politics. The changing importance of individual payoffs versus collective payoffs gives rising to a changing mix of political influence and group dynamics.


Horizontal Layering Phenomenon

Horizontal layering of technology solutions diffuses risk and reward across multiple vendors allowing many companies to share the results of a successful whole-product solution. In the old world of a vertically-integrated solution from one vendor, it was a winner-take-all market structure in which IBM dominated for many years in the computing realm. This means that every company plays against the leader, fighting for wins at the expense of each other, and the leader. The market efficiency of this structure is clearly less than a cooperative horizontal layering, whereby there is a leader for each layer of a whole product solution. Additionally, because many companies are vying for dominance in any layer, the customer is relative confident that a cohesive long-term architecture will endure. No one player can switch the interfaces, without risking incompatibility with a ripple effect between neighboring layers.

Because of participation of many companies in the horizontal layering phenomenon, the marketing promotion becomes a shared resource, providing returns across all players which are part of the solution. In the case of vertical integration, each single company must become its own bottleneck by definition, and because so much more of the solution is provided per vendor (perhaps the entire solution), the throughput in providing product (and therefore revenue) is much lower. When the whole product solution is dispersed across multiple vendors, ideally each vendor picks the component which best matches their core competencies, and thus the returns are quicker. Other players are encouraged to enter to supply the missing pieces of the whole product solution. The notion of creating a Web, an open market that leverages the marketing hype of all participants can be combined with complementary products to form an open systems cooperative business environment. Sun Microsystems essentially created this phenomena, in its attempt to lower costs by designing workstations to use off-the-shelf parts:

"The reason that Sun (Microsystems) could out-produce Apollo (Computer, Inc.) was that its open systems strategy kept it from ever becoming a bottleneck in its whole product development. Instead it leveraged partnerships to outsource the needed components, relying on the natural mechanisms of a free market to bring together the whole product at the end." [Moore 96]

Case Studies

Object Database Standard: ODMG-93 an interorganizational group that became a standard

Object databases first appeared as technology products about 1990. The first companies were formed in 1988, in response to growing demand for storage systems designed to support object-oriented programming languages, especially C++. Computer aided design and manufacturing systems (CAD/CAM) were among the first applications to use this new technology. This first standards efforts began in 1991, and the first reference standard was published in 1993, under the name ODMG-93. This stands for Object Data Management Group. Appendix 1 describes the details of ODMG goals and structure of the reference model. ODMG-93 is a standard established by a working group composed of five representatives of many of the OODBMS vendors. The standard is documented in "The Object Database Standard: ODMG-93" ed. R G G Cattell, ISBN 1-55860-302-6. Updates can be found online at http://www.odmg.org/. The ODMG group "Object Data Management Group" is a object database (ODBMS) standards defining organization that started in 1991.

Reason for Founding

Rick Cattell, an engineering manager with Sun Microsystems, started ODMG in 1991. During 1991 through 1993, it was a loose organization. There was an emerging industry of object databases. These databases allowed new forms of linking and defining "objects" that supported complex applications. Objects represented a new style of software programming that packaged data with procedures and allowed for higher productivity through reuse of past work efforts. Traditional databases did not support multimedia objects and networks of tightly interconnected objects well. The rise in popularity of C++, the object-oriented antecedent to C, was a driving influence. Most relational databases, which grew in popularity throughout the 1980s with companies such as Oracle, Sybase, Informix, did not yet have C++ interfaces. This was due in part to the mismatch in representation style between tables (rows and columns) and the C++ object definition style of programming.

Technical Goals

The ODMG is an informal consortium of ODBMS vendors working on standards to allow portability of customer software across their products. By supporting the same interface, they expect to significantly accelerate progress towards effective adoption of the standard. By submitting it to relevant organizations (OMG, ANSI, ISO, STEP, PCTE, CFI, etc.), they plan to encourage commonality among the object database portion of all those standards. To date, the lack of a standard for ODBMS has been a limitation to their more widespread use. Much of the success of relational database systems is due to their standardization in terms of a data model and SQL language. Standardization is even more important for ODBMS because their scope is more far-reaching, integrating the programming language and database system, and encompassing all of an application's operations and data. A standard is critical to making such applications practical.

Initial Constituents

Five vendors, Objectivity, Object Design, Versant, Servio, and Ontos had representatives on the team. Rick was an academic database researcher prior to his position at Sun Microsystems. In 1992, the work began on the published standard that would define the interfaces for programming and data definition for object-oriented database systems. In 1993, the group ODMG was incorporated as a non-profit organization, based on experience with the SQL Access group. Members of ODMG corresponded with this group, and some of the procedures were modeled after those that group. In 1993, Doug Barry, and independent consultant was asked to become the business director, because of his neutrality. Doug knew the technology, knew the press contacts, and the issues from the customer point of view. Additionally, he no longer had ties to product vendor, and thus could assume an unbiased representative of the standard.

The Structure of ODMG

ODMG is affiliated with the broader industry standards group OMG (object management group, who has defined CORBA). The roles of OMG, to define, to promote, to educate, etc., have in some instances interfered with the charter of the ODMG during the period of evolution of ODMG. The structure of the meetings early on involved up to 40 attendees, with lecture-style setting, and with no continuity of attendees, questions of a redundant nature were often brought up. Because of the urgency of the charter of ODMG, a breakoff from the structure of OMG meetings was agreed upon by the board members of ODMG. This led to the structure defined below.

Types of Constituents

Doug realized early on the benefits of inviting the user community for the ODBMS products to be involved in the standards definition process. The notion of "reviewers" was added. Reviewer members must be accepted by the voting members of the ODMG. Two of the top criteria for acceptance were technical knowledge, and consistent attendance. The first criteria kept people with a marketing or sales focus out of the company, and the second ensured continuity. The continuity of course helped for efficiency and productivity of the meetings. Consistent attendance applied to all voting members, non-voting, and reviewers. The voting members were required to spend 25% of their time on the ongoing definition of the standard. This included aspects such as feature requirements from customers, functionality specification, and subtle understanding of issues between computer language specifications (C++, Smalltalk, SQL) and the impact on the interface definitions for the databases. Non-voting members were asked to secure a 10% time commitment, and to attend all meetings on consistent basis. No rotation between employees at a firm was permitted.

The published standard was organized into chapters of a book. The all members were organized into work groups, of roughly five people "dinner-party size". This was found to be an efficient size to allow for exchange of ideas, and progress. For any change proposals to be ratified, they needed to pass two rounds of voting: first the workgroup level, and second, and the board level. The chair of the workgroup was nominated by members, based on experience and technical respect, and that chairperson empowered others within the workgroup to take on individual tasks.

Voting members dues were two times those of non-voting members. The executive director, Doug receives a retainer to cover the allocation of his time, since he is not salaried employee of a firm in this industry. The organization is virtual in the sense that it has no headquarters location, only meetings and results: a published standard, agenda for new features and functions that all members support (in principle) and committed members, and of course, a website.

The recent popularity of Java has provided in interesting opportunity to the ODMG group. Past languages were defined well in advance of the groups organization (C++, Smalltalk, C, SQL). Since the ODMG group has a well-defined process, and is running smoothly, there exists the possibility to define the standard before early proprietary features are added into the product definitions of the ODBMS vendors. This prevents retrofitting risk, and provides vendors the security that their product will be compatible with the standard. Clearly some work needs to progress in parallel, however general release will be adhering to the published standard it is hoped.

Forces driving the evolution of the standard

Customers demanded certain features, and patterns of requirements emerged from different industries of customers. The members of ODMG (including ODBMS customers) recognize patterns in demanded features, and seek to meet the requirements in a way that maximizes customer satisfaction, technical feasibility, and elegance of design principals (symmetry, orthogonality, predictability, and other laws of language design). Because the objects in the database are created over long periods of time, with different programming languages, and are accessed over long periods of time, by possibly many users, in different locations, with different types of applications, written in different programming languages, a myriad of problems is possible. Resolving these (at times) competing issues is non-trivial. Add to this the fact that member-vendors having different products, experiential backgrounds, and programming philosophies, often have competing standard evolution desires. They are bound by the common good of expanding the size of the market. Customers fear lock-in, and need the standard as a signal of economic integrity which acts as "guarantee of quality" over time. The vendors remain incented to provide continuing product excellence and reasonableness of price, for fear of incurring the switch to a "standard compatible" database from another vendor. The voting members have vision and commitment for making progress on the standards beyond the features demanded by current customers.

Renewal

The eight board members are the workgroup chairs. They decide the charter for the ODMG organization. At the end of a release of the standard (release 2.0 is due out summer 1997), they solicit features for approval from any members. All features are recorded, and members are asked to rate each one on scale of 1 to 5. The tallies are made, and then the 50% percentile is kept, those below are dropped. From this new list, the same rating process is repeated. This is repeated until a number of coherent features that can be managed by the workgroups is achieved.

Staying Focused

The charter exists in written form, and the IRS requires the non-profit organization to stay within its charter. A two-thirds vote would be required from the voting members. Because all the vendors are under time-to-market pressure, and the members of ODMG are committed employees to their respective organizations, time is the key resource. The policies and procedures provide an ongoing means to achieve standards definition progress that benefits the member vendors and their customers. This group's charter is to generate the standards, and not to disseminate, promote, or teach the standards. This standard has in no small part secured the longevity of the nascent ODBMS industry, despite the overwhelming economic advantages of the entrenched relational database vendors. [ODMG Int97]



The Flashpix Case

Presently, the increasing digitization of older forms of communication and expression is creating applications that require standards which bridge separate industries. Digital photography and video creation are examples of this. Consider the digital camera: you take a snapshot, it is stored in a standard file format, that format is read by a special application on the computer. That application may allow filtering, editing, and change to that image. The new processed image may become part of a multimedia application or presentation. Therefore, three different industries are involved: the camera manufacturer, the image application developer, and the presentation or media application development software. Transformations are usually possible in the intermediate application, however the destination must also be considered.

Flashpix is a digital image format for digital photography and computer applications that was introduced in 1996 by Kodak, Microsoft, Hewlett-Packard, and Live Picture. It promised to overcome the limitations of previous standards (de facto GIF from Compuserve, and ISO committee standard JPEG) which were faulted for their slow transmission and loss of resolution, respectively. By the end of 1996, Flashpix had gained support from a number of significant manufacturers due in part to the free licensing plan created by the founders. Flashpix was appealing due to the structure of the format which allowed multiple resolutions to be stored and the common elements of picture colorization to be separated from each resolution. These features overcame many of the technical limitations of the previous formats.


Synopsys PDEF Case

Switching cost concerns lead interface definitions

Often times, competing software vendors are drawn together when they face a single large customer who desires separate products to work together on some large integrated system. For example, Synopsis and Cadence, both leaders in the EDA (Electronic Design Automation) market, have tools that Intel needed for future semiconductor designs. Synopsis with its Design Compiler, and Cadence with its Dracula and Verilog tools, each perform part of the coordinated effort to produce and test next generation computer chips. In general, vendors may have a suite of tools, but the success of one particular tool may draw the majority of customers.

Competing vendors could in theory choose to standardize or "open up" the interface so that its details are published, to make interoperability easy and encourage new entrants for complementary tools. However, they often don't do this -- unless pushed or dragged by large customers. The reason is that by having a closed-system, vendors retain control. If an interface to some software tool is not published, then tools that work with it must either come from the original vendor (customer lock-in) or be created illegally through reverse engineering. This is similar to patent protection, in that it is a legally enforceable barrier to entry for products which use the interface. The size of the customer and the potential orders provide the incentive for competitors to reconsider cooperation.

Vendors don't want to lose the power to innovate in a closed system. Adherence to a standard may prevent future functionality because of its technical constraints. The burden of a standard also means they may lose control of customer accounts.

  1. Logical Design: Create the netlists (Synopsis Design Compiler, the leading tool)
  2. Physical Design: Routing the circuits and placing the components for optimal performance
  3. Verification: Simulating the operation of the chip design, collecting performance data (Verilog from Cadence, the leading tool)

The customer wants best of breed tools, and they want them to work together in an integrated fashion. Since the tools are from different vendors, the ability to integrate different versions of the tools together to allow upgrades over time is a concern to the customer. The dominance of the customer in this case, Intel, provides a huge incentive to the two potential suppliers, Cadence and Synopsis. Intel can easily state the intention to buy tools under the condition that they work together, thus simplifying the entire design process for Intel. This increase in value is part of the "consumer surplus" which is the sum of savings accrued by the combination of process simplification, automation, improved time-to-market, and accuracy and reliability permitted when using the new tools. Some of this cost is born by the "competing" suppliers, in this case Synopsis and Cadence. However, these vendors, can readily see a stream of revenues from future product versions, annual support contracts, and complementary future products in the pipeline. This revenue stream taken into account by the suppliers may be harder to quantify to the consumer (Intel) and thus each party sees a different set of costs and benefits, and therefore a win-win contract is more probable in this context. From this interaction, a standard did emerge. It is called PDEF (Physical Design Exchange Format). It is used to exchange chip design data between logical design and physical design tools.

An additional benefit of doing business with Intel is the "halo effect" of working with a world-class customer. It helps win other new business. This is anticipated by the suppliers, and provides incentive for them to work together to create the standard.

The customer is focused on the standard as a credible guarantee of interoperability which means essentially the customer is not locked in. Future offerings which conform to the standard are available and permit easier switching (reduced switching costs) should the customer feel that the price/performance tradeoff is an improvement over his current technology. So, standards are a risk to the vendors which adhere to them, yet they reduce buying resistance from customers. In fact, customers with enough buying power can induce the formation of standards by asking for an integrated solution as in the above example. Alternatively, the customer may be presented with a proposal for a standard by a consortium of vendors to induce a purchase from the reluctant customer. [Synopsys Int97]


The Taligent Case

The Taligent OS project was born under the code name "Pink" at Apple Computer Inc. When Apple spun off Taligent in 1991, the new company was given the mission to deliver the first completely object-oriented operating system. IBM and Apple created a joint venture called "Taligent" that was incorporated in 1992. IBM's decision to pull the plug on Taligent as an independent company confirms a shift to a new business model for software products. Based on past experience with Next, Inc. and Taligent, both providing entirely new operating systems and programming environments -- and failing -- the new model of software success emphasizes incremental innovation with technology and rejects the assumption that customers will make sweeping changes to their technology infrastructures with high upfront risks and costs.

Taligent promised to solve a broad array of problems with object-oriented programming, component frameworks, a distributed object infrastructure and an innovative new user interface designed for networks. In the end, however, Taligent was too sweeping, too grand -- in every respect. Taligent's technology included a new operating system platform, a development environment and a user environment. Taligent hired hundreds of developers and marketers and hosted high-tech demonstrations at key venues. Unfortunately, the company demanded too much of the customer.

Taligent will now be absorbed by IBM, however Big Blue already has many competing investments in distributed computing and object-oriented technologies. Furthermore, IBM has never seemed to know what to do with the bright ideas coming out of Taligent. Taligent's passing signals the end of an era in which companies introduced sweeping new environments designed to replace obviously inferior products. But while Taligent was a leading proponent of this approach to product development, it is not the only recent example. IBM managed OS/2 according to this model for a long time. And when General Magic, Inc. sought to design a complete environment for intelligent distributed applications, it found no existing technology worthy of the challenge; everything had to be new. In retrospect, it's clear the ending we have now reached has been coming for a long time. We've now entered a new age: an era of opportunistic technology. This era features companies that make incremental use of new technology within existing structures. They view existing technology as bases upon which to innovate, and they recognize user preferences to adopt new technologies in stages. Netscape Communications Corp. exemplifies the thinking of this new era. Netscape's products are based on protocols and services layered on top of the Internet by a loose-knit collection of developers. Netscape constantly improves this foundation work and builds on top of it. New products appear every six months or less. Product strategies change even more frequently. [Taligent 95].

With Taligent and Next Computer, the limits of technology adoption were stretched beyond business viability. Both of the these companies lost the battle. These were tightly coupled joint ventures, with the assets and motivations concentrated within the companies' walls. They are examples of joint ventures, which are discussed in detail below.  Next Software Inc., a downsized and focused outgrowth of Next Computer, was recently acquired by Apple Computer with plans of enhancing the operating systems for future Apple product designs.



Cooperative and Competitive Standards Environment


1. Today's standards making architecture

The architecture of standards-making organizations in the telecommunication and IT industry fields has fundamentally changed over the past decade. The old architecture was simple and well-bounded around a handful of bodies with explicit international, regional, national, and subject matter jurisdictions. These standards-making bodies were virtually sovereign, following slow, deliberate, time-honored processes that remained essentially unchanged for the preceding 130 years since the first multilateral telecom standards conference.

This old architecture has been fundamentally altered these days. Constellations of new bodies now exist with diverse new constituencies and boundaries, and all are competing in a global standards marketplace. Even the form of these new bodies differs dramatically from traditional organizations. Their range includes:

The chart below, depicting the standards making universe (popularly known as the Rosetta Stone) seeks to provide perspective and portray relationships within this new architecture. The chart was originally prepared for the first Standards Summit in 1990 and has been revised continually since that time to reflect the newest architecture [Rut1996].

 

Standards Organizations in Telecommunications


2. Factors which generated this rapid transition

The reasons of these architectural changes include:

    1. Moore's Law (i.e., electronic technologies are changing dramatically on an average of every two years). Furthermore, in highly dynamic environments, fundamental rates of change are measured even in months. (Rutkowski's Law)
    2. Most telecommunication and information markets are very competitive. The marketplace, not institutions and government, decides winners and losers.
    3. Most of the information infrastructure has moved from being a public good to now being a private commodity. Millions of individuals and organizations now own and design a collective national and global infrastructure.
    4. An increasingly global competitive environment prevents organizations from setting solutions that are favorable to a particular country or market segment. Attempts by governments to mandate specific directions will likely only disadvantage that nation or market by limiting both the quality and performance of available products.
    5. The requisite manner in which standards are developed and implemented for computer network environments is fundamentally different from hardware-oriented telecommunication fields.
    6. Time-to-market has become the single most compelling factor for both service providers and product vendors. This concern is a byproduct of rapid technology change, a robust competitive marketplace, and a globally competitive environment.
    7. Differences between "bottom up" and "top down" initiatives. Top down initiatives are characterized by grand telecommunication and information infrastructure standards programs begun through traditional international organizations. Meanwhile, most revolutions in the telecommunication and IT industry have occurred from the bottom up.

These factors have produced a very different standards making architecture. Today, direct government involvement in picking winners and losers is likely to be avoided. With few exceptions, every direct governmental intrusion into the standards marketplace over the past decade has had major adverse consequences. On the other hand, minimal government involvement, designed primarily to foster research, collaboration and technology transfer among developers and rapid dissemination of standards, appears to work well.

 


MPEG (Moving Picture Experts Group)

Collaboration among Different Industries 

1. What is MPEG?

MPEG (Moving Pictures Experts Group) is a group of people that meet under ISO (International Standards Organization) to generate standards for digital video (sequences of images in time) and audio compression. In particular, they define a compressed bit stream, which implicitly defines a decompresser. However, the compression algorithms are up to the individual manufacturers, and that is where proprietary advantage is obtained within the scope of a publicly available international standard [Gad1996].

MPEG meets roughly four times a year for about a week each time. In between meetings, a great deal of work is done by the members, so it doesn't all happen at the meetings. The work is organized and planned at the meetings. 


2. How firms cooperated?

The MPEG standard was established with joint work of inter-industrial organizations; consumer electronics, computer industry, telecommunication industry, and broadcasting industry. While the original standard, MPEG-1, was originally designed mainly for storage media use, MPEG-2 covers a much wider range of applications such as digital broadcasting (including HDTV), video-on-demand, higher-density storage media (DVD) (refer to following figure). This high incentive for strong interoperability beyond industrial bounds naturally required collaborative design work for participating organizations.  As a result, due to the technological superiority and potential benefits of MPEG, this newest video compression algorithm was standardized under ISO and even adopted by ITU as an authorized international standard.  Moreover, it has been adopted in the Grand Alliance of HDTV by United States and Europe. 

1-Core & 3-Layer Model of Multimedia Service [Fuj1995]


3. A success of General Instruments Corporation

When talking about MPEG, one episode sometimes mentioned is another digital compression technology, DigiCipher, developed by General Instruments Corporation. DigiCipher is similar technology to MPEG-2, but was realized prior to MPEG-2.

Through its standardization activity with MPEG as one of the cooperative members, GI's strategy was flexible. While their contribution to the MPEG standard was of course large with a background of digital video compression technology, they started to release DigiCipher as their original brand video server targeted to broadcasting companies.

One reason for this early start is their strategy to keep a certain market share of professional video servers. Since standardization activities are time-consuming, it was a reasonable idea to release their incompatible products for the stand-alone use in which interoperability is not necessarily a big issue. Then, once MPEG-2 was finished, they could upgrade for their existing customers.

Another reason for GI's decision is more technological and complicated. Generally speaking, selection of open standards is a prerequisite for any modern system. However, especially in the case of MPEG, adoption of open standards is an insufficient criterion for ensuring high quality of compressed video or even interoperability.

As mentioned in the first paragraph, in MPEG, the compression algorithms are left up to the individual manufacturers, which is where proprietary advantage is obtained within the scope of international standard. This means even within a range of standard, the picture quality of 1.5Mbit/s-compressed video is fully variable depends on how effectively it is compressed by each manufacturer. This compression know-how is available only in well-experienced manufacturers, and is not mentioned in the standard at all to guarantee the competitive improvement.

Also, in many cases, only by supplementing the use of open standards with a comprehensive licensing program will true interoperability of equipment from multiple suppliers result (open standard vs. licensing) [Tay1996]. This is mainly due to the fact that MPEG targeted such a wide range of applications that the idea of full-conformance to standard brings unrealistically expensive designs.

In summary, these contradictions come from the flexibility or expandability of MPEG, which may be one characteristic feature of recent inter-organizational designs. Actually, General Instruments is now an MPEG product supplier along with their own DigiCipher products. 


4. Remaining patent problems

MPEG core technology includes many different patents from different companies and individuals worldwide, but the MPEG committee only sets the technical standards without dealing with patents and intellectual property issues. The effort to form an MPEG-related licensing entity that would provide efficient access to intellectual property rights (IPR) necessary for the implementation of MPEG technology worldwide was started.  This group supports initiatives leading to the establishment of a patent pool for MPEG-2, outside of MPEG [Sch1995].

So far, the MPEG IPR Group has reached a consensus on a two-phase action plan for establishing a licensing entity. The first phase will be to identify which patent holders are willing to participate in this effort and whether they own rights necessary for implementation of MPEG core technology. The effort will also concentrate on identifying which patent holders believe that they own or are likely to own patents they think are necessary for others to license in order to conform to the MPEG standard.

The second phase will include determining the entity's administrative structure as an ongoing effort that works with new licensees and licensers, the licensing structure, and the allocation of royalties.

In order to enhance the distribution of MPEG technology with low entrance barriers for manufacturing firms, participants of the MPEG standardization effort have postponed efforts to settle the complicated patent issues. However, it is clear that as the MPEG market grows, more conflicts may occur among patent holders. MPEG IPR's effort to settle the patent problem will be a key point for the future of the MPEG business. 

 


IETF (Internet Engineering Task Force)

De facto Standard vs. De jure Standard

1. Standards in the Internet

In data communications, a standard specifies a set of procedures. A specification typically pertains to computer-to-computer interaction but might be more limited, such as describing only the format of data, rather than all of the rules for passing that data back and forth. While mildly controversial, it also is legitimate to specify characteristics of information that is exchanged among humans, such as for electronic mail address strings that should be placed on business cards. Standardizing such strings greatly facilitates the "out of band" passing of information which eventually winds up as input data to a computer.

A standard might also specify the procedures to use when operating a system. Typically, Internet standards shy away from such dictates, since there is a strong desire to leave network operators free to conduct business as they see fit.  However, guidelines are occasionally published when conformance to them will be highly beneficial for the overall health of the Internet. Still, such guidelines are not formal standards.


2. De facto standard vs. de jure standard

Discussions often distinguish de jure from de facto standards. As the name denotes, the former is made legitimate by force of law, whereas the latter is legitimate by virtue of popularity. Since the Internet's researchers had less intention of developing a global service than telecom researchers, its technology definitely falls into the camp of de facto. Unfortunately, this is sometimes used against it, to suggest that it is less legitimate than the formally-commissioned products of other groups. Instead, one should note that its adoption has been possible only because of its very strong virtues.


3. Noticeable points in IETF success

The Internet standards process did not set out to achieve its current role. It was only the side-effect of a small research community. While that community was reasonably clear about the basis for its good work, the global perception of that success is quite recent. Hence, it is worth considering the constituents of this remarkable process [Cro1993].

  • Simple, immediate goals
  • IETF specifications usually attempt to solve specific, immediate problems, rather than to encompass a wide-range of long-term goals. This permits work to be directly responsive to immediate requirements. Keeping goals simple tends to make the resulting designs also simple.

  • Comprehensible specifications
  • The IETF has very loose requirements for the style in which its standards are written. In general, this results in documents that are easily read by the average implementer. Although formal analysis often uncovers ambiguities and errors in such documents, the informal network of implementers convey whatever additional information is necessary.

  • Easy access to specifications
  • The existence of the Internet Repository means that anyone with Internet access can obtain standards and working documents of the IETF, for no additional cost. This is in marked contrast with many other standards organizations. It is another example of the ways in which the IETF work is highly accessible to the broadest audience, permitting better analysis and broader use.

  • Incremental enhancement
  • Because the Internet can field solutions quickly, Internet standards can benefit from considerable operational feedback. This, in turn, permits another round of specification, if needed.

  • Diverse contribution
  • Newcomers to the IETF never quite believe that the process is as open as it is. Anyone with a fresh perspective and clear insight is always welcome. As the Internet increases its global reach, many IETF contributors participate exclusively by email.

  • Live with the results
  • IETF participants usually are directly involved in producing or using the technology. In particular, they are not professionals in standards development. Even more important, IETF members build what they specify and then use it. The Internet itself provides a very large-scale live test environment and as is often true with software, once it passes the test it is instantly used in production. If a working group's efforts are not useful, this is quickly evident before the work is made into a standard.

  • People, not procedure
  • For all of the increasingly formal procedure in the IETF standards process, the real work of the IETF relies on individual judgment, as well as individual effort. The formal rules provide beacons for guidance and for synchronization. The real test that is applied to difficult choices is whether the people involved conducted themselves fairly and made the best choices under the circumstances.

  • On-line collaboration
  • As described above, the ability to have on-line working group participation is paramount. It fundamentally eliminates the barriers of time and cost for participation and contribution. This enormously increases the number and diversity of people who can contribute.


    4. Failure of OSI community

    Over the last several years, work from the Internet community has shown vastly greater market acceptance and use than the work of the OSI community. It is puzzling to try to determine the engineering reasons to explain this. One possibility is the OSI community's desire for functional completeness and accommodation of all interests leading to the philosophy of including as much as possible in a design.

    In contrast, successful IETF working groups are driven by near-term needs and consequently try to produce designs that remove as much as possible. At first glance, this approach should produce highly limited designs. The trick in the process appears to be the group consensus requirement. As one would expect, each participant contributes their list of desired features, but the short time-fuse on the work requires that the group reach consensus quickly. This can only be done by removing features, since only a small core of features will be clearly acceptable to most participants. (The alternative approach of including all of everyone's preferences requires too much group debate and results in a design that is obviously unacceptable.)

    However, the process of removing features also requires some assurance that some of those features can be added later. Hence, the design usually permits extendibility which must be designed with an approximate sense of the types of extensions that are likely to be made.

     


    Consortia


    Definition

          A consortium is typically a loose, long-term alliance between competitors in a given industry. Research and development consortia are a specific type of consortia that focus on basic research and sometimes applied research, rather than downstream activities such as production.  While joint ventures and licensing partnerships are relatively traditional forms of inter-firm collaboration, R&D consortia are new to the scene.  Under the National Cooperative Research Act (NCRA) of 1984, these sorts of industry-based consortia became immune to anti-trust legislation in the United States.  The NCRA emphasizes the pre-competitive aspect of R & D as shown in section 2.6 of that act which protects "theoretical analysis, experimentation, or systematic study of phenomena or observable facts, the development or testing of basic engineering techniques, the experimental production and testing of models, prototypes, equipment, materials, and processes, the collection, exchange, and analysis of research information."  To date, no U.S. consortium has been prosecuted under any anti-trust legislation.  The collaboration of competitors in early phases of the innovation process can yield great advances for the entire industry involved in the consortia.  This is often the main goal of a consortium, the advancement of the industry concerned.  While a multitude of industries engage in significant research activity, the information technology industry accounts for 42 percent of worldwide strategic alliances by one account [Hag 1992].  This fact makes consortia a key factor in examining IT inter-organizational design practices.

    Differences from other alliances

          Consortia normally consist of many firms, making them much larger than typical joint ventures.  Also, most consortia have relatively small budgets (i.e. $1-2 million) compared to the large amounts of capital that large firms put into joint ventures, although there are examples of consortium with budgets approaching $1 billion.  The flexible nature of most consortia lends itself to longer term projects and the volatile nature of the IT industry.  Changes in the industry's direction can sometimes be accounted for in a consortia (given an effective organizational structure), whereas a joint venture or established technology web would have difficulties adjusting.  This flexibility is sometimes a result of a rather open-ended goal for a consortia.  The potential output of these organizational forms is somewhat uncertain and it may be difficult to get members to agree on very specific goals [Eva 1990].  It should also be noted that some consortia are non-profit organizations (e.g. Open Software Foundation, MIT World Wide Web Consortium).

    R&D consortia description

          There are numerous examples of R&D consortia, both in the U.S. and elsewhere.  One of the first U.S. consortia which is commonly used as a model for study (both positively and negatively) for newly emerging consortia is the Microelectronics and Computer Technology Corporation (MCC).  Another semiconductor related consortia is the Semiconductor Manufacturing Technology (SEMATECH) consortium, and in telecommunications there is the Telecommunications Information Networking Architecture (TINA) consortium.  The proliferation of R & D consortia in the U.S. is documented in Table 1.  The large number of registrations in 1985 is due to the fact that the NCRA was passed in 1984 and no registration of these types of organizations was done before then.  Thus, some of the 50 consortia that registered in 1985 actually formed before that date (e.g. MCC in 1982).  In fact, it was some of these original consortia that lobbied the U.S. Congress for passage of the NCRA.  U.S. consortia are often compared to a similar organizational form in Japan, called Technology Research Associations (TRA's).  Indeed, the similarities are apparent and in fact some U.S. and European consortia are seen as direct respondents to TRA's formed in Japan.  For example, the Fifth Generation Computer Program (FGCS) was announced in Japan in 1981, just before the founding of the MCC in the U.S. and the Alvey program in the U.K.  Declining microelectronics/semiconductor market share in the United States led to the formation of SEMATECH whose primary goal was to revitalize the entire industry and regain market share lost to the Japanese.  The Alvey program was similar; its aims were to improve U.K. competitiveness in the entire IT industry.

    TABLE 1. CONSORTIUM REGISTRATION
    IN THE U.S. BY YEAR

      year

    1985

    1986

    1987

    1988

    1989

    1990

    1991

    1992

    # of consortia

    registered in U.S.

    50

    17

    27

    33

    34

    45

    59

    59


    Strategic Issues for Firms

    Motivations to join / Advantages

           The advantages a firm enjoys in being a part of an industry-directing consortia are somewhat apparent. Reduction of R&D costs as well as the change to work with other industry leaders prompts many large firms to join consortia.  There are other, less obvious reasons to participate though.  Participants in the Alvey program claimed benefits included the acquisition of technology knowledge, development of tools and techniques, increased ties with other industrial and academic researchers, and possible participation in the development or application of emerging IT standards.  The cited advantages are not directly related to traditional cost/benefit analyses.  In other words, while there is great difficulty in determining tangible direct outputs of R&D collaboration, many important benefits are inherent in the cooperation and not the results.  One participant in the European Dedicated Road Infrastructure for Vehicle safety in Europe (DRIVE) program indicated that government subsidies did not provide sufficient support for the project to be considered cost-effective but that the cooperative work with other companies was the primary factor in joining the program [Coo 1996].  

    A comprehensive list of the advantages of an R&D consortia includes:

          Other issues firms should consider when joining a consortia are the complementary nature of the work to be done.  In other words, companies should be sure to use consortia to complement what they are presently doing;  it won't replace in-house R&D.  Joining a consortia to investigate interesting technological areas that you would not invest funding in otherwise is another strategy some firms take.  This tactic is useful in exploring new markets or becoming more international.  Finally, non-R&D based consortia have distinct motivations from those above.  For instance, the consideration of safety issues, which will not sacrifice proprietary rights while benefiting all participants, is common in industries such as automobiles.

    Disadvantages

          A discussion of consortia drawbacks or potential stumbling blocks is instructive as it is these factors that hinder the market expansion and/or general product development that is the goal of these consortia.  Knowledge or technology transfer between companies is one of the biggest hurdles during the formation of R&D groups.  The diffusion of a new technology or process throughout members of a consortia can be extremely touchy, especially when the majority of research is undertaken in member firms laboratories as opposed to a joint facility.  Proprietary and tacit knowledge ("know-how") can only be transferred through reliable and dense communication paths.  This sort of communication is difficult to maintain, particularly towards the beginning of a consortium's lifetime when companies are not used to working with the other members.  This type of initial difficulty in starting a consortia can snowball, making success all the more difficult in the long run. Along with technology transfer comes intellectual property rights issues.  In this case, however, the organizational decision to utilize joint research facilities makes it more difficult for companies to maintain intellectual property rights since company scientists making a discovery are working right alongside workers from other corporations.  A strict proprietary policy in a joint facility would inevitably lead to segregation between different members, undermining the entire purpose for the consortia's existence. 

          It is interesting here to look at Stiglitz's notion [Coo 1996] of technological learning being localized in relation to R&D consortia.  This belief states that technology, as opposed to science, is susceptible to obsolescence or inapplicability.  For instance, collaboration between two companies may result in a product or process that is extremely beneficial for those two companies, but practically useless for other firms.  While this may be true in some instances, this theory is directed more towards collaborative efforts at the far end of the innovation process (see table 2).  Inventions or feasibility studies are inherently non-localized; a new type of transistor developed at Bell Labs would work just as well at Intel.  It is these sorts of projects that are undertaken in a consortia environment: pre-competitive R&D. In other sorts of alliances, however, the concept of localized learning makes intellectual property issues more manageable as company-specific gains are more easily protected.  Other problems encountered in consortia include cost overheads experienced due to excessive meetings (time and money considerations), travel, etc. For a successful alliance, these costs would seem to be minimal although they have been cited in the literature by group R&D participants [Von 1991].  In addition, the actual organizational structure of the consortia can lead to difficulties in decision making and dominance by larger members.  Consortia often function through committees and by majority rule, meaning response time can be slow and some members are left unhappy.  These sorts of problems need to realized and addressed as quickly as possible to prevent internal conflict or lack of direction from hurting the consortia.

          In general, drawbacks of consortia include:

    International Consortia Models

         The impact of national culture and policy on business practices can sometimes be seen in technological collaboration.  In studying consortia/TRA's in the U.S. and Japan, Aldrich [Ald 1995] sheds light on both organizational forms of collaboration, highlighting their different methods of achieving similar goals. For example, R&D consortia in the United States tend to perform research in joint facilities much more often than Japanese TRA's, while U.S. consortia also utilize university facilities on a much larger scale than their Japanese counterparts. A Japanese stigma attached to university professors becoming too involved with private industry is a cultural phenomenon resulting in the latter discrepancy. Also, joint facility research leads to easier sharing of results but more difficulty in protecting proprietary information. Japanese TRA's have been around for much longer than U.S. consortia and are heavily funded and influenced by the government.  It is an interesting footnote that the Japanese government has spent extensive time and money supporting collaborative R&D in the second half of the 20th century while the U.S. government has spent much of that time investigating the same thing for possible anti-trust prosecution.  To contrast, Japanese TRA's receive 53% of their funding from the government while a typical U.S. consortia obtain just 17% from government channels.  Many U.S. organizations must rely heavily on member dues for revenue, although there is significant diversity in U.S. consortia funding arrangements in comparison with TRA's.  A glaring organizational difference between TRA's and consortia is the lifespan. TRA's are basically self-contained research projects rather than ongoing industry-related research.  For this reason, TRA's will terminate either due to the successful completion of the project or to the infrequent realization that the project is failing.  Consortia in the U.S., on the other hand, have indefinite lifetimes and although approximately 10% of those registered have been disbanded, this is mostly due to project failure than project completion.


    Success or Failure

    Technical vs. Economic

          Many consortia are not-for-profit, so direct economic indicators are not always helpful.  Specifically for R&D consortia, the development of new technologies and the advancement of the industry or member companies at the least, is a better measure of success.  Does it meet the goals set forth at the formation of the consortia?  These goals are often very open-ended and can change throughout the lifetime of the organization (e.g. SEMATECH).  The organizational structure (centralized or umbrella), funding sources, specific goals of members, and other factors are all important in determining whether a consortia succeeds or fails.  Typical barriers to success include technology transfer back to members, disagreement between members about goals, time lines of firms, managerial impatience (returns may take longer than usual), and companies that are too secretive leading to low consortium awareness of important problems

    Technologies being developed??

          Are new technologies being developed as a direct result of research done at or funded by the consortium?  Is technology being licensed out?  MCC companies took money from their in-house R&D budgets and re-appropriated it to MCC, leading companies to expect quick returns and more control over consortia projects. This fact made it difficult for the consortia to act independently but more importantly led to member companies "pulling" on research projects in different directions, making useful results harder to come by.  

    Technology Transfer

          If technology is being developed, is it being transferred effectively back to the member companies?  MCC was forced to hire most people outside of the member companies because many members were unwilling to assign its best workers to an unproven commodity.  This led to difficulty in transferring technology back since MCC researchers had no ties to member companies.  SEMATECH overcame this problem, the vast majority of its research positions are filled by member company assignees who return to their company after an average of 2 years at SEMATECH.  This greatly faciltates technology transfer, especially that of tacit knowledge accumulated over the assignees' term in Austin.

    Lifecycle Analysis

          Just as there are various steps of technological innovation, there are various forms of collaboration along those steps. The table below illustrates the stages of innovation and their accompanying inter-organizational arrangements [Ald 1995].  As was mentioned above, R&D consortia focus on more generic or basic research.  Joint ventures tend to occur downstream, in the manufacturing realm of innovation.  

    TABLE 2. TECHNOLOGICAL INNOVATION PROCESS

    idea generation

     feasibility study  

    product
    development

    prototype and pilot
    plant construction

    interim
    manufacturing

    full
    commercialization

    Industry/University
    Cooperative Research
    Centers (IUCRC)

    yes

    yes

    Research & Development
    Limited Partnerships (RDLP)

    yes

    yes

    yes

    yes

    Joint Ventures

    yes

    yes

    yes

    Consortium

    yes

    yes

    yes

    yes

          A technology lifecycle analysis can help determine the most effective form of collaboration at a particular stage in an industry.  Technology can be broken down into 5 stages [Dod 1993]:

          Early stages of a technology's lifecycle are more conducive to research consortia as well as standards committees.  Therefore, it makes sense that industries such as telecommunications, microelectronics, and biotechnology (not mentioned previously but this industry accounts for a significant portion of consortia) contribute largely to the number of R&D consortia in existence.  Other older industries such as automotive and steel are mature enough that most inter-firm agreements concentrate on production or marketing.


    Adoption of Technology

    Help or hinder?

          Research and development consortia are normally flexible, long-term collaborations between competitors.  Most consortia are not limited in membership (within a given country usually) as are joint ventures and other collaborative efforts.  Because of anti-trust laws that limit these consortia (in the U.S. at least) to more generic research as opposed to production, new ideas are being generated and transferred back to a large number of competitors.  This basic outcome of an R&D consortia is beneficial to the adoption of new techologies by industry.  This means many companies and indeed, entire industries, can benefit and apply the new technology in very different ways.  

          It is possible for unsuccessful or ineffective consortia to hinder the adoption of new technologies.  For instance, a consortia that has outlived its usefulness but continues to exist will take funding away from other, more productive projects.  Also, a consortia that is unsuccessful due to poor technology transfer could be argued to hinder new technologies.  Research performed at this sort of consortia may have conceivably led to new products or processes that would benefit companies and the public but ineffective technology transfer back to the member firms prevented the proper use of the research.  A successful, or even moderately effective, consortium should not have these problems.  So, consortia in general can be seen to help bring about new technologies and their incorporation into society.  

    Risky projects

          R&D consortia allow for many risky projects to be undertaken that might otherwise not be performed.  This fact means consortia are very conducive to the creation of new technologies.  It is then up to the member companies to implement these technologies and market the resultant products.  For instance, the very ambitious Fifth Generation Computer Systems (FGCS) Project in Japan ran from 1982 to 1992 and had the ultimate goal of producing a computer capable of "intelligent information processing" [Gib 1992].  Basically, they planned to incorporate artificial intelligence into a computer by 1992.  This program was completely subsidized by the Japanese government (as are many TRA's), and there is little chance that any one of the member companies would have undertaken this work without the formation of the FGCS project.  Despite the failure of this project, usable results were disseminated to the member companies for use in other VLSI research.

    Examples

          There are numerous examples of consortia that produced significant and direct contributions to products and services.  The European Commission did a study on the ESPRIT (European Strategic Program for R&D in Information Technology) program in late 1990 and determined that approximately 3/4 of the program's 400 projects resulted in direct contributions to software tools, products, international standards and services.  SEMATECH has developed a cost-of-ownership (COO) model to estimate the cost of ownership for individual pieces of semiconductor manufacturing equipment over its lifetime.  These costs include purchase, support, service and maintenance costs.  Intel, and over 100 other firms, have implemented this COO model and Intel claims that it has reduced operating costs by over 50% for a specific etching tool.  This is an excellent example of a consortia benefiting an entire industry, not just member firms.  


    Case Studies

    SEMATECH

          In late 1986 the United States semiconductor industry came to a startling realization.  Japanese semiconductor market share had surpassed the U.S. and projections showed U.S. market share, once at 85%, dwindling to 20% by 1993.  In addition to this, two reports were issued from the Defense Science Board (DSB) and the Semiconductor Industry Association (SIA) that served to heighten the sense of urgency in the microelectronics industry.  The DSB report concluded that American firms were falling behind the Japanese in high volume manufacturing processes.  This fact had dire implications for the defense industry which depended on U.S. firms for its technological superiority.  As a result, this report recommended the formation of a new semiconductor manufacturing facility, jointly owned by the government and industry.  The SIA report, on the other hand, attacked the same problem somewhat differently.  Its recommendation was for the creation of a non-profit industrial consortium that concentrated on research as opposed to manufacturing.  The SIA acted first and formed a committee to create this new organization.  The committee then became the consortium itself, the Semiconductor Manufacturing Technology consortium, or SEMATECH.  

    TABLE 3. ORIGINAL MEMBERS OF SEMATECH

    Advanced Micro Devices LSI Logic
    AT&T Microelectronics Micron Technology
    Digital Equipment Corporation Motorola
    Harris Corporation National Semiconductor
    Hewlett-Packard NCR
    Intel Rockwell International
    International Business Machines Texas Instruments

          SEMATECH's founding 14 members are shown in table 3.  These companies accounted for approximately 80 percent of the total U.S. semiconductor industry in 1987.  This fact is not to be taken lightly.  The semiconductor industry is a notoriously competitive one, with strict adherence to the free market/enterprise concept. Proprietary court battles were not uncommon (e.g. Intel and Advanced Micro Devices over microprocessor technologies), and even lobbying for trade restrictions against Japan was cut short because rival companies could not agree on industry objectives [Yof 1988].  Thus, it is nothing short of amazing that soon after their realization of the impending Japanese domination, all major semiconductor companies agreed to form this consortium.  The purpose of SEMATECH was put forth in its mission statement as follows: "To provide the U.S. semiconductor industry the capability of achieving a world-leadership manufacturing position by the mid-1990's" [Pet 1988].  It will be seen in the following discussion that the means of achieving this purpose have evolved throughout the lifespan of SEMATECH.

          Upon deciding to form a consortium, industry leaders needed to decide on several important topics, including funding sources, organizational form of the new entity, whether to build new facilities or use existing ones, and the filling of key leadership positions.  First and foremost, SEMATECH was to be funded by both the government and industry.  The federal government agreed to provide $100 million annually for the initial five year lifespan of the consortium, to be matched by member firms.  The membership dues were set at 1% of a company's semiconductor sales with a minimum of $1 million and a maximum of $15 million.  This provision has acted (purposefully or not) to discourage smaller semiconductor firms from joining SEMATECH as the $1 million minimum is somewhat prohibitive.  In fact, the membership of SEMATECH has not grown at all since its formation, which many companies claim (including Cypress Semiconductor) is a result of these expensive dues. So, the operating budget of SEMATECH was set at $200 million per year for five years at formation.  SEMATECH has never been (and probably never will be) open to companies that are not based in the U.S, which makes sense given the goals of the consortia.  The aforementioned exclusion of smaller companies seems disturbing at first in that it may lead to a "rich get richer" environment in the semiconductor industry.  However, SEMATECH's original goal of enhancing the entire U.S. industry, not just consortium members, has proven to be more than just empty rhetoric.  This will be shown shortly.

          A vital second decision that needed to be made was that of facilities or laboratories.  As we have already seen, Japanese collaborations tended to concentrate research in member firms, which had proven successful for them.  However, some early American R&D consortia such as MCC had opted to go with joint facilities and this is the path that SEMATECH decided to take.  In fact, SEMATECH set up its headquarters in Austin, Texas just as MCC had several years earlier.  By choosing to go with joint facilities, consortium founders also determined that SEMATECH would need a sizable staff, and would include assignees from member firms.  All these decisions were at least partially influenced by the fact that technology transfer from SEMATECH back to members would be easier in this format.  It should be mentioned that not all SEMATECH research takes place in Austin; member firm laboratories also undertake consortium-related research.  Approximately one-third of SEMATECH personnel were assignees from member firms and these workers constitute most of SEMATECH's engineering and scientist workforce. Most of the remaining direct hires are not research personnel but serve as technicians and administration personnel [Gri 1993].  The technology transfer model used in SEMATECH is shown in the following figure where SEMATECH assignees are in a very key position: between SEMATECH activities and the primary receivers.  Also important are member company technology transfer managers who specifically focus on bringing SEMATECH technology into use at their firm.

    Member-company technology transfer from SEMATECH

          The selection of a leader, or CEO, of any industry-wide consortium is a daunting task.  This single decision can make the difference as to whether a consortium is successful or a failure.  Research has concluded the presence of a "champion" for a consortium or industry greatly increases the chances of success for a new venture [Sou 1993].  Difficulties in managing a consortium are well-documented.  They include dissimilar motives for collaboration, proprietary rights issues, differences in business cultures, and effective technology transfer.  Any of these issues or many more can result in the dissolution of an otherwise well-meaning consortium.  A CEO needs a strong vision, plenty of charisma, and industry-wide respect to set a consortium on a track to success.  For this reason, SEMATECH took an entire year to find their first CEO, Bob Noyce, founder of Intel [Bro 1995].  His appointment to the position most likely helped attract top researchers from member firms to the fledgling organization and avoid MCC's fate of using outside hires to conduct important research.  Just as significantly, his appointment helped assure SEMATECH of government funding because Noyce was consistently lobbying for government aid.  The rest of SEMATECH's organizational structure was mainly borrowed from successful industry models and is shown in the figure below.  Each Focus Technical Advisory Board (FTAB) consists of one member company representative.  This assures members of significant input on technical issues.  An interesting feature is the use of only 3 levels of management under the CEO's three closest executives.  The levels of management were directors, managers, and project managers.  By minimizing the number of management levels, founders were probably attempting to steer clear of management difficulties and bureaucracy while emphasizing the research being done.

    The governance of SEMATECH

          SEMATECH inevitably encountered a few problems during its start-up phase.  For instance, when assignees came over from member firms, they were filling positions unevenly.  High-level positions sometimes had very few people underneath them while lower level positions had a multitude of subordinates.  This led to much confusion in the hierarchy of the consortium.  Noyce's determination not to use internal organization charts did not help matters.  This difficulty can be expected somewhat in a new entity, especially when it is understood that assignees are coming in from many different companies at all sorts of different levels inherent to their own firms.  SEMATECH's experience in this matter should be taken into account in future consortia formations.  This initial state of hierarchical disorder should be avoidable with just a little planning before start-up.  

          Another problem was related to business culture.  Employees from 14 different firms were being thrown together in the same facilities in Austin, along with direct hires who previously worked for still more companies.  Researchers used to a more formal environment might be somewhat put off by new, informal co-workers. Also, the issue of technical jargon and numerous acronyms could result in less effective communication between workers from different member firms.  SEMATECH addressed this problem by compiling a dictionary of all technical terms and acronyms upon formation [Bro 1995].  This solution may sound somewhat mundane but it proved effective in this case.  

          The most important obstacle in the way of SEMATECH's goals was just that, it's goals.  The original purpose of SEMATECH was to allow members to work together to improve their manufacturing process technology.  After construction of a new fabrication facility in Austin, members began to question whether the sharing of process technology, which can be quite specific, was too unproprietary in nature.  For example, it is the superior manufacturing capabilities of Intel that have in part propelled them to the top of the world market.  To share process technology that is so vital to a company's prosperity is rather risky.  Thus, SEMATECH altered its focus.  This point is of extreme importance when dealing with technological collaboration.  The ability to change directions and adapt to market needs as well as member needs is vital to the success of any sort of strategic alliance.  The new focus of SEMATECH was on generic technology and the improvement of the semiconductor manufacturing equipment infrastructure.  To demonstrate the improvements in equipment, SEMATECH set goals of producing specific linewidths on silicon on a very specific timeline.  For instance, 0.35 micron linewidths on 8-inch wafers by late 1992 was a goal that was satisfied.  The belief here is that by improving the manufacturing equipment infrastructure, U.S. semiconductor manufacturers would be placed back in a dominant role.  With this change in strategy, SEMATECH's focus moved to process development and the consequent technology transfer to its members.  

          As mentioned above, SEMATECH has kept its promise to revitalize the U.S. semiconductor industry.  U.S. market share, projected to be 20% in 1993, overtook Japan in that same year for the first time since 1986.  The situation has continued to improve in recent years, with American companies accounting for over 44 percent of world semiconductor sales in 1996, almost 8% more than Japanese firms.  Since SEMATECH has focused on manufacturing technology in the semiconductor industry, it is interesting to note that yields of U.S. equipment have improved from 15% less than Japanese equipment in 1985 to 9% in 1991 [USG 1992].  In addition, Applied Materials, a California-based company, has become the top worldwide supplier of semiconductor manufacturing equipment.  The contribution of SEMATECH to the new and improved technologies is exemplified by the COO model mentioned earlier as well as the equipment improvement projects (EIP) it undertakes.  One EIP resulted in improving the mean operating time between breakdown of a chemical vapor deposition (CVD) tool from 75 to several hundred hours.  SEMATECH's more applied research agenda has nicely complemented the work of 2 other U.S. semiconductor consortia, MCC, which focuses on artificial intelligence (AI) and computers, and the Semiconductor Research Corporation (SRC) which focuses on very basic research and emphasizes industry-academic collaboration.  With the bulk of the U.S. semiconductor industry as members, and the industry itself flourishing, it can be easily said that SEMATECH has been a successful attempt at inter-organizational collaboration.  As Craig Barrett, Chief Operating Officer of Intel Corporation, said, "I judge SEMATECH by results.  The organization set out to recover market share from Japan; five years later, market share had been recovered.  At Intel we call that a results-oriented, successful project" [Bro 1995].

    Alvey

          Much like the establishing of SEMATECH, the UK's Alvey program was instituted to counteract a competitive decline in the British IT industry. The balance of trade in IT goods in the UK went from a 100 million pound surplus in 1975 to a 300 million pound deficit just 5 years later. Add to this fact the announcement of the Japanese Fifth Generation Computer Systems Program in 1981 and the UK government determined it needed to improve competitiveness in the IT industry. To do so it implemented the Alvey program in 1983 whose fundamental goals were to [Qui 1995]:

    1) increase the competitiveness of UK IT suppliers
    2) ensure some self-reliance for commercial and especially defense purposes
    3) strengthen the R&D base and encourage collaboration between industry and academia
    4) achieve more specific technical goals as were laid out at the beginning of the program

          The Alvey program isolated 5 specific areas of IT to focus on:

    1) VLSI (very large scale integration) semiconductor technology
    2) IKBS (intelligent knowledge-based systems) or artificial intelligence (possibly to compete with Japan's FGCS)
    3) Software Engineering
    4) Man-Machine Interface
    5) Systems Architecture (parallel processing)

          Strictly speaking, Alvey was not a consortium but a national program.  The similarities are striking, however, and comparisons between this program and other consortia such as SEMATECH can provide valuable insight into collaborative research efforts.  In the 6 year duration of the program, 198 collaborative projects were supported with each project averaging a 2 to 3 year lifetime. A very large number of firms were involved with this national program as well as academic institutions and government research laboratories. In all, 115 firms, 68 academic institutions and 27 government labs participated. Most projects involved only a small number of these organizations, typically 3 to 5.  This extensive collaboration was unprecedented in the UK, whose IT industry was extremely fragmented when compared to other nations such as Japan or the U.S.

          The Alvey program is similar in some ways to SEMATECH. For instance, government support of the program was set at 50% with the participants supporting the remainder. Over the 6 year lifetime of the program, the government provided 200 million pounds to Alvey of the 350 million total. Also, both SEMATECH and the Alvey program were begun in hopes of turning around a national industry. In line with this goal, no international companies were allowed to participate in either venture.  There are significant differences, however.  First of all, Alvey was a fixed-term project (much like Japanese TRA's) while SEMATECH is an ongoing collaboration.  Research in Alvey was undertaken at participating firms' sites rather than at a central location. This makes technology transfer completely dependent on meetings, telephones, and electronic mail.  According to numerous surveys in the literature of consortia managers and researchers employee visits and technical demonstrations are the most effective means of transferring technology.  By using a central facility, these types of transfer mechanisms become inherent to the consortia [Gib 1992].  The decentralized structure of Alvey was most likely chosen for the advantages of more firm control over the research and less knowledge spillover to other firms (or loss of proprietary rights).  In addition, Alvey relied much more heavily on academia than SEMATECH.  Finally, SEMATECH research is less fundamental and more applied than that undertaken at Alvey.  For example, SEMATECH houses it's own joint production line for testing new processes in-house.  

          Was the Alvey program to advance information technology in the UK successful or did it fail to meet its goals?  This is a debatable issue but there is one clear result of the Alvey program: its strong impact on UK R&D expenditure.  The figure below shows the dramatic change in single company project funding as well as joint project spending throughout the lifetime of the Alvey program.  The effects of this national program on the British IT industry cannot be denied but the results were mixed.

    UK R&D expenditure during Alvey

    Successful aspects of Alvey

    1. Primarily core technologies were addressed by the projects while maintaining a focus on long-term returns [Qui 1995].  Since Alvey was created to perform basic, or unapplied research, long-term goals were paramount.  At the same time, a focus was kept on the 5 enabling technologies outlined above and 64% of industrial participants felt Alvey allowed them to develop new tools and techniques in these areas.
    2. Strong ties were created between academia and industry which were not present before program initiation.  Nearly half of industrial participants claim that Alvey enabled them to absorb know-how and access technology from academic partners (only 22% said the same about industrial partners).  Meanwhile, academia was benefited by better linking to the commercial sector.
    3. Many of Alvey's industrial participants have continued to work on projects begun during Alvey with partners (both industrial and academic).  Approximately 3/4 of industrial participants maintained follow-up R&D to projects begun under Alvey funding.
    4. The continuation of Alvey programs led to the expansion of in-house R&D for many larger companies and smaller companies were encouraged to establish new R&D departments by their successful collaborations during the program.  

    Unsuccessful aspects of Alvey

    1. The UK government stressed that the work done under Alvey funding should be in addition to in-house R&D.  This emphasis on "additional" research implies that the work done in the program was non-essential to the participants.  Indeed, one author [Sci 1987] concluded from his study of Alvey and ESPRIT that firms will gladly accept public funding for projects that they do not feel are commercially worthwhile to undertake on their own.  This point would seem to contradict the first successful aspect listed above.  The somewhat vague definition of core technology may be a large part of this contradiction as different participants have different interests and may be reluctant to claim that most of their research was done only on peripheral technologies.  These points would lead one to believe that the research performed during Alvey was not as "vital" as the government would have liked.
    2. The focus on more basic research, as opposed to SEMATECH's more applied focus on manufacturing, makes Alvey more difficult to assess in terms of reaching initial technological goals.  However, the UK has not gained significant market share in the IT industry after the completion of Alvey, except slightly in a few small niche markets such as ASIC's (application-specific integrated circuits).  Strides were made overall, including meeting a pre-determined goal of fabricating 1 micron devices on silicon by the end of the proejct.  It is difficult to say whether these advances would have been made without the program. So, in terms of the fundamental goals listed at the beginning of this case study, the increase in worldwide competitiveness does not seem to have been attained but a higher degree of self-reliance appears to be a reasonable assessment of Alvey.
    3. The amount of funding for this program was insignificant in comparison to the R&D budgets of the larger participants.  To address a country's weak IT industry would require much more funding and as a result, Alvey became a more limited program with a specific focus on ASIC's [Von 1991].  Aspects of the program such as man-machine interface and artificial intelligence did not see significant gains due to the attention given to the VLSI program, both in catching up to world leaders in fabricating small devices and in ASIC design.  In this sense, Alvey failed to address many of its original concerns due to limited funding.
    4. At the beginning of the program 60 million pounds were allocated to support training and research at academic institutions because the government had determined there was a severe shortage of well-qualified engineers and scientists coming from UK schools.  While Alvey itself aided academia by creating better ties to industry, the shortage of qualified personnel was not alleviated by this funding [Von 1991].  

          Looking back at the initial 4 fundamental goals listed above, it seems that the first and fourth goals were not met satisfactorily.  The second and third goals were achieved through Alvey so it would appear that this example of inter-organizational collaboration was both successful and unsuccessful.  Lessons to be learned here include more prudent goals that reflect the amount of funding available.  Other European programs similar to Alvey, such as Eureka and JESSI, have greatly increased funding, with support of over 5 billion pounds.  Also, it may be difficult to run a successful program with only long-term goals in mind.  SEMATECH, for example, has focused on producing results with short-term goals, including the improvement and adoption of technology.  This sort of work seems more suited to R&D consortia: the strengthening of established industries rather than the total creation or resurrection of dead industries.  It is difficult to create a prosperous industry within the span of a 5 or 6 year program as was the case with Alvey.  U.S. and Japanese consortia have learned this and now tend to focus more on short-term projects [Gri 1994].  Also, the umbrella structure of the Alvey program leads to purely administrative leadership rather than the technological direction that SEMATECH CEO's such as Noyce and his successor Bill Spencer have provided.  The flexibility discussed in the SEMATECH case study is missing in the former organizational structure.  European programs have seemed reluctant to let control of research projects move away from firms (to reduce knowledge spillovers) although this seems to be the most efficient method of consortium management.


    Strategic Alliances and Joint Ventures


    According to data from the MERIT - Cooperative Agreements and Technology Indicators (CATI) databank, about 70% of the alliances made during the 1980s are related to the new core technologies: information technology, biotechnology, and new materials.[Hag1993]

    In light of this statistic and given the barrage of collaborative activity surrounding the information superhighway and the near daily computer industry alliance announcements of the 1990s, one can safely assume that information technology industries account for a majority of strategic alliances today. In this section we consider two modes of inter-firm collaboration, strategic alliances and joint ventures, the understanding of which is undeniably critical to any organization active in the computer or communications industries.


    Definitions of Strategic Alliance and Joint Venture

    Before we begin our discussion of the strategic alliance and the joint venture let us clarify these terms in the context of our inter-firm collaboration framework. Because both strategic alliance and joint venture are a broad terms which can include standards organizations, consortia and other forms of formal collaboration, we offer the following narrow interpretations:


    Strategic Issues for Firms

    "Alliances are burgeoning in the information technology industry, which is characterized by rapid change and short innovation cycles" [Rai1996].  Indisputably, understanding the key issues surrounding strategic alliances and joint ventures is paramount to the corporate strategy of a firm active in the computer or communications industries. This section addresses these issues, which include several key categories of factors which motivate firms to collaborate in this way, potential costs or drawbacks associated with alliances.

    Synergies between Firms

    The most fundamental reason why firms form strategic alliances is that each firm is convinced that participation will better enable that firm to accomplish a business goal. In other words, by working together the reward enjoyed by each participant will be greater than the reward associated with an analogous solo campaign.

    Collaborative vertical integration

    A common motivation for a firm in forming an alliance is to combine it's product or service with a complimentary product or service offered by another firm. Typically in the computer industry this means that the companies are collaborating to build a vertical solution from the specialized components provided by each participant. This vertical integration phenomenon, also a central theme in technology web dynamics has become increasingly important as the computer industry has matured. In the mainframe era large corporations like IBM, Unisys, and DEC provided complete vertical computer systems, with each vendor developing its own processor, peripherals, operating system, database and networking software, and applications. In today's personal computer era, the most successful firms are much more likely to be specialists in a particular horizontal system layer, meaning that a firm's product will most certainly be combined with the products of several other firms to provide the complete solution.

    Some companies manufactured microprocessors and semiconductors; others assembled printed circuit boards or CPUs and peripherals. Some developed system software or software tools; others, specific application software. Yet others focused on integrating computer systems, supplying consulting or training services, or offering after-sales support. [Hag1996]

    This characteristic of today's computing systems make collaboration between firms critical, and, as Rai, et al suggest, "companies relying on strategic alliances are more profitable than their vertically integrated counterparts" [Rai1996].

    One need only examine the packing slip or applications directory of a modern personal computer to witness the variety of firms which supply products to make up the entire package. A common configuration might combine a microprocessor from Intel, operating system software from Microsoft, a network interface card from 3Com, a modem from US Robotics, application software from Lotus, etc. Although the fact that products from such a variety of firms reside in a single computer system doesn't imply that these firms have collaborated formally, as we shall reiterate in our discussion of technology webs, formal cooperative agreements are very common in the industry. One example of a strategic alliance which has been strongly motivated by this collaborative vertical integration phenomenon is the PowerPC alliance [Moo1994] between IBM, Apple, and Motorola. This broad reaching alliance aspired to create an new personal computer platform, which required a new microprocessor (the PowerPC chip), new supporting hardware such as motherboard, chipsets, etc., new operating system software, and new application software. Clearly this project represented a massive endeavor, and certainly benefited from the respective competencies of the participant firms.

    Leveraging complimentary resources

    Often alliances are formed because the participating firms are able to leverage resources or competencies of the other firms. "Strategic alliances provide an effective means to improve both the economies of scale and scope offered by traditional modes of organization." [Rai1996] Because the introduction of a new technology increasingly involves integration of a variety of potentially complicated component technologies and potentially difficult-to-master component competencies, firms are often compelled to work together in bringing a technology to market. An example of how a strategic alliance serves as a vehicle for participant firms to leverage each others' resources can be found in the Hewlett Packard - Netscape Intranet Alliance [Tak1996]. In collaborating to deliver complete intranet solutions to business, HP was able to leverage Netscape's leadership in the web server and browser software markets, while Netscape was able to take advantage of HP's workstation line, extensive customer service organization, consulting force, customer contacts, and established reputation.

    In general, firms may realize a number of benefits by leveraging resources via a strategic alliance:

    Less quantifiable synergies

    For a technology-oriented firm with a strong innovative culture, additional and equally compelling reasons exist for collaboration whose value is less quantifiable.

    External information inputs are as, if not more, important to innovative activities (than internal inputs). Successful innovation depends on effective interactions between organizations. Partnerships with suppliers can provide privileged access to state of the art components. Strong links with important customers facilitate effective feedback on market requirements and productive performance. Collaboration with other firms, perhaps even with competitors, and even with university, government, and private research laboratories, can extend a firm's options in innovation. Such linkages are nothing new. They may, however, be extending and intensifying, and may involve increasing use of information technology in cementing them. [Dod1993]

    Collaboration can be viewed as a means for exploiting synergies between firms along three dimensions: [Dod1993]

    In other words, by working together the sharing of ideas between organizations can work to the advantage of all parties. Indeed the subtle benefits of cooperation for firms are analogous to the benefits that individuals realize when working in teams. Effective management of a strategic alliance is a crucial requirement if firms are to realize these subtle benefits. In the following section, Success or Failure we take a look at key factors in effective alliance management and discuss issues pivotal to successful collaboration.

    Managing Risk

    Firms often form alliances to help them manage the risk associated with a particular business proposition. Any enterprise, ranging from development project to advertising campaign, brings with it a degree of risk. Forming strategic alliances or joint ventures can help firms disburse or otherwise reduce risk.

    Spreading out R&D costs and other investments

    A new semiconductor wafer fabrication plant costs many hundreds of millions of dollars to construct, and may only be state-of-the-art for a few years. By forming alliances, firms can share the enormous costs often associated with developing or manufacturing new technologies, while reducing unnecessary duplication of research or development efforts.[Dod1993]

    Reducing risk of market failure

    In the computer and communications industries, a typical development project can be very expensive and if the initiative doesn't achieve critical mass, may never earn a significant return on investment. "As more firms are discouraged from undertaking further risky ventures when they cannot recover their seed capital from earlier investments that faced shortened product lives, even bitter competitors will be sharing laboratory results to commercialize cutting-edge technologies before they become obsolete."[Har1987]. The likelihood of such a scenario is especially real in the information technology because of the prevalence of strong network externalities.

    Companies often form alliances around a particular technology initiative to reduce the likelihood of the market failure of that product. The mindset in this case is that if several successful technology companies are backing a particular campaign, then the combined engineering expertise, marketing clout, customer contacts, and public relations experience of these firms will be strong enough that critical mass will hopefully be achieved. However, as we we will discuss in the following section, Success of Failure, the support of powerful, respected companies is no guarantee of a venture's success.


    Success or Failure

    It is clear that strategic alliances offer numerous benefits to companies, especially those involved with information technology. Managing strategic alliances and joint ventures is difficult, and in many cases they do not achieve the goals set out for them by the participating firms. [She1992] In this section we examine implementation issues, upon which an alliance's success often hinges, and we present some key recurring points of wisdom gleaned from the business literature.

    Implementation Issues

    Are Alliances Effective ?

    With collaborative activity between firms at an all time high, it seems appropriate to ask whether strategic alliances and joint ventures are accomplishing the objectives laid out for them by their parent organizations. According to Sherman, "roughly one-third of the 49 alliances tracked by McKinsey were flops, failing to live up to the parents' expectations." [She1992] The consensus in the literature indicates that alliances tend to fail most often because of implementation issues such as those mentioned above.

    The evidence strongly suggests that alliances are an important, even crucial strategic consideration, and can be used to great benefit. However, several recurring factors are pivotal for their success. First, only alliances that represent long-term partnerships should be considered. "A partnership that is going to last only five to seven years simply doesn't warrant that kind of investment. Today's corporate partners are less interested in short-term ventures designed to save a few dollars and more focused on long-term alliances where gains can be harvested for years to come."[She1992] Also, another vital characteristic of a successful alliance is flexibility to change over time. Reevaluation of the partners goals and expectations should be done periodically. Finally, perhaps the most fundamental ingredient any successful form of cooperation is trust. According to Ernst, "We Americans are used to acquisition and control. That's our mind-set. The lesson for U.S. companies is that it's necessary to have the mind-set of collaboration."[Ern1993]


    Strategic Alliance Case Study: General Magic

    Background

    General Magic was founded in 1990 as a joint venture led by CEO Marc Porat and funded primarily by seven major international corporations: Apple, AT & T, Sony, Motorola, Matshushita Electric Industrial, Philips, and Nippon Telephone and Telegraph (NTT).   General Magic commenced with the ambitious goal of revolutionizing the way people communicate with one another and the way people gather information.  Porat's dream centered around the notion of intelligent networks, based on existing telecommunications infrastructure and populated by smart software agents who interacted with each other to accomplish tasks for their masters, the end human users.  An intelligent network system would allow new and exciting services to be offered to all sorts of individuals and businesses, creating enormous revenue for the telecommunications providers, software companies, and electronic equipment manufacturers listed above.

    General Magic manifested its vision in two products:

    General Magic undertook an initial public offering in early 1995 with a great deal of fanfare, it's stock price doubling on the first day of trading[Mar1995].  In the years since, however, Wall Street's interest in the company has waned, as it has been unable to maintain profitability and it's technological vision has fallen by the wayside with the explosion of the Internet and Java.

    Motivational Issues

    Many of the factors which caused General Magic to form as a joint venture of large international corporations are those which have been discussed above.  In order to fulfill this vision of a worldwide intelligent network, an enormously expensive and complicated endeavor indeed, Porat knew he would have to enlist the cooperation of and support of many of the world's telecom and electronics powers, and he was successful in doing so.

    In joining this alliance, the participant firms demonstrate an interest in working together to apply their respective competencies to deliver an overall package to the customer, with AT & T providing network infrastructure and switching, Apple providing software expertise, and Sony providing electronic hardware manufacturing capability, all of which are necessary to the implementation of a widespread intelligent networking environment.  Furthermore, the firms are able to disperse the enormous cost, and risks associated with this venture.  Finally, the firms hoped to gain from organizational synergies that would result from bringing together the talents of many of the world's most successful companies.

    Success or Failure

    For reasons both external and internal to the alliance, General Magic has been by most accounts a complete failure.   As mentioned above, factors outside the control of the alliance effect a joint venture much as they would a company that was not hatched from an alliance of firms.  In addition to the external factors, a joint venture must be aware of additional problems and hurdles that accompany the benefits and rewards of collaboration.

    As for external factors, General Magic was derailed by the most outstanding technological development of the past decade.  "In a word, the Internet ... while the telephone companies were thinking in their own ponderous way about setting up such a network, the Internet was emerging under their nose." [eco1996]   The Internet provided many of the intelligent features envisioned by Porat, and Java emerged as a technology analogous to Telescript, but with open standards and the momentum of the Internet behind it.

    While external factors did certainly play a role in the downfall of General Magic, difficulties arising from the implementation of the alliance have also been significant.  "General Magic is suffering the tribulations of managing a huge partnership.  Executing a business plan becomes an order of magnitude more difficult when a lot of giant partners must be coddled, catered to and satiated."   [Bra1995]  The alliance seems to have been less than adequate with regard to the implementation issues of congruity, leadership, and human resource management mentioned above in Success or Failure.

    General Magic quickly found itself with too many partners pulling in too many different directions, and lack of congruity led to indecision and inconsistency on the part of management.  "Not only are Porat's plans extremely ambitious, he must try to fulfill them while pleasing many partners at the same time."  Because of the diversity of industries and nationalities represented by the alliance partners, it's not surprising that their interests are difficult to reconcile.  "At one time, one hated Magic Cap and another hated Telescript."  The leadership structure of the joint venture, was neither dominated by a single partner nor a democracy of equal representation, but consisted of an independent group headed by Porat, which had to balance commitments, financial and otherwise, to each of the partners.  This leadership framework proved ineffectual, with the partners either becoming fiercely demanding or harmfully negligent of General Magic.  Motorola, a member of the overly demanding camp, required that it's 68000 microprocessor family be the first to which Magic Cap be ported, a decision which was "clearly a mistake, since even Apple has abandoned this has-been PC processor."  Apple, as the company most responsible for starting the venture in the first place, surprisingly wound up in the negligent camp, having "refused to put the Magic Cap interface in its Newton handheld computers".  Finally, human resources issues became a stumbling block for General Magic.  Brandt contends that none of the partners wanted to devote their best employees to the General Magic team, and with the exception of Sony and Motorola, the people assigned to General Magic were "not that great". [Bra1995]


    Technology Webs



    While studying strategic alliances and joint ventures we noticed that a powerful collaborative dynamic existed which was characterized by informal cooperation by many firms around a particular technology. To informal to be considered under the heading of strategic alliance, a search through recent business literature turned up an interesting analysis of this collaborative phenomenon, the technology web.

    Definition of Technology Web

    An economic web is a set of companies that use a common architecture to deliver independent elements of an overall value proposition that grows stronger as more companies join the set. Within economic webs, technology webs organize around specific technology platforms. Webs are not alliances, however. They operate without any formal relationship between participants. [Hag1996]

    Perhaps the most salient example of an industry dominated by technology webs is the desktop computer industry, where the past decade has seen the Intel PC Web pitted against the Apple PC Web.


    Strategic Issues for Firms

    Especially true in information technology, today's most successful high-tech companies understand the issues of webs, which involve compelling dynamics between firms who cooperate informally and at times even reluctantly.

    Web Formation

    Before a web can form two conditions must be present:

    [Hag1996]

    The existence of a standard, whether official or de facto, brings companies into the web because it gives the particular technological platform around which the web is built legitimacy and longevity, thus reducing the risk born by entering firms that the market for products related the technology may never mature. The increasing returns condition ensures the continued incentive for firms to remain cooperative in order to grow and hence strengthen the web.

    Motivating Factors

    Many of the reasons described as motivating factors for companies to form strategic alliances and joint ventures can also be cited as issues which compel firms to participate in a technology web.

    Clearly exemplified by the personal computer web, collaborative vertical integration, wherein "highly specialized participants acted both independently and interdependently to assemble a complex package of technology components and services", is an important deliverable of a technology web. Often an information technology customer wants to purchase a vertically integrated, turnkey solution rather than buying each component separately and integrating them in-house. This market pressure is a powerful motivation for firms who do not offer complete systems to participate in a technology web.

    Because firms active in technology webs do not necessarily collaborate formally, leveraging each others resources is less a motivating factor than in the strategic alliance or joint venture scenario. However, web participants do take advantage of resources of other participants. Because of the increasing returns condition which prevails in a technology web, it is to each participant's benefit for the web itself to grow. Therefore member firms often engage in activities targeted primarily at growing the web. For example, a microprocessor company might fund an advertising campaign or even launch a development project which in itself may not be directly profitable, but which is really aimed at expanding the market for the personal computer. Although no formal collaboration may be involved in this engagement, certainly operating system companies, peripherals manufactures, and other participants in the personal computer web would benefit from the market expansion. The scenario just described is exemplified by Intel's recent strategic initiatives, including ProShare video-conferencing, designed to find "new uses and new users for PCs." [Mel1994].

    Again the recurring thread of risk management as a motivation for collaboration is evident in technology webs:

    Webs are a natural response to environments fraught with risk and uncertainty - which is why they are so prevalent in high-technology arenas. The "safety net" created by the other participants in a web allows a firm to focus exclusively on activities in which it can offer distinctive value. In this way, webs reduce overall investment requirements, focus individual participants' investments on areas most likely to succeed, and promote the emergence of multiple suppliers for bottleneck components.[Hag1996]

    As mentioned above, the existence of a standard is a prerequisite for web formation, and that standard reduces the risk involved in bringing a product to market. In joining a web this risk is spread over all of the participants. Additionally, web participation reduces the complexity for member firm, because each participant can feel more comfortable in specializing in its own core competencies.

    Strategic Roles

    Hagel identifies two distinct strategic roles that a web participant may adopt:

    Critical to the firm's success in a web environment are understanding the strategic differences in the two roles and effectively casting the company into one of these roles. "Too often, senior management selects one of these approaches without explicitly considering either the choice itself or what it means for business strategy. Yet whether a company decides to be an adapter or a shaper has profound implications for the strategy and tactics that it must pursue to be successful".[Mel1994]


    Success or Failure

    Despite the increasing returns phenomenon present in technology webs, it is important to note that the success of the web itself is no guarantee of the success of the firm. For a firm to maintain long-term growth, however, the size of the web must continue to grow as well. The relationship between a particular firm's success and the success of its web is illustrated in the table below:

    FIRM SUCCESS

    FAIL

    SUCCEED

    WEB
    SUCCESS

    FAIL

    Web failed so firms failed. Both excellent shapers and adapters might end up here if they don't effectively grow the web as a whole. Example: Apple, an competent shaper, failed because the Apple PC web stopped growing and has recently lost ground.

    No firms in this state

    SUCCEED

    Web succeeded but the firm failed. This may happen to a incapable shaper or a slow adapter. Example: IBM in the Wintel PC web.

    Web succeeded and firm succeeded. Firm succeeded in it's role and the web grew as well. Example: Microsoft, Compaq in the Wintel PC Web


    Perhaps one of the most interesting facets of the technology web is the dependence of the firm's success on the success of the web in which it participates. Overall web growth is so crucial that firms may directly or indirectly assist their competitors in the interest of growing the web. "Web strategies demand a completely different mindset from that employed in traditional strategic thinking. The context for defining strategy expands from maximizing value for the enterprise to maximizing value for the web." [Hag1996] Ultimately, for firms competing in a technology web to be successful, they must effectively understand the web value maximization versus firm value maximization issues, and find a balance between the two.

    Adoption of Technology

    As was the case for strategic alliances and joint ventures, technology webs are beneficial in the sense that they promote innovation and technological cooperation between companies. "Technology webs improve the climate for innovation. Webs are largely shaped by information flows; in them, information is distributed far more widely and intensely than in conventional markets. Webs disperse information to many participants and provide robust mechanisms for disseminating learning via information links and dependencies" [Hag1996].

    In another sense, however, webs may hinder the adoption of new technologies. Once a web achieves critical mass, a new technology which competes with the standardized technology within the web faces an often insurmountable battle of inertia. Once the web gets rolling it's hard to stop, and it's can be next to impossible get a new web rolling in direct competition with an established one. Path dependencies and technology lock-in effects which occur once a technology achieves ubiquity only exacerbate this problem.

    Case Study: The Network Computer Technology Web

    Background

    In February 1996 Oracle Corporation unveiled a working prototype of the Network Computer, described at the time as a "family of low-cost devices that run a variety of common applications across the network", [Ora1996] sparking a great deal of interest and activity in network computing. In this section we cast the network computer movement as the beginnings of a technology web and revisit the key technology web issues through the lens of the network computer.

    Network Computer Alliance Timeline

    February 1996 Oracle demos an NC prototype, hinting that it would bring on a new era in computing press release
    May 1996 Apple, IBM, Netscape, Oracle, and Sun announce an a general standard for network computing devices. This collaboration results in the publication of the NC Reference Profile 1, the "official" NC standards as dictated by these five companies. press release
    Oct 1996 Sun Microsystems introduces its network computer, called the JavaStation. Henceforth the phrase network computer turns up in reference to a wide range of thin clients, many running Java. press release
    Oct 1996 Microsoft and Intel respond by announcing the launch of the NetPC reference platform. The NC and NetPC standards are quite similar. In fact, the NetPC does conform to the NC standard. press release
    March 1997 IBM, Oracle, Sun, and Netscape announce an agreement to focus on open standards for network computing press release

    Web Formation

    The network computer movement as described above is beginning to resemble a technology web in its nascent stages.  Indeed it is characterized by the two requisite criteria:

    Companies are beginning to rally around these proposed standards, especially Java, for which a $100 million venture fund was recently established.

    Motivating Factors

    Clearly the primary motivations for a company to attempt to create a new technology web are their dissatisfaction with their position in existing webs, or their realization that the new web would both have a high likelihood of success, and, if successful, offer the company a valuable role in that web. Companies like Sun and Oracle, for example, who have been successful in general but who are not playing a successful role in the most dominant personal computer technology web, that of Wintel, would like the computer users of the world to shift to a different desktop computing paradigm, especially one which features prominently networking, a specialty of Sun, and central information servers, a specialty of Oracle.

    Strategic Roles

    In the burgeoning network computer technology web, companies are already beginning to align themselves in the strategic role framework outlined previously in the Strategic Issues for Firms section.

    Success or Failure

    Of course it is difficult to prognosticate about the success or failure of a particular technological movement, but several factors indicate that while the network computer will not replace the workstation or the personal computer, the network computer web will enjoy a great deal of growth moving forward.

    For example, it is clear that the companies who are driving the network computer movement have learned lessons from the victories and defeats of recent memory. They appear to be focused on maximizing the overall value of the web via open systems, avoiding the common pitfall of trying to grab everything for themselves through proprietary technologies.  Companies are out there evangelizing not only for their own products, but for the concept of network computing in general.

    So while it seems likely that the network computer web will attain a measure of success, predicting which companies will thrive within it's context is a trickier matter indeed.  Chances are it will be those firms who understand and capitalize on the technology web dynamics outlined here.


    Conclusions


    Summary

    In this report we discuss the major modes of collaboration, which appear in the vertical axis below, and describe several key issues which compel firms to collaborate.  These motivational issues, listed in order of decreasing altruism along the horizontal axis, are strongly correlated with the various collaborative forms. For example, when promotion of social welfare or national excellence are the goals of collaboration, a standards organization or consortium is typically the most appropriate vehicle to achieve these goals.  More business oriented motivations such as market domination or sharing the design cost burden are typically best served by a strategic alliance or technology web collaborative structure.

    Adoption of Technology

    On the whole, collaboration between firms promotes the adoption of technology.  It is difficult these days to identify a high tech product or service which came to market without the benefit of a standards organization, consortium, strategic alliance, or technology web.  The nature of information technology industries, characterized by large numbers of complex interrelated technologies, a rapid pace of innovation, and high development costs, makes collaboration important if not absolutely vital for firms.  Although in some cases, collaboration aimed at market domination may impede the evolution of new technologies, cooperation among organizations is instrumental in disseminating knowledge and enhances the overall productivity of an industry.

    Future of Collaboration

    Recent decades have seen a drastic increase in inter-firm collaboration, especially in information technology industries.  Several authors have suggested that this increase in collaborative activity is merely a temporary phase.  It is difficult to prognosticate about the future of the collaborative phenomenon, but there does not seem to be any indication that the factors which currently compel firms to work together will become less important in the coming years.  As long as IT industries are characterized by a high rate of growth and innovation, and a high degree of uncertainty, firms will continue to regard collaboration as an attractive business strategy.


    Bibliography


    Introduction and General Business Issues:

    [Newsbytes 9/96] Novell Begins New Marketing Strategy, Newsbytes News Network, September 11, 1996.

    [Moore 96] Moore, Geoffrey, Inside the Tornado, 1996.

    [InfoPart] Information Partnerships - Shared Data, Shared Scale, Benn Konsynski and Warren McFarlen, Harvard Business Review (HBR)

    [Stretch] G. Hamel, and C.K. Prahalad, Strategy as Stretch and Leverage , HBR.

    [Era] Crawford, Richard, The Era of Human Capital, 1991.

    [Strat] Porter, Michael, Competitive Strategy, 1980.

    [Value] Taggert, M.C., The Value Imperative.

    [Innov] Utterback, James, Mastering the Dynamics of Innovation, 1996.

    [Shared] Schrage, Michael, Shared Minds, New Technologies of Callaboration, 1990.

    [White 94] White, Sondhi, Fried, The Analysis and Use of Financial Statements, 1994.

    [Taligent 95] Taligent's Passing, Network World, 1995.

    [IODM 72] Mathew Tuite, et al (editor), Interorganizational Design Making , Aldine Publishing 1972.

    Information Systems Meta-List http://www.cait.wustl.edu/cait/infosys.html

    Object Communications Information http://www.mcc.com/projects/ndvas/tech-object.html

    ETSI infoCentre http://www.etsi.fr/dir/infocentre/vista/vista.htm

    GOSS - Guide to Open Systems Specifications http://www.ewos.be/goss/top.htm

    UniForum Association Extends Blocks of Compli. http://www.opengroup.org/launch/Uniforum.html

    Technology Policy Working Group http://nii.nist.gov/cat/tp/t960723.html


    Standards Organizations:

    [Rut1996] Rutkowski, Anthony M, Today's Cooperative Competitive Standards Environment For Open Information and Telecommunication Networks and the Internet Standards-Making Model, Standards Development and Information Infrastructure Workshop, June, 1994

    [Gad1996] Gadegast, Frank, MPEG-FAQ Version 4, http://www.powerweb.de/mpeg/mpegfaq/, March, 1996

    [Fuj1995] Hiroshi Fujiwara, MPEG textbook, ASCII Corporation, November, 1995

    [Sch1995] Schwartz, Mike, Group to Handle MPEG Intellectual Property Issues, http://www.cablelabs.com/PR/MPEG-IPR_release.html, Cable Television Laboratories, Inc., 1995

    [Tay1996] Tayer, Marc L., Beyond MPEG: Why not All MPEG-2 Digital Television Systems are create equal, http://www.powerweb.de/mpeg2/doc/dcsystem/dcsystem.htm, General Instrument Corporation, March, 1996

     [Cro1993] Crocker, D, Making Standards the IETF Way, Standard View, Vol.1, No.1, 1993


    Consortia:

    [Coo 1996] Coombs, Rod (editor), Technological Collaboration: The Dynamics of Cooperation in Industrial Innovation, Elgar Publishing, 1996.

    [Gib 1992] Gibson, David (editor), Technology Transfer in Consortia and Strategic Alliances, Rowman & Littlefield Publishers, 1992.

    [Dod 1993] Dodgson, Mark, Technological Collaboration in Industry, Routledge, 1993.

    [Von 1991] Vonortas, Nicholas, Cooperative Research in R&D-intensive Industries, Avebury, 1991.

    [Sou 1993] Souder, W.E., Getting together: a state-of-the art review of the challenges and rewards of consortia, International Journal of Technology Management, vol. 8, 1993: p. 784-801.

    [Ouc 1988] Ouchi, W.G. and Bolton, M.K., The Logic of Joint Research and Development, California Management Review, Spring 1988: p. 9-33.

    [Sou 1990] Souder, W.E. and Nassar, S., Managing R&D Consortia for Success, Research-Technology Management, Sep/Oct 1990: p. 44-50.

    [Eva 1990] Evan, W.M. and Olk, P., R&D Consortia: A New U.S. Organizational Form, Sloan Management Review, Spring 1990: p. 37-46.

    [Hag 1992] Hagedoorn, J., Leading Companies and Networks of Strategic Alliances in Information Technologies, Journal Policy, April 1992, p. 163-190.

    [Pet 1988] Peterman, J., Address to the Industrial Research Institute at the meeting of the National Academy of Sciences, Washington, D.C.

    [Yof 1988] Yoffie, D.B., How an industry builds political advantage, Harvard Business Review, 66(3), 1988, p. 82-89.

    [Gri 1994] Grindley, Peter; Mowery, David; Silverman, Brian, SEMATECH and Collaborative Research: Lessons in the Design of High-Technology Consortia, Journal of Policy Analysis and Management, Fall 1994, p. 723-758.

    [Bro 1995] Browning, Larry; Beyer, Janice; Shetler, Judy, Building cooperation in a competitive industry: SEMATECH and the semiconductor industry, Academy of Management Journal, Feb 1995: p. 113-151.

    [Ald 1995] Aldrich, H. and Sasaki, T., R&D consortia in the United States and Japan, Research Policy, Mar 1995: p. 301-316.

    [Qui 1995] Quintas, P. and Guy, K., Collaborative, pre-competitive R&D and the firm, Research Policy, May 1995: p. 325-348.

    [Sci 1987] Sciberras, E., Government Sponsored Programmes for International Technological Exchanges and Applied Collective Research, R&D Management, Jan 1987, p. 15-23.

    [USG 1992] U.S. General Accounting Office, SEMATECH's Technological Progress and Proposed R&D Program, GAO/RCED-92-223BR, Washington D.C., July 1992


    Strategic Alliances and Technology Webs:

    [Tak1996] Takahashi, Dean, Netscape and Hewlett-Packard Form and Alliance, San Jose Mercury News, May 15, 1996.

    [Hag1996] Hagel, John III, Spider versus Spider, The McKinsey Quarterly1996 Number 1, March 1996: 4.

    [Sch1996] Schlender, Brent, A Converstaion with the Lords of Wintel, Fortune, July 8, 1996: 42.

    [She1992] Sherman, Stratford, Are Strategic Alliances Working?, Fortune, September 21, 1992: 77.

    [Moo1994] Moore, Charles R. and Stanphill, Russell C., The PowerPC Alliance, Communications of the ACM, Vol. 37, No. 6, June 1994: 25-27.

    [Kes1994] Kessler, Andrew J., Is there any hope for IBAppleola?, Forbes, v154, n14, December 19, 1994: 319.

    [Hag1993] Hagedoorn, John, Strategic technology partnering during the 1980s: trends, networks and corporate patterns in non-core technologies. Research Policy v. 24, 1995: 207-231.

    [Hag1991] Hagedoorn, John and Schakenraad, Jos, Leading companies and networkds of strategic alliances in information technologies, Research Policy v. 21, 1992: 163-194.

    [Har1987] Harrigan, Kathryn, Strategic Alliances: Their New Role in Global Competition, Columbia Journal of World Business, Summer 1987.

    [Dod1993] Dodgson, Mark, Technology Collaboration in Industry, Routledge, London, 1993.

    [Rai1996] Arun Rai, Santanu Borah, and Arkalgud Ramaprasad, Critical Success Factors for Strategic Alliances in the Information Technology Industry: An Empirical Study, Decision Sciences Volume 27 Number 1, Winter 1996.

    [Mar1995] Markoff, John, Does General Magic have the touch in hand-held computers?, New York Times v144 (Mon, Feb 13, 1995).

    [Bra1995] Brandt, Richard, Dangerous Liaisons, Upside v7, n12, Dec 1995.

    [Eco1996] General Magic:  Unintelligent, Economist v338, n7951, Feb 3, 1996.

    [Mel1994] Melanson, Daniel, Intel veers of in new direction, Info Canada v19, n3, March 1994: 3, 32.

    [Ern1993] Ernst, David and Bleeke, Joel, Collaborating to Compete, Wiley, New York, 1993.


    Appendix: ODMG Specification

    The ODMG-93 specification includes an introductory chapter explaining the goals and discussing architectural issues, how the standards fit together; an object model chapter, which is an extension to the OMG object model; an object definition language (ODL) standard which provide a programming-language-independent mechanism to express user object models (schema) and is an extension to the OMG IDL; an object query language (OQL) which provides a declarative access interface for interactive and programmatic query as an extension of SQL; a binding to C++ for all functionality, including object definition, manipulation, and query; a similar binding for Smalltalk; an appendix mapping the object model to OMG's; an appendix mapping the architecture to OMG's ORB; an appendix suggesting enhancements to ANSI C++ which will allow better language integration and facilitate other, more general application needs in C++. The ODMG effort might be considered analogous to early work on the SQL standard for relational systems. However, note that ODMG had no de facto language to start with, and had to produce a standard that spans much more, including integration with application programming languages such as C++ and Smalltalk. Substantial creative effort has been invested in ODMG-93. The result of the ODMG-93 release does not cover all possible areas of functionality; for example, it does not cover distributed database interoperability or versioning. However, it covers all the basic capabilities necessary for an application to use an ODBMS, to create, modify, and share objects. Applications written to these interfaces will operate across all compliant ODBMS implementations with a re-compile. Continuing work is planned for later releases that will address added functionality, will track evolving related standards (e.g., changing C++), and map to further languages and domains.

    Object Oriented Database Management Systems (ODBMS)

    Object-Oriented Databases are databases that support objects and classes. They are different from the more traditional relational databases because they allow structured sub-objects, each object having its own identity or object-id (as opposed to a purely value-oriented approach). In addition, they provide support for object-oriented features such as methods and inheritance. It is also possible to provide relational operations on an object-oriented database. ODBMS allow all the benefits of object-orientation, as well as the ability to have a strong equivalence with object-oriented programs, an equivalence that would be lost if an alternative were chosen, such as a purely relational database. Rather than providing only a high-level language such as SQL for data manipulation, an ODBMS transparently integrates database capabilities with the application programming language. This transparency makes it unnecessary to learn a separate data manipulation language (DML), obviates the need to explicitly copy and translate data between database and programming language representations, and supports substantial performance advantages through data caching in applications.


    The ODMG specification consists of the following:

    Object Model

    The common data model to be supported by the ODBMS that integrates with OMG's Object Model. Components (such as relationships) that are necessary for ODBMS have been added to the OMG model.

    Object Definition Language (ODL)

    The data definition language for the database. ODL is a superset of the OMG's Interface Definition Language (IDL) and provides a programming language independent mechanism to express user object models (schemas)

    Object Query Language (OQL)

    A declarative (non-procedural) language for querying and updating database objects for interactive and programmatic query as an extension of SQL

    Programming language bindings

    Define how applications written in the supported programming languages (C++ and Smalltalk for the moment) can manipulate persistent objects including object definition, manipulation, and query. ODMG hierarchy of languages and steps in the generation of an ODBMS application.