1. Trang chủ
  2. » Ngoại Ngữ

Industrial and Economic Properties of Software

50 1 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Industrial and Economic Properties of Software Technology, Processes, and Value
Tác giả David G. Messerschmitt, Clemens Szyperski
Trường học University of California
Chuyên ngành Electrical Engineering and Computer Sciences
Thể loại essay
Năm xuất bản 2000
Thành phố Berkeley
Định dạng
Số trang 50
Dung lượng 427 KB

Cấu trúc

  • 1.1 Software—a unique good (4)
  • 1.2 Software—a unique industry (4)
  • 1.3 Foundations of information technology (5)
  • 1.4 Perspectives and issues (6)
  • 2.1 Productivity and impact (8)
  • 2.2 Network effects (8)
  • 2.3 Usage (8)
  • 2.4 Quality and performance (8)
  • 2.5 Usability (9)
  • 2.6 Security and privacy (9)
  • 2.7 Flexibility and extensibility (9)
  • 2.8 Composability (9)
  • 3.1 Advancing technology (10)
  • 3.2 Program execution (10)
    • 3.2.1 Platform and environment (10)
    • 3.2.2 Portability (10)
    • 3.2.3 Compilation and interpretation (11)
    • 3.2.4 Trust in execution (12)
    • 3.2.5 Operating system (12)
  • 3.3 Software development process (12)
    • 3.3.1 Waterfall model (12)
    • 3.3.2 Development tools (13)
    • 3.3.3 Architecture (13)
    • 3.3.4 Interfaces and APIs (14)
    • 3.3.5 Achieving composability (14)
  • 3.4 Software as a plan and factory (15)
  • 3.5 Impact of the network (16)
  • 3.6 Standardization (16)
  • 4.1 Four value chains (18)
  • 4.2 The stages of the supply value chain (19)
    • 4.2.1 Development (19)
    • 4.2.2 Provisioning (19)
    • 4.2.3 Operations (20)
    • 4.2.4 Use (20)
  • 4.3 Total cost of ownership (20)
  • 5.1 Copyright (21)
  • 5.2 Patents (21)
  • 6.1 Industrial organization (22)
  • 6.2 Business relationships (23)
    • 6.2.1 Types of customers (23)
    • 6.2.2 Software distribution (24)
    • 6.2.3 Software pricing (24)
    • 6.2.4 Acquiring applications (25)
    • 6.2.5 Acquiring infrastructure (25)
  • 6.3 Vertical heterogeneity (26)
  • 6.4 Horizontal heterogeneity (27)
    • 6.4.1 Multiple platforms (27)
    • 6.4.2 Shared responsibility (28)
    • 6.4.3 Distributed partitioning (28)
  • 6.5 An industrial revolution? (28)
    • 6.5.1 Frameworks (29)
    • 6.5.2 Components (29)
  • 7.1 Demand (31)
    • 7.1.1 Network effects vs. software category (31)
    • 7.1.2 Lock-in (32)
  • 7.2 Supply (32)
    • 7.2.1 Risk (32)
    • 7.2.2 Reusability (33)
    • 7.2.3 Competition (33)
    • 7.2.4 Dynamic Supply Chains (33)
    • 7.2.5 Rapidly expanding markets (33)
  • 7.3 Pricing (34)
    • 7.3.1 Value pricing and versioning (34)
    • 7.3.2 Variable pricing (35)
    • 7.3.3 Bundling (35)
    • 7.3.4 Third party revenue (35)
  • 7.4 Evolution (35)
  • 7.5 Complementarity (36)
  • 8.1 Information appliances (36)
  • 8.2 Pervasive computing (37)
  • 8.3 Mobile and nomadic information technology (37)
  • 8.4 A component marketplace (37)
  • 8.5 Pricing and business models (37)

Nội dung

Software—a unique good

Like information, software is an immaterial good—it has a logical rather than physical manifestation, as distinct from most goods in the industrial economy, which are material

Both software and information rely on a physical infrastructure to be effective Information gains value through its ability to inform, necessitating a tangible medium for storage and access, such as paper, disks, or displays Similarly, software is appreciated for its functionality but depends on a computer processor to execute its purpose In this sense, software parallels services in the industrial economy, as both are immaterial yet require a provider—whether mechanical or human—to deliver their intended outcomes.

Software is distinct from other goods and services due to its significant economies of scale, characterized by high initial creation costs but minimal reproduction and distribution expenses This unique aspect aligns software more closely with information than with traditional material products.

Software, unlike information that is valued for its ability to inform, derives its worth from the actions and behaviors it performs It often substitutes for human-provided services in areas such as computation, robotics, email, and word processing Additionally, software can replace physical goods, similar to how typewriters and telephones functioned Ultimately, the value of software lies in its execution rather than the insights it offers.

Software—a unique industry

The software industry is distinct due to its unique combination of familiar characteristics Similar to novel writing, investing in software development carries risks; however, it crucially requires collaboration with end-users to define features While software applications play a vital role in business operations akin to an organizational hierarchy, they are often created by external vendors who may struggle to adapt to specific or evolving requirements.

Software is unique in its value, as it relies heavily on an execution infrastructure rather than incurring unit manufacturing costs like physical goods Unlike most material products, a single software application is composed of various internal units, or modules, which may come from different vendors and possess distinct ownership This concept of ownership in software is influenced by intellectual property laws, differing significantly from the traditional notions of title and physical possession associated with material goods.

The software industry faces unique challenges that impact end-users, service providers, and regulatory communities This paper aims to explore the multifaceted nature of software creation and usage in the real world The authors, who are technical specialists with a focus on industrial, business, and economic issues, seek to enhance software technology and improve the organizational processes that contribute to its success By encouraging software professionals and managers to consider improved strategies for software investments, we believe that a deeper examination of software’s characteristics and related strategies will benefit the industry Our primary objective is to foster a better understanding of software as a distinct good, separate from material and informational goods, and to examine the processes that surround it, ultimately inspiring further investigation into the strategic challenges that arise from this understanding.

Foundations of information technology

Software is often perceived through our interactions with personal computers, where it functions as a program executing various instructions to perform useful tasks for users However, the reality is far more complex, as software plays a crucial role in the operations of organizations and society as a whole.

Information technology (IT) is designed to acquire, manipulate, and retrieve information, defined as recognizable patterns such as text, images, and audio that inform individuals or organizations IT comprises three key components: processing, which modifies information; storage, which preserves it over time; and communications, which transmits it across different locations.

Often products valued for their behavior or actions have a material or hardware embodiment

Hardware in information technology encompasses the physical components governed by principles of electronics, magnetics, and optics While it is possible to build an entire information technology system using only hardware, the essence of computing lies in the programmability of hardware Ultimately, a computer's capabilities are shaped not just by its hardware but also by how that hardware is programmed.

Software, initially fixed at the time of manufacture, can be modified post-production through the addition and execution of new software This highlights the fundamental interchangeability of hardware and software, where each can potentially replace the other, allowing us to perceive software as immaterial hardware The distinction between software and hardware achievements is somewhat arbitrary and evolves over time.

Information exists in various forms, including sound as pressure waves, images as two-dimensional intensity fields, and text as sequences of characters and punctuation Regardless of its medium, all information can ultimately be represented as collections of bits.

(immaterial entities assuming two values: zero and one), such a collection is also known as data 6

In information technology, all data and software are represented digitally, enabling seamless integration and interaction among various types of information This digital representation enhances the processing, storage, and communication of bits, making IT systems versatile and efficient.

An operating IT system transmits bits through time and space, with communication links facilitating the flow from sender to multiple receivers Storage allows for the temporal conveyance of bits, where a sender can store information for later retrieval by a recipient Processing alters bits at designated points in space-time, executed by hardware that interprets software instructions Essential to this process is the material hardware, composed of atoms and photons, which enables the existence of bits and supports their processing, storage, retrieval, and communication across different locations.

Perspectives and issues

This paper examines software through six distinct perspectives: users, software engineers, managers, lawyers, owners, and economists While it does not claim to cover every aspect of these viewpoints, it emphasizes the most relevant and significant issues The main body is structured according to these perspectives, following the sequence presented in Table 1.

The article emphasizes three fundamental aspects of software: technology, processes, and value In terms of technology, it distinguishes between software applications that deliver direct benefits to end-users and software infrastructure that supports multiple applications Processes outline the necessary steps for effectively supplying and utilizing software, highlighting their interdependencies The value component focuses on the sequential value chains in software, where the supplier value chain, beginning with the software vendor, enhances functionality for users during execution, while the requirements value chain, originating from business ideas, captures user objectives to create detailed implementation requirements Together, these value chains create a cyclical relationship, illustrating how innovations often stem from software developers who rely on end-user feedback for validation and enhancement.

The article highlights the interdependencies of technology, processes, and value, emphasizing key considerations at the intersection of various perspectives and issues A reference table is provided to illustrate these relationships, which will aid in understanding the numerous topics discussed later The subsequent sections will explore the six perspectives outlined in the table and their interconnections.

Table 1 Examples of issues (columns) and perspectives (rows) applying to commercial software.

(users) Flexibility Security, privacy Functionality, impact

Representation, languages, execution, portability, layering

Architecture, composition vs decomposition, standardization

(managers) Infrastructure Development, provisioning, operations

Intellectual property (patent, copyright, trade secret)

Licensing, business process patents, antitrust

Software and content supply, outsourced development, system integration, service provision

Business relationships, terms and conditions

Software is designed primarily to meet the needs of its end users, which can include individuals, organizations such as companies and universities, groups of organizations like commerce, and society as a whole, encompassing areas like entertainment and politics.

Technology directly affects users by necessitating the acquisition and management of complementary infrastructure, including the necessary hardware and software for processing, storage, and communication Coordinating the design and provisioning of software applications with existing organizational processes presents a significant challenge for both end-users and vendors, requiring adaptation of processes to fit the software or vice versa.

Software provides instructions that dictate a computer's behavior, making its primary value stem from the actions it enables on behalf of users While the value of software can vary depending on the specific application context, there are essential universal aspects to consider Acquiring, provisioning, and operating software incurs various costs, such as payments to suppliers, hardware acquisition, and employee salaries Therefore, software that is more cost-effective significantly enhances the user's value proposition.

Productivity and impact

Valuing an application often hinges on its tangible impact on an organization or individual user, enhancing effectiveness and success By improving productivity, reducing task completion time, fostering collaboration among workers, and better managing knowledge assets, applications can significantly elevate the quality of outcomes Furthermore, certain applications enable achievements that would be impossible otherwise, such as in the realms of movie special effects and design simulations.

Network effects

The value of many software products is influenced not only by their inherent features but also by the number of users adopting similar solutions, a phenomenon known as network effect or network externality This effect manifests in two forms: the stronger direct network effect, where user interaction enhances value with more participants, and the weaker indirect network effect, where value derives from secondary assets such as content, trained staff, and complementary applications, leading to increased investment as more users join For instance, remote conferencing tools exemplify direct network effects by facilitating user interaction, while the Web illustrates indirect effects through the content it attracts Additionally, widely used word processing applications provide significant value to individual users while also benefiting from enhanced collaboration as more users share documents.

Usage

Generally speaking, software that is used more offers more value Usage has two factors: the number of users, and the amount of time spent by each user 14

Quality and performance

Quality is defined by the user's perceptual experience, encompassing the number and severity of defects alongside performance metrics Key performance indicators include the volume of work done, such as the number of web pages served per unit time, and interactive delay, which refers to the time taken from clicking a hyperlink to the page's appearance While observed performance can be affected by perceptual factors, objective measures are relevant when the observer is another software component.

Completely avoiding perceived and real defects in software development is impossible due to inherent mismatches between what is built and user needs Accurately capturing individual user requirements at any given moment is challenging, especially when software is designed for a broad audience to maximize revenue and must adapt to evolving user needs over time As a result, the requirements of diverse users can only be approximated Additionally, the impracticality of identifying every design flaw further contributes to the presence of defects in software.

Despite these observations, it's crucial to recognize that defects vary in severity based on their perceptual and quantifiable impact For instance, a defect that results in substantial loss of time and effort invested is considered far more severe than one that merely causes a temporary disruption in display resolution.

Usability

Usability is a critical aspect of quality, defined by users' perceptions of how easy or difficult it is to complete tasks within an application This perception is subjective and varies significantly among users, influenced by factors such as education, background, skill level, preferred interaction methods, and prior experience To enhance usability for a diverse audience, applications must provide alternative ways to achieve the same objectives Similar to quality, usability can be challenged by the necessity to meet the diverse and evolving needs of a broad user base.

Security and privacy

Security aims to prevent external attacks that seek to reveal confidential information or harm software and data, while privacy focuses on protecting individuals or organizations from being tracked or having their activities correlated Both security and privacy are essential in minimizing unwanted external influences and safeguarding sensitive information.

Security and privacy are governed by policies that outline permissible actions, established by the end-user or organization and enforced through software and hardware However, as these policies become more stringent, they can negatively affect usability.

It is therefore valuable to offer configurability, based on the needs of the individual or organization and on the sensitivity of information being protected.

Establishing and maintaining trust is a critical component of security and privacy, especially in transactions involving multiple parties A mutual network of trust must be created or recognized, often facilitated by reliable third parties.

Flexibility and extensibility

In today's fast-paced business environment, the ability to adapt to evolving requirements is crucial, especially in the context of organizational changes such as mergers and divestments, as well as the introduction of new products and services.

End-user organizations invest significantly in adopting specific application solutions, often restructuring their business processes to align with these applications Software suppliers that offer a clear roadmap for future enhancements instill confidence in users, reducing the likelihood of needing to switch solutions in the future.

Composability

A single closed software solution often lacks the value found in systems that can integrate with other solutions for enhanced functionality This concept, known as the composability of complementary software, allows for seamless information sharing and formatting across applications, such as word processors and spreadsheets within an office suite More complex scenarios involve the ability to combine various business applications to develop innovative products or services.

Software engineering primarily focuses on the development of functional software, encompassing design, implementation, testing, maintenance, and upgrades While users represent the demand side, software development constitutes the supply side, with various intermediaries involved in the supply chain Although a thorough exploration of software development could fill numerous volumes, this article will highlight a few key aspects.

Advancing technology

The rapid advancements in processing, storage, and communication technologies are marked by an exponential increase in performance at a consistent cost, doubling approximately every 1.5 to 2 years, with even more significant gains in storage and fiber-optic communication Future improvements are anticipated to reach an impressive factor of a million, although physical laws will ultimately set the limits Currently, the pace of technological advancement is influenced by economic factors, as technology suppliers invest in innovation based on current revenues and market size expectations This investment strategy, shaped by the anticipated return on investment and associated risks, ultimately dictates the rate of research, development, and manufacturing Additionally, a predictable advancement rate fosters collaboration among various industry players, including microprocessor manufacturers and semiconductor equipment vendors.

Recent technological advancements significantly influence the software industry by allowing developers to focus on enhancing usability features, such as graphical user interfaces and real-time video This shift not only reduces the time to market but also enables the addition of valuable functionalities, ultimately improving the overall user experience.

Program execution

Platform and environment

A target program typically does not operate in isolation; it depends on complementary software, while also serving as a foundation for other software The platform encompasses all hardware and software considered available and static from the target's perspective, such as a computer and its operating system Additionally, there may be other software that exists outside the platform's control, which, when combined with the platform, forms the environment for the target If other software relies on the target being stable and available, the target becomes part of that software's platform Therefore, the definition of a platform is relative to the specific target it supports.

Portability

Programming should not be overly dependent on specific processor instruction sets, as this can lead to challenges in writing, reading, and understanding code due to the simplistic nature of individual instructions Additionally, ensuring that programs are portable and can execute on various processors is essential for broad usability and efficiency.

3.5 For this reason, software is developed using an abstract execution model, divorced from the instruction set of a particular processor.

Portability of software ensures that its full functionality is maintained across various computers and operating systems To achieve this, a consistent platform must be established for the portable application, which can be accomplished by integrating software into each operating system This consistent platform, commonly referred to as a virtual machine, standardizes interactions with operating system resources, input/output devices, and network connections Consequently, it provides a uniform representation for programs, facilitating their portable execution across diverse computing environments.

Particularly in the networked age, portability is an essential business requirement for many applications (see Section 3.5).

Compilation and interpretation

Source code is the format of a program that is directly manipulated by software developers and is designed to be read by both humans and various development tools One such tool is an automatic translator that converts source code into another program format known as object code The object code that can be executed directly on a target processor is referred to as native code It is not always necessary to translate source code directly into native code; instead, multiple transformations can be applied, which can occur at different stages and locations.

Traditionally, software undergoes a single transformation during development, known as compilation, or just before execution, referred to as interpretation Compilation enables developers to convert code into native format tailored for a specific target processor, streamlining the execution process.

Interpretation allows transformation on the fly, at the time of execution, by the target processor

Compiled object code is designed to run on a specific target processor, while interpreted code can execute on various processors without modification Although portable execution through compilation necessitates separate software distributions for each target, interpretation enables portable execution from a single software source distribution.

In multi-stage translation, such as that used in the Java language, compilation and interpretation can be effectively combined, allowing software to run on various targets while maintaining the performance benefits of compilation For applications that execute repeatedly on the same processor, interpretation can lead to unnecessary performance penalties, which can be mitigated through just-in-time compilation (JIT) This process involves invoking a compiler within the interpreter to convert some intermediate object code into native code, potentially incorporating online optimization to enhance compilation by monitoring local execution Current Java implementations exemplify this approach.

Interpretation and Just-In-Time (JIT) compilation are crucial for achieving execution portability in software applications By utilizing install-time or JIT compilation, as seen in the Microsoft NET Framework's common language runtime, interpretation can be entirely eliminated without sacrificing portability Moreover, platform vendors can leverage interpretation and JIT compilation to enable software designed for one platform, such as Windows applications on a Pentium architecture, to run seamlessly on different platforms, like the Digital Alpha.

Trust in execution

Users inherently trust executing programs, which poses risks if the software is untrustworthy, potentially leading to data damage and privacy violations This concern emphasizes the importance of selecting a secure intermediate object code format Currently, two models address this issue: the first employs cryptographic technology to ensure that object code is from a reputable vendor and has not been altered, while the second allows for real-time verification of the code's behavior during execution, enforcing policies to prevent malicious actions.

Operating system

An application program does not operate in isolation; it functions alongside an operating system that creates an abstract execution environment This environment shields the program from the complexities of the computer hardware, such as data storage methods, and manages multitasking by allowing multiple programs to run simultaneously Additionally, the operating system allocates shared resources like memory and processor cycles and offers essential services, including network communications Consequently, the operating system is a vital component of any computing platform, working in tandem with the hardware.

Software development process

Waterfall model

The requirements value chain focuses on the end-user experience to define software development requirements, complemented by the waterfall model, which consists of distinct phases that add value sequentially In the conceptualization and analysis phase, a vision and detailed development plan are created, establishing the necessary requirements for investment The architecture and design phase employs a "divide and conquer" strategy to break the system into manageable components, allowing for independent implementation and testing Finally, these modules are integrated and evaluated to ensure functionality and performance.

Traditional software development methods focused on starting with end-user requirements and delivering software, but this approach is becoming outdated Today, most new software is developed through modifications and updates of existing software The key asset for producing new software is the established source base, which consists of the repertoire of source code that an organization has mastered An alternative to maintaining a large source base is the use of software components, which allows for viewing source code as a collection of individual units that can be evolved separately This method promotes the composition of various software products from individually developed components, rather than arbitrarily modifying a growing source base.

The waterfall model effectively outlines distinct development activities; however, it oversimplifies the process by failing to consider the existing code base, overlooking the significant overlap between phases, and neglecting the fact that requirements often change throughout the development lifecycle.

Development tools

Development tools significantly enhance project value by minimizing both development time and costs They automate time-consuming tasks and perform various functions, including tracking and merging changes For large projects that involve hundreds or thousands of software engineers, advanced toolkits are essential for effective management and long-term success.

Architecture

Building software using existing assets can be approached more systematically by designing software systems that are inherently interconnected This method reduces reliance on developers to identify reusable code or components, as the architecture itself fosters these relationships Software architecture is crucial in managing system complexity, enabling the overall system to be constructed from independently developed components.

Architecture serves as a foundational framework for designing software systems, guiding the development of individual designs that adhere to its principles When executed effectively, architecture breaks down systems into distinct modules, clarifying their interdependencies and interactions while defining the parameters for configurability As depicted in Figure 1, architecture encompasses three key aspects: the modular decomposition of the system, the functionality of each module, and the interactions between them The overall qualities of the system, such as performance, maintainability, extensibility, and usability, arise from the specific arrangement and composition of these modules.

Figure 1 An illustration of a simple software architecture.

"Modular" refers to architectural designs that enhance development methodologies and manage complexity effectively A fundamental characteristic of modular architectures is strong cohesion, which signifies high internal dependencies within modules, paired with weak coupling, indicating low dependencies between modules Over time, additional beneficial attributes of modular architectures have gained widespread acceptance.

Modular architectures are typically organized in a hierarchical structure, where larger modules are made up of smaller, finer-grain modules This design allows for varying levels of system granularity, balancing the simplicity of a coarse-grain view with the detailed implementation of fine-grain modules Consequently, the cohesion among modules tends to be stronger at the lower levels of the hierarchy compared to the upper levels.

Coarser grain: few modules are easier to understand

Finer grain: small modules are easier to implement

Figure 2 An illustration of hierarchical decomposition.

Software architecture has interesting parallels in the design of human organizations [Lan00, Lan92, Bal97, San96] The principles of modularity can be applied there as well.

Interfaces and APIs

The interaction between modules centers on their interfaces, which define how other modules can utilize them Specifically, an interface outlines a set of atomic actions along with their associated data parameters and return values Additionally, it includes protocols, which are combinations of actions needed to achieve specific objectives, with multiple protocols potentially sharing the same action.

The interface serves to guide module developers on necessary implementations, with each action functioning as an operation on internal data and often necessitating interactions with other modules It is crucial that the interface conceals irrelevant internal details, allowing for modifications without creating dependencies on other modules This encapsulation prevents the circumvention of the interface, thereby avoiding unnecessary or unintended dependencies.

An interface meant to accept a broad and open class of extensions—modules that are added later, following deployment—is called an application-programming interface (API) 45

Achieving composability

Modular software development involves two main approaches: decomposition and composition Decomposition defines modules based on the necessary system functionality, while composition achieves functionality by combining pre-existing modules The concept of composition is central to component software, as elaborated in Section 6.5.

Architecture and development prioritize the creation and implementation of modular components that can be composed later This process, known as emergence, adds significant value during the development phase of the supply chain While essential, achieving composability is challenging; it is notably simpler through top-down decomposition than bottom-up composition Successful composability necessitates two key properties: interoperability and complementarity.

For effective communication between two modules, three essential requirements must be fulfilled: a communication infrastructure for the physical transfer of data, a mutually agreed-upon protocol for initiating and concluding communication, and a shared encoding method for the transmitted messages When these criteria are met, the modules are considered interoperable.

Interoperability alone does not guarantee effective communication; for meaningful interaction, modules must complement each other in their functions and capabilities For instance, a facsimile machine and a telephone answering machine can interoperate through a network, but their lack of complementary features renders them ineffective together When modules are both interoperable and complementary for a specific purpose, they are considered composable, providing enhanced value as their combined functionality surpasses that of individual components A prime example of this is the relationship between web browsers and servers, which demonstrate interoperability, complementarity, and composability Additionally, usability can be viewed as a form of composability between users and software applications.

Software as a plan and factory

The nature of software as a good raises questions about its classification; is it akin to information goods in the “new economy” or material goods in the “industrial economy”? Unlike information, software is valued for its functionality rather than its informational content Similar to material goods, such as automobiles that serve a practical purpose, software may share characteristics with traditional engineering products from the user's perspective However, from the development standpoint, a distinct property sets software apart, highlighting its unique position in the market.

Software can be compared to a physical product or machine, where it consists of predefined modules that interact to fulfill a specific purpose, similar to how parts of a machine work together This perspective suggests that, much like the industrial revolution transformed manufacturing, we could significantly enhance software development by constructing applications from standard, reusable components.

The notion that the modules in an executing program are pre-defined is a misconception In reality, a diverse array of modules is generated dynamically during execution, tailored to the specific needs identified at that moment For instance, a word processor can create millions of modules based on the content of the document being processed Programmers establish the available modules and outline a detailed plan for their dynamic creation and interaction during execution to fulfill overarching objectives.

Programming is analogous to creating a plan for a very flexible factory in the industrial economy

At execution, programs function as versatile factories that produce a diverse range of intangible artifacts on demand, assembling them to fulfill greater objectives Unlike hardware products, programs are akin to flexible factories for hardware components The raw materials for these factories are represented by reusable IT resources, including instruction cycles, storage capacity, and communication bandwidth.

Software products can be likened to a blueprint for a highly adaptable factory, representing the supply side, while on the demand side, they resemble the tangible products produced by that factory This blueprint serves as a form of information, characterized by high initial creation costs and low reproduction costs, guiding the factory's operations (the executing program) rather than directly informing the consumer.

Many engineering fields face challenges in developing systematic methods for creating new, particularly flexible factories The notion that software engineering lags behind more established engineering disciplines is, therefore, an overstatement.

Impact of the network

The rise of the Internet has significantly transformed software development, enabling the creation of distributed applications that run on various computers and interact over networks These applications, which can function across diverse platforms, cater to a broader audience and provide enhanced value through network effects While portability was previously beneficial for expanding market reach, it has become essential in the interconnected landscape However, this networked environment also complicates interoperability, as modules from different vendors or operating in varied administrative settings may face challenges in coordinated decision-making.

The network presents a significant opportunity by enabling software programs to be transmitted alongside information, as both can be represented as data This creates an appealing distribution channel characterized by low costs and minimal delays.

Mobile code revolutionizes software usage by allowing programs to be transported and executed on computers without the need for pre-installation This approach enhances user experience by operating transparently and overcoming interoperability challenges between different software versions By executing applications on-demand, mobile code can optimize performance by running programs closer to the user or utilizing available resources effectively, thereby improving responsiveness and efficiency.

Mobile agents facilitate the dynamic movement of programs between processors during execution, allowing them to carry both code and data This technology has valuable applications in information access and negotiation; however, it also presents significant security and privacy challenges that must be addressed.

The multi-stage translation process is crucial for networked software distribution and mobile code, as distributing source code is often impractical for business reasons While native object code poses challenges due to the diversity of platforms, an intermediate form of object code serves as an ideal solution for software distribution, depending on compatible interpreters available on each platform.

Standardization

An open industry standard is a widely accepted and well-documented set of specifications that is freely accessible and typically free from burdensome intellectual property restrictions Unlike proprietary specifications, open standards facilitate interoperability among software from different vendors, which is crucial in a diverse platform environment This makes standards processes vital for collaborative development in the networked software industry Additionally, users and managers favor open standards as they enable the integration of various products, fostering competition and specialization within the industry, ultimately leading to enhanced availability, cost-effectiveness, and quality.

The primary goal of standards in software interfaces is to facilitate interoperability among modules created by different vendors This begins with defining a reference model, which outlines the system's decomposition into typical modules, focusing on aspects relevant to the standard The standards process subsequently specifies the functionality and interfaces of these modules to ensure composability Additionally, standards often target data representation for common information types, like HTML for web documents and MPEG for video De facto standards emerge from market forces and represent widely adopted interfaces or data formats.

Standards play a crucial role in addressing significant challenges in software engineering, particularly in managing the complexity of interfaces between modules While theoretically, a new interface can be created for every module combination, limiting the number of interfaces is essential to minimize development and maintenance costs Additionally, the open world problem complicates matters, as new modules may emerge after the initial system design, making it impractical to create a comprehensive set of proprietary interfaces To ensure interoperability, it is vital to standardize interfaces, their functionalities, system decompositions, and data representations Anticipated needs can be addressed by standardization bodies, allowing for preemptive standards that multiple vendors can adopt However, past attempts at standardization have often failed or been sluggish, particularly when venturing into uncharted territory, prompting the need for new standardization processes that integrate research efforts, exemplified by organizations like the Internet Engineering Task Force (IETF).

Layering is a standardization approach that enables incremental development and continuous enhancement of standards, as exemplified by the IETF The foundational layer, known as wiring or plumbing standards, focuses on establishing basic connection-level standards.

Just like wiring or plumbing, creating connections at this level can lead to meaningless or even detrimental outcomes during composition To address this, standardization can be gradually implemented, allowing for increasingly sophisticated rules of interoperation and composability to be established.

Software presents severe management challenges, some of them relating directly to the software, and some relating to the organizational context of the software application.

The supplier value chain from software vendor to user consists of four key stages: development, provisioning, operation, and use Each stage presents unique management challenges and opportunities for value creation The development stage encompasses initial design, implementation, and ongoing software maintenance and upgrades In the provisioning stage, necessary facilities such as networks, servers, and PCs are acquired and set up based on performance needs, followed by software installation, integration, and testing The operations stage focuses on ensuring the application and its infrastructure run reliably and securely Finally, the use stage delivers direct value to users and end-user organizations through the application's functionality.

Table 2 Stages of the supplier value chain (rows) vs generic tasks (columns).

Planning Deployment Facilitation Maintenance Evolution

Build systems Software tools support

Installation, integration, configuration, and testing

Installation, integration, configuration, and testing

Four value chains

There are two distinct types of software, and hence in reality two supplier value chains

Application software delivers targeted functionalities to meet end-user needs, while infrastructure offers foundational capabilities utilized by various applications This infrastructure encompasses both hardware—such as computers, peripherals, and communication links—and software, including operating systems and middleware Although infrastructure itself does not offer direct value to users, it is crucial for the effective operation of applications.

Figure 3 Three value chains in the software industry.

The separation of application and infrastructure creates two distinct supplier value chains and one user requirements value chain Infrastructure development, provisioning, and operation indirectly enhance application functionality, making them easier to manage In contrast, application development directly delivers value to users User requirements play a crucial role in shaping application development by establishing criteria that align with user needs, although they do not directly influence infrastructure, which supports multiple applications Additionally, the combined needs of various applications contribute to a secondary requirements value chain.

The stages of the supply value chain

Development

Software developers hold ongoing responsibilities throughout the product lifecycle, providing essential support for inquiries and troubleshooting Their role includes continuous maintenance through service packs and patches to address reported issues Additionally, software requires regular upgrades, which often involve significant reprogramming to rectify flaws, adapt to changing requirements, and introduce new features.

Provisioning

Provisioning encompasses the selection, negotiation, and acquisition of essential facilities, including equipment and communication links, along with the necessary software to support an application and its infrastructure This process often involves a design component, ensuring that the facilities are appropriately sized to fulfill the performance requirements of users.

Provisioning encompasses the installation and testing of equipment and software, ensuring they meet specific functionality and performance standards Typically, the communication and information processing components have distinct provisioning processes These elements can be outsourced to specialized firms that offer systems integration services.

Operations

Effective software system operation necessitates ongoing attention to various factors, including adjustments to authorization levels due to organizational and personnel changes Security remains a critical concern, requiring the installation of patches to address vulnerabilities These responsibilities fall under system administration Additionally, system management involves reconfiguring facilities to adapt to evolving organizational needs and workload distributions, serving as an extension of the provisioning phase Together, system administration and management are essential for enhancing an application's overall effectiveness and efficiency.

Use

End-user organizations play a crucial role in supporting individuals who utilize applications During the planning phase, key considerations include the alignment of business processes with the application being used A significant decision arises between developing a custom application tailored to specific business needs or opting for a commercial off-the-shelf (COTS) solution, which may require adjustments to fit predefined design assumptions COTS vendors strive to enhance configurability and parameterization, offering flexibility while also demanding substantial effort during the provisioning phase.

Before the operations phase begins, it is essential to train workers on the effective use of the application and other business processes Additionally, during operations, organizations must support users by offering assistance and establishing a dedicated helpdesk for addressing any issues that may arise.

Total cost of ownership

Managers must carefully evaluate the total cost of ownership (TCO) of an application, which encompasses provisioning, operational expenses, and user support Additionally, TCO may factor in development and maintenance costs for internally developed applications It is essential to include imputed costs for user responsibilities, such as administering personal desktop computers or providing training and assistance to peers, to gain a comprehensive understanding of TCO.

As total cost of ownership (TCO) increasingly impacts organizational budgets, reducing TCO has become a priority for managers and suppliers This drive for cost reduction places pressure on vendors to enhance the administration and management of applications and infrastructure while simplifying training and support Notably, the high costs associated with managing desktop computers have led to a trend towards centralization, shifting management from individual desktops (clients) to centralized servers This approach minimizes the number of computers that need administration, reminiscent of earlier mainframe systems but with modern adaptations Thin clients, which rely on dynamically loaded mobile code, represent one extreme, while rich clients offer local customizability supported by centralized management Most organizations utilize a blend of both thin and rich clients to accommodate diverse job requirements.

The legal system is crucial in defining and enforcing software property rights, while government regulation is being considered to tackle various software-related issues, including privacy concerns, access control—especially for children—and the regulation of encryption usage.

Copyright

Software can be easily replicated, making unauthorized copying and distribution a trivial issue While security measures aim to prevent piracy, they often hinder usability and face significant customer pushback To effectively combat software piracy—defined as the large-scale unauthorized manufacture and sale of software—social constructs such as established ethics, legal restrictions, and active law enforcement are essential These measures not only deter piracy but also promote substantial investments in software development.

Copyright safeguards original software creations by granting exclusive rights to the creator, including the ability to sell or license the work, while preventing unauthorized replication or sale by others However, it does not stop individuals from independently creating similar software inspired by the same concepts or objectives Additionally, the original developer retains control over derivative works, such as updates or new versions.

Software licensing typically grants rights to the individual or entity that provisions and operates the software, with terms that can vary widely regarding usage, payment, and distribution Due to low replication costs, unique licensing arrangements can be economically feasible Freeware allows users to replicate and distribute software at no cost, while shareware is free but requests voluntary payment for productive use Copyleft promotes the creation of derivative works, stipulating that these must also be freeware or adhere to copyleft principles.

Copyrights safeguard the property rights of both source and object code, although it is primarily object code that is distributed and licensed The use of object code complicates reverse engineering, making it harder to uncover proprietary trade secrets Additionally, object code is significantly more challenging to modify, which is crucial since customer alterations could void warranties and hinder customer support Furthermore, distributing object code helps encapsulate implementation details, thereby minimizing unnecessary dependencies when integrating with other software.

Open source is a form of freeware in which source rather than object code is released 84

Associated with open source is usually an informal (but coordinated) group of dedicated volunteers who maintain and upgrade it.

Patents

A patent provides exclusive rights for a limited time to produce, utilize, or market products that incorporate an invention, defined as a novel, non-obvious, and practically useful idea Unlike copyrights, patent holders can prevent others from using their inventions, even if those individuals arrive at the same idea independently Recently, patents have been extended to software and business processes, which frequently form the foundation of software applications.

Inventions are inherently shared through their use, revealing possibilities and leading to challenges in investment without proper property rights due to their non-rival nature Patents play a crucial role by incentivizing research and development through guaranteed exclusivity, while simultaneously making inventions public, thus enabling others to enhance these innovations even if they are not actively utilized.

The software industry's organization is heavily influenced by technology, processes, and value, which shape the relationships between cooperating and competing firms Ownership and the ability to generate monetary value are crucial incentives for software production, making a supportive industrial and societal structure essential for effective output The interplay between software architecture and industrial organization is significant, as firm boundaries are defined by specific module interfaces Additionally, the value chain stages create distinct business functions with separate ownership, which can be further divided into specific job roles or combined into unified businesses.

Industrial organization

Industrial organization can be understood as the division of the value chain into separate companies, each focusing on individual units of value that enhance internal synergies and leverage shared expertise This approach leads to the formation of distinct businesses, as depicted in the partitioning of the value chain illustrated in the accompanying figures.

Application software suppliers often integrate analysis and development functions while collaborating with end-user organizations to identify their specific requirements Likewise, infrastructure software suppliers need to be aware of the diverse requirements set by various applications.

A system integrator specializes in provisioning by acquiring software from multiple application and infrastructure suppliers, ensuring compatibility, and overseeing installation and testing processes This role often involves programming tasks Additionally, consultants play a crucial part in helping end-user organizations adapt their business processes and configure the software to meet specific needs.

An application service provider (ASP) specializes in licensing and operating applications, while an infrastructure service provider (ISP) focuses on acquiring and managing the necessary hardware and software infrastructure, including computers, storage, networks, and operating systems.

Different configurations of the value chain can be established, with a common approach being an end-user organization that manages one or both service provider functions within its information systems (IS) department Larger organizations may develop some applications internally and act as their own systems integrator Other typical arrangements include an Internet Service Provider (ISP) that manages its own systems integration and a service provider that oversees both application and infrastructure services The ISP function can be fragmented among multiple companies, such as separate providers for backbone networks, processing, and storage, or shared between the end-user organization and a service provider Additionally, a software developer may transition to an Application Service Provider (ASP), altering its relationship with end-users from licensing object code to offering application services on a subscription basis.

Figure 4 Natural businesses partitioning of the value chain.

The information content supplier plays a crucial role in applications by manipulating and presenting data to users This information can originate from independent sources, such as stock analysts who share insights on company prospects with users of stock brokerage applications.

The industrial landscape is increasingly shifting towards complex organizations where applications are formed from multiple unbundled modules that communicate over the network A prime example of this is Web services, where one Web-based application directly utilizes capabilities from another through the Web infrastructure In this scenario, the 'customer' is another software component rather than the end-user, positioning the first application as an intermediary that enhances user experience by providing added value through customization, aggregation, filtering, and consolidation.

Business relationships

Types of customers

In today's digital landscape, nearly all organizations and a significant number of individuals are either customers of software vendors or software developers themselves Users fall into four main categories: individuals who license applications for personal productivity, collaboration, information access, and entertainment; organizations that license, purchase, or develop software to enhance their internal operations and external business relationships; original equipment manufacturers (OEMs) that embed software within the products they manufacture; and software applications that serve as customers to other software, exemplified by Web services.

Software distribution

Software is typically delivered to customers in the form of object code, which is encoded in a binary format This object code can be distributed via various methods, including magnetic or optical media, as well as through network transfers.

The network has become a crucial distribution channel for software, offering a cost-effective and timely method to bypass traditional distribution chains This approach facilitates the frequent release of maintenance updates and new software versions, allowing for greater flexibility in application management with reduced concerns about incompatibility Nevertheless, intermediaries still play a vital role in the distribution process, especially when navigating numerous alternative suppliers or when there is a need for product integration and bundling.

Before a customer can execute software, there are four main scenarios: First, the software might be pre-installed in equipment, known as an appliance, before it is sold to the end-user Second, the customer may need to install the software themselves, which involves active participation in the process, referred to as user self-provisioning Third, the software can download directly over the network and run without requiring installation, a method known as mobile code Lastly, the customer might utilize software that operates remotely, managed by an Application Service Provider (ASP) From the customer's viewpoint, these methods are largely similar, with the notable exception of the self-installation process.

For user-installed or mobile software, a traditional productization and marketing business model is suitable In contrast, embedded and ASP/ISP-operated software is acquired and provisioned by OEMs or service providers The key distinction lies in the scale and sophistication of the suppliers, with fewer sophisticated OEMs or service providers compared to a larger number of end-users Additionally, the decision-making process varies, as third parties, such as OEMs or service providers, decide on behalf of end-users regarding software updates or competitive offerings.

Many distribution models often blend together, allowing appliances to receive automatic upgrades via the network Additionally, an Application Service Provider (ASP) can utilize mobile code to enhance interactivity by relocating part of the execution process closer to the end user In some cases, the ASP may also necessitate the installation of complementary software by the user.

Software pricing

When designing pricing models, there are various alternatives and challenges to consider, as discussed in Section 7.3 Nevertheless, the industry follows certain standard practices that vary depending on the distribution model For instance, user-installed software is typically sold at a fixed price, similar to traditional products This approach emphasizes the importance of selling new releases to ensure a consistent revenue stream, particularly after achieving high market penetration Consequently, the existing installed base becomes a significant competitive factor for suppliers.

OEM and service-provider models introduce intermediaries into the value chain, necessitating dual pricing strategies: one from the software supplier to the intermediary and another from the intermediary to the end-user Software suppliers often adopt user-installed pricing models based on end-user adoption rates, which can include fixed pricing per appliance sold or pricing proportional to the number of customers an Application Service Provider (ASP) attracts Additionally, ASPs benefit from the ability to easily measure various usage metrics, such as the total number of completed transactions, allowing for more flexible pricing options.

ASP pricing for end users typically follows traditional service industry models, primarily subscription, pay-per-use, and cross-subsidy The subscription model allows users to access services for a specified period, either with limited or unlimited capacity In contrast, the pay-per-use model involves metering and billing based on actual usage Lastly, the cross-subsidy model helps recover the costs associated with providing the service.

(possibly incurring a loss or a profit margin) by attaching to a technically unrelated model, such as advertising or bundling with another paid service.

The superdistribution model promotes the sharing of useful software modules by allowing anyone to distribute them, while incorporating a payment mechanism that ensures compensation flows to the original owner based on usage metrics.

Acquiring applications

When an end-user organization seeks to acquire software for internal use, it can choose from several options: making, buying, licensing, or subscribing to software solutions Each option presents unique benefits and drawbacks that organizations must carefully consider to determine the best fit for their needs.

The make option involves keeping all development stages in-house, which maximizes competitive differentiation but lacks cost-sharing benefits and introduces significant delays and risks Conversely, the buy option entails outsourcing development to a specialized firm, potentially granting the end-user organization ownership of the source code, while maintenance and upgrades may fall to either the developer or the user Both approaches allow for the integration of business processes, organizations, and software, fostering efficiency and enhancing competitive advantage.

In the licensing model, end-users obtain software from suppliers, while the subscription model allows direct purchase of application services from an Application Service Provider (ASP) Both options are typically cost-effective but offer limited differentiation from competitors In these scenarios, software often dictates business processes and organizational structures, prompting the need for consultants to facilitate necessary adjustments.

Acquiring infrastructure

Application software can transition into the infrastructure category when it becomes widely used and integrated into other applications For instance, the Web, initially designed for scholarly information access, has evolved into a crucial infrastructure for e-commerce and various applications Similarly, productivity application suites now serve as foundational infrastructure for custom applications needing essential features like word processing and spreadsheets Additionally, some infrastructure is specifically developed for certain types of applications or for universal application use.

Infrastructure can be categorized into platforms and specialized systems designed for specific applications Various supplier firms focus on different infrastructure types, each employing unique business models Some infrastructure is specifically created to streamline application development and reduce costs, often bundled with the application and licensed together This approach allows for subsidizing infrastructure development costs through application-generated revenue, while still depending on platform support to prevent redundant or conflicting installations of infrastructure software.

A platform, which is essential for supporting multiple applications, is typically sold separately due to the extensive resources required for its deployment This allows for a single licensing and installation process Additionally, separate applications are often developed to enable higher-level functionalities, such as information sharing A key advantage of using a platform is the interoperability it provides among applications, which would not be feasible if each application were tied to its own infrastructure.

Infrastructure is typically not created or purchased outright; instead, it is often licensed or subscribed to, particularly in the realm of wide-area networking and communication services This subscription model is favored because it is impractical for organizations to establish their own communication lines, and public networks provide extensive connectivity that would be difficult to replicate with dedicated facilities As the trend of application subscriptions continues to rise, it is expected that subscription-based infrastructure offerings will also increase in both variety and popularity A notable example of this is caching, which enhances information distribution performance by placing temporary storage closer to end-users.

Vertical heterogeneity

The software industry exhibits both vertical and horizontal heterogeneity, influencing its structure and business relationships Vertical heterogeneity fosters dependencies and complementarities among various firms, such as the reliance of application software on infrastructure software Additionally, within the infrastructure itself, these dependencies manifest as layering, which is connected to the layering of standards.

Layering is an architectural approach where modules maintain a vertical relationship, with each layer depending on the one below, creating a complementary structure essential for application support The lower layers focus on technology components such as processing, storage, and connectivity, offering common representations of information and standard infrastructure services Above these are integrative layers that combine the services of various technologies effectively, culminating in the application components and applications themselves at the top.

For a thriving ecosystem, it is essential for a variety of applications to coexist alongside diverse core technologies, promoting free market entry and innovation with minimal interdependence The middle infrastructure layers play a crucial role by establishing common and universal representations and services, enabling applications to interact with these shared elements, which can be adapted for new technologies.

Diversity of processing, storage, and connectivity technologies

Common services and representations and structures for information

Figure 6 A separation of technological progress from applications.

The modern objective of each layer in technology is to create versatile and configurable platforms that can support a wide range of applications This goal is still in progress and has not yet been fully realized In the past, the industry relied on vertical stovepipes, which created separate infrastructures for distinct applications like voice, data, and video distribution However, this outdated model hindered the development of diverse applications due to the significant investment required for establishing new infrastructures for each application type.

The shift from stovepipes to layered architectures significantly alters industry dynamics, as companies must now accommodate all applications rather than operate within isolated marketplaces This change fosters specialization at individual layers rather than within specific application classes, promoting horizontal integration over vertical integration Consequently, no single company can deliver a fully integrated solution, leading to competition at each layer Customers, or system integrators, are then tasked with combining products to achieve a comprehensive solution.

Horizontal heterogeneity

Multiple platforms

Due to historical trends and industry competition, various operating system and processor platforms, such as IBM mainframe, SUN/Solaris, Macintosh, and PC/Windows, coexist, leading to horizontal heterogeneity In the post-Internet era, the demand for applications to operate across diverse platforms emphasizes the importance of code portability and mobility, as outlined in Section 3.5 Similar challenges are present in storage and networking, with multiple object-relational database types competing in the market and a strong push for unified standards Additionally, Internet technologies are evolving to support a broader array of applications, including high-quality voice and video, potentially replacing traditional telephone and video distribution networks.

Shared responsibility

At the application layer there is heterogeneity introduced by the necessary partitioning of the application across hosts and across multiple organizations.

Distributed applications often engage multiple end-user organizations, particularly in areas like supply chain management and business-to-business e-commerce This collaboration necessitates shared responsibilities across various roles, including planning, deployment, provisioning, and operation Furthermore, these applications frequently need to integrate with legacy systems within the organizations, complicating the shared ownership and operational control and leading to numerous practical challenges.

Coordinated decision-making on platform adoption is often impractical, necessitating alternative approaches One effective method is to establish common standards and implement suitable conversions to ensure interoperability among various platforms and legacy applications, such as using XML for business document representation Another viable solution involves employing an intermediary responsible for managing interoperability between organizations, exemplified by the common business-to-business e-commerce intermediary being developed within the automotive industry.

There are many other issues in shared responsibility other than format and protocol conversions Complications arise in all aspects of provisioning and operations.

Distributed partitioning

When distributing an application across multiple hosts, it is essential to consider how to partition it effectively, as mere distribution does not inherently enhance functionality Networking an application can significantly improve performance and scalability through concurrent execution on multiple hosts Additionally, while centralized applications require centralized administration, distributed applications allow for partitioned control, which is vital for applications that cross organizational boundaries, such as business-to-business e-commerce Furthermore, distribution can enhance security by enabling organizations to maintain tighter control over their critical data assets, addressing privacy and ownership concerns in the process.

An industrial revolution?

Frameworks

A framework is essentially an architecture that is reusable for multiple applications 119 Thus, in essence it is a pre-plan for the decomposition of an application, including interface specifications

A framework can be tailored by replacing functionalities within its modules and can be expanded by integrating new modules via established gateways Given the diverse needs of applications, the framework's scope is inherently constrained, as no single architecture can effectively accommodate a broad spectrum of applications.

Components

One effective method of reuse in software development is sharing infrastructure, while another involves integrating preexisting modules Software components are defined as reusable modules that can be composed into various applications, enhancing efficiency and reducing development time.

Although the software community has seen many technologies, methodologies, and processes aimed at reuse, the consensus today is that component software is the most promising approach.

A well-designed component minimizes context dependencies by offering flexible connection points for specific configurations, rather than relying on the presence of multiple other modules This approach allows components to maintain their unique identity within a deployed application, facilitating updates and extensions through the replacement or addition of individual components In contrast, traditional applications consist of executable or dynamically loadable modules with rigid configuration and context details, making it necessary to replace the entire system for updates.

Component-based architectures must prioritize modularity, emphasizing weak coupling among components to facilitate independent development and enhanced composability While strong cohesion within components is less critical, it is important to note that components can be hierarchically decomposed to aid in their implementation.

Diversity of processing, storage, and connectivity technologies

Common services, representations and structures for information

Figure 7 Component frameworks to separate dimensions of evolution.

Component-based systems are inherently modular, allowing for the reuse of components in various contexts, unlike all modules However, designing these systems is more complex than traditional modular systems By loosening the contextual constraints on components, a vast combinatorial space of configurations emerges, making it theoretically limitless as the number of components continues to grow Maintaining quality in such an environment presents a significant challenge that remains largely unaddressed As a result, the market for software components began to develop only in the mid-1990s, with the prevailing approach still resembling a craft—modifying nearly suitable modules—rather than true engineering, which would involve acquiring components that seamlessly fit within a reference model without the need for modification.

Component assembly refers to the process of selecting, configuring, and integrating various components to build applications However, utilizing a peer-to-peer architecture for this purpose can lead to scalability issues due to the exponential increase in dependencies that arise during application development To manage these complexities, a component framework can be employed to organize relevant component connections and configurations into a hierarchical structure, creating more manageable modules For instance, an operating system serves as a component framework by integrating device driver components for lower-level functionality and application components for higher-level operations.

Component frameworks can function as components themselves, establishing a hierarchy within applications For instance, an OS-hosted application can evolve into a framework by integrating plug-ins, synonymous with components While these frameworks may seem like traditional layers, they differ significantly as they actively invoke components positioned above them Ultimately, component frameworks represent a recursive generalization of the concept of distinguishing applications from infrastructure.

Although most software systems are not primarily built from components, many hardware systems are The integration of customized software often sets these systems apart Emerging component software technologies and methodologies may lead to a future industrial revolution in software, where externally sourced components are routinely integrated into software solutions However, customized modules developed internally are expected to continue playing a crucial role in differentiation.

Microeconomics provides valuable insights into the business dynamics and strategies within the software industry This section expands on previous discussions to highlight key economic traits of software However, a main objective of this paper is to encourage further research into the economic and business aspects of software, indicating that the analysis presented here is neither exhaustive nor conclusive.

Software economics diverges from traditional supply-demand dynamics due to its low replication and distribution costs, as well as the non-rivalrous nature of software use This unique landscape emphasizes the importance of understanding investment incentives in software development and exploring how suppliers can extract economic value from their investments.

Demand

Network effects vs software category

Network effects have a strong influence on the software industry However, there are considerable differences among different types of software.

Before the Internet, various platforms could coexist and compete for the same customer base, with market share primarily driven by attracting application developers, leading to secondary network effects Platforms typically segmented the market into categories such as mainframes for back-office applications, PCs for personal productivity, and UNIX servers for scientific and departmental business needs However, post-Internet, the landscape shifted to a collective platform encompassing all computer systems, networks, and expanding middleware Consequently, infrastructure suppliers must now consider their role within an ecosystem that includes both complementary and competitive suppliers.

Prospective infrastructure solutions encounter two interconnected network effects: they need to build a large user community to enhance their value for each member, while also facing the challenge that their appeal to end users depends on attracting a robust ecosystem of applications However, application developers are only inclined to build on infrastructures that already demonstrate significant market penetration.

Overcoming these two challenges is challenging, but there are potential pathways Similar to the development of the Internet, initial infrastructure solutions may emerge from the research community, attracting experimental applications before transitioning to the commercial marketplace Alternatively, innovative capabilities may first be integrated into successful applications and later developed into a distinct infrastructure product category.

Certain application categories face significant challenges due to network effects, particularly those relying on the client-server model, which tends to experience weaker network effects In this model, the initial client of a new server application gains full value, facilitating the success of the Application Service Provider (ASP) model by eliminating the need for extensive software installation on multiple clients Conversely, peer-to-peer applications, such as video conferencing, faxing, and instant messaging, experience stronger network effects due to the need for direct client interoperability Despite these challenges, there have been notable successes in distributing the necessary software over the network with relative ease.

Lock-in

Switching costs significantly hinder customers from transitioning between products, creating challenges for competing suppliers to attract new clientele A complete application relies on various complementary products, including application components and necessary infrastructure software Additionally, there are intangible costs, such as employee training and adjustments in administration and operations Transitioning to a new software vendor typically incurs costs related to replacing these complementary assets and retraining personnel As a result, lock-in effects impose a negative value on a competitor's product, equal to the incurred switching costs, further complicating their efforts to gain market share.

Open standards are appealing to customers as they facilitate the integration of products from various vendors and lower switching costs However, the implications of lock-in are more profound, particularly in business applications tied to specific organizational processes and structures Transitioning to a new vendor may necessitate extensive reengineering, restructuring, employee training, and potential disruptions, leading to significant switching costs To counteract this lock-in, competitive suppliers may offer subsidies to ease the transition for customers Ultimately, the initial supplier retains an advantage due to the relinquishment of incremental lock-in asset value.

Layering in infrastructure effectively lowers switching costs by allowing the integration of new capabilities without discarding existing ones This characteristic enhances the appeal of layering as a strategy for infrastructure advancement Moreover, layering emerges as a natural result of market dynamics in both tangible and intangible realms.

Supply

Risk

The high initial costs of traditional software development are often unrecoverable, making it challenging for new applications to gain user adoption since they are considered experience goods that require firsthand use to be appreciated To address this, companies can leverage low replication and distribution costs by offering free trials However, with an increasing number of free trial options available, user attention becomes limited, making this strategy less effective over time Therefore, it is essential to implement a portfolio diversification strategy, investing in multiple products with overlapping lifecycles to reduce risk.

Reusability

Advancements in reusable software can significantly lower development costs and time for software suppliers while mitigating associated risks However, achieving software reusability is more challenging than in physical materials due to the inherent nature of software Despite these challenges, viable component technologies have emerged since the introduction of Microsoft COM with OLE 2 in 1992, leading to the growth of several markets and the establishment of companies to fulfill merchant, broker, and triage roles.

A critical challenge in software development is managing trust and risk, particularly when integrating components from external suppliers To address the potential risks involved, effective warranty and insurance models are essential However, the unique complexities of software necessitate a reevaluation of traditional warranty, liability laws, and insurance practices, making this issue as vital as the technical challenges faced in the industry.

Competition

To maximize revenue from software, it is essential to enhance customer value while differentiating from competitors However, once software is released, it becomes challenging to prevent competitors from copying it, as patents are often easy to bypass compared to industries like biotechnology Additionally, copyrights fail to effectively protect against the replication of an application's features and design Competitors can reproduce similar features independently in a "clean room" setup, further complicating the protection of software innovations.

Fundamental deterrents to competitors extend beyond intellectual property protections and include several key strategies Enlightened competitors focus on differentiation rather than imitation, recognizing that profits are challenging to secure in a market with undifferentiated players benefiting from economies of scale Additionally, suppliers with significant market share can implement limit pricing to counteract the high entry costs faced by new entrants Customer lock-in is another advantage for dominant suppliers, as switching costs may compel competitors to subsidize customer transitions through discounts or other incentives To maximize customer retention, suppliers often introduce proprietary features or improve interoperability with complementary products Conversely, competitors can lower switching costs by providing translations or backward compatibility, particularly in the software sector, highlighting an area ripe for further research.

Dynamic Supply Chains

In the realm of material goods, suppliers and customers often engage in long-term contracts or dynamic procurement within marketplaces However, in the software industry, supply chains can achieve a higher level of sophistication, becoming fully “self-aware” as demonstrated by superdistribution This allows for the free exchange of components directly among customers, while also enabling these components to initiate and enforce fair monetary compensation through a micro-billing infrastructure.

Rapidly expanding markets

The rapid expansion of software markets presents unique challenges for suppliers due to low replication and distribution costs In such dynamic environments, the traditional lock-in theory is less applicable, as merely capturing a significant share of customers does not guarantee long-term success Instead, new competitors can swiftly enter the market and attract a substantial portion of new customers by providing better price-value propositions, appealing bundles, or enhanced integration solutions, particularly when growth is brisk and network effects are weak.

Initial strength in new technologies, including software, provides a first-mover advantage, but it does not guarantee long-term dominance To effectively leverage this advantage, companies must engage in rapid and ongoing innovation, as the initial customer base can quickly become less relevant.

Pricing

Value pricing and versioning

For products that stand out from competitors, the most effective supplier strategy is value pricing, which sets prices according to customers' willingness to pay However, the challenge lies in the significant variation in this willingness among different customers To optimize revenue, implementing price discrimination is essential in value pricing strategies.

Price discrimination strategies, particularly versioning, are highly relevant in the software industry This approach involves creating a portfolio of products that vary in features, quality, and performance, allowing customers to choose based on their willingness to pay Often, a basic version is provided for free, either as a limited-time trial or indefinitely, which incurs minimal costs for the supplier This tactic not only introduces customers to the product but also aims to encourage upgrades to paid versions or the purchase of complementary products A prime example of this strategy can be seen on the Web, where browsers are offered for free while servers with added value features are sold.

Some business models offer more flexibility for price discrimination than others Fixed pricing of

"Shrink-wrapped" software limits price discrimination opportunities, while custom-developed applications with individually negotiated licenses can consider factors such as usage and impact The ASP model is appealing due to its flexibility in pricing, allowing adjustments based on willingness to pay, usage, and contextual factors.

Variable pricing

The willingness of users to pay is influenced by various factors, including usage and quality, which can fluctuate over time for the same user This variability suggests the potential for dynamic pricing models that adjust based on usage patterns However, implementing such variable pricing poses challenges For instance, while direct monitoring of usage is uncommon, suppliers often estimate usage by pricing based on the number of "seats" or the number of installations, rather than actual usage levels.

A floating license pricing model charges based on the maximum number of simultaneous users, rather than the total number of seats or individual users With the widespread availability of the Internet, tracking usage has become more straightforward, and the Application Service Provider (ASP) model inherently supports this monitoring of user activity.

A fundamentally different approach is to base pricing on direct impact rather than usage, such as per-transaction pricing for ASP e-commerce intermediaries.

The business model and pricing strategy play a crucial role in shaping supplier incentives and investment targets For instance, a usage-based pricing model creates a steady revenue stream regardless of product updates, alleviating the constant pressure to enhance functionality, unless it aims to drive increased usage.

Bundling

Bundling products can effectively reduce customer demand dispersion, leading to simpler pricing and increased total revenues In the software industry, ensuring that bundled components are complementary and composable further enhances the overall value of the bundle.

Third party revenue

Advertising, with its low marginal costs, serves as an effective revenue stream from third parties instead of users, particularly within the ASP subscription model This approach allows for the delivery of targeted advertisements that enhance effectiveness by considering the application context Here, the focus shifts to the value of user attention for advertisers, rather than direct user benefits Unlike traditional media, networked media advertisements can be dynamically updated and often feature hyperlinks to the advertiser's website, providing access to extensive information.

Evolution

To minimize the threat of competitors and generate consistent revenue, companies often implement a strategy of continuous innovation by launching new product releases that feature enhanced quality, performance, and functionality While maintenance upgrades may be provided at no cost, new versions can be monetized These regular updates not only facilitate ongoing maintenance and improvements but also attract new customers and discourage potential market entrants Once a substantial market share is established, the primary competition for each new release comes from the existing user base of previous versions.

Immaterial software, unlike physical goods, does not wear out but can become less suitable over time due to evolving user requirements and changes in complementary products To remain relevant, software must undergo upgrades as long as there is a viable user base However, these upgrades come with risks, such as the potential discontinuation of support for older data formats, which can alienate some customers, or the possibility of failing to interoperate with existing complementary software.

Legacy software can become a financial liability for suppliers as their user base diminishes, making ongoing investments in updates unjustifiable This situation often leads to a dilemma where discontinuing new releases alienates existing users, leaving them with outdated and eventually unusable software However, implementing a component-based approach allows for a smoother transition, enabling old and new versions to coexist This gradual phasing out of older versions provides clients with the flexibility to adapt without the pressure of an immediate upgrade.

Complementarity

Offering a diverse range of complementary products is a common strategy in various markets, as it helps mitigate risks, lowers sales and marketing expenses, and provides consumers with seamless systems integration along with a single point of contact for both sales and support.

Software suppliers rely significantly on complementary products from other providers, especially through layering While each supplier aims to differentiate its offerings and reduce competition, they also seek robust competition among their complementers to ensure that customers benefit from improved pricing and quality.

Software technology is experiencing rapid evolution, particularly with the integration of communication, storage, and processing capabilities, signaling a significant advancement in the field Although the major technological gaps have been closed, the market consequences of these developments are just beginning to emerge Additionally, several other technological and market trends are already underway, and while their complete implications remain uncertain, we highlight these trends and their potential impacts.

Information appliances

Information appliances combine software applications with dedicated hardware to serve specific purposes, eliminating the need for specialized software installations on general computers By leveraging the declining cost of hardware, these devices offer improved portability, ergonomics, and usability, making them more efficient for targeted tasks.

Software in appliances resembles traditional industrial products, leading to diminished opportunities and technical challenges associated with composability Consequently, maintenance and upgrades are integrated into the appliance's overall functionality, rather than being treated as independent software processes.

Pervasive computing

A growing trend in technology is the integration of software capabilities into everyday physical products, known as pervasive computing This concept involves embedding information technology, including network connectivity, processing, and storage, into common objects Unlike traditional information appliances, the focus of pervasive computing is to enhance the functionality and capabilities of the material objects we use daily.

The emergence of interconnected objects presents numerous opportunities for enhanced communication and coordination, transforming our daily environment into a flexible network of hidden computing nodes These nodes efficiently handle information processing needs that often go unarticulated, seamlessly integrating into our normal activities.

Pervasive computing shifts the focus of software development away from traditional information appliances, aiming for flexible and opportunistic composability that is nearly universal This presents significant technical challenges Additionally, leveraging information technology to enhance the complementarity of various physical products emerges as a new and demanding objective for product marketing and design.

Mobile and nomadic information technology

With the rise of wireless Internet access, users can now connect from multiple access points, enhancing mobility for nomadic users who often use laptops while traveling This flexibility allows either the device to be relocated or the user to switch between different devices seamlessly, enabling continuous access to applications even during transitions.

Ensuring optimal transparency for nomadic or mobile users presents significant challenges for software engineers and managers The infrastructure must provide a consistent user experience regardless of the user's location or device, whether it's different access points or various computers Applications must either adapt to significant variations in connectivity or rely on the infrastructure to manage these adjustments A major obstacle is accomplishing this across diverse ownership and administrative domains within a global network.

A component marketplace

The assembly of applications using finer-grained software components is often restricted for individual software suppliers due to the potential development costs that may not be justified by their usage Similar to infrastructure software, the true potential of these components can only be realized with the establishment of a marketplace dedicated to their exchange.

Software markets differ significantly from traditional hardware and material goods markets, primarily because software components are protected by intellectual property and are usually licensed rather than sold outright Additionally, pricing in software markets tends to be more variable and can depend on multiple factors, including usage This variability opens up numerous possibilities for exploration and analysis in the field.

Pricing and business models

The growing popularity of the Application Service Provider (ASP) model reflects a significant shift in the software industry's business landscape This trend is fueled by several key factors: the widespread availability of high-performance networks, which unlocks new opportunities; the shift from monolithic applications to finely-grained software components that can be sourced from multiple vendors; and the rise of pervasive computing, which enables the integration of higher-level capabilities across various devices Additionally, the increasing mobility of users presents diverse scenarios for managing the ownership and operation of infrastructure As a result, the complexities of provisioning and operating applications have intensified, particularly in an environment characterized by greater application diversity and multi-component integration.

Traditional pricing models that rely on the number of hosts or users are becoming less suitable for current scenarios Instead, a shift towards usage-based subscription models is necessary These new pricing structures demand robust infrastructure to ensure transparent, efficient, and auditable billing for the services provided.

Components can be offered via subscription, allowing for monitored usage and facilitating micro-payments to vendors This model promotes the adoption of components, as each one serves as a miniature platform for additional components to build upon, thereby reducing initial entry barriers in markets where earlier offerings have already gained traction.

While the details are unclear, the changing technology is stimulating widespread changes in industrial organization and business models.

The software marketplace is characterized by unique combinations of supply and demand factors, showcasing exceptional flexibility, variability, and richness However, these advantages are met with significant challenges across societal, organizational, technical, financial, and economic domains As a result of ongoing and relentless technological advancements, the current software market remains in an immature state.

This article highlights significant opportunities to gain insights into the challenges and prospects of investing in, developing, marketing, and selling software By enhancing our understanding of these areas, we can better formulate strategies for the advancement of software technology and create business models that effectively meet the needs of both suppliers and customers We aim to take an initial step toward achieving this vision by summarizing our current limited understanding of the topic.

Software operates under a framework of laws akin to those of physics, encompassing key theories of information, computability, and communication Despite these foundational laws, advancements in electronics and photonics render them minimally restrictive Software, much like information, possesses extraordinary versatility, with the primary limitation being human imagination Given the nascent state of technology and its markets, this paper likely overlooks the vast and largely unpredictable possibilities that lie ahead.

Despite the valuable insights gained from studying other goods and services, the principles of software economics remain underexplored It is essential to reevaluate competitive market mechanisms, valuation and pricing models, investment recovery strategies, risk management practices, insurance frameworks, and value chains from foundational perspectives to fully understand the distinct nature of software as a unique asset.

[Bak79] Baker, Albert L.; Zweben, Stuart H “The Use of Software Science in Evaluating Modularity Concepts” IEEE Transactions on Software Engineering, March 1979, SE-5(2): 110-120.

[Bal97] Baldwin, Carliss Y; Clark, Kim B “Managing in an age of modularity” Harvard Business Review, Sep/Oct 1997, 75(5): 84-93.

[BCK98] Bass, Len; Clements, Paul; Kazman, Rick Software Architecture in Practice, Addison-Wesley, 1998.

[Boe00] Boehm, B; Sullivan, K “Software Economics: A Roadmap,”in The Future of Software

Engineering, special volume, A Finkelstein, Ed., 22nd International Conference on Software

[Boe81] Boehm, B Software Engineering Economics , Englewood Cliffs, N.J : Prentice-Hall, 1981. [Boe84] Boehm, Barry W “Software Engineering Economics” IEEE Transactions on Software

[Boe99] Boehm, B; Sullivan, K “Software economics: Status and prospects” Information & Software

[Bos00] Bosch, Jan Design and Use of Software Architectures, Addison Wesley, 2000

[Bro98] Brown, William J.; Malveau, Raphael C.; Brown, William H.; McCormick,; Hays W., III;

Mowbray, Thomas J AntiPatterns: Refactoring Software, Architectures, and Projects in Crisis, John Wiley & Sons, 1998.

[Bul00] Bulkeley, William M “Ozzie to unveil Napster-style networking” Wall Street Journal Interactive

Edition, October 24 (http://www.zdnet.com/zdnn/stories/news/0,4586,2644020,00.html)

[Chu92] Church, Jeffrey; Gandal, Neil “Network Effects, Software Provision, and Standardization”

Journal of Industrial Economics, Mar 1992, 40(1): 85-103.

[Cia00] Ciarletta, L.P., Dima, A.A “A Conceptual Model for Pervasive Computing”, Workshop on

Pervasive Computing; in: Proceedings of the 29th International Conference on Parallel Computing

[Cla93] Clark, J R; Levy, Leon S “Software economics: An application of price theory to the development of expert systems” Journal of Applied Business Research, Spring 1993, 9(2): 14-18.

[Com96] Compaq “White paper: How DIGITAL FX!32 works”.

(http://www.support.compaq.com/amt/fx32/fx-white.html.)

[Cov00] Covisint “Covisint Establishes Corporate Entity – Automotive e-business exchange becomes LLC”, December 2000 (http://www.covisint.com/)

[Cox96] Cox, B Superdistribution: Objects as Property on the Electronic Frontier; Addison Wesley 1996 (http://www.virtualschool.edu/mon)

[CSA98] Workshop report, OMG DARPA Workshop on Compositional Software Architectures, February 1998.

(http://www.objs.com/workshops/ws9801/report.html)

[Dav90] David, Paul A., and Shane Greenstein 1990 “The Economics of Compatibility Standards: An Introduction to Recent Research,” Economics of Innovation and New Technology 1(1-2): 3-41.

[DFS98] Devanbu, P.; Fong, P.; Stubblebine, S “Techniques for trusted software engineering” In:

Proceedings of the 20th International Conference on Software Engineering (ICSE'98), Kyoto, Japan,

(http://seclab.cs.ucdavis.edu/~devanbu/icse98.ps)

[Fra90] Frakes, W B.; Gandel, P B “Representing Reusable Software” Information & Software

[Gaf89] Gaffney, J E., Jr.; Durek, T A “Software Reuse - Key to Enhanced Productivity: Some

Quantitative Models” Information & Software Technology, Jun 1989, 31(5): 258-267.

[Gas71] Gaskins “Dynamic Limit Pricing: Optimal Pricing Under Threat of Entry”, J Econ Theory 306, 1971.

[Goe9?] Ben Goertzel, “The Internet Economy as a Complex System”, 199?

(http://www.goertzel.org/ben/ecommerce.html)

[Gul93] Analytical methods in software engineering economics Thomas R Gulledge, William P Hutzler, eds Berlin ; New York : Springer-Verlag, c1993.

[Hil93] Hiles, A Service Level Agreements – Managing Cost and Quality in Service Relationships, Chapman & Hall, London, 1993.

[How97] Howard, J.D “An Analysis Of Security Incidents On The Internet”, PhD thesis, Carnegie Mellon University, Pittsburgh, PA, April 1997.

(http://www.cert.org/research/JHThesis/Start.html)

[IDC99] Steve Garone and Sally Cusack “Components, objects, and development environments: 1999 worldwide markets and trends” International Data Corporation, June 1999.

[Jun99] Jung, Ho-Won; Choi, Byoungju “Optimization models for quality and cost of modular software systems” European Journal of Operational Research, Feb 1, 1999, 112(3): 613-619.

[Kan89] Kang, K C.; Levy, L S “Software Methodology in the Harsh Light of Economics” Information

[Kat85] Katz, Michael, and Carl Shapiro 1985 “Network Externalities, Competition, and Compatibility,”

[Kat86] Katz, Michael L.; Shapiro, Carl “Technology Adoption in the Presence of Network Externalities”

Journal of Political Economy, Aug 1986, 94(4): 822-841.

[Ken98] Kemerer, Chris F “Progress, obstacles, and opportunities in software engineering economics”

Communications of the ACM, Aug 1998, 41(8): 63-66.

[Koc98] Koch, Christopher “Service level agreements: put IT in writing”, CIO Magazine, 15 Nov 1998 (http://www.cio.com/archive/111598_sla.html)

[Lan00] Langlois, Richard, “Modularity in Technology and Organization”, to appear in the Journal of

[Lan92] Langlois, Richard N “External economies and economic progress: The case of the microcomputer industry” Business History Review, Spring 1992, 66(1): 1-50.

In their 1992 research article, Langlois and Robertson explore the impact of networks on innovation within modular systems, drawing insights from the microcomputer and stereo component industries They highlight how interconnectedness fosters creativity and efficiency in product development Additionally, Levy's 1987 work, "Taming the Tiger," delves into the intricacies of software engineering and its economic implications, emphasizing the importance of managing software projects effectively Together, these studies underscore the significance of collaboration and strategic management in driving technological advancement.

[Lew97] Ted Lewis, Friction-Free Economy HarperBusiness, 1997.

(http://www.friction-free-economy.com/)

[LL96] Lee, Peter; Leone, Mark, “Optimizing ML with run-time code generation”, ACM SIGPLAN

[Mak99] Makulowich, John “Pervasive Computing: ‘The Next Big Thing’” Washington Technology

Online, 19 July 1999 (http://www.wtonline.com/vol14_no8/cover/652-1.html)

[Mar90] Marshall, Alfred Principles of Economics, first edition: 1890 Reprinted in Great Minds Series, Prometheus Books, 1997.

[Mes99a] David G Messerschmitt, Understanding Networked Applications: A First Course Morgan Kaufmann, 1999.

[Mes99b] David G Messerschmitt, Networked Applications: A Guide to the New Computing Infrastructure. Morgan Kaufmann, 1999.

[Mes99c] D.G Messerschmitt, "The Prospects for Computing-Communications Convergence"

Proceedings of MĩNCHNER KREIS, Conference "VISION 21:Perspectives for the Information and Communication Technology", Munich Germany, Nov 25, 1999.

(http://www.EECS.Berkeley.EDU/~messer/PAPERS/99/Munich.PDF)

[Net] Nepliance, Inc (http://www.netpliance.com/iopener/)

[Nie00] Nielsen, J Designing Web Usability: The Practice of Simplicity New Riders Publishing,

[Nie93] Nielsen, J “Noncommand user interfaces.” Communications of the ACM, April 1993, 36(4): 83-99. (http://www.useit.com/papers/noncommand.html)

[Par72] Parnas, David L 1972 “On the Criteria for Decomposing Systems into Modules,”

Communications of the ACM 15(12): 1053-1058 (December).

[Pfl97] Pfleeger, Charles P Security in Computing, 2 nd edition, Prentice Hall, 1997.

[Pre00] Pressman, Roger S Software Engineering: A Practitioner’s Approach (Fifth Edition) McGraw- Hill, 2000.

[Rob95, Lan92] Robertson, Paul L; Langlois, Richard N “Innovation, networks, and vertical integration”

[Roy70] Royce, W.W “Managing the development of large software systems”, IEEE WESCON, August 1970.

[San96] Sanchez, Ron; Mahoney, Joseph T “Modularity, flexibility, and knowledge management in product and organization design” Strategic Management Journal, Winter 1996, 1763-76.

[Sch88] Schattke, Rudolph “Accounting for Computer Software: The Revenue Side of the Coin” Journal of Accountancy, Jan 1988, 165(1): 58-70.

[Sha99] Carl Shapiro and Hal R Varian, Information Rules: A Strategic Guide to the Network Economy Harvard Business School Press, 1999.

[Sil87] Silvestre, Joaquim “Economies and Diseconomies of Scale”, in: The New Palgrave: A Dictionary of Economics, ed by John Eatwell, Murray Milgate, and Peter Newman London: Macmillan, London,

[Sla98] Slaughter, Sandra A; Harter, Donald E; Krishnan, Mayuram S “Evaluating the cost of software quality” Communications of the ACM, Aug 1998, 41(8): 67-73.

[SOT00] Suganuma, T.; Ogasawara, T.; Takeuchi, M.; Yasue, T.; Kawahito, M.; Ishizaki, K.; Komatsu, H.; Nakatani, T “Overview of the IBM Java just-in-time compiler”, IBM Systems Journal, 2000, 39(1): 175- 193.

[Sul99] Sullivan, Jennifer “Napster: Music Is for Sharing”, Wired News, 1 November 1999

(http://www.wired.com/news/print/0,1294,32151,00.html)

[Sun99] “The Java Hotspot TM performance engine architecture – A white paper about Sun's second generation performance technology”, April 1999.

(http://java.sun.com/products/hotspot/whitepaper.html)

[Szy98] Clemens Szyperski, Component Software—Beyond Object-Oriented Programming Addison- Wesley, 1998.

[The84] Thebaut, S M.; Shen, V Y “An Analytic Resource Model for Large-Scale Software

[Tor98] Torrisi, S., Industrial Organization and Innovation : An International Study of the Software

[UPA] Usability Professionals' Association (http://www.upassoc.org/)

[Upt92] Upton, David M “A flexible structure for computer-controlled manufacturing systems”,

(http://www.people.hbs.edu/dupton/papers/organic/WorkingPaper.html)

[Vac93] Vacca, John “Tapping a gold mine of software assets” Software Magazine, Nov 1993, 13(16): 57- 67.

[Ver91] The Economics of information systems and software Richard Veryard, ed Oxford ; Boston : Butterworth-Heinemann, 1991.

[W3C95] World Wide Web Consortium “A Little History of the World Wide Web”

(http://www.w3.org/History.html)

[W3CP] World Wide Web Privacy (http://www.w3.org/Privacy/)

[War00] Ward, Eric, “Viral marketing involves serendipity, not planning” B to B, Jul 17, 2000, 85(10): 26.

David G Messerschmitt is the Roger A Strauch Professor of Electrical Engineering and

The University of California at Berkeley's Computer Sciences program has been significantly influenced by a former Chair of EECS from 1993 to 1996, who previously worked at AT&T Bell Laboratories His current research focuses on the future of wireless networks, the economics of networks and software, and the interplay between business and technology He is dedicated to enhancing the curriculum by integrating information technology concepts into business and information science courses and has authored the textbook "Understanding Networked Applications: A First Course."

He is a co-founder and former Director of TCSI Corporation, currently serving on the Advisory Board of the Fisher Center for Management & Information Technology at the Haas School of Business and the Directorate for Computer and Information Sciences and Engineering at the National Science Foundation He recently co-chaired a National Research Council study focused on the future of information technology research He earned his B.S from the University of Colorado and both his M.S and Ph.D from the University of Michigan He is a Fellow of the IEEE, a Member of the National Academy of Engineering, and a recipient of the prestigious IEEE Alexander Graham Bell Medal.

Clemens A Szyperski is a Software Architect in the Component Applications Group of

Microsoft Research, where he furthers the principles, technologies, and methods supporting component software He is the author of the award-winning book Component Software: Beyond

Object-Oriented Programming and numerous other publications He is the charter editor of the

The Addison-Wesley Component Software professional book series features an accomplished author who is an active speaker, panelist, and committee member at various international conferences and events, encompassing both academic and industrial sectors He earned his first degree in Electrical Engineering in 1987 from the Aachen Institute of Technology.

Technology in Germany He received his Ph.D in Computer Science in 1992 from the Swiss Federal Institute of Technology (ETH) in Zurich under the guidance of Niklaus Wirth In 1992-

In 1993, he co-founded Oberon Microsystems, Inc in Zurich, Switzerland He held a postdoctoral scholarship at the International Computer Science Institute at the University of California, Berkeley in 1993 From 1994 to 1999, he served as a tenured associate professor at the Queensland University of Technology in Brisbane, Australia, where he continues to hold an adjunct professorship.

1998 spin-off, esmertec inc., also Zurich.

Technology is commonly defined as the application of physical laws for practical purposes, which traditionally excludes software However, due to the interchangeable relationship between software and hardware, software is also considered a form of technology.

2 The theoretical mutability of hardware and software was the original basis of software patents, as discussed in Section

5 If it is reasonable to allow hardware inventions to be patented, then it should be equally reasonable to allow those same inventions, but embodied by software, to be patented.

The computer stands out as the first fully programmable product, surpassing earlier items that offered limited parameterizability, like drafting compasses, or configurability, such as erector sets While some products, like paper, can adapt to different content, none match the computer's extensive functionality, which is not predetermined at the time of manufacture.

The main practical challenges in design are complexity and performance While achieving high complexity is generally simpler in software, transferring that functionality to hardware significantly enhances performance Recent advancements in computer-aided design tools have made hardware design increasingly similar to software programming.

Ngày đăng: 18/10/2022, 19:03

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN

w