1. Trang chủ
  2. » Tài Chính - Ngân Hàng

Tiếng anh chuyên ngành kế toán part 56 pot

10 250 0

Đang tải... (xem toàn văn)

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 10
Dung lượng 100 KB

Nội dung

538 Making Key Strategic Decisions computers that functioned using the UNIX operating system. While, in the 1970s, Bell Labs actually developed UNIX as an operating system for scientific applications, it later became an accepted standard for commercial applications. Platform independent, the operating system and its associated applications could run on a variety of manufacturers’ computers, creating both opportunities for users and competition within the computer industry. Users were no longer inex- orably tied to one manufacturer. UNIX became the standard as companies moved into the 1990s. However, standards changed rapidly in the nineties, and UNIX has lost ground due to the development of client server technology. In the early 1990s, technologists predicted the demise of the mainframe. IBM’s stock declined sharply as the market realized that the company’s chief source of margin was headed toward extinction. However, the mainframe has reinvented itself as a super server, and, while it has been replaced for some of the processing load, the mainframe and IBM are still positioned to occupy important roles in the future. Server technology is heading toward a design in which processors are built around multiple, smaller processors, all operating in parallel. Referred to as symmetrical multiprocessors (SMPs), there are between two and eight processors in a unit. SMPs are made available by a range of manufacturers and operating systems, and they provide processor power typically not available in a uniprocessor. Faced with the demanding environment of multiple, simultaneous queries from databases that exceed hundreds of gigabytes, processors with mas- sively parallel processors, or MPPs, are being utilized more and more. MPPs are processors that have hundreds of smaller processors within one unit. The goal of SMPs and MPPs is to split the processing load among the processors. In a typical factory in the 1800s, one motor usually powered all of the machinery, to which it was connected by a series of gears, belts, and pulleys. Today, that is no longer the case, as each machine has its own motor or, in some cases, multiple, specialized motors. For example, the automobile’s main motor is the engine, but there are also many other motors that perform such tasks as opening and closing windows, raising and lowering the radio antenna, and pow- ering the windshield wipers. Computers are the firm’s motors, and like motors, they, too, have evolved. Initially, firms used a host centric mainframe, one large computer; today, they are using many computers to perform both special- ized and general functions. In the early 1990s, Xerox’s prestigious Palo Alto Research Center intro- duced “ubiquitous computing,” a model that it feels reflects the way companies and their employees will work in the future. In ubiquitous computing, each worker will have available differing quantities of three different size comput- ers: 20 to 50 Post-it note size portable computers, three or four computers the size of a writing tablet, and one computer the size of a six-foot-by-six-foot white board. All of the computers will work together by communicating to a network through, in most cases, wireless connections. The progress of chip technology has been highly predictable. In the early 1960s, Gordon Moore, the inventor of the modern CPU at Intel, developed Information Technology and the Firm 539 Moore’s Law, which predicts that the density of the components on a com- puter chip will double every 18 to 24 months, thereby doubling the chip’s pro- cessing power. This hypothesis has proven to be very accurate. Exhibit 16.1 shows the growth of the various Intel CPU chips that have powered the per- sonal computer and many other machines. As can be seen, the PC’s power has just about doubled every 18 to 24 months. This growth can be seen more dramatically when the graph is plotted logarithmically, as in Exhibit 16.2. EXHIBIT 16.1 Moore’s Law—charting the power of the growth of the PC. 1980 1982 1984 1986 1988 1990 1992 1994 1996 1998 2000 2002 Year MIPS (millions of instructions per second) 0 100 200 300 400 500 600 700 800 900 1000 EXHIBIT 16.2 Moore’s Law—charting the growth of the PC (logarithmically). 1980 1982 1984 1986 1988 1990 1992 1994 1996 1998 2000 2002 Year MIPS (millions of instructions per second) 0.1 1.0 10.0 100.0 1000.0 540 Making Key Strategic Decisions SOFTWARE Exhibit 16.3 represents the information systems paradigm. Operational control systems, which run the company’s day-to-day operations, are typically used by the lowest level of the organization, are run on a scheduled basis, and usually contain large volumes of input data, output reports, and information. These systems might be accounts payable, accounts receivable, payroll, order entry, or inventory control. Decision support systems are generally used by middle-level managers to supply them with information that they can use to make decisions. Usually run on an ad hoc basis and involving small amounts of data, budgets, exception re- porting, cash-flow forecasting, accounts receivable dunning reports, “what if ” analyses, audit analysis reports, and variance analyses are examples of these decision support systems. Many of the newer applications packages come with facilities for managers without any programming knowledge to create their own decision reports. Strategic information systems are used by senior management to make de- cisions on corporate strategy. For example, a retail company might use demo- graphic census data, along with a computerized geographical mapping system, to evaluate the most appropriate locations at which it should open new stores. A manufacturing company, given its demands for both skilled and unskilled labor, might use a similar method to determine the optimal location for a new plant. While most older hardware has given way to newer computers, most com- panies use a combination of newly acquired and older, self-developed software. The latter was developed over a period of years, perhaps 20 or more, using COBOL, which, until the early 1990s, was the standard programming language in business applications. Today, many companies’ mission critical systems still EXHIBIT 16.3 Types of information systems. Operational control systems Decision support systems Strategic information systems Information Technology and the Firm 541 run on mainframe technology, using programs written in COBOL; in fact, there are billions of lines of COBOL programming code still functional in U.S. business. These “legacy” systems have become a major issue for many, though, and were the key issue behind the Y2K problem. In many instances, they have grown like patchwork quilts, as they have been written and modified by pro- grammers who are no longer with their firms. More often than not, documen- tation of these changes and enhancements is not available, and the guidelines for many of these software applications no longer exist. Replacing these appli- cations is cost prohibitive, and the distraction to the organization caused by the need to retrain workers would be tremendous. Nonetheless, as a result of the Y2K problem, many of these systems were replaced, but large volumes of them were merely patched to allow for the mil- lennium change. These systems will eventually have to be replaced. If history is a lesson, many of these systems will not be replaced, though, until it is too late. In any event, the business community should not face the singular dead- line it faced at the end of 1999. Today, most programmers write in C++, C, or fourth-generation program- ming languages. C++ is an object oriented programming language; object ori- ented languages provide the programmer with a facility to create a programming object or module that may be reused in many applications. Fourth-generation programming languages are usually provided with sophisticated relational data- base systems. These database systems provide high-level tools and programming languages that allow programmers to create applications quickly without having to concern themselves with the physical and logical structure of the data. Ora- cle, Informix, Sybase, and Progress are some of the more popular relational database package companies. INTERNET TECHNOLOGY Nothing has impacted technology and society in the past 10 years more than the Internet. When Bill Clinton was inaugurated in January 1993, there were 50 pages on the Internet. Today, there are more than 200 billion pages. The un- derlying technology behind the Internet has its roots in a project begun by the U.S. government in the early 1970s. The network was originally developed by a consortium of research colleges and universities and the federal government that was looking for a way to share research data and provide a secure means of communicating and for backing up defense facilities. The original network was called ARPANET. ARPANET was sponsored by the Department of Defense’s Advanced Research and Planning Agency (ARPA). It was replaced in the 1980s by the current network, which was originally not very user friendly and was used mostly by techies. The Internet’s popularity exploded with the develop- ment of the World Wide Web and the necessary software programs that made it much more user friendly to explore. 542 Making Key Strategic Decisions The Internet works on a set of software standards the first of which, TCP/IP, was developed in the 1970s. The entire theory behind the Internet and TCP/IP, which enables computers to speak to each other over the Inter- net, was to create a network that had no central controller. The Internet is un- like a string of Christmas lights, where if one light in the series goes out the rest of the lights stop functioning. Rather, if one computer in the network is disabled, the rest of the network continues to perform. Each computer in the Internet has an Internet, or IP, address. Similar to one’s postal address, it consists of a series of numbers (e.g., 155.48.178.21), and it tells the network where to leave your e-mail, and data. When you ac- cess an Internet site through its URL (e.g., www.babson.edu), a series of computers on the Internet, called domain name servers (DNS), convert the URL to an IP address. When an e-mail, message, or data is sent to someone over the Internet, it is broken into a series of packets. These packets, similar to postcards, contain the IP address of the sender, the IP address of the recipient, the packet number of the message (e.g., 12 of 36), and the data itself. These packets may travel many different routes along the Internet. Frequently, packets belonging to the same message do not travel the same route. The receiving computer then reassembles these packets into a com- plete message. The second standard that makes the Internet work is HTML, or Hyper- text Markup Language. This language allows data to be displayed on the user’s screen. It also allows a user to click on an Internet link and jump to a new page on the Internet. While HTML remains the underlying program- ming language for the World Wide Web, there are many more user-friendly software packages, like FrontPage 2000, that help create HTML code. More- over, HTML, while powerful in its own right, is not dynamic and has its lim- itations. Therefore, languages such as JavaScript, Java, and Pearl, which create animation, perform calculations, create dynamic Web pages, and access and update databases with information on the host’s Web server, were developed to complement HTML. Using a Web browser (e.g., Netscape Navigator or Microsoft’s Internet Explorer), the computer converts the HTML or other programming languages into the information that the users see on their com- puter monitors. Internet technology has radically changed the manner in which corporate information systems process their data. In the early and mid-1990s, corporate in- formation systems used distributed processing techniques. Using this method, some of the processing would take place on the central computer (the server) and the rest on the users’ (the clients’) computers—hence, the term client- server computing. Many companies implemented applications using this technol- ogy, which ensured that processing power was utilized at both ends and that systems were scalable. The problem with client-server processing was that dif- ferent computers (even within the IBM-compatible PC family) used different drivers and required tweaking to make the systems work properly. Also, if the software needed to be changed at the client end, and there were many clients Information Technology and the Firm 543 (some companies have thousands of PC clients), maintaining the software for all of those clients could be a nightmare. Even with specialized tools developed for that purpose, it never quite worked perfectly. As companies recognized the opportunity to send data over the Internet, whether for their customers or their employees, they started to migrate all of their applications to a browser interface. This change has required companies to rethink where the locus of their processing will occur. Prior to the 1990s, companies’ networks were host-centric, where all of their processing was con- ducted using one large mainframe. In the early 1990s, companies began using client-server architecture. Today, with the current browser technology and the Internet, the locus has shifted back to a host-centric environment. The differ- ence, though, is that the browser on the users’ computers is used to display and capture data, and the data processing actually occurs back at the central host on a series of specialized servers, not on one large mainframe computer. The only program users need is a standard browser, which solves the incompatibil- ity problem presented by distributed data processing. No specialized software is stored on the users’ computers. Internet technology was largely responsible for many of the productivity enhancements of the 1990s. Intel’s microprocessors, Sun and Hewlett Packard’s servers, CISCO’s communications hardware, and Microsoft’s Windows operat- ing systems have all facilitated this evolution. While Windows is the predomi- nant client operating system, most servers operate on Windows NT or 2000, UNIX or LINUX operating systems. TODAY’S APPLICATION SYSTEMS In the 1970s and 1980s, application software systems were stand-alone. There was little sharing of data, leading to the frequent redundancy of information. For example, in older systems, there might have been vendor data files for both inventory and accounts payable, resulting in the possibility of multiple versions of the truth. Each of the files may have contained address information, yet each of the addresses may have been different for the same vendor. Today, however, software applications are integrated across functional applications (accounts payable, accounts receivable, marketing, sales, manufacturing, etc.). Database systems contain only one vendor data location, which all systems uti- lize. These changes in software architecture better reflect the integration of functions that has occurred within most companies. Accounting systems, while used primarily for accounting data, also pro- vide a source of data for sales and marketing. While retail stores’ point of sale cash registers are used as a repository for cash and to account for it, they are also the source of data for inventory, sales, and customer marketing. For exam- ple, some major retailers ask their customers for their zip codes when point of sale transactions are entered, and that data is shared by all of the companies’ major applications. 544 Making Key Strategic Decisions Accounts receivable systems serve two purposes. On one hand, they allow the company to control an important asset, their accounts receivable. Also, the availability of credit enables customers to buy items, both commercial and re- tail, that they otherwise would not be able to buy if they had to pay in cash. Credit card companies, which make their money from the transaction fees and the interest charges, understand this function well. Frequently, they reevaluate the spending and credit patterns of their client base and award increased credit limits to their customers. Their goal is to encourage their customers to buy more, without necessarily paying off their balance any sooner than necessary. Information systems make it possible for the companies to both control and promote their products, which in this case are credit card transactions. These examples of horizontally integrated systems, as well as the under- standing of the strategic and competitive uses of information technology, demonstrate where industry is headed. ACCOUNTING INFORMATION SYSTEMS As mentioned earlier, computer-based accounting systems were, for most com- panies, the first computerized applications. As the years progressed, these sys- tems have become integrated and consist of the following modules: • Accounts Payable. • Order Entry and Invoicing. • Accounts Receivable. • Purchase Order Management and Replenishment. • Inventory Control. • Human Resource Management. •Payroll. • Fixed Assets. • General Ledger and Financial Statements. Whereas in past years some of these modules were acquired and others were self-developed, today most companies purchase packaged software. In the 1980s, “shrink-wrapped” software was developed and introduced. Lotus Corporation, along with other companies, was a pioneer, selling software like its 1-2-3 application in shrink-wrapped packages. The software was accom- panied by sufficient documentation and available telephone support to ensure that even companies with limited technical expertise could manage their own destinies. There are a host of software packages that will satisfy the needs of com- panies of all sizes. Smaller companies can find software selections that run on personal computers and networks, are integrated, and satisfy most of the com- panies’ requirements. Quicken and Computer Associates have offerings that Information Technology and the Firm 545 provide most of the necessary functional modules for small and medium size companies, respectively. The more advanced packages, like Macola and Acc- Pac, are equipped with interfaces to bar-code scanners and scales, which, to- gether, track inventory and work in process and weigh packages as they are shipped, producing not only invoices but also shipping documents for most of the popular freight companies such as FedEx and UPS. These packages range in price from $100 for the entire suite of accounting applications for the small- est packages to approximately $800 per module for the larger packages, which, of course, have more robust features. While some of the smaller packages are available through computer stores and software retailers, the larger packages are acquired through independent software vendors (ISV), who, for a consult- ing fee, will sell, install, and service the software. The practice of using third party ISVs began in the 1980s, when large hardware and software manufactur- ers realized that they were incapable of servicing all of the smaller companies that would be installing their products, many of whom required a lot of hand- holding. Consequently, a cottage industry of distributors and value added deal- ers developed, in which companies earn profits on the sale of hardware and software and the ensuing consulting services. Larger companies are following a trend toward large, integrated packages from companies like SAP and Oracle. These packages integrate not only the ac- counting functions but also the manufacturing, warehousing, sales, marketing, and distribution functions. These systems are referred to as enterprise re- source planning (ERP) systems. Many of these ERP systems, available from companies such as SAP, Oracle, and BAAN, also interface with Web applica- tions to enable electronic commerce transactions. SAP has spawned an entire industry of consulting companies that assist large companies in implementing its software, a process that may take several years to complete. As in any soft- ware implementation, one must always factor into the timetable the process’s cost and the distraction it causes the organization. In today’s lean business en- vironment, people have little extra time for new tasks. Implementing a major new system or, for that matter, any system, requires a major time and effort commitment. INFORMATION TECHNOLOGY IN BANKING AND FINANCE The financial services industry is the leading industry in its use of information technology. As shown in Exhibit 16.4, according to a survey conducted in 1999 by the Computer Sciences Corporation, this sector has spent 5.0% of its annual revenue on IT, almost more than double that of any other industry, except the technology driven telecommunications industry. This graph also illustrates how integral a role real-time information plays in the financial services industry, whether it be for accessing stock quotes or processing bank deposits. The industry has become a transaction processing 546 Making Key Strategic Decisions in dustry that is information dependent. Very little real money is ever touched. Rather, all transactions, from stock purchases to the direct deposit of workers’ checks, are processed electronically. Information technology has paved the way for innovations like the NASDAQ trading system, in which, unlike the New York Stock Exchange (NYSE), all of the trades are conducted totally electronically. NETWORKS AND COMMUNICATIONS It is becoming increasingly common in industry to create virtual wide area net- works using multiple, interconnected local area networks. These networks also connect the older mainframe and midrange computers that industry uses for its older legacy systems to the client terminals on the users’ desks. Exhibit 16.5 is a model of a typical company’s wide area network, and it demonstrates how all of the older technology interconnects with the newer local area networks and the Internet. In the early 1990s, there were numerous, competing network operating systems and protocols. While Novell and its NetWare software holds the largest market share, Microsoft’s Windows NT is becoming the network operating sys- tem of choice, and, because of the Internet’s overwhelming success, TCP/IP is rapidly becoming the standard communications protocol. Remember, though, success is very fragile in the world of information technology. Today’s standard can easily become yesterday’s news. If you are always prepared for change, then you will not be surprised by it. EXHIBIT 16.4 Information technology budgets by industry. Financial services Percentage of revenue Health care Aerospace/defense Manufacturing Chemicals Retail Telecommunications Consumer goods Utilities Oil/energy 0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 Information Technology and the Firm 547 Electronic Data Interchange (EDI) allows companies to communicate and conduct electronic commerce from one computer to another. EDI is one of the industry’s growing uses for data communications, and many companies are using it to send purchase orders to their suppliers, thereby lessening the time it takes for purchase orders to be mailed and then entered and processed by the suppli- ers. Inventories are lowered by speeding up the turnaround time of ordering and receiving goods and materials. On the flip side, many suppliers use EDI to send their customers advance ship notifications (ASN), advising them of what has been shipped so that they can prepare their warehouses for the goods and EXHIBIT 16.5 Model of wide area network (local area network and Internet connection using open communications protocol, c. 1997). Server Minicomputer Mainframe Internet (TCP/IP) Router Server Minicomputer Mainframe Client Client (e.g., branch office) Key: = Processing capacity (e.g., the ability to run program code) TCP/IP = Transmission Control Protocol/Internet Protocol Local area network (TCP/IP) . chip will double every 18 to 24 months, thereby doubling the chip’s pro- cessing power. This hypothesis has proven to be very accurate. Exhibit 16.1 shows the growth of the various Intel CPU. up defense facilities. The original network was called ARPANET. ARPANET was sponsored by the Department of Defense’s Advanced Research and Planning Agency (ARPA). It was replaced in the 1980s by. for a consult- ing fee, will sell, install, and service the software. The practice of using third party ISVs began in the 1980s, when large hardware and software manufactur- ers realized that they

Ngày đăng: 07/07/2014, 13:20