GAO H77 TU EU DI Report to the Chairman and Ranking
Trang 2Contents Letter 3
Appendixes: Appendix i Objectives, Scope, and Methodology 18
Appendix Comments From the Department of Defense a Appendix II: Description of SEI Capability Maturity Models 36 Appendix Detailed Results of Review of DOD Components SPI Programs 38 Appendix Vz GAO Contaet and Staff Acknowledgments 66 ‘Tables Table 2 SoftwareSSystems Units Selected for Review ‘able 1: Comparison of Components With IDEAL Model " 1 ‘Table 3 Phases of the IDEAL™ Model 20 ‘Table + Phases and Tasks ofthe IDEAL!™ Model 2 ‘Table 6: Army Examples of Alignment ‘Table 6: Comparisons of Army SPI Activities With the IDEAL™ With IDEAL™ 0 Model a ‘Table 7 Air Force Examples ‘Table 8: Comparisons of Air Force SPI Activites With the of Alignment With IDEAL 4
Trang 3
‘igure 6: Partial Marine Comps Organization Units Responsible for SoftwareSystoms Chart Highlighting 55 Figure 7: Partial DFAS Organization Chart Highlighting Units Responsible for Information Systems oo igure: Part DIA Organization Chart Highlighting Units Responsible for Software Systems 65,
‘Kvbreviations
APA AlrForce Academy
AFCA Air Force Communications Agency AMC Army Materiel Command
AMCOM Aviation and Missile Command
CECOM Communications-Electronies Command C10 chief information officer
CMMI™ Capability Maturity Model Invegration ™* DFAS Defense Finance and Accounting Service DLA Defense Logistics Agency
DOD Department of Defense
PSO Financlal Systems Organization
IDEAL™ initiating, dlagnosing, establishing, acting, and leveraging TTD Information and Technology Directorate
MCTSSA Marine Corps Tactical Systems Support Activity MSG NAVAIR Naval Aviation Systems Command Materiel Systems Group
OSD Office of the Secretary of Defense SEC Software Engineering Center SED Software Engineering Directorate SEI Software Engineering institute SEO systems engineering organizations SEPG software engineering process group
SPAWAR Space and Naval Warfare Systems Command SSC SPAWAR Systems Center
SSG Standard Systems Group
SW-CMME® Software Capability Maturity Model® SPL softwarefystems process improvement
Trang 4i 2 GAO United States General Accounting OMfice Washington, D.C 20548 March 90, 2001
“The Honorable James M Inhofe Chairman ‘The Honorable Daniel K Akaka Ranking Member ‘Subcommittee on Readiness and Management Support ‘Committee on Armed Services United States Senate
With an annual information technology budget of about $20 billion, and tens of billions more budgeted for technology embedded in sophisticated ‘weaponry, the Departinent of Defense (DOD) relies heavily on software Intensive systems to support military operations and associated business funetions, such as logisties, personnel, and financial enanagement One ‘important determinant of the quality of these systems, and thus DOD'S nission performance, is tne quality of the processes used to develop, ogre, and engineer them Recognizing the importance ofthese processes
to producing systems that perform as intended and meet cost and schedule commitments, successful public and private organizations have alopted and implemented softwareystems process improvement (SPD programs!
KT stan abd tens ening Softare deepen fest ets an
Trang 5
“This report is part of our response to your request to compare and contrast OD information technology practices with leading practices In particular, ‘you asked us to review DOD components (military services and Defense agencies) SPI management activities to ensure that DOD is taking the necessary steps to continuously strengthen its software and systems ‘development, acquisition, and engineering processes As agreed with Your
“offices, our objectives were to (1) compare selected DOD components! SPL ‘programs against Carnegie Mellon University’s Software Engineering Institutes (SED IDEAL!™ model, which isa recognized best practices model, 2) determine how these components have approached ‘management of their SPI programs, and (3) determine what DOD-wide efforts are under way to promote and leverage the components? SPI ‘rograms The components that we selected were the Departments ofthe “Army, Air Force, and Navy the Marine Corps; the Defense Logistics Agency (DIA); and the Defense Finance and Accounting Service (DFAS)
[Because Army, Navy and Air Force do not manage SPI centrally and have delegated SPI responsibilty to their respective subordinate organizational ‘units, we selected atleast two ofthe largest ofthese uits within each service to review Accordingly all references in this report tothe respective services’ SPI programs refer only to the subortinate units that we
reviewed, We performed our work from March through December 2000, in saccordarice with generally accepted government auditing standards (See appendix I for details of our objectives, scope, and methodology including the specific service units reviewed.) DOD provided us with written
comments on a draft of this report These comments are summarized in the "Agency Comments and Our Evaluation" section ofthis letter and are reproditced in fallin appendix
Background OD maintains a fore of about $ milion military and eliian personnel worldwide To protet the secunty of the United States, the departn relies on a complex array of computerdependent and mutually supportive ‘organizational components, inching the military services and Defense
agencies It also relies on a broad array of computer systems, including
"Sir a anonalyreoogtand (orl fande esearch and develop cmon ‘Stash at Carnegie lon Unveray ose xo xe engineering penis
Trang 6
“weapons aystems, command and control ystems, satelite ystems, inventory management systems, Gnatcialeystems, personnel ystems, ‘payment systems, and others Many of these systems in tum are connected ‘with systems operated by private contractors other government agencies, and international organizations
DOD's ability to effectively manage information technology i critical to its hy to accomplish its mission, fs reliance on software-intensive systems to support operations related to intelligence, survellance, security, and
sophisticated weaponry—along with financial management and other business functions —ill only increase asthe department modernizes and responds to changes in traditional concepts of warfighting
“The scope of DOD's information technology inventory is vast: over 1.5 rillion computers, 28,000 systems, and 10,000 computer networks Further, ‘many of DOD's most important technology projects continue to cost more
than projected, take longer to produce, and deliver less than promised AS aresul, we have designated DOD systems development and modernization ‘efforts as a high-risk area*
"The quality ofthe processes involved in developing, acquiring, and
‘engineering software and systems has a significant effect onthe quality of the resulling products Accordingly, process improvement programs can Increase product quality and decrease product costs Public and private ‘organizations have reported significant returns on investment through such
rocess improvement programs SEI has published reports of benefits rallzed through process improvement programs For example, SEI reported in 1905" that a major defense contractor implemented a process Improvement program in 1088 and by 1905 had reduced its rework costs from about 40 percent of project cost to about 10 percent, increased staff productivity by about 170 percent, and reduced defects by about 75, percent According to a 1900 SBI report, a software development ‘contractor reduced its average deviation from estimated schedule tie {raat Dip a Det Pil War 100 Peace Rapa and ‘isl Tar 001 Peformance Pra (GADNSIAD- 188, ane 9 28)
Trang 7
‘rom 112 percent to 6 percent between 1088 and 1906 During the same ‘period, SEI reported that this contractor reduced its average deviation ‘rom estimated cost from 87 percent to minus 4 percent
‘To aid organizations attempting to initiate and manage SPI programms, SEL has published a best practices model called IDEAL,™ hich defines systematic five phase, continuous process improvement approach, with &
‘concurrent sixth element addressing the program management tasks spanning the five phases*(see figure 1)
Trang 8
‘igure 1: Simplified Diagram of he IDEAL™ Mode Initiating Leveraging Diagnosing > |Establishing Program Management
+ Initiating: During this phase, an organization establishes the ‘management structure ofthe process improvement program, defines and assigns roles and responsibilities, allocates initial resources, {develops a plan to guide the onganization through the first three phases ‘ofthe program, and obtains management approval and funding Two key ‘organizational components ofthe program management structure
established during this phase are a management steering group and a ‘software engineering process group (SEPG) Responsibility for this, hase ests with senior management
Trang 9
> Diagnosing: During this phase, the SEPG appraises the current level oF ‘process maturity to establish a baseline capability against which to ‘measure progress and identifies any existing process improvement inliatses, The SEPG then uses the baseline to identify weaknesses and target process improvement activities It also compares these targeted activities with any ongoing process improvement activities and
reconciles any differences Responsibility for this phase rests primarily with lie managers and practitioners
+ Establishing: During this phase, the SEPG prioritizes the process Improvement activities and develops strategies for pursuing them It then develops a process improvement action plan that details the activities and strategies and includes measurable goals for the activities and metrics for monitoring progress against goals Also during this phase, the resources needed to iruplement the plan are committed and training is provided for technical working groups, who will be
responsible for developing and testing new or improved processes Responsihity for this phase resides primarily with line managers and practitioners
‘+ Acting: During this phase, the technical working groups, formed under the esiablishing phase, create and evaluate new and improved processes, Bvaluation of the processes is based on pilot tests that are formally planned and executed, If the tests are successful, the working -sroups develop plans for orgaaization-wide adoption and
Insitutionaliztion, and once approved, execute them Responsibility {or this phase resides primarily with line managers and practitioners + Leveraging: During this phase, results and lessons learned from earlier
phases are ansested and applied, an appropriate, to enhance the structures and plans of process improvement programs Responsibility {or this phase rests primarily with senlor management
“The models sixth element, continuous program management, specifies ‘management structures ad tasks for planning, organizing, directing, staffing, and monitoring the program Responsibility for this element rests
‘with senior management
Trang 10
ach phase of the IDEAL™ model contains several recommended tasks, Appendix I, which deseribes our objectives, seope, and methodology, Identifies al tasks for each phase
Results in Brief ‘The DOD components that we reviewed vary in how they compare to SE's IDEAL™ model In particular, the Air Force, Army, and DFAS generally satisfed the models recommended tasks, as did certain Navy units However, DLA, the Marine Comps, another Navy units dd not Specifically, DLA does not have an SPI program, although during the course of our review the DLA Chief Information Officer stated that she intends to establish one Further, although the Marine Corps is performing many SPL activities, core tasks associated with an effective SPI program, such asa plan of action or dedicated resources to implement recommended
Improvements, are missing Finally, certain Navy units also do not have SPI programs aligned with the IDEAL™ model, although one is perforng a few of the model's recommended asks
‘The four components with SPI programs (Army, Air Force, DFAS, and parts ofthe Navy) are using different management strategies for directing and controlling their respective programs Nonetheless, all components with SPI programs report that they have realized benefits in product quality and ‘productivity For example, DRAS uses a centralized management approach land reports that is SPI program has helped decrease development costs about one-third lower than those of similar organizations In contrast, the Army uses a decentralized approach and also reports tht the SPI programm for one ofits organizational units has helped it almost double its
‘productivity in developing software
DOD-wide activities to promote and leverage component SPI programs do rot exist Aecording fundamental to continuous process improvement While wo organtzational 10 the IDEAL model, leveraging SPI experiences is “unite within the Office ofthe Secretary of Defense (OSD) that have {important leadership roles to play in department software and system,
processes are both taking steps almed at strengthening DOD software, ‘these steps do not specifically include SPL In particular, OSD des not have initiatives under way or planned to determine where in DOD SP! programs do and do not exist so that steps can be taken to promote programs in ‘component units where they do not, such asa DLA Similarly, actions do
"Rötexis to share informtion across the department about the experiences ‘of successful SPI programs, such as those within the Army, Navy, Alr Force, ‘nud DFAS, According o OSD officials, uncertainty about the casts versus
Trang 11
‘benefits of SPI, resource constraints, and other priorities have precladed such a focus Without such actions, DOD is missing opportunities to realize potential SPI benefits in all DOD components To adress this, we are ‘making recommendations to the Secretary of Defense
DOD provided written comments on a dealt ofthis report In commenting, DOD agreed that SP practices should be used and encouraged and that information about best practices should be shared, However, DOD stated
that itis premature at this point to mandate SPI programs throughout the ‘department, as we recommend, and that ihas established a working group to review how best to proceed While we believe that sufilent bases ‘curently exist to mandate SPI, particulary in ight of the evidence inthis
report on (I) components that are not implementing SPI in the absence of a ‘mandate and 2) the benefits being reported by components that are {implementing SP, we do not view DOD's desire to await the results ofits ‘working group as beng unreasonable or inconsistent with our recommendations Components’ SPI Program Alignment With SEI IDEAL™ Model Varies
‘The Army and Air Force units that we reviewed, as well as DFAS and two of the four Navy units have long-standing SPT programs that satisfy almost every task recommended inthe IDBAL™ model (se table I fora sumenary
‘ofhow cach component and te units, i applicable, compared to the ‘model, For exaniple, in 1096 the Secretary of the Army mandated that all, software development, acquisition, and maintenance activities establish ‘SPI programs Further the Amy requires that its software activities ‘continually improve their process maturity and has set maturity goals for all ofits unit, Army regulations also mandate that contractors be
‘evaluated for software process maturity Moreover, the two specific units ‘within the Army that we reviewed have SPI management structures, plans, And dedicated resources In addition, these units have continuously evolved in software and system process maturity through many years of asesaing their baseline process capabilites, implementing new andl Improved process inilatives, reassessing process maturity, and Implementing lessons learned Both Army units satisfy all IDBAL™ tasks
Trang 12—-—ễễễễễ= Tablet: Comparison of Components With IDEAL™ Model ESGGGGSGGÀở on
‘componeet Softwarfsysems nit Gonerallycatistieg? Am Commend ‘Canmunicaions Hacoucs _ Seware Engiioning Garay For Monmouth, Vee Ni
‘akon and aie Comand SofmusEnạnsemgDredomiefReddmne - Vm Tava vain Store Comrand, No ppieaio™ = bee Pansat er MD
Space snd Navel War Sytas Systems Cenex oman ‘San Diogo CA - vA te XE Em Ssbme Gemir Seg sone on Wwe Arras Yor
ato Stas Gov, Woah Patezan Ar Yo Au Force Academy Colesso Not npcae ve Sprngs,cO
‘iarme Cope Marine Cops Syste Caps Ser Conmand Warne Crp Totes Sysere Cpe aa Sage sm We ras Tremsien ni recta, Ainge, VA Top Ye oa Heasquarions For Bevo, VA Not appizable m3
“Tagg pn ta tapers es ah Be Comer ae eters gaan oe
In contrast, DLA, the Marine Corps and two of the Navy's four units that ‘we reviewed do not perform important IDEAL™ model tasks In particular, DLA currently doesnot satisfy any of the model's recommended tas ‘According to DLA offical, it had an SPI program prior to 1988, but at that time the program was terminated to reduce costs During (C10 stated that the agency plans to begin a new SPI program and has taken our review, DLAs
first step by assigning organizational responsibilty
‘The Marine Corps has many SPI activites under way that could form the foundation of a program However, itis not performing several key SPL tasks that are fundamental Marine Comps has assigned responsibilty for process improvement, and it to SPI program success For example, the
Trang 13
Components’ SPI
Management
Approaches Vary, Yet ‘All Report Positive Program Results
‘capability, However, tis not using this baseline as a basis for implementing ‘ecommercd improvements, or does it have an SPI plan or dedicated resources for these activities As such, the likelihood of the Marine Corps? process improvement initiatives producing desired results is diminished "Two of the four Navy software/systems units that we revieeedl also do not have SPI programs that are aligned withthe IDEAL! model To thele credit, however, one has recently taken the first step toward initiating a programa and the other has activities under way that could form the beginnings of ‘rogram (See appendix IV for more detailed results on each ofthe ‘components that we reviewed.)
‘The four components that have SPI programs—Amy, Ait Force, DFAS, and parts of the Navy—have differen approaches for directing and controling ther respective programs, raging from centralized to highly decentralized; each, however, repors positive results For example, DFAS has a
‘centralized approach, wit ts headquarters all SP acivties In contrast, the Arty, Alt Force, and Navy have office directing and controlling decentralized approaches to SPI program management, The Army, which bbgan its SPI program centrally, has since delegated SPL responsiblity to its commands, which—in the case ofthe two commands we reviewed—have further delegated SPI program management to their respective
software!systems units Sinilany the Air Force units that we reviewed further delegated SP! management to their respective sotware/systems units The Navy commands fellow different approaches-—one manages its ‘program centrally and the other has delegated SPI management to its ‘sofware/systems units
Despite diferent approaches, each DOD component/unit with an SPL rogram reports postive effects on software/systems quality: DFAS, for ‘example, reports that its SPI program has reduced its cost to deliver software Navy software activity reports reduced cost, improved product quality, to about one-third less than organizations of sinilar size One and a7: return on its SPI investment An Ary activity reports that it has almost doubled its productivity in writing software for new systems ‘because of improvements made under its SPI program (See appends 1V
Trang 14
DOD-Wide Efforts to Promote and Leverage SPI Programs Do Not Exist
Within OSD, the Assistant Seeretary for Command, Control,
‘Communications, and lnteligence i responsible for establishing and implementing DOD's policies, processes, programs, and standards ‘governing the development, acquisition, and operation of ronweapons systems software and information systems." Similarly, the Under Secretary {or Acquisition, Technology, and Logistics is responsible fr establishing,
‘DOD anquistion policies and procedures” Accordingly, OSD his an Important leadership role to playin ensuring that DOD components reap the maximum possible benefits of effective SPI programs Sich leadership
‘an include dissemination of polices and guidance promoting SPL programs and activities, knowledge ofthe nature and extent of
‘components SPI programs and activities, associated lessons leamed and best practices, and facilitation of SPI knowledge-sharing across DOD ‘components
oth OSD organizational units have efforts under way aimed at improving some aspects of DOD's ability to develop and acqire software and systems For example, they have established teams to conduct software Acquisition maturity assessments and established a software collaborators group They also are collecting software metrics and establishing training, for managers
However, OSD has no SPI actions under way or planned, such as issuing policy and guidance on SPI programs; determining where in DOD SPL prograns do and do not exis; promoting the establishment of programs in
‘component units, such as DLA, where they do not exist; and sharing ‘knowledge across DOD about the experiences of reportedly successful SPL programs, such as those within the Anny, Air Force, DFAS, and pars of the Navy According to OSD officals, uncertainty about the costs versus ‘benefits of SP, resource constraints, and other priorities have precluded such afocus However, as stated earlier inthis report, various
‘organizations including some DOD components, report positive returns on {nvestinent from SPI programs that arghe for SP being treated as a funding proxi
Trang 15
‘Several DOD components have SPI programs that realigned closely to the ‘Conctusions: best practices embodied inthe SEI IDEAL model and ts provide ‘excellent examples of SPL However, such programs are lacking in other
pars of the department Where they exist, these programs are being ‘redied with producing higher quality software and systems produets faster and at less expense, whether managed in a centralized or
decentralized fashion
(08D has an important leadership role to playin expanding <department In particular, it ean seize opportunities to build upon and SP! across the leverage the existing base of SPI programs within DOD's components and help ensure that ll of ts components realize the strategie value (Le,
‘benefits that exceed costs) that both private and public-sector ‘organizations, including some DOD components, attbute to these programs While OSD is faced with making funding choices among competing leadership inltiatves, such as is efforts to conduct software acquisition maturity assessments and collect software metrics, dese are some ofthe very tasks that are embedded within an effective SPI program ‘Thus, by ensuring that DOD components have effective SPI programs, OSD
an leverage programs to nde accomplish is ter he pronty Recommendations for psteien DIA Marne Corps and Navsoftmae and ates : : fevelopment, acquisition, and engineering processes, we reco Executive Action the Secretary the Marine Corps, and the Secretary ofthe Navy to establish SPI prograns of Defense direct the Director of DLA, the Commandant of
‘where this report shows none currently exist In so doing, these officials sould consider following the best practices embodied in the SEI IDEAL ‘model and drawing some Navy unis from the experiences ofthe Any, Air Force, DPAS, and ‘Further, to strengthen DOD-vide SPI, we recommend thatthe Secretary of Defense direct the Assistant Secretary of Defense for Command, Control, Communications, and intelligence, in collaboration withthe Under
Secretary of Defense for Acquisition, Technology, and Logistics, to (1) issue policy requiring DOD components that are responsible for
Trang 16
‘examples of SPI within the Anny, Air Force, DPAS, and some Navy nits ‘ted in his report
We also recommend that the Secretary direct the Assistant Secretary for Command, Control, Communications, ‘determine the components compliance with the SPI policy and and Intelligence to (1) annually
(@) establish and promote a means for sharing SPI lessons leamed and best practices knowledge throughout DOD
‘Agency Comments and ln writen comments on a draft ofthis report, the Deputy Assistant
Our Evaluation agreed withthe reports message that SPI practices should be used and Secretary of Defense for Command, Control, Communications, and Intelligence, whois also the DOD Deputy Chief Information Officer (C10), fencouraged, and that information about SPI practices should be shared ‘among DOD components To this end, and since receiving a draft ofthis
report, the Deputy CIO stated that the Under Secretary of Defense
(Acquisition, Technology, and Logistics) has established a working group" {hat s, among other things, to develop a plan for implementing SPI
According tothe Deputy CIO, this plan willbe ready for internal review in ‘April 2001
Further, the Deputy CIO stated that a January 2001 revision to DOD Regulation 50002-R” represents a policy step toward addressing software improvement by including in the regulation a section on software
‘management, According tothe Deputy CIO, while this section does not specifically call for an SPI program, the regulation provides guidance for Improving software by using, for example, SEI Capability Maturity Model level 3 or ts equivalent for major acquisition programs with procurement costs in excess of $219 billion *
‘il group cle the ncopenlen Raper Program dew Wag Gap owas ssubtied Janay 200, tern Regulation 50022, “Mandatory Procdhres fr Major Defense Aegon "ganar toa ean Ste Anton Pr” Cana 4 “tr Regulation 50028 refers to dese progam as AcqustionCaegury (ACA 1 programs
Trang 17Tnlight ofthe above, the Deputy CIO stated that DOD agreed with our recommendation to establish and promote a means for sharing SPI lessons learned and best practices knowledge throughout DOD, and add that a DOD steering group,” which was chartered during the course of our review, has been assigned responsiblity for this function, However, the Deputy ClO disagreed with our recommendation that DOD issue apoliey to ‘mandate SPI programs for all DOD components and their relevant
activities According to the Deputy CIO, establishing a policy requiring oF otherwise directing DOD components that donot have SPI programs to lunplement them would be premature at this time because there are insufficient data to Justify the sole use ofthe SELIDBAL™ model and that ‘unless a specific model were used, compliance with such a policy oF dlrective would be problematic Therefore, the Deputy CIO stared a ‘decision regarding the issuance of DOD-wide policy mandating the {implementation of SPI programs would not be made until the work group ‘reports its results and develops its plan for implementing SPL At this point and without the work group's findings, according to the Deputy ClO, {issuance of SPI guidance (as opposed to *policy”) would be “a more Deneficial approach.”
In our view, the Deputy C10’ comments are not inconsistent with our recommendations, and our point of disagreement appears to center around sump the timing of actions rather than the recommended actions
‘hemselves Specifically, while we continue to believe that suficent bases currently exist for issuance of a DOD SPI policy requirement, especially in light ofthe evidence in our report that (1) without this requirement not all ‘components are implementing SPI and (2) those components that are currently implementing SP1 are reporting substantial benefits, its reasonable for DOD to await its work group's results before malang & {ecision on how to proceed Further, we agree with the Deputy CIOS ‘comment that there ae insicient cata to ustfy citing in DOD poliey the SELIDBAL™ model a the single model for SP Our report recognizes that notall ofthe DOD components that we cited as having effective SPL programs not prescribe are using the same model Asa result, our recoznmendations did a specific SPI model Instead, we recommended that in developing SPI policy and associated guidance, DOD should consider basing this dance on the SEI IDEAL" model as well a the positive
“is oap scaled he Stiware nerve Sens Sern Grow was area =——
Trang 18
‘examples of SPI within the Army, Air Force, DFAS, and some Nasy units cited in the report
Regarding the Deputy CIO comment that DOD has recently revised! DOD Regulation 50002-R to inchude guidance for improving software
management through the use of, for example, SEI Capability Maturity Model leve 3, we note that level 3 requirements include performance of process improvement practices that are expanded upon bythe SEI IDEAL model Additionally, we note that the regulation does not apply ta all DOD software/system programs but, rather, only to aequisition
‘rogramns that exceed a certain dollar threshold Therefore, the revised ‘regulation does not fall the intent of our recommendations, DOD's writen comments, along with our responses, are reproduced in appendix I
‘We are sending copies ofthis eport to Senator John Warner, Senator Carl Levin, Senator Ted Stevens, Senator Daniel Inouye, and to Representative Bob Seurmp, Representative tke Skelton, and Representative CW Bi ‘Young, in their eapacities as Chairmen, Ranking Members, of Ranking Minority Members of Senate and House Committees and Subcommittees Inaddition, we are sending copies ofthis report to the Secretaries of the ‘Ammy, Navy, and Air Force; the Commandant of the Marine Corps; the Directors of DLA and DFAS, and the Director, Office of Management and Budget Copies will also be available at GAO's web site, www gao.go% Ifyou have any questions about this report, please contact me at (202) 512- 399 or by e-mail at hter@gao go Key contributors to this reportae listed in appendis V
2⁄2
Randolph C Hite
Director, Information Technology Systems Issues
Trang 19Appendix 1
Objectives, Scope, and Methodology
‘Our objectives were to (1) compare selected DOD components? SPI ‘programs against SETs IDEAL™ model, which is a recognized best ‘Prictices model; (2) determine how these components have approached ‘management of thelr SPI programs and what program results they are reporting; and (3) determine what DOD-wide efforts are under way to promote and leverage the components’ SPI programs The selected ‘components include all four services—Army, Air Force, Navy, Marine CCorps—and two DOD agencies that have lage, software intensive system
rodemization programs under way-—the Defense Finance and Accounting Service (DFAS) and the Defense Logistics Agency (DLA)
‘To address the first objective, we reviewed the components respective information technology strategic plans as well as available SPI policies, fuidance, and program documentation, and interviewed headquarters officials from each component Using this information, we frst ascertained ‘whether SPI programs or activities existed for a component, and 50, How they were organized and stractured For the components in which we
{ound SPI programs or activities, we then identified the units within the ‘components responsible for implementing those programs and activities Ininstances in which these responsibilities were decentralized (Army, Alt Force, and Navy), we worked with component headquarters and command officials to select atleast two units in each component (@) had mussions involving both software intensive weapons and business that collectively systems and (2) were reeporuable for the largest percentages of software and systems development, acquisition, and engineering activities within ‘each comporient Table 2 shows the DOD conponents ane
softwarefsystoms nits where we reviewed SPI programs and activites ‘Where “not applicable” is indicated in the table, SPI responsibility resided atthe “Command/major organizational unit,” and therefore our work dia hot extend toa “Software/sytems ni”
Trang 20AE “—.ửừ._ừ_ò
“Tnhle2: SofeerSyelone Une 8eleeted for Relew
‘Gampanent — Cemmaranajer organizational unit SSiwaslysemsuni _ ny ‘Comunieatne- Electonics Command Sofware Engineering Geno
‘Raion a wie Command Sofware Engng Desert Navy Taal Avan Stara Command Net aspicab
‘Space end eval Wear Systeme Sysans Gener Command ‘Sen Diogo me
‘irFern ocvone Sytem n "an xin IS Mater Systems cup ‘laine Gaps Maro Corp Stone Conmand Marine Corps Tacieal ‘Systems Support Acity
oa TRSTalonand Techno Beste Not appheabes
For each unit that we identified as being responsible for implementing an SPI program or activites, ye analyzed relevant SPI program
documentation, including program descriptions, plans, budgets, and progress and performance measures and reports, an interviewed program officials We then compared ths information with the SPI tasks specified and described in SET IDEAL™ model to determine whether the programm satisfied the model
Designed to assist onganizations in implementing and managing effective SPI programs, the SEF-developed IDEAL™ model comprises fve specifi phases; sixth element addresses overall management ofthe five phases, ‘Table 3 provides more information about the tasks involved in each phase ‘Table 4 Usts every task included under each phase,
Trang 21Offoatve Seope, ant Metodoony “able 9: Phases ofthe IDEAL™ Model Pras "Paseiplonor ypioiaste
Tan, Senor managers oslabish SPI pga suture, Gli lon, steal resource, ad develop apa to gu (0a) ‘fecrganaton tough he Estaiohing pao: management coment ons cng esas Ts Koy ‘Srstral componoria eablshd nthe phso area ranagerent stering poup ahd eotwae engbsemma Peccss gio (SEPG) Bisgreaing 1 basen capabnyagarel 'SEPG—wi ie managats and praciiones—appracs ha lve of coware process au i lane wren lo racute progress ry xising process improverot faves re
ofc slong wh mesineasea and neesed error
Em đen 'SEPG, tne ranages, and practioner protze SP ates and devlop Hates ane an aco lan, Ineladng measurable goal nd mets fr molting progiss, Rescues needed to mpimont tbe pan a0 ‘omni and Veing i provod for occa Wort oupe fa Joep and is Nw or mgroves process ing Fiona lene we ooaueg eau nev alnpietpooneay coved Baloch (60a) ‘pouna esis scone, pares dowloped er fgariabonwide aaeptan, nebitanaleaton, a
Laesgne (Tasks) ha Marsoing DI suse que an proaos “Senior manager rau hel deisana Ters 262i onrpìn:atonangeds and ta ta mansgsmanE SP task
Teak
hen lao dep Organize dacovery aan fodewiop« Số mo3Hm proposal io management lEenhh boss neeo:anddrves r hhemnsmanE
Bead an SP propa cate and bl eu
‘Goin approve for SP propel adi Tsou Enbish SẽIninsnsame ‘sess te nae for SPL Detne genera 1 aes
Beine gướng phpopsssl SE) mg Launch te progr
Trang 22ae c.à (Gerrued From Provous Pago) Phase Task
‘iagrosng — Detare wa bassinet) are neated Panly baseinei) ‘ond baseinate) Fresno
Devoop fra ndings and rcorvmondations reser
“Tarelorm general 51 gone fo mansiabe goa ‘restoupaao SP sepe pan
‘a consensus row, prove SP sain pls and Sone ‘Deeipresut svategy an plan ena
Package improvement and rn orto SEPG ‘isbn eerie! woking group
Trang 23icv Scope, and Methodatonr (Contre Fe Provios Page) Phase ‘Coninue wih SI ‘Managing Sete stage for SFI Organs he SP negra
Plane SPI program Sate SP progam ono the SPI progam
Diet ha $1 Program CS ———— ‘To address the second objective, we analyzed the aforementioned information, conducted additional interviews, and reviewed additional ‘program information from the component units to which SPI management responsibilty had been delegated As part of this objective, we also
reviewed program progress and performance reports and discussed program accomplishments with responsible officials to identify examples ‘of SPI benefits We then analyzed each components SPI program results in ‘elation to its program management approack to determine whether any
patterns were evident We did not independently validate components reported accomplishments and benefits,
‘To adress the third objective, we interviewed responsible component officials, reviewed supporting records and documentation, and visited Internet sites to identify SPI program best practices and lessons leamed, along with what efforts are being made to share these with other activities and components throughout the department We also identified two offices ‘within the Office ofthe Secretary of Defense (OSD) that have responsibility and activities underway relating to the advancement of software and system management practice in the department—the Office of the Deputy Under Secretary of Defense for Acquisition, Technology, and Logistics; and the Office ofthe Assistant Secretary of Defense for Command, Control,
Communications, and Intelligence For each office, we analyzed
documentation describing their respective ongoing and planned activities and interviewed officials n doing so, we focused on identifying any activities that specifically promoted and leveraged SPI programs ancl activities under way throughout DOD We also discussed with SPI program officials in each component their awareness of the OSD efforts,
We performed our work at Army headquarters, the Pentagon, Arlington, ‘Virginia; and interviewed officials and reviewed documentation from the Communications Electronics Command Software Engineering Center at Fort Monmouth, New Jersey; and the Aviation and Missile Command
Trang 24
xa
tfc Senge and Methodolge
‘Software Bagineering Directorate at Redstone Arsenal, Alabanva We ao performed our work at Navy headquarters in Arlington, Virginia; and Inwerviewed officials and reviewed documentation from the Naval Aviation ‘Systems Command at Patuxent River, Maryland; and the Space and Naval Warfare Systems Command Centers at San Diego, Califor; Chesapeake, ‘Virginia; and Charleston, South Carolina We also interviewed officials and reviewed documentation from the Air Force's Electronic Systems Center, Standard Systems Group at Maxwell Air Force Base, Alabama; the Materiel Systems Group at Wright Patterson Air Force Base, Ohio; and the Air Force ‘Academy in Colorado Springs, Colorado, We also performed our work at Marine Comps headquarters in Arlington, Vigna; and interviewed officials sand reviewed documentation from the Marine Corps Systems Command in Quantico, Virginia; and the Marine Comps Tactical Systems Support Activity a Camp Pendleton, California, We also performed work at DEAS
headquarters in Arlington, Virginia; and DLA headquarters at Fort Belvoir, ‘Virginia We condacted our work from March through December 2000, in accordance with generally accepted government auditing standards
Trang 26Fe Ee From he Deurinent of Defense ‘See comment ‘See commer2
"GAO CODE S116860SD CASE 326
"B00 INFORMATION TECHNOLOGY: Stan ae pn rca pret rams Vary Une’ be Pron ‘AECOMMESDAION: he GAO ceoss.=t Sry ofr Eee of UA Coss ee Cp te Seay ‘io pages chee op hws esse en, re ie
‘Sie Awtg ee pte ngai nàn Vẽ BA oi mm eeeng [entering Kirfoe DEAS an ame Ky (De Rg
PS -—
‘ca Ep rn Review ots amp (PRC) med he nie Serf Den ne Yee Lah lon
See Re ng HC gine 050 ep set mee init a fre ce ot ene tas ant ott aap Ta ee oe se Erte pa a mip nn ero tr DEAL pos, Tea, ————-— -
Trang 27‘mens From th Department of Defense
‘ee comment’3 BDABSBDMSE ly com Te Aitc Secey of Dn Corman, Cs, ‘Sree alanis
{Recreate nd et cn tone etd ono (Cheer amehe sis SGconanememtea fe Sein nancy ‘stent Or Sey oD Sea eno
Trang 28
‘tines From the Department of Defense
_
Setar manne ein ne en ae oD Rapin 002% “Mamet or Dew eatin Pre ADAP na ‘smal neuen Sc AA) Age gran ae asin 9S Tt (ten mer ptey eset pclae ee mga este
Bw Rone te ode bod 3 on wep 1:26 sotwar Managua
‘hat arat mange tenia starintete slams ung bel pecans ‘2 pia ew ora cl sch atone a ‘eM Shatoe star yrs conan ae doar en sens anny acne a ncn Hư
Tig adn bases stor yams ht gear one ym ‘Sor Canc soe net ae ro ‘Sowremars ban on ror nuns canto
+ Soe Rtpeperang egone nce ent an sare S59 xvng ose th tne onal Wea casa nasal
+ ecard Ss ton aint p00 +See contace win doman arene emony cress thea ‘ew ecu pal prereset sare em
‘Ecpnectortngette ren ov Ach oars en 2 selina Sra yt Scan gr inet (eh how prod yb 0 ‘Si Ssouiyo Cc ert nd ecrsogy CURDS) AS nay
Trang 31ae ‘Comenta rom the Deparment o Defense re
Trang 35The following are GAO's comments on the Department of Defense letter dated March 2, 2001
GAO Comments 1 We disagree Sufficient policy requirement, especially in ght of the evidence in our report that bases currently exist for issuance of a DOD SPI (CE) without this requirement not all components are implementing SPL ‘and ©) those components that are currently implementing SPI are reporting substantial benefits, Nevertheless, DOD's decision to await, ‘an OSD work group's results before making a decision on how to
proceed isnot unreasonable o inconsistent with our position, 2, See response to comment I
3 We disagree Oversight is an important part of policy implementation, and! without such oversight, DOD would incur significant risk that thê policy would not be implemented Further, establishing a baseline ‘measure to determine compliance does not require the implementation ‘fa speeific model, The intent of our recommendations isto establish a policy requiring SP that recognizes, as our report recognizes, that there {smore than one model for doing so effectively
Trang 36Appendix UI
Description of SEI Capability Maturity Models
Sinee 1984, the Software Engineering Institute (SET) has worked to
improve management of software/systems productivity and quality
primarily by addressing problems in acquiring, developing, engineering, or ‘enhancing softwarelsystems through a series According to SEI, an organization's process capability provides a means of| of capability maturity models
predicting the most likely outcome of the next software/systems project undertaken; process maturity impltes thatthe productivity and quality ‘resulting from an organization's software/systems processes can be improved as maturity of the processes increases The IDEAL™ model is ‘based on lessons leamed from SEl experiences aswell as from SBI projects| ‘lating to software process capability and maturity For example, during the initiating phase ofthe IDEAL™ model, general SPI program goals are defined, and this deSinition could be in terms of capability maturity model levels In the diagnosing phase, IDEAL™ recommends developing an ‘organization process maturity baseline; SE's capability maturity model— based appraisal is one way of establishing this baseline
‘The fst of these capability maturity models, the Software Capability ‘Maturity Model® (SW-CMM@), was designed to assist organizations in ‘improving sofware development and maintenance processes In this :model, software process maturty—ranked froma low ofleval Ito ahigh of
level 5—serves as an indicator of the likely range of software cost, schedule, and quality that can be expected to be achieved by projects developed within an organization (See figure 2.)
Trang 37
Process and posac quay re Eoleted Bh the soar process
ded cred
consistent | The software proces froth management
J Base projet management processes are | ectached to track cont sd and
—
Sinee the SWCMMG® ws published, SET has đeveloped addional models Inthe capabilty maturity series:
Trang 38
oan
độn lo ors ape aay
+ “The Software Acquisition CMM® isa model for improving the software acquisition process follows the same five-level architecture asthe 'SW-CMIM® but emphasizes acquisition issues and the needs of
individuals and groups planning and managing software acquisition activites
‘+ The Systems Engineering CMM@ describes the essential elements ofa ‘organization's systems engineering process and provides a reference for ‘comparing actual systenns engineering practices against these elements "The model addresses the process aspects of systems engineering and the product development portion of the life cycle, This model was a collaboration of several organizations including SEL
+ In 1997 a tam led by DOD, in conjunction with SEI, govemment, and industry, concentrated on developing an integrated framework for ‘maturity models and associnted products The result was the CMM Integration™ (CMMI) which is intended to provide guidance for Improving an organization's processes and the ability to manage the evelopment, acquisition, and maintenance of products and services, hile reducing the redundancy and inconsistency caused by using standalone models
‘The CMO™ combines earlier models from SEI and the Electronic Industries Alliance? into a single model for use by organizations pursuing ‘enterprise-wide process improvement, However, the prototype CMMI
‘does not include the acquisition features ofthe SA-CMM® because the team wanted to focus fist on the development process A CMMI that includes coverage for acquiring software intensive systems is currently being developed Additional disciplines may also be covered Ultimately, ‘the CMM i to replace the models that have been its starting point
GN egal
2h Eero nes Allance tre ration epcemeing ob 2110) cunts voted rte desi tuafchet,ụ sử í era ers component ‘sees, astm or eet comercial start al aeropace se
Trang 39Appendix 1
Detailed Results of Review of DOD
Components’ SPI Programs
‘Army SPI Program
Background major commands and every facet of ts mission, from weapons to financal "The Army depends on softwareintensive systems to support each ofits
management systems The Army budgeted about $3 billion on {information technology during fiscal year 2000
The Army has assigned responsibility for information systems to the Army Materiel Command (AMC) Several major subcommands function under AMC Three of these major subcommands—the Conumunications- Electronics Command (CECOMD, the Aviation and Missile Command (AMCOM), and the TankeAutomotive Comamand—are responsible for the acquisition, development, engineering, and maintenance of information technology for the Army We reviewed the Army's SP activities at CECOM and AMCOM
(CECOM has assigned responsibility for information ystems to its Sofware Engineering Center (SEC) The center, located at Fort Monmouth, New Jersey s supported by several software/systems activites located across ‘the United States The center is responsible for overseing about 85 percent of the Army's ystems, including (1) command, contro,
‘communications and computers; (2) intelligence, electronic warfare, and ‘sensors (3) sustaining base/power projection; and (4) AMC business systems,
Trang 40ena ‘tae Heats ot Review of DOD roe Paral Army Organization Ghat Rigging Unt Reeponsibe fr Softwar/Syeteme [ ARMY HEADQUARTERS THB 0 compen ose! ———¬
Army's SPI program activities began inthe early 1900s; in mi-1996 the Secretary mandated tha ll Army sofware acquisition, development, ‘maintenance activities establish SPI programs AC the same time, the Army an
published an SPI policy’ that specified two requirements:
‘+ First, contractors capability to produce quality software wil be part of the Army’s source selection evaluation process The Army has Implemented this requirement by evaluating potential contractors, against SW-CMM@ level 3 maturity requirements and requiring
contractors that do not meet these requirements to propose a strategy {or mitigating the risks associated with not meeting them, This requirement i further enforced during milestone reviews of major "Res PT pales now par of Ay Regulation TH