1. Trang chủ
  2. » Ngoại Ngữ

CMMISM Model Components Derived from CMMISM-SESW, V1.0 Appendixes Continuous Representation

91 5 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Pittsburgh, PA 15213-3890 CMMISM Model Components Derived from CMMISM-SE/SW, V1.0 Appendixes Continuous Representation CMMI Product Development Team August 2000 Unlimited distribution subject to the copyright CMMI Model Components Continuous Representation CMMI Model Components Continuous Representation This work is sponsored by the U.S Department of Defense The Software Engineering Institute is a federally funded research and development center sponsored by the U.S Department of Defense Copyright 2000 by Carnegie Mellon University NO WARRANTY THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN “AS-IS” BASIS CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT Use of any trademarks in this report is not intended in any way to infringe on the rights of the trademark holder Internal use Permission to reproduce this document and to prepare derivative works from this document for internal use is granted, provided the copyright and “No Warranty” statements are included with all reproductions and derivative works External use Requests for permission to reproduce this document or prepare derivative works of this document for external and commercial use should be addressed to the SEI Licensing Agent This work was created in the performance of Federal Government Contract Number F19628-00-C-0003with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center The Government of the United States has a royalty-free government-purpose license to use, duplicate, or disclose the work, in whole or in part and in any manner, and to have or permit others to so, for government purposes pursuant to the copyright license under the clause at 52.227-7013 The following service marks and registered trademarks are used in this document: Capability Maturity Model CMM CMM IntegrationSM CMMISM Capability Maturity Model and CMM are registered trademarks in the U.S Patent and Trademark Office CMM Integration and CMMI are service marks of Carnegie Mellon University CMMI Model Components Continuous Representation CMMI Model Components Continuous Representation Table of Contents Appendixes A References Publicly Available Sources Sources Not Publicly Available B C D Acronyms Glossary Required and Expected Model Elements Process Management 10 11 14 40 41 Organizational Process Focus 42 Organizational Process Definition 44 Organizational Training 45 Organizational Process Performance 47 Organizational Innovation and Deployment 48 Project Management 50 Project Planning 51 Project Monitoring and Control 53 Supplier Agreement Management 55 Integrated Project Management 57 Risk Management 59 Quantitative Project Management 61 Engineering 63 Requirements Management 64 Requirements Development 65 Technical Solution 68 Product Integration 71 Verification 73 Validation Support 75 76 Configuration Management 77 Process and Product Quality Assurance 79 Measurement and Analysis 80 Causal Analysis and Resolution 82 Decision Analysis and Resolution 83 Generic Goals and Generic Practices E F CMMI Project Participants Equivalent Staging 84 87 89 CMMI Model Components Continuous Representation CMMI Model Components Continuous Representation Appendixes CMMI Model Components Continuous Representation A References Publicly Available Sources The following documents were used in the development of the CMMI Product Suite and are publicly available Bate 95 Bate, Roger, et al., Systems Engineering Capability Maturity Model, Version 1.1, Enterprise Process Improvement Collaboration and Software Engineering Institute, Carnegie Mellon University, November 1995 Crosby 79 Crosby, P B Quality is Free New York, New York: McGrawHill, 1979 Curtis 95 Curtis, Bill; Hefley, William E.; & Miller, Sally People Capability Maturity Model (CMU/SEI-95-MM-002) Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, September 1995 Deming 86 Deming, W Edward Out of the Crisis Cambridge, MA: MIT Center for Advanced Engineering, 1986 DoD 91 Department of Defense DoD Directive 5000.1: Defense Acquisition Washington, DC: Department of Defense, 1991 DoD 96a Department of Defense DoD Regulation 5000.2: Mandatory Procedures for Major Defense Acquisition Programs and Major Automated Information Systems Washington, DC: Department of Defense, 1996 DoD 96b Department of Defense DoD Guide to Integrated Product and Process Development (Version 1.0.) Washington, DC: Office of the Under Secretary of Defense (Acquisition and Technology), February 5, 1996 Available WWW DoD 98 Department of Defense Defense Acquisition Deskbook, Version 3.2 Available WWW (Note this is continually updated.) References CMMI Model Components Continuous Representation Dunaway 96 Dunaway, D & Masters, S CMM-Based Appraisal for Internal Process Improvement (CBA IPI): Method Description (CMU/SEI-96-TR-007) Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, April 1996 EIA 94 Electronic Industries Association EIA Interim Standard: Systems Engineering (EIA/IS-632) Washington, D.C.: Electronic Industries Association, 1994 EIA 95 Electronic Industries Association EIA Interim Standard: National Consensus Standard for Configuration Management (EIA/IS-649) Washington, D.C.: Electronic Industries Association, 1995 EIA 98 Electronic Industries Association Systems Engineering Capability Model (EIA/IS-731) Washington, D.C.: Electronic Industries Association, 1998 Available WWW FAA 97 Federal Aviation Administration-Integrated Capability Maturity Model, Version 1.0 Available WWW , November 1997 Ferguson 96 Ferguson, Jack; Cooper, Jack; Falat, Michael; Fisher, Matthew; Guido, Anthony; Marciniak, Jack; Matejceck, J.; & Webster, R Software Acquisition Capability Maturity Model Version 1.01 (CMU/SEI-96-TR-020) Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, December 1996 Herbsleb 97 Herbsleb, James; Zubrow, David; Goldenson, Dennis; Hayes, Will; & Paulk, Mark "Software Quality and the Capability Maturity Model." Communications of the ACM 40, (June 1977): 30-40 Humphrey 89 Humphrey, Watts S Managing the Software Process Reading, MA: Addison-Wesley, 1989 IEEE 90 Institute of Electrical and Electronics Engineers IEEE Standard Computer Dictionary: A Compilation of IEEE Standard Computer Glossaries New York, New York: Institute of Electrical and Electronics Engineers, 1990 INCOSE 96 Systems Engineering Capability Assessment Model, Version 1.50, International Council on Systems Engineering, June 1996 ISO 87 International Organization for Standardization ISO 9000: International Standard New York, New York: International References CMMI Model Components Continuous Representation Organization for Standardization, 1987 ISO 95 International Organization for Standardization & International Electrotechnical Commission Information Technology: Software Life Cycle Processes (ISO 12207) Geneva, Switzerland: International Organization for Standardization/International Electrotechnical Commission, 1995 JLC 96 Joint Logistics Commanders Practical Software Measurement: A Guide to Objective Program Insight Newport, RI: Department of the Navy, Naval Undersea Warfare Center, 1996 Juran 88 Juran, J M Juran on Planning for Quality New York, New York: MacMillan, 1988 Masters 95 Masters, S & Bothwell, C CMM Appraisal Framework (CMU/SEI-95-TR-001) Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, February 1995 Paulk 93 Paulk, M C., Curtis, B., Chrissis, M B., & Weber, C V Capability Maturity Model for Software, Version 1.1 (CMU/SEI-93-TR-024, ADA 263403), Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, 1993 Paulk 98 Paulk, Mark Software Capability Maturity Model (SWCMM®) Case Study Bibliography [online] Available WWW (1998) SEI 97 Software Engineering Institute Software CMM, Version (Draft C) Available WWW , Oct 22, 1997 SEI 98 Software Engineering Institute CMMI A-Specification, Version 1.3 Available WWW , July 15, 1998 SPMN 97 Software Program Managers Network Program Managers Guide to Software Acquisition Best Practices, Version V.2 Available WWW , April 1997 References 10 CMMI Model Components Continuous Representation SUPPORT Support 77 CMMI Model Components Continuous Representation CONFIGURATION MANAGEMENT Support The purpose of Configuration Management is to establish and maintain the integrity of work products using configuration identification, configuration control, configuration status accounting, and configuration audits Practices by Goal: SG Establish Baselines Baselines of identified work products are established and maintained SP 1.1-1 Identify Configuration Items Identify the configuration items, components, and related work products that will be placed under configuration management SP 1.2-1 Establish a Configuration Management System Establish and maintain a configuration management and change management system for controlling work products SP 1.3-1 Create or Release Baselines Create or release baselines for internal use and for delivery to the customer SG Track and Control Changes Changes to the work products under configuration management are tracked and controlled SP 2.1-1 Track Changes Track change requests for the configuration items SP 2.2-1 Control Changes Control changes to the content of configuration items Support, Configuration Management 78 CMMI Model Components Continuous Representation SG Establish Integrity Integrity of baselines is established and maintained SP 3.1-1 Establish Configuration Management Records Establish and maintain records describing configuration items SP 3.2-1 Perform Configuration Audits Perform configuration audits to maintain integrity of the configuration baselines Support, Configuration Management 79 CMMI Model Components Continuous Representation PROCESS AND PRODUCT QUALITY ASSURANCE Support The purpose of Process and Product Quality Assurance is to provide staff and management with objective insight into the processes and associated work products Practices by Goal: SG Objectively Evaluate Processes and Work Products Adherence of the performed process and associated work products and services to applicable process descriptions, standards and procedures is objectively evaluated SP 1.1-1 Objectively Evaluate Processes Objectively evaluate the designated performed processes against the applicable process descriptions, standards and procedures SP 1.2-1 Objectively Evaluate Work Products and Services Objectively evaluate the designated work products and services against the applicable process descriptions, standards, and procedures SG Provide Objective Insight Noncompliance issues are objectively tracked and communicated, and resolution is ensured SP 2.1-1 Communicate and Ensure Resolution of Noncompliance Issues Communicate quality issues and ensure resolution of noncompliance issues with the staff and managers SP 2.2-1 Establish Records Establish and maintain records of the quality assurance activities Support, Process and Product Quality Assurance 80 CMMI Model Components Continuous Representation MEASUREMENT AND ANALYSIS Support The purpose of Measurement and Analysis is to develop and sustain a measurement capability that is used to support management information needs Practices by Goal: SG Align Measurement and Analysis Activities Measurement objectives and practices are aligned with identified information needs and objectives SP 1.1-1 Establish Measurement Objectives Establish and maintain measurement objectives that are derived from identified information needs and objectives SP 1.2-1 Specify Measures Specify measures to address the measurement objectives SP 1.3-1 Specify Data Collection and Storage Procedures Specify how measurement data will be obtained and stored SP 1.4-1 Specify Analysis Procedures Specify how measurement data will be analyzed and reported SG Provide Measurement Results Measurement results that address identified information needs and objectives are provided SP 2.1-1 Collect Measurement Data Obtain specified measurement data Support, Measurement and Analysis 81 CMMI Model Components Continuous Representation SP 2.2-1 Analyze Measurement Data Analyze and interpret measurement data SP 2.3-1 Store Data and Results Manage and store measurement data, measurement specifications, and analysis results SP 2.4-1 Communicate Results Report results of measurement and analysis activities to all affected stakeholders Support, Measurement and Analysis 82 CMMI Model Components Continuous Representation CAUSAL ANALYSIS AND RESOLUTION Support The purpose of Causal Analysis and Resolution is to identify causes of defects and other problems and take action to prevent them from occurring in the future Practices by Goal: SG Determine Causes of Defects Root causes of defects and other problems are systematically determined SP 1.1-1 Select Defect Data for Analysis Select the defects and other problems for analysis SP 1.2-1 Analyze Causes Perform causal analysis of selected defects and other problems and propose actions to address them SG Address Causes of Defects Root causes of defects and other problems are systematically addressed to prevent their future occurrence SP 2.1-1 Implement the Action Proposals Implement the selected action proposals that were developed in causal analysis SP 2.2-1 Evaluate the Effect of Changes Evaluate the effect of changes on process performance SP 2.3-1 Record Data Record causal analysis and resolution data for use across the project and organization Support, Causal Analysis and Resolution 83 CMMI Model Components Continuous Representation DECISION ANALYSIS AND RESOLUTION Support The purpose of Decision Analysis and Resolution is to make decisions using a structured approach that evaluates identified alternatives against established criteria Practices by Goal: SG Evaluate Alternatives Decisions are based on an evaluation of alternatives using established criteria SP 1.1-1 Establish and Use Guidelines for Decision Analysis Establish and use guidelines to determine which issues are subject to a structured decision analysis and resolution process SP 1.2-1 Select Evaluation Techniques Select the decision-making techniques SP 1.3-1 Establish Evaluation Criteria Establish the evaluation criteria and their relative ranking SP 1.4-1 Identify Proposed Alternatives Identify alternative solutions to issues SP 1.5-1 Evaluate Alternatives Evaluate alternative solutions using the documented criteria SP 1.6-1 Select Solutions Select solutions from the alternatives based on the evaluation criteria Support, Decision Analysis and Resolution 84 CMMI Model Components Continuous Representation GENERIC GOALS AND GENERIC PRACTICES GG Achieve Specific Goals The process supports and enables achievement of the specific goals of the process area by transforming identifiable input work products to produce identifiable output work products GP 1.1 Identify Work Scope Identify the scope of the work to be performed and work products or services to be produced, and communicate this information to those performing the work GP 1.2 Perform Base Practices Perform the base practices of the process to develop work products and provide services to achieve the specific goals of the process area GG Institutionalize a Managed Process The process is institutionalized as a managed process GP 2.1 Establish an Organizational Policy Establish and maintain an organizational policy for planning and performing the process GP 2.2 Plan the Process Establish and maintain the requirements and objectives, and plan for performing the process GP 2.3 Provide Resources Provide adequate resources for performing the process, developing the work products, and providing the services of the process Generic Goals and Generic Practices 85 CMMI Model Components Continuous Representation GP 2.4 Assign Responsibility Assign responsibility and authority for performing the process, developing the work products, and providing the services of the process GP 2.5 Train People Train the people performing or supporting the process as needed GP 2.6 Manage Configurations Place designated work products of the process under appropriate levels of configuration management GP 2.7 Identify and Involve Relevant Stakeholders Identify and involve the relevant stakeholders as planned GP 2.8 Monitor and Control the Process Monitor and control the process against the plan and take appropriate corrective action GP 2.9 Objectively Evaluate Adherence Objectively evaluate adherence of the process and the work products and services of the process to the applicable requirements, objectives, and standards, and address noncompliance GP 2.10 Review Status with Higher-Level Management Review the activities, status, and results of the process with higher-level management and resolve issues GG Institutionalize a Defined Process The process is institutionalized as a defined process GP 3.1 Establish a Defined Process Establish and maintain the description of a defined process Generic Goals and Generic Practices 86 CMMI Model Components Continuous Representation GP 3.2 Collect Improvement Information Collect work products, measures, measurement results, and improvement information derived from planning and performing the process to support the future use and improvement of the organization’s processes and process assets GG Institutionalize a Quantitatively Managed Process The process is institutionalized as a quantitatively managed process GP 4.1 Establish Quality Objectives Establish and maintain quantitative objectives for the process about quality and process performance based on customer needs and business objectives GP 4.2 Stabilize Subprocess Performance Stabilize the performance of one or more subprocesses of the process to determine its ability to achieve the established quantitative quality and process performance objectives GG Institutionalize an Optimizing Process The process is institutionalized as an optimizing process GP 5.1 Ensure Continuous Process Improvement Ensure continuous improvement of the process in fulfilling the relevant business goals of the organization GP 5.2 Correct Common Cause of Problems Identify and correct the root causes of defects and other problems in the process Generic Goals and Generic Practices 87 CMMI Model Components Continuous Representation E CMMI Project Participants The following people were involved in the CMMI project as product development team members, steering group members, or members of the stakeholder/reviewer team Ahern, Dennis Albert, Cecilia Allgood, Bruce Angstadt, Kim Armstrong, Jim Austin, Darryl Bailey, Mike Baker, Michele Barsotti, Dennis Basili, Victor Bate, Roger Baxter, Brent Bennett, Dan Billi, Joseph Blasewitz, Bob Blazy, Louis Blyler, John Briganti, Kristine Brown, Alan Brown, Leroy Capell, Peter Carter, Dennis Castellano, Dave Cattan, Denise Cavanaugh, Mark Cepeda, Sandra Chittister, Clyde Chrissis, Mary Beth Clouse, Aaron Cole, David Conrad, Tom Consiglio, John Costello, Joe Coyle, Thomas Craig, Rushby Criss, William Cukor, Jeff Denny, Barbara DeWolf, Barton Doran, Terry Draper, Geoff Project Participants DuBlanica, Walt Dulai, Ajmel Dunaway, Donna Dutton, Jeffrey L Dzmura, Lucas Eagan, Robert Egeland, Jim El-Emam, Khaled Eskenasy, Antonio Fantazier, Bob Farinello, Joe Ferguson, Dr Jack Fritz, Nick Gaeta, Rob Goldenson, Dennis Graffius, Joe Gramoy, Beth Gray, Lewis Green, Dan Gross, Jon Guerin, Joan Gunning, Kelly Haas, Sue Haggerty, Chad Hayes, Will Hefner, Rick Heijstek, Andre Herman, Jeff Hodyke, Andrew Hollenbach, Craig Ibrahim, Linda Irion-Talbot, Wendy Iyer, Seshadri Jacobs, Debbie Jarzombek, Joe Johnson, Martha Jones, Lawrence Kansala, Kari Karandikar, Harsh Kayuha, Bob Keeler, Kristi Kellner, Marc Kellogg, David Kelly, Susanne Kirschbaum, Alan Kitson, Dave Kitson, Loretta J Kohl, Ron Konrad, Mike Kopcho, Joanne Kordik, John Kormos, Christina Kosco, Don Koshetar, Paul Langhout, Jacquelyn Lanier, Kelly Lentz, Robert Le, Hien Loebig, Kathleen Madhavan, Pg Malpass, Peter Marciniak, John Martin, Rich Matthews, Jeanne McConnell, David McNeill, Bob McSteen, Bill Menezes, Winifred Midha, Anil Mogilensky, Judah Moon, Jane Moore, James Moore, Richard Mosley, Mark Mounts, Darryl Nash, Dan Nauman, Matt Newberry, George Norimatsu, So Nygren, Steve Ourada, Gerald Parker, Thomas 88 CMMI Model Components Continuous Representation Parry, Thomas Patterson, Bob Paulk, Mark Peterson, Bill Pflugrad, Alan Phillips, David M (Mike Pillai, R Pinkney, Lisa Pomietto, Robert J Prange, Mark Raphael, Richard Rassa, Bob Rawat, A Richins, Kevin Richter, Karen Riddle, Bill Project Participants Rogoway, Paul Salomon, Arthur Sautter, John Schoening, Bill Scott, Terry Sherer, Wayne Shioya, Kazunori Shrum, Sandy Shuster, David Sleder, Al Smith, Dudrey Steiner, Cliff Stewart, Lee Stratton, Duane Svolou, Agapi Tady, Carolyn Tavan, Steve Taylor, Guy D Totty, Lonnie Trebbien-Nielsen, Claus Tyson, Barbara A Vernick, Judy A Waina, Richard Weber, Charles Wells, Curt Weszka, Joan White, Barbara White, David Wilson, Hal Wolf, Gary Yeats, James Zubrow, Dave 89 F Equivalent Staging Equivalent staging is a target staging that is defined so that the results of the target staging can be equivalent to the maturity levels of the staged representation Such staging permits benchmarking of progress between organizations, enterprises, and projects, regardless of the CMMI representation used Figure shows the target profiles that must be achieved when using the continuous representation in order to be equivalent to a maturity level when using a staged representation The columns of the figure have the following meanings:  “Category” is the category to which the process area is assigned  “Name” is the full name of the process area  “ML” is the maturity level assignment of the process area in the staged representation  “CL1,” “CL2,” “CL3,” “CL4,” “CL5” are headings for the columns assigned to capability levels in the continuous representation The shaded areas in the capability level columns indicate target profiles that are equivalent to maturity levels in the staged representation Equivalent Staging  To achieve Target Profile the first seven process areas (Requirements Management to Configuration Management) must have satisfied Capability Levels and  To achieve Target Profile 3, the first 18 process areas (Requirements Management to Organizational Training) must have satisfied capability levels 1, 2, and  To achieve Target Profile the first twenty process areas (Requirements Management to Quantitative Project Management) must have satisfied Capability Levels 1, 2, and  To achieve Target Profile all of the process areas must have satisfied Capability Levels 1, 2, and - 90 Name Abbr ML Requirements Management Measurement and Analysis Project Monitoring and Control Project Planning Process and Product Quality Assurance Supplier Agreement Management Configuration Management Decision Analysis Resolution Product Integration Requirements Development Technical Solution Validation Verification Organizational Process Definition Organizational Process Focus Integrated Project Management Risk Management Organizational Training Organizational Process Performance Quantitative Project Management Organizationals Innovation and Deplayment Causal Analysis and Resolution REQM MA PMC PP PPQA SAM CM DAR PI RD TS VAL VER OPD OPF IPM RSKM OT OPP QPM OID CAR 2 2 2 3 3 3 3 3 4 5 CL1 CL2 CL3 CL4 CL5 Target Profile Target Profile Target Profile Target Profile Note: Target Profile N is equivalent to Maturity Level N in the Staged Representation Figure 1: Target Profiles and Equivalent Staging To reach maturity levels and 5, specific process areas are required to attain capability levels and The maturity level process areas operate on the selection of the organization’s subprocesses to be stabilized and quantitatively understood, based on the business objectives of the organization Users of the continuous representation may wish to extend their capability level target profiles for individual process areas above capability level This extension is assessable if a valid mapping of subprocesses to process areas has been constructed, so that you can tell whether a process area has been placed under quantitative management Some past users of continuous models have found it beneficial to being with the engineering process areas The correlation of these process areas with maturity level three is due to equivalence with the staged maturity levels and is not intended to preclude earlier application Equivalent Staging - 91 ... Equivalent Staging 84 87 89 CMMI Model Components Continuous Representation CMMI Model Components Continuous Representation Appendixes CMMI Model Components Continuous Representation A References... schedule) Glossary 40 CMMI Model Components Continuous Representation D Required and Expected Model Elements Required and Expected Model Elements 41 CMMI Model Components Continuous Representation PROCESS... marks of Carnegie Mellon University CMMI Model Components Continuous Representation CMMI Model Components Continuous Representation Table of Contents Appendixes A References Publicly Available

Ngày đăng: 18/10/2022, 23:34

Xem thêm:

w