1. Trang chủ
  2. » Kinh Doanh - Tiếp Thị

Managing IT performance to create business value

395 15 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 395
Dung lượng 5,59 MB

Nội dung

Managing IT Performance to Create Business Value Managing IT Performance to Create Business Value Jessica Keyes CRC Press Taylor & Francis Group 6000 Broken Sound Parkway NW, Suite 300 Boca Raton, FL 33487-2742 © 2016 by Taylor & Francis Group, LLC CRC Press is an imprint of Taylor & Francis Group, an Informa business No claim to original U.S Government works Printed on acid-free paper Version Date: 20160504 International Standard Book Number-13: 978-1-4987-5285-5 (Hardback) This book contains information obtained from authentic and highly regarded sources Reasonable efforts have been made to publish reliable data and information, but the author and publisher cannot assume responsibility for the validity of all materials or the consequences of their use The authors and publishers have attempted to trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if permission to publish in this form has not been obtained If any copyright material has not been acknowledged please write and let us know so we may rectify in any future reprint Except as permitted under U.S Copyright Law, no part of this book may be reprinted, reproduced, transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying, microfilming, and recording, or in any information storage or retrieval system, without written permission from the publishers For permission to photocopy or use material electronically from this work, please access www.copyright.com (http://www.copyright.com/) or contact the Copyright Clearance Center, Inc (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978750-8400 CCC is a not-for-profit organization that provides licenses and registration for a variety of users For organizations that have been granted a photocopy license by the CCC, a separate system of payment has been arranged Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe Visit the Taylor & Francis Web site at http://www.taylorandfrancis.com and the CRC Press Web site at http://www.crcpress.com Contents PREFACE ACKNOWLEDGMENTS AUTHOR CHAPTER DESIGNING PERFORMANCE-BASED STRATEGIC PLANNING SYSTEMS IT Roadmap Strategic Planning Strategy Implementation Implementation Problems In Conclusion References CHAPTER DESIGNING PERFORMANCE MANAGEMENT AND MEASUREMENT SYSTEMS Developing the QI Plan Balanced Scorecard Establishing a Performance Management Framework Developing Benchmarks Looking Outside the Organization Process Mapping In Conclusion Reference CHAPTER DESIGNING METRICS What Constitutes a Good Metric? IT-Specific Measures System-Specific Metrics Financial Metrics Initial Benefits Worksheet Continuing Benefits Worksheet Quality Benefits Worksheet Other Benefits Worksheet ROI Spreadsheet Calculation Examples of Performance Measures In Conclusion Project/Process Measurement Questions Organizational Measurement Questions References CHAPTER ESTABLISHING A SOFTWARE MEASUREMENT PROGRAM Resources, Products, Processes Direct and Indirect Software Measurement Views of Core Measures Strategic View Tactical View Application View Use a Software Process Improvement Model Organization Software Measurement Project Software Measurement Software Engineering Institute Capability Maturity Model Identify a Goal-Question-Metric (GQM) Structure Develop a Software Measurement Plan Example Measurement Plan Standard In Conclusion CHAPTER DESIGNING PEOPLE IMPROVEMENT SYSTEMS Impact of Positive Leadership Motivation Recruitment Employee Appraisal Automated Appraisal Tools Dealing with Burnout In Conclusion References CHAPTER KNOWLEDGE AND SOCIAL ENTERPRISING PERFORMANCE MEASUREMENT AND MANAGEMENT Using Balanced Scorecards to Manage Knowledge-Based Social Enterprising Adopting the Balanced Scorecard Attributes of Successful Project Management Measurement Systems Measuring Project Portfolio Management Project Management Process Maturity Model (PM)2 and Collaboration In Conclusion References CHAPTER DESIGNING PERFORMANCE-BASED RISK MANAGEMENT SYSTEMS Risk Strategy Risk Analysis Risk Identification Sample Risk Plan RMMM Strategy Risk Avoidance Quantitative Risk Analysis Risk Checklists IT Risk Assessment Frameworks Risk Process Measurement In Conclusion Reference CHAPTER DESIGNING PROCESS CONTROL AND IMPROVEMENT SYSTEMS IT Utility Getting to Process Improvements Enhancing IT Processes New Methods Process Quality Process Performance Metrics Shared First Step 1: Inventory, Assess, and Benchmark Internal Functions and Services Tasks Step 2: Identify Potential Shared Services Providers Tasks Step 3: Compare Internal Services versus Shared Services Providers Step 4: Make the Investment Decision Step 5: Determine Funding Approach Step 6: Establish Service-Level Agreements Step 7: Postdeployment Operations and Management Configuration Management CM and Process Improvement Implementing CM in the Organization In Conclusion References CHAPTER DESIGNING AND MEASURING THE IT PRODUCT STRATEGY Product Life Cycle Product Life-Cycle Management Product Development Process Continuous Innovation Measuring Product Development In Conclusion References CHAPTER 10 DESIGNING CUSTOMER VALUE SYSTEMS Customer Intimacy and Operational Excellence Customer Satisfaction Survey Using Force Field Analysis to Listen to Customers Customer Economy Innovation for Enhanced Customer Support Managing for Innovation In Conclusion References APPENDIX I APPENDIX II APPENDIX III APPENDIX IV APPENDIX V APPENDIX VI APPENDIX VII APPENDIX VIII APPENDIX IX APPENDIX X APPENDIX XI APPENDIX XII APPENDIX XIII APPENDIX XIV APPENDIX XV INDEX Preface One of the reasons why information technology (IT) projects so often fail is that the return on investment rarely drives the technology investment decision It is not always the best idea that wins Often, the project that wins the funding just did a better job of marketing the idea and themselves An even more important reason for all of the IT project chaos is that there is rarely any long-term accountability (i.e., lack of performance management or measurement) in technology There are literally hundreds of processes taking place simultaneously in an organization, each creating value in some way IT performance management and measurement is about pushing the performance of the automation and maintenance of these processes in the right direction, ultimately to minimize the risk of failure Every so often, a hot new performance management technique appears on the horizon Complimentary to the now familiar agile development methodology, agile performance management is designed for an environment where work is more collaborative, social, and faster moving than ever before As should be expected from a methodology that stems from agile development, the most important features of agile performance management are a development focus and regular check-ins Toward this end, this methodology stresses more frequent feedback, managers conducting regular check-ins with team members, crowdsourcing feedback from project team members and managers, social recognition that encourages people to their best work, emphasis on skills power as opposed to the usual rigid hierarchical power, tight integration with development planning, and just-in-time learning The goal is to improve the “performance culture” of the organization Unsurprisingly, agile performance management is just a new name for a set of methodologies that have long been used by forward-thinking IT managers Knowledge management and social enterprising methodologies, which we cover in this book, have always had a real synergy with performance management and measurement This volume thoroughly explains the concepts behind performance management and measurement from an IT “performance culture” perspective It provides examples, case histories, and current research on critical issues such as performance measurement and management, continuous process improvement, knowledge management, risk management, benchmarking, metrics selection, and people management may be unable to answer them For example, a (poor) Each time I visit the library, the waiting line is long b (better) Generally, the waiting line in the library is long Are the questions biased? Biased questions influence the customer to respond in a manner that does not correctly reflect his/her opinion For example, a (poor) How much you like our library? b (better) Would you recommend our library to a friend? Are the questions objectionable? Usually, this problem can be overcome by asking the question in a less direct way For example, a (poor) Are you living with someone? b (better) How many people, including yourself, are in your household? Are the questions double-barreled? Two separate questions are sometimes combined into one The customer is forced to give a single response and this, of course, would be ambiguous For example, a (poor) The library is attractive and well maintained b (better) The library is attractive Are the answer choices mutually exclusive? The answer categories must be mutually exclusive and the respondent should not feel forced to choose more than one For example, a (poor) Scale range: 1, 2–5, 5–9, 9–13, 13 or over b (better) Scale range: 0, 1–5, 6–10, 11–15, 16 or over Are the answer choices mutually exhaustive? The response categories provided should be exhaustive They should include all the possible responses that might be expected For example, a (poor) Scale range: 1–5, 6–10, 11–15, 16–20 b (better) Scale range: 0, 1–5, 6–10, 11–15, 16 or over Tallying the responses will provide a “score” that assists in making a decision that requires the use of quantifiable information When using interval scales, keep in mind that not all questions will carry the same weight Hence, it is a good idea to use a weighted average formula during calculation To this, assign a “weight” or level of importance to each question For example, the aforementioned question might be assigned a weight of on a scale of to meaning that this is a very important question On the other hand, a question such as “Was the training center comfortable” might carry a weight of only The weighted average is calculated by multiplying the weight by the score (w * s) to get the final score Thus, the formula is snew = w * s There are several problems that might result in a poorly constructed questionnaire Leniency is caused by respondents who grade nonsubjectively—in other words, too easily Central tendency occurs when respondents rate everything as average The halo effect occurs when the respondent carries his or her good or bad impression from one question to the next There are several methods that can be used to successfully deploy a survey The easiest and most accurate is to gather all respondents in a conference room and hand out the survey For the most part, this is not realistic, so other approaches would be more appropriate E-mail and traditional mail are two methodologies that work well, although you often have to supply an incentive (i.e., prize) to get respondents to fill out those surveys on a timely basis Web-based surveys (Internet and intranet) are becoming increasingly popular as they enable the inclusion of demos, audio, and video For example, a web-based survey on what type of user interface is preferable could have hyperlinks to demos or screen shots of the choices Observation Observation is an important tool that can provide a wealth of information There are several forms of observation: silent and directed In silent observation, the analyst merely sits on the sidelines, pen and pad, and observes what is happening If it is suitable, a tape recorder or video recorder can record what is being observed However, this is not recommended if the net result will be several hours of random footage Silent observation is best used to capture the spontaneous nature of a particular process or procedure For example, When customers will be interacting with staff During group meetings On the manufacturing floor In the field Directed observation provides the analyst with a chance to micro-control a process or procedure so that it is broken down into its observable parts At one accounting firm, a tax system was being developed The analysts requested that several senior tax accountants be coupled with a junior staff member The group was given a problem as well as all the manuals and materials they needed The junior accountant sat at one end of the table with the pile of manuals and forms while the senior tax accountants sat at the other end A tough tax problem was posed The senior tax accountants were directed to think through the process and then direct the junior member to follow through on their directions to solve this problem The catch was that the senior members could not walk over to the junior person nor touch any of the reference guides This whole exercise had to be verbal and use just their memories and expertise The entire process was videotaped The net result was that the analyst had a complete record of how to perform one of the critical functions of the new system Participation The flip side of observation is participation Actually becoming a member of the staff, and thereby learning exactly what it is that the staff does so that it might be automated, is an invaluable experience Documentation It is logical to assume that there will be a wide variety of documentation available to the analyst This includes, but is not limited to the following: Documentation from existing systems This includes requirements and design specifications, program documentation, user manuals, and help files This also includes whatever “wish lists” have been developed for the existing system Archival information Policies and procedures manuals Reports Memos Standards E-mail Minutes from meetings Government and other regulatory guidelines and regulations 10 Industry or association manuals, guidelines, standards (e.g., accountants are guided not only by in-house “rules and regulations,” but also by industry and other rules and regulations) Brainstorming In a brainstorming session, you gather together a group of people, create a stimulating and focused atmosphere, and let people come up with ideas without risk of being ridiculed Even seemingly stupid ideas may turn out to be “golden.” Focus Groups Focus groups are derived from marketing These are structured sessions where a group of stakeholders are presented with a solution to a problem and then are closely questioned on their views about that solution Index A AHP, see Analytic hierarchy process (AHP) AIAG, see Automotive industry action group (AIAG) Analytic hierarchy process (AHP), 33–36 Anytime Feedback, 83 APIs, see Application programming interfaces (APIs) Application programming interfaces (APIs), 112 Automated appraisal tools, 82–83 Automotive industry action group (AIAG), 101 B Balanced scorecards, 18–20, 90–93 metrics, 309–315 BCR, see Benefit–cost ratio (BCR) Behavioral competencies, 273–278 Benchmarking, 22–25 and brainstorming, 368 and documentation, 368 and focus groups, 368 and interviewing, 361–364 of metrics, 32–36 and observation, 367 and participation, 368 and questionnaires/surveys, 364–367 Benefit–cost ratio (BCR), 46 BetterWorks, 82 BPM, see Business process management (BPM) Brainstorming, and benchmarking, 368 Break-even analysis, 46 Break-even point, 46 Business Balanced Scorecard On-Line, 137 Business process improvements, 131 Business process management (BPM), 130 Business risk RMMM strategy, 112–113 Business risks, 107 C Capability maturity model (CMM), 61 Capco, 82 CES, see Cost element structure (CES) Chief information officers (CIOs), 2, 39, 132 CIOs, see Chief information officers (CIOs) CM, see Configuration management (CM) CMM, see Capability maturity model (CMM) Comcast Interactive, Competitive analysis, 26–27 Configuration management (CM), 133, 145–146 metrics, 256 Continuous improvement, 131 Continuous innovation, 155–159 Cost–benefit analysis, 45 Cost element structure (CES), 289 Covey, Stephen, 84 Critical success factors (CSFs), 113–115, 136 CSFs, see Critical success factors (CSFs) Customer economy, 166–167 Customer intimacy, and operational excellence, 161–162 Customer satisfaction survey, 162–164 D DARPA, see Defense Advanced Research Project Agency (DARPA) Data analytics, 77 Data collection, 361 Decision trees, 119 Defense Advanced Research Project Agency (DARPA), 154 Define, measure, analyze, improve, and control (DMAIC), 130 Design for Six Sigma (DFSS), 130 DevOps, 132 DFSS, see Design for Six Sigma (DFSS) Direct software measurement program, 59 DirectTV, DMAIC, see Define, measure, analyze, improve, and control (DMAIC) Documentation, and benchmarking, 368 E Earned-value management (EVM), 52 “Easy to business with” (ETDBW), 166 Economic value added (EVA), 48 Employee appraisal, 78–81 Engineering process improvements, 131 Enterprise resource planning (ERP), 41–44, 265–266 ERP, see Enterprise resource planning (ERP) ETDBW, see “Easy to business with” (ETDBW) EVA, see Economic value added (EVA) EVM, see Earned-value management (EVM) F Factor analysis of information risk (FAIR), 122 FADE, see Focus, analyze, develop, execute, and evaluate (FADE) Failure modes and effects analysis (FMEA), 116 FAIR, see Factor analysis of information risk (FAIR) Financial metrics, 45–52 Five Forces model, FMEA, see Failure modes and effects analysis (FMEA) Focus, analyze, develop, execute, and evaluate (FADE), 17 Focus groups, and benchmarking, 368 Force field analysis, and VOC, 164–165 G Goal-question-metric (GQM) paradigm, 63–64 GQM, see Goal-question-metric (GQM) paradigm H Holacracy, 71 I IBS, see Information-based strategy (IBS) IDEAL model, see Initiate, diagnose, establish, act, and leverage (IDEAL) model IEEE, see Institute of Electrical and Electronics Engineers (IEEE) Indirect software measurement program, 59 Information-based strategy (IBS), 172 Infrastructure as code, 133 Initiate, diagnose, establish, act, and leverage (IDEAL) model, 61 Innovation continuous, 155–159 for enhanced customer support, 167–170 managing for, 170–172 Institute of Electrical and Electronics Engineers (IEEE), 258–264 Intelligence quotient (IQ), 77 Interviewing, and benchmarking, 361–364 IQ, see Intelligence quotient (IQ) IT risk assessment frameworks, 120–123 IT-specific metrics, 36–41, 255–271 IT staff competency survey, 217–221 IT utility, 127–130 J JIT, see Just-in-time (JIT) Just-in-time (JIT), 114 K Kaizen, 132 KCO, see Key control over operations (KCO) model Key control over operations (KCO) model, 317 Key performance indicators (KPIs), 93 K&IM, see Knowledge and information management (K&IM) KM, see Knowledge management (KM) Knowledge and information management (K&IM), 333–337 Knowledge-based social enterprising and balanced scorecards, 90–93 measuring project portfolio management, 95–99 overview, 89 project management measurement systems, 93–95 Project Management Process Maturity Model (PM)2 model, 99–102 Knowledge management (KM), 317–331 KPIs, see Key performance indicators (KPIs) L Lean software development (LSD), 130 LSD, see Lean software development (LSD) M Malcolm Baldrige National Quality Award, 97 Measurement case, 66 Measurement plan, 66–68, 279–284 Metrics balanced scorecard, 309–315 benchmarking initiative of, 32–36 configuration management, 256 examples of performance measures, 52–54 financial, 45–52 Institute of Electrical and Electronics Engineers (IEEE) defined, 258–264 IT-specific measures, 36–41, 255–271 for knowledge management (KM), 317–331 overview, 31–32 process maturity framework, 256–258 product and process, 255–256 selected performance, 264 Software Technology Support Center (STSC), 223–254 system-specific, 41–45 Monte Carlo methods, 119 Motivation, 75–76 N National Consumer Electronics, Net present value (NPV), 51–52 New product development, 153–154 New York Times, 83 NPV, see Net present value (NPV) O Observation, and benchmarking, 367 OCTAVE, see Operationally critical threat, asset, and vulnerability evaluation (OCTAVE) ODI, see Outcome-driven innovation (ODI) OLTP, see Online transaction processing (OLTP) Online transaction processing (OLTP), 44 Operationally critical threat, asset, and vulnerability evaluation (OCTAVE), 122 Organization software measurement, 61–62 Outcome-driven innovation (ODI), 167 P Participation, and benchmarking, 368 PC, see Production capability (PC) People improvement systems automated appraisal tools, 82–83 employee appraisal, 78–81 impact of positive leadership, 73–75 motivation, 75–76 overview, 71–73 recruitment, 76–78 and workplace stress, 83–86 Performance-based strategic planning systems overview, 1–2 strategic planning, 2–5 strategy implementation, 5–11 technology roadmap, Performance management and measurement systems balanced scorecard, 18–20 competitive analysis, 26–27 developing benchmarks, 22–25 developing QI plan, 15–18 establishing, 20–22 overview, 13–15 process mapping, 27–29 PMBOK, see Project Management Body of Knowledge (PMBOK) PMG, see Portfolio Management Group (PMG) PM2 model, see Project Management Process Maturity Model (PM)2 model PMO, see Project management office (PMO) Portfolio Management Group (PMG), 97 Positive leadership, 73–75 PQI, see Process quality index (PQI) PQM, see Process quality management (PQM) PRINCE2, see Projects in Controlled Environments (PRINCE2) Probability trees, 119 Process configuration management, 146, see Configuration management (CM) Process mapping, 27–29 Process maturity framework metrics, 256–258 Process performance metrics, 137–140 Process quality, 134–137 Process quality index (PQI), 134 Process quality management (PQM), 114, 351–355 Product development measuring, 159 process, 153–154 Production capability (PC), 84 Product life cycle (PLC), 149–152 management, 152–153 Project Management Body of Knowledge (PMBOK), 98 Project management measurement systems, 93–95 Project management office (PMO), 97–98 Project Management Process Maturity Model (PM)2 model, 99–102 Project portfolio management, measuring, 95–99 Project QA and collaboration plan, 339–349 Project quality plan, 357–359 Project risk RMMM strategy, 112 Project risks, 107 Projects in Controlled Environments (PRINCE2), 98 Project software measurement, 62 ProSTEP-iViP reference model, 101 Q QA, see Quality assurance (QA) QCE, see Quality of customer experience (QCE) QI, see Quality improvement (QI) QoE, see Quality of experience (QoE) Quality assurance (QA), 134 Quality improvement (QI), 13–15 developing plan, 15–18 Quality of customer experience (QCE), 166–167 Quality of experience (QoE), 166 Quantitative risk analysis, 116–119 Questionnaires/surveys, 364–367 R Recruitment, 76–78 Requirements elicitation, 361 Return on attitude (ROA), 10 Return on excitement (ROE), 10 Return on intellect (ROI), 10 Return on investment (ROI), 45–51, 73, 97, 140, 172, 285 Return on management (ROM), 172 Risk analysis, 104–105 quantitative, 116–119 Risk avoidance, 113–115 Risk checklists, 119–120 Risk identification, 105–108 Risk mitigation, monitoring, and management plan (RMMM), 108 business strategy, 112–113 project strategy, 112 technical strategy, 112 Risk process measurement, 123–125 Risk strategy, 103–104 RMMM, see Risk mitigation, monitoring, and management plan (RMMM) ROA, see Return on attitude (ROA) ROE, see Return on excitement (ROE) ROI, see Return on intellect (ROI); Return on investment (ROI) ROM, see Return on management (ROM) Rosemann–Wiese approach, 43 S Habits of Highly Effective People (Covey), 84 Sample measurement plan, 279–284 Sample risk plan, 108–111 SDLC, see Systems development life cycle (SDLC) SEI, see Software Engineering Institute (SEI) SEI CMM, see Software Engineering Institute Capability Maturity Model (SEI CMM) Selected performance metrics, 264 Service-level agreement (SLA), 144 Shared first approach, 140–145 Simulation models, see Monte Carlo methods SLA, see Service-level agreement (SLA) SLOC, see Source lines of code (SLOC) SMART, see Specific, measurable, achievable, relevant, and time-framed (SMART) SMEs, see Subject-matter experts (SMEs) Software Engineering Institute Capability Maturity Model (SEI CMM), 62–63 Software Engineering Institute (SEI), 134 Software measurement program developing software measurement plan, 64–65 direct and indirect, 59 goal-question-metric (GQM) paradigm, 63–64 measurement plan standard, 66–68, 279–284 overview, 57–58 Software Engineering Institute Capability Maturity Model (SEI CMM), 62–63 software objects, 59 software process improvement model, 61–62 thematic outline, 68–70 views of core measures, 60 Software process improvement model, 61–62 Software Technology Support Center (STSC), 223–254 Source lines of code (SLOC), 59 SOW, see Statement of work (SOW) Specific, measurable, achievable, relevant, and time-framed (SMART), 16 Statement of work (SOW), 142 Strategic planning, 2–5 Strengths, weaknesses, opportunities, and threats (SWOT), STSC, see Software Technology Support Center (STSC) Subject-matter experts (SMEs), 141 Surveys/questionnaires, 364–367 SWOT, see Strengths, weaknesses, opportunities, and threats (SWOT) Systems development life cycle (SDLC), 150 System-specific metrics, 41–45 T TCO, see Total cost of ownership (TCO) Technical risk RMMM strategy, 112 Technical risks, 107 Technology roadmap, TIME magazine, 77 Time-to-market (TTM), 130 TiVo, 6–7 Total cost of ownership (TCO), 48 Total quality control (TQC), 18 TQC, see Total quality control (TQC) TTM, see Time-to-market (TTM) V Value measuring methodology (VMM), 52, 285–307 VMM, see Value measuring methodology (VMM) VOC, see Voice of the customer (VOC) Voice of the customer (VOC) customer economy, 166–167 customer intimacy and operational excellence, 161–162 customer satisfaction survey, 162–164 force field analysis, 164–165 innovation for enhanced customer support, 167–170 managing for innovation, 170–172 W WEST, see Women in the Enterprise of Science and Technology (WEST) Women in the Enterprise of Science and Technology (WEST), 155 Workplace stress, 83–86 Work unit measures, 213–216 X X quotient (XQ), 77 Y Y2K panic, 154 Z Zeigarnik effect, 85

Ngày đăng: 03/01/2020, 13:33

w