1. Trang chủ
  2. » Kinh Doanh - Tiếp Thị

2015 (EBOOK) MANAGING AND MEASURING PERFORMANCE IN PUBLIC AND NONPROFIT ORGANIZATIONS

482 438 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 482
Dung lượng 5,04 MB

Nội dung

MANAGING AND MEASURING PERFORMANCE IN PUBLIC AND NONPROFIT ORGANIZATIONS MANAGING AND MEASURING PERFORMANCE IN PUBLIC AND NONPROFIT ORGANIZATIONS An Integrated Approach Second Edition Theodore H Poister Maria P Aristigueta Jeremy L Hall Cover design by Wiley Cover image: © iStock.com / aleksandarvelasevic Copyright © 2015 by John Wiley & Sons, Inc All rights reserved Published by Jossey-Bass A Wiley Brand One Montgomery Street, Suite 1200, San Francisco, CA 94104-4594—www.josseybass.com No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400, fax 978-646-8600, or on the Web at www.copyright.com Requests to the publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, 201-748-6011, fax 201-748-6008, or online at www.wiley.com/go/permissions Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose No warranty may be created or extended by sales representatives or written sales materials The advice and strategies contained herein may not be suitable for your situation You should consult with a professional where appropriate Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages Readers should be aware that Internet Web sites offered as citations and/or sources for further information may have changed or disappeared between the time this was written and when it is read Jossey-Bass books and products are available through most bookstores To contact Jossey-Bass directly call our Customer Care Department within the U.S at 800-956-7739, outside the U.S at 317-572-3986, or fax 317-572-4002 Wiley publishes in a variety of print and electronic formats and by print-on-demand Some material included with standard print versions of this book may not be included in e-books or in print-on-demand If this book refers to media such as a CD or DVD that is not included in the version you purchased, you may download this material at http://booksupport.wiley.com For more information about Wiley products, visit www.wiley.com Library of Congress Cataloging-in-Publication Data Poister, Theodore H [Measuring performance in public and nonprofit organizations] Managing and measuring performance in public and nonprofit organizations: an integrated approach / Theodore H Poister, Maria P Aristigueta, Jeremy L Hall – Second edition pages cm Revised edition of Poister’s Measuring performance in public and nonprofit organizations Includes bibliographical references and index ISBN 978-1-118-43905-0 (hardback) 1.  Organizational effectiveness–Measurement 2.  Organizational effectiveness–Management 3.  Nonprofit organizations 4.  Public administration 5.  Performance–Measurement 6.  Performance–Management  I.  Aristigueta, Maria Pilar  II.  Hall, Jeremy L.  III.  Title HD58.9.P65 2015 658.4’013–dc23 2014021079 Printed in the United States of America second edition HB Printing  10  9  8  7  6  5  4  3  2  CONTENTS Preface  ix Acknowledgments  xiii PART 1: INTRODUCTION TO PERFORMANCE MANAGEMENT    Introduction to Performance Management and Measurement    Developing Effective Performance Management Systems  35 PART 2: METHODOLOGICAL ELEMENTS OF PERFORMANCE MANAGEMENT  51   Developing a Performance Framework: Program Logic Models and Performance Measures  53   Targeting Results: Clarifying Goals and Objectives  88   Defining Performance Indicators  112   Reporting Performance Data  155   Analyzing Performance Information  175 v vi Contents PART 3: STRATEGIC APPLICATIONS OF PERFORMANCE MANAGEMENT PRINCIPLES  197   Using Performance Measures to Support Strategic Planning and Management  199   Performance-Informed Budgeting  231 10 Managing Employees, Programs, and Organizational Units  274 11 Performance Management in Grant and Contract Programs  304 12 Improving Quality and Process  331 13 Soliciting Stakeholder Feedback  355 14 Using Comparative Measures to Benchmark Performance  384 PART 4: DESIGN AND IMPLEMENTATION OF PERFORMANCE MANAGEMENT PROCESSES  411 15 Designing and Implementing Effective Management Systems  413 Name Index  447 Subject Index  450 To my wonderful granddaughters, Susannah Grace and Caroline Elizabeth Tusher Who light up my life and make it all the more worthwhile.—Ted Poister To my husband, Don Coons, For his unwavering love, patience, and support.—Maria Aristigueta To my niece, Kadence Olivia Dick, Who brightens each day and always motivates me to perform at my best.—Jeremy L Hall SUBJECT INDEX Page references followed by fig indicate an illustrated figure; followed by t indicate a table A ABB (Asea Brown Bovari), 345 Accenture, 385 Accountability: demand for performance management, 4; description of, 13–14; different government delivery mechanism and, 310t–311; examining performance management issues of, 4, 13–17; five distinct dimensions of, 14; four types of public agency, 15–17; GPRA adopted for, 314; improving strategies for, 440; performance-based budgeting, 267; as performance measurement success factor, 420 ACT tests, 141–142 Adaptive Behavior Scales (ABS), 118–119 Adjusted performance measures: adjusted performance measures: cost per mile of surface treatment, 399t; comparative measures using, 397–400; county highway maintenance cost per mile of surface treatment, 398fig Affordable Care Act (2013), 304 450 Agency and administrative records, 122–123 AgencyStat, 418 AIDS education programs: cost-effectiveness measures for an, 72; efficiency and productivity measures of, 68; outcomes vs outputs, 58, 59t; output measures of a, 67–68 See also HIV/AIDS treatment programs Air quality index (AQI), 116–117t Albuquerque (New Mexico) city budgeting, 253, 258fig–260fig Allied Signal, 345 Alcoholics Anonymous (AA), 62 American Association of State Highway and Transportation Officials (AASHTO), 121, 122, 401 American Customers Satisfaction Index Model, 374 American Public Works Association photometric index, 126 American Recovery and Reinvestment Act (ARRA), 25, 306, 323 American Red Cross Field Operations Consolidated Information System (FOCIS), 206–207 American Review of Public Administration, 28 Annual Texas Index Crime Rate, 248 Assessment: organizational readiness for performance management, 37fig, 38, 424; performance measures, 151–152t See also Program evaluation Association of State Highway and Transportation Officials, 388 Auditor of the State of Hawaii, 320–321 Averages: applications of, 114–115; description of statistical, 114 B Balanced scorecard models: Kaplan and Norton’s, 216, 219; for service-oriented public and nonprofit sectors, 219–220 Balanced scorecards: City of Charlotte (North Carolina) and CDOT, 224–226; description of, 166; distinguishing between dashboards and, 167–168; Kenya Red Cross, 221fig–224; Library of Virginia, 214fig; 451 Subject Index logic models and, 226, 228; municipal police department, 168–169, 170fig; for public and nonprofit sectors, 219–220; public sector organizations that develop their own, 109; strategy management system incorporation of, 221; strategy maps and, 220–221; Virginia Performs, 211–212fig See also Feedback Baldrige Award, 41 Baltimore CitiStat system: DC Stat compared to, 363; description of the, 294–295; effectiveness of, 414; parking management program, 295–296t; performance results of the, 295 Bar charts: cluster construction of, 161, 163fig; customer ratings of driver license renewal process, fiscal year 2014, 163fig; description of, 160–161; stacked, 161, 162fig Beginning outcomes, 380 Benchmarking: availability of data issue of, 390–391; corporatestyle, 386; description and function of, 385; identifying best practices using, 400–402; prospects for, 408–409; public sector, 384–387; regulatory, 402–408; reliability of comparative data issue of, 391–392; statistical, 387–390; variation in operating conditions issue of, 392–393 Benchmarking statistical or comparative performance, 386 Benchmarking targeting approach, 101 Best practices: description and function of, 400; identifying, 400–402 Biases: nonresponse, 141–143; observer, 138–139 Block grants, 312, 313, 314–316 Board of Nursing Program logic model, 402–403fig British Audit Commission, 414 Bubble graphs: applications of, 165–166; description of, 164; VDOT highway construction projects (2002), 164fig–165 Budget Accounting and Procedures Act (1950), 236 Budgeting See Performance-based budgeting (PBB) Bureau of Labor Statistics, 130 Bush (George W.) administration: Government Performance and Results Act (GPRA) under the, 17; PART used for performance management and measurement during, 314–315 C Canadian Pension Plan Disability (CPPD) Program: logic model of, 81–82fig, 83; performance measures for the, 83t–84; systematic assessment of performance measures for the, 151–152t Cascading of measures, 274–275 Categorical grants, 312–313 Center for American Progress, 297 Center for Effective Government, 315 Centers for Disease Control (CDC) US Centers for Disease Control and Prevention (CDC) Centers for Medicare and Medicaid Services, 368, 404 CGI Federal (Canada), 304–305 Challenger disaster, 16 Charlotte Department of Transportation (CDOT): balanced scorecard of, 224–226; objectives and measures of, 227t Cheating issue, 143–144 Child support enforcement program: performance analysis of effectiveness of, 188–193; performance standards established for a, 96–99; program logic model of, 97fig Child support enforcement program effectiveness: child support enforcement performance (2006–2012), 190–192; percentage of cases in arrears by unemployment rate 92012), 192fig–193; performance analysis of, 188–193; selected performance measures by region, 190t; state child enforcement program results (2012), 189t CitiStat program: DC Stat compared to, 363; description of, 41; effectiveness of, 414; individual and programmatic performance management using the, 294–297; origins and development of the, 23; program improvement through the, 417–418; stakeholder involvement through the, 378 Citizens: as stakeholder engagement role, 356; as stakeholder partner addressing community issues, 376–378 City of Albuquerque (New Mexico) city budgeting, 253, 258fig–260fig City of Atlanta quality-important matrixes on public services, 371fig–372 City of Livingston (Montana) performance-informed budgeting, 252–253, 254fig–257fig City of Phoenix Planning and Research Bureau, 283, 284t City of Toronto’s child care spaces (Canada), 389fig–390 Clinical examinations data, 124 Collaboration: challenges of stakeholder, 356–357; expanding use of performance management, 439; government and nonprofit use of, 427; implementing interagency performance management, 428, 429t; managing the performance of, 427–428 See also Partners Commitment to Ongoing Regulatory Excellence (CORE), 367–368fig, 402, 405–408 “Community Conversations” (United Way and Harwood Institute), 376–377 Community Development Block Grant (CDBG) program, 312, 314–316 Community-level outcomes, 380 Comparative measures: adjusted performance measures approach to, 397–400; benchmarking using, 384–393; explanatory variables approach to, 393–394fig; peer groups approach to, 394–396; police performance, 396t; recalibrated measures approach to, 396–397fig Comparative performance measurement, 386 Compass system (NMSH&TD), 291–292 CompStat: description of, 23, 41; fighting crime effectiveness of, 414; New York City Police Department use of, 23, 292, 452 293t; tracking crime data in NYC, 172 Concurrent (or correlational) validity, 134 Confirmation of expectations model, 374–376 Consensual validity, 133–134 Consumer price index (CPI), 117 Content-oriented outcomes, 380 Content validity, 134 Contracts See Federal contracts Controllability: accountability dimension of, 14–15; definition of, 14 Core values (Kenya Red Cross), 221fig Corporate-style benchmarking, 386 Cost-effectiveness measures: Canadian Pension Plan Disability (CPPD) Program, 83t; description of, 72; percentages, rates, and ratios used as, 116–117; public transit system performance, 178e, 179; River Valley Transit performance standards, 182t, 183; teen mother parenting education program, 80t See also Performance-based budgeting (PBB) Cost-sensitive performance measures, 150–151 Council on Virginia’s Future: agency-based strategic goals of, 210–212; description of, 209; outcome and output measures of, 212–213; Virginia Performs framework, 210fig; Virginia Performs scorecard, 211– 212fig; vision statement and goals of, 209–210 Crime: CompStat tracking of, 23, 41, 172, 292, 293t, 414; outcomes vs outputs of programs to control, 59t Crisis stabilization logic model, 61fig–62 Customer feedback: on customer service, 361–363fig; Girl Scout Council of Northwest Georgia, 360–361t; issues to consider for, 359–360 Customer ratings of driver license renewal process (2014), 161, 163fig Customer response card data, 125–126 Customer satisfaction measures: description of, 72–73; Subject Index expectations, confirmation and disconfirmation models and, 374–376; integrated sets of, 77–78; objective performance measures relationship to, 372–374fig; teen mother parenting education program, 80t; tracking customer complaints for, 73 See also Service quality measures Customers: obtaining feedback from, 359–368fig; stakeholder engagement role of, 355 D Dashboards: description of, 166–167; distinguishing between scorecards and, 167–168; example of a pictorial display, 168fig; hospital patient feedback, 167fig; US Patent Office, 167–168, 169fig Data: categories of, 122; noncomparability of, 135–136; sources of, 121–126 See also Performance data analysis; Reporting performance data Data collection: developing procedures for, 37fig, 43–45, 424; evaluation phase of, 48 Data sources: agency and administrative records, 122–123; clinical examinations, 124–125; direct observation, 124; follow-up contacts, 123–124; overview of, 121–122; specially designed measurement tools, 126; surveys and customer response cards, 125–126 DC Stat, 363 Delaware Department of Natural Resources and Environmental Control (DNREG), 349 Department of Veterans Affairs See US Department of Veterans Affairs (VA) Direct observation data, 124 Disaggregated productivity measures, 338–339 Disconfirmation of expectations model, 374–376 Duke Activity Status Index, 119t E E-government stakeholder involvement, 378 Effectiveness measures: child support enforcement program analysis of, 188–193; HOPE Scholarship program (Georgia), 146; outcomes as, 57–60, 71–72; performanceinformed budgeting, 263; teen mother parenting education program, 80t Efficiency measures: Canadian Pension Plan Disability (CPPD) Program, 83t; description and purpose of, 68; how to use, 68–69; percentages, rates, and ratios used as, 116–117; public transit system performance, 178e, 179; River Valley Transit performance standards, 181, 182t, 183; teen mother parenting education program, 80t End outcomes, 380 Evaluating performance management system: basic phases of, 48; data validation during, 48; HCAHCPS survey on hospital care by patients, 368fig; Management by Objectives (MBO) measures for, 282–283, 284t; process of, 37fig, 47–48, 425; strategic planning on future monitoring and, 206; systematic assessment of performance measures, 151–152t See also Program evaluation Evaluation question, 48 Evidence-based practice: data aggregation and utilization over time of, 29–30fig; Oregon’s ORS section 182.525 definition of scientifically, 300; program evaluation using, 300–302; selecting programs using, 298–300 External environmental measures: applications of, 75–76; description of, 75 F Face validity, 133 Fatality Analysis Reporting System, 94 Federal Aviation Administration’s air traffic control program, 70 Federal budget reform: Budget Accounting and Procedures Act (1950) on the, 236; GPRA conflict with the traditionally political process of, 237–238; history of, 236–237; zero-based budgeting (1970s), 237 See 453 Subject Index also Performance-based budgeting (PBB); State-level performance-informed budgeting system Federal contracts: accountability relationship with, 310t–311; challenges of transition to, 305–308; comparing performance measurement and management in grants and, 311–316; contractor and contractee relationship in, 308; distinguishing between grants and, 308–311; government service delivery using, 305–306; PART tool for performance measurement and management of, 314–315; performance management approaches used for, 316–321; purpose of performance measurement in, 321–323; spending trends (2000–2011), 306–307fig; types of, 311–312 Federal government: budget reform by the, 236–238; federalism relationship between state and, 309 See also Government; Legislation; Local government Federal grants: accountability relationship with, 310t–311; challenges of transition to, 305–308; considerations when using performance management, 327–329; distinguishing between contracts and, 308–311; grantor and grantee relationship in, 308; PART tool for performance measurement and management of, 314–315; performance-based management, 323–327; performance measurement and management of specific types of, 311–316; program logic models from grantee and grantor perspectives, 324–326; purpose of performance measurement in, 321–323; spending trends (2000–2011), 306–307fig Federal Health Care Financing Administration, 116 Federal Reserve Board, 117 Federal Transit Administration (FTA), 187–188 Federal Transit Administration’s Small Transit Intensive Cities (STIC) program, 313–314 Federalism, 309 Feedback: analyzing stakeholder, 368–378; Commitment to Ongoing Regulatory Excellence (CORE), 367– 368fig, 402, 405–408; dashboards form of, 166– 169fig; GradedDC.gov for, 362–363fig; HCAHCPS survey on hospital care by patients, 368fig; on how often patients receive help quickly from hospital staff, 404fig–405; logic model for stakeholder, 379fig–380; obtaining customer, 359–368fig; partner, 364fig– 367; 360 degree assessment model for, 364fig–365, 367 See also Balanced scorecards; Program evaluation Field Operations Consolidated Information System (FOCIS) [ARC], 206–207 5S methodology, 349–350 Florida Benchmarking Consortium, 401–402 Florida Benchmarks, 386 Follow-up contacts data, 123–124 Food and Nutrition Services, 418 Forecasting targeting approach, 100 Formula grants, 312, 313–314 FoxNews.com, 304 Full-time equivalent employee productivity, 333 G Gaming, 437–438 GED (Graduation Equivalency Diploma), 26–27 General Electric, 345 Geographical information systems (GIS) maps, 169, 171fig–172 Georgia: “Cleaning Days” sponsored by youth groups of, 377–378; Department of Transportation (GDOT) of, 357–359, 365–367; expectations and satisfaction with traffic flow and congestion in, 375fig; high school graduation rates by percentage of students eligible for free lunch program in, 393–394fig; Office of Child Support Enforcement of, 362, 387; performance-informed budgeting by, 241, 242t; quality-important matrixes on local public services in the Atlanta area performance, 371fig–372; 360-degree assessment model used by GDOT, 364fig–367 Georgia Department of Transportation (GDOT): feedback grades for highway safety, 365–367; grades for highway safety, 366fig; stakeholder map of, 357–359; survey results regarding selected process dimensions, 365, 366fig; 360 degree assessment model used by, 364fig–367 Girl Scout Council of Northwest Georgia, 360–361t Girl Scouts of the Georgia, 360 Goal displacement: measures that are resistant to, 148–149; preventing, 437–438; sources of, 149–150 Goals: Council on Virginia’s Future use of vision to develop objectives, measures, and, 209–216; DHHS’s clarifying mission, objectives, and, 89–92; establishing program objectives, mission, and, 88–93, 110; performance-based budgeting planning to select, 265–266; performance management system tied to, 415–416; performance measures that are resistance to displacing, 148–150; program logic relationships to objectives and, 92–93; programmatic versus managerial objectives and, 105–108; strategic planning incorporation of both short-term and long-term, 206–207; structuring public and nonprofit, 108–110; United Way of Metropolitan Atlanta’s established targets and, 102 Google, 304 Governing (magazine), 306 Government: accountability relationships under different delivery mechanisms of, 310t–311; federalism between federal and state, 309; intergovernmental relations between, 309; stakeholder engagement and feedback to, 355–380 See also federal government; Local government; States government 454 Subject Index Government Accountability Office (GAO): best collaborative approaches study by, 428, 429t; PART report by the, 18, 20 Government channel (Time Warner Cable), 378 Government Performance and Results Act (GPRA): accountability through the, 314; description of, 17; DHHS clarifying mission, goals, objectives, and performance measures to comply with, 89; PART (Program Assessment Rating Tool) to fulfill, 17–20; tracking performance of specific operating programs under the, 226; traditionally political budget process as conflicting with intention of the, 237–238 Government Performance and Results Modernization Act (GPRAMA): DHHS clarifying mission, goals, objectives, and performance measures to comply with, 89; passage and use of the, 20; substantial changes enacted under the, 20–22; tracking performance of specific operating programs under the, 226 GradedDC.gov, 362–363fig Grants See Federal grants Graphical displays: bar charts, 160–162fig, 163fig; bubble graphs, 164fig–166; line graphs, 162–164; pie charts, 160; reporting data using, 160–166 Great Britain’s National Health Service, 149 Great Recession, 191 Group-level outcomes, 380 H Harwood Institute, 376 Hawaii Auditor’s Report, 320–321 Healthcare.gov debacle, 304–305, 320 Highway maintenance program outcomes vs outputs, 59t Hillsdale College, 328 HIV/AIDS treatment programs, 76 See also AIDS education programs HOPE Scholarship program (Georgia): description of, 145–146; effectiveness measures of the, 146 Hospital Consumer Assessment of Health Care Providers and Systems (HCAHCPS), 368, 404 Hospital patient feedback dashboard, 167fig J Juvenile justice programs: after-care program service standards, 105; boot campus outcomes vs outputs, 59t I K IBM Center for the Business of Government, 185 Improvement See Quality improvement Indexes: American Customers Satisfaction Index Model, 374; American Public Works Association photometric, 126; Annual Texas Index Crime Rate, 248; description of, 117; Duke Activity Status Index, 119t; illustrative applications for performance management, 117–120; international roughness index (IRI), 134 Individual-level outcomes, 380 Individual performance management: CitiStat program used for, 23, 41, 294–297, 363, 378, 414; Community Disaster Education Program targets and actual performance, 285– 287fig; the Compass system of New Mexico State Highway and Transportation Department (NMSH&TD), 291–292; CompStat system used for, 23, 41, 292, 293t; MBO (Management by Objectives) of programmatic and, 283, 285–298; measures for monitoring systems, 287–291; OASIS system used for, 292, 294 Instruments: decay of, 139–140; design problems, 138 Intermediate outcomes, 380 Internal consistency reliability, 128–129 Internal resistance: performance measurement system barrier of, 435–436; strategies for overcoming, 436–437 International City/County Management Association, 388, 395 International roughness index (IRI), 134 Iowa Department of Natural Resources (DNR), 350, 351t Kaizen events, 350, 352t Keep America Beautiful program, 126 Keep Nebraska Beautiful program, 56–57 Kennedy School of Government’s Innovations in American Government Awards, 414 Kentucky’s Wolf Creek Dam contract, 318–319 Kenya Red Cross: balanced scorecard on, 221fig–224; mission, vision, and core values overview, 221fig; strategy map of, 222fig Key performance indicators See Performance indicators KISS principle (Keep It Simple Stupid), 434 Korean Ministry of Public Administration and Security, 378 L Labor productivity measures See Productivity measures Leadership: performance management system design role of, 420; performance measurement system stakeholders and support of, 421–424 Lean improvement tool, 348 Lean organizations, 344 Lean Six Sigma, 250–252 Legislation: Affordable Care Act (2013), 304; American Recovery and Reinvestment Act (ARRA), 25, 306, 323; Budget Accounting and Procedures Act (1950), 236; Government Performance and Results Act (GPRA), 17, 89, 226, 237, 314; Government Performance and Results Modernization Act (GPRAMA), 20–22, 89, 226 See also Federal government Liability: accountability dimension of, 14; definition of, 14 455 Subject Index Library of Virginia: balanced scorecard of, 214fig; output measures of, 213–216; output performance measures and objectives used by the, 215t Life in Jacksonville, 386 Line graphs: description of, 162; MARTA (Metropolitan Atlanta Rapid Transit Authority), 162–163fig Little League World Series (Williamsport), 180 Local government: City of Albuquerque (New Mexico) city budgeting, 253, 258fig– 260fig; City of Atlanta quality-important matrixes on public services, 371fig–372; City of Livingston (Montana) performance-informed budgeting, 252–253, 254fig–257fig; City of Phoenix Planning and Research Bureau, 283, 284t; City of Toronto’s child care spaces (Canada), 389fig–390; comparative police performance, 395–396t; crimes per 1,000 residents and estimated daytime population, 396–397fig; intergovernmental relations between, 309 See also Federal government; Government; States government Logic models See Program logic models Los Angeles Police Department, 23 M Malcolm Baldrige National Quality Award program, 41 Management by Objectives (MBO): action plan for, 278; comparing performance monitoring systems and, 280–283; description of, 224; four steps of the process, 277; individual and programmatic performance management using, 283, 285–298; key characteristics of, 281t; measures to evaluate, 282–283, 284t; origins of, 276–277; role of performance measures in, 278–279; setting SMART objectives for, 278 Managerial goals, 105–108 Managerial objectives: comparing programmatic and, 105–108; of workers compensation program, 107t–108 Maps: description of, 169; geographical information systems (GIS), 169, 171fig–172; presenting geographically decentralized performance data using, 172–173 Maryland’s Managing for Results budgeting, 240 Measurement See Performance measurement Measuring Health: A Guide to Rating Scales and Questionnaires (McDowell), 121 Medicare Rural Hospital Flexibility (FLEX) program, 327 Metropolitan Atlanta Rapid Transit Authority (MARTA), 162–163fig Minnesota Milestones, 386 Mission: DHHS’s clarifying goals, objectives, and, 89–92; establishing program objectives, goals, and, 88–93; Kenya Red Cross, 221fig; performance management system tied to, 415–416 Mixed measures: description of, 120; illustrative highway maintenance performance measures, 120t Monitoring See Performance monitoring systems Motorola, 345 Municipal Benchmarks: Assessing Local Performance and Establishing Community Standards (Ammons), 121 Municipal police department scorecard, 168–169, 170fig N National Assessment of Educational Progress test, 130 National Association of State Budget Officers, 263 National Benchmarking Project (Sweden), 402 National Breast and Cervical Cancer Early Detection Program (NBCCEDP), 323–324 National Council of State Boards of Nursing (NCSBN): comparative performance measurement systems leadership by, 388; CORE progress conducted by, 367–368fig, 402, 405–408; regulatory benchmarking by, 402–408 National Health Service (Great Britain), 149 National Highway and Transportation Administration, 94 National Highway System service standards, 104 National Performance Management Advisory Commission, 9, 233 Needs indicators: description of, 76; evaluation and monitoring applications of, 76–77 Networks: expanding use of performance management, 439; government and nonprofit use of, 427; managing the performance of, 427–428 Nevada’s performance-informed budgeting, 241–244, 245fig New Mexico State Highway and Transportation Department (NMSH&TD): Compass system of the, 291–292; setting targets approach by, 99 New Public Management movement: description of, 12; privatization priority of the, 305 New York City Police Department CompStat system, 23, 292, 293t, 414 Noncategorical grants, 312 Nonprofit organizations: balanced scorecard models for service-oriented public and, 219–220; goal structures for profit and, 108–110 Nonresponse bias, 141–143 North Carolina Benchmarking Project, 402 O OASIS system, 292, 294 Objectives: for balanced scorecards of public and nonprofit sectors, 219–220; Charlotte Department of Transportation (CDOT), 227t; Council on Virginia’s Future use of vision to develop goals, measures, and, 209–216; DHHS’s clarifying mission, goals, and, 89–92; establishing program goals, mission, and, 88–93, 110; Library of Virginia 456 output performance measures and, 215t; performance-based budgeting planning to select, 265–266; program logic relationships to goals and, 92–93; programmatic versus managerial goals and, 105–108; strategy maps for determining, 220; of workers’ compensation programs management objectives and, 107t–108 See also Smart objectives Observer bias, 138–139 Office of Child Support Enforcement (Georgia), 362, 387 Office of Management and Budget (OMB): GPRAMA as used by the, 21–22; PART data used for improvement by the, 20 Office of Rural Health Policy, 327 Ohio Department of Transportation (ODOT), 158–159fig Ontario Municipal CAO’s Benchmarking Initiative (Canada), 389–390 Operating efficiency See Efficiency measures Operational indicators: averages, 114–115; identifying possible performance indicators, 120–121; indexes, 117–120; mixed measures, 120t; overview of, 112–114; percentages, rates, and ratios, 115–117; raw numbers, 114 See also Performance measures Oracle, 304 Oregon: ORS section 182.525 mandating evidence-based practice use by, 299–300; performance-informed budgeting system of, 260–261, 262 Oregon Benchmarks, 386 Organisation for Economic Co-operation and Development, 440 Organization performance management readiness, 37fig, 38, 424 Organizational Six Sigma metrics, 344t Output-oriented systems: Department of Veterans Affairs disability compensation and patient expenditures (2000–2012), 335–337; Subject Index description of, 334–335; productivity analysis consideration of, 335–337 Out of the Crisis (Deming), 341–342 Outcome measures: backwardmapping approach to, 85–86; beginning, intermediate, and end, 380; Canadian Pension Plan Disability (CPPD) Program immediate, 83t; Canadian Pension Plan Disability (CPPD) Program long-term, 83t; challenges related to validity when monitoring, 129–133; community-level, 380; content-oriented, 380; Council on Virginia’s Future’s Virginia Perform, 212–213; description of, 57–58; group-level, 380; illustrations of outputs versus, 58–60; individual-level, 380; performance standards representing the, 97–99; process-oriented, 380; program logic, 57–60; program purpose tied to, 71–72; teen mother parenting education program, 80t; US Department of Transportation, 216, 217t–218t; user-oriented, 380 Output measures: Canadian Pension Plan Disability (CPPD) Program, 83t; Council on Virginia’s Future’s Virginia Perform, 212–213; description of, 57, 67–68; illustrations of outcomes versus, 58–60; Library of Virginia, 213–216; performance standards representing the, 97–99; program logic, 57–60 Overreporting problem, 137–138 P PART (Program Assessment Rating Tool): GAO report on, 18, 20; Government Performance and Results Act (GPRA) implementation by, 17–20; grant and contract performance measurement and management using the, 314–315; OMB use of the, 20 Partners: feedback gathered from, 364fig–367; stakeholder engagement by, 356 See also Collaboration Past trends targeting approach, 100 Peer groups: comparative measures using, 394–396; comparative police performance of cities, 395–396t; constructing, 394–935; crimes per 1,000 capita by percentage of population below poverty level, 395fig Pennsylvania: Department of Transportation qualityimportant matrixes, 370fig– 371; Department of Transportation (PennDOT) strategic management, 202–203; performanceinformed budgeting by, 240; satisfaction with ride quality and roughness by road type in, 373–374fig Percentages: description of, 115; statistical applications of, 115–116 Performance: performance management expectations impact on, 413–414; research on how performance management impacts, 414–416 Performance-based budgeting guidelines: four recommendations for implementing PBB, 268–269; on linkage to budget decision process, 267–268; on measurement, 266; overview of the key elements and concerns, 264–265; on planning, 265–266; on reporting, 266–267; on verification, 267 Performance-based budgeting (PBB): conceptual understanding of public, 232–235; effectiveness of, 263; examining the process of, 231–232; guidelines for implementing, 264–269; historical development and the current state of, 236–240; importance of direct linkage between budget decisions and, 267–268; as means to minimize political process of budgeting, 235; potential benefits of, 235–236t; successes and concerns over, 269–270 See also Cost-effectiveness measures; Federal budget reform; State-level performance-informed budgeting system 457 Subject Index Performance-based grant management: ARRA implementation of accountability in, 323–324; challenges related to a, 326–327; description of, 323; Medicare Rural Hospital Flexibility (FLEX) program, 327; National Breast and Cervical Cancer Early Detection Program (NBCCEDP), 323–324; program logic models from grantee and grantor perspectives, 324–326 Performance contracts: Auditor of the State of Hawaii reporting on, 320–321; description of, 320 Performance data analysis: of child support enforcement program effectiveness, 188–193; heavy influence of external forces on, 194; limitations of surface-level comparisons, 193–194; of public transit system performance, 175–188; questions that may be raised by, 194–195; relevant questions to address during, 193 See also Data Performance indicators: common measurement problems, 135–144; criteria for selecting, 144–153; data sources used for, 121–126; identifying possible, 120–121; operational, 112–121; performance management system definition and use of, 37fig, 43; performance measurement based on, 5; reliability of, 126–129; SMART principle for, 314; validity of, 126–127, 129–135 See also Performance measures Performance management: accountability of, 4, 13–17; benefits of, 24; challenges related to, 5, 25–27; data aggregation and utilization over time across different, 29–30fig; description of, 1, 3, 35–36, 275; distinguished from related fields, 27–30; distinguishing between performance measurement and, 5, 36; grant and contract programs, 304–329; how expectations contribute to, 413–414; individual and programmatic, 283, 285–298; prospects for progress in, 439–440; public management context of, 4, 11–13; purposes of, 9–10; questions that structure our understanding of, 5–6; research on impact on performance by, 414–416; understanding the importance and potential of, 440–442; wide scope and increasing interest in, 3–4 Performance management doctrine, 10–11 Performance management framework: budgeting element of the, 7fig, 8; diverse logic models used to develop a, 60–66fig; illustrated diagram of, 7fig; management element of, 7fig, 8–9; overview of the, 7–11; planning element of, 7fig, 8; program logic used to develop a, 54–60; reporting element of, 7fig–8 Performance management institutionalization: early history of, 17; GPRAMA (Government Performance and Results Modernization) and, 20–22; PART (Program Assessment Rating Tool) and, 17–20; Texas’s strategic planning, budgeting, and monitoring system, 22fig–23 Performance management process: aligning mission, goals, and strategies, 415–416; barrier to success, 418–419; follow-up for program improvement, 417–418; selecting quality performance measures, 416–417 Performance management reform: demand for accountability and, 4; multiple efforts for, 4; New Public Management movement, 12, 305 Performance management strategies: data aggregation and utilization over time across different, 29–30fig; for result-oriented tools and policy stages heuristic, 29–30fig Performance management system design: as deliberate process, 424–425; leadership role in, 420; specifying the, 37fig, 45–46, 424 Performance management system development: step 1: clarify the purpose of the system, 37fig–38, 424; step 2: assess organizational readiness, 37fig, 38, 424; step 3: identify external stakeholders, 39, 424; step 4: organize the system development process, 39, 424; step 5: identify key purposes and parameters for initiating performance management, 39–41, 424; step 6: define the components for the performance management system, 41–42, 424; step 7: define, evaluate, and select indicators, 43, 424; step 8: develop data collection procedures, 43–45, 424; step 9: specify system design, 45–46, 424; step 10: conduct a pilot if necessary, 46–47, 424; step 11: implement full-scale system, 47, 424; step 12: use, modify, and evaluate the system, 47–48, 424; step 13: share the results with stakeholders, 48–49, 424 Performance management systems: design and implementation of, 35–49; elements of successful, 420–428; flexibility of an effective, 49–50; implementing interagency collaborations, 428, 429t; managing the process of, 415–419; multiple types of, 36; overview and examples of, 275–276; public transit, 175–188; strategies for successful, 428, 430–439 Performance measure sets: Canadian Pension Plan Disability (CPPD) Program, 81–84; overview of integrated, 77–78; teen mother parenting education program, 78–81 Performance measurement: based on key performance indicators, 5; common problems related to, 135–144; considerations for, 67; data aggregation and utilization over time of, 29–30fig; definitions of, 6–7; distinguishing between performance management and, 5, 36; elements of successful, 420–421; federal grant and contract programs, 304–329; managerial purposes served by, 13; performancebased budgeting, 266; purposes of, 9–10; strategic 458 management intersection with, 228–229 Performance measurement problems: barriers to success, 418–419; cheating, 143–144; instrument decay, 139–140; noncomparability of data, 135–136; nonresponse bias, 141–143; observer bias, 138–139; poor instrument design, 138; reactive measurement, 140–141; tenuous proximate measures, 136–137; underreporting or overreporting, 137–138 Performance measurement system: deliberate process of a successful, 424–425; elements of, 420–421; implementing interagency collaborations for, 421, 429t; leadership and stakeholder support and involvement in, 421–424; networks and collaborations of, 427–428; project management of the, 425–427; strategies for successful, 428, 430–439 Performance measurement system strategies: avoiding system abuse, 438–439; consideration of resource requirements, 432–433; focusing on usefulness of information produced, 431–432; getting stakeholder buy-in, 434–435; guidelines for maximizing useful data, 433–434; issues to consider for successful, 428–430; to overcome internal resistance, 435–437; preventing goal displacement and gaming, 437–438 Performance measures: analyzing, 175–195; benchmarking performance using comparative, 384–409; cascading the, 274–275; Charlotte Department of Transportation (CDOT), 227t; compensation and rewards linked to, 421; considerations when selecting, 67; costeffectiveness, 72; Council on Virginia’s Future use of vision to develop objectives, goals, and, 209–216; customer satisfaction, 72–73, 77–78, 372–376; developing logic models for, 84–86; DHHS’s clarifying mission, goals, objectives, and, 89–92; Subject Index efficiency and productivity, 68–70; external and other environmental, 75–76; guidelines for defining, 153; illustrative highway maintenance, 120t; integrated sets of, 77–84; Management by Objectives (MBO) role of, 278–279; needs indicators, 76–77; performance standards and program logic model implications for, 97–99; productivity, 69–70, 80t, 116–117, 178e–179, 181, 182t, 183; for public organization goals and objectives, 110; public transit system performance, 177–179; quality, 416–417; resource, 74, 177–178e; service quality, 70–71, 80t, 177–178e, 181, 182t, 183, 337–338; strategic planning and management supported by, 199–229; system productivity, 73–74; workload, 74–75 See also Operational indicators; Outcome measures; Output measures; Performance indicators Performance measures criteria: balanced and comprehensive measures, 145–146; clear preferred direction of movement, 146–147; cost-sensitive performance measures, 150–151; description of, 144; guidelines for defining performance measures, 153; meaningful measures, 145; measures that are resistant to goal displacement, 148–150; systematic assessment of performance measures, 151–152t; timely and actionable measures, 147–148 Performance monitoring systems: challenges related to validity when, 131–133; Commitment to Ongoing Regulatory Excellence (CORE), 367– 368fig, 402, 405–408; comparing Management by Objectives (MBO) and, 280–283; crisis stabilization logic model used for teen mother parenting education program, 61fig, 78; individual and programmatic measures for, 287–291; key characteristics of, 281t; needs indicators used for, 76–77; process improvement through, 332–339; of the program nuts and bolts, 352–353; public transit system performance trends over time, 179fig–181t; strategic planning on future evaluation and, 206; Texas’s strategic planning, budgeting, and, 22fig–23 Performance standards: child support enforcement program, 96–97fig; description of, 96; output and outcome measures representing, 97–99; performance measures implicated by program logic model and, 97fig–99; targets for established, 96–97 Pew Center on the States, 238 Pew-MacArthur Results First Initiative, 261, 263 Phoenix Planning and Research Bureau, 283, 284t Photometric indexes, 126 Pie charts, 160 Pilot testing performance management system, 37fig, 46–47 Policy research outcomes vs outputs, 59t Predictive validity, 134–135 Privitization See Third-party implementation Process-oriented outcomes, 380 Process Six Sigma metrics, 344t Productivity: description of, 332–333; full-time equivalent employee, 333; micro level of, 333; process improvement by monitoring, 332–339 Productivity analysis: factors considered during, 333–339; focus of, 333 Productivity analysis factors: disaggregated measures, 338–339; monitoring service quality, 337–338; outputoriented systems, 334–337; rework, 338; standard hours, 334 Productivity improvement: monitoring, 332–339; quality management and, 339–352t See also Quality improvement Productivity measures: description and purpose of, 69; labor productivity ratios, 69–70; percentages, rates, and ratios used as, 116–117; public transit system performance, 459 Subject Index 178e–179; River Valley Transit performance standards, 181, 182t, 183; teen mother parenting education program, 80t See also System productivity measures Program Assessment Rating Tool (PART): description of, 19; improvement over time, 18fig, 20; origins and purpose of, 17 Program evaluation: data aggregation and utilization over time of, 29–30fig; evidence-based practice used for, 300–302; GDOT’s 360-degree assessment model for, 364fig–367; Girl Scout Council of Northwest Georgia customer surveys, 360–361t See also Assessment; Evaluating performance management system; Feedback Program goals: establishing program objectives, mission, and, 88–93; managerial objectives and goals versus, 105–108; performance measures that are resistance to displacing, 148–150; program logic relationships to objectives and, 92–93; strategic planning incorporation of both short-term and long-term, 206–207; United Way of Metropolitan Atlanta’s established targets and, 102; of workers compensation program, 107t–108 Program logic: developing program performance measures using, 54; goals, objectives, and relationship to, 92–93; models for, 54–57; outputs versus outcomes, 57–60 Program logic models: balanced scorecards and, 226, 228; Board of Nursing Program, 402–403fig; child support enforcement program, 97fig; collaborative process of developing a, 86; crisis stabilization, 61fig–62; on grantee and grantor perspectives, 324–326; illustrated diagram of generic, 55fig; local transit system, 176fig–177; overview and development of, 54–87; purpose of developing, 67; for stakeholder feedback, 379fig–380; state highway safety program, 63–65; state of workers’ compensation, 106fig–107; STD prevention program, 65–66fig; vocational rehabilitation program, 62–63fig Programmatic goals: comparing managerial objectives and goals to, 105–108; description of, 105 Programmatic objectives: comparing managerial goals and objectives to, 105–108; description of, 105; stated as SMART objectives, 105–106 Programmatic performance management: CitiStat program used for, 23, 41, 294–297, 363, 378, 414; Community Disaster Education Program targets and actual performance, 285– 287fig; the Compass system of New Mexico State Highway and Transportation Department (NMSH&TD), 291–292; CompStat system used for, 23, 41, 292, 293t; MBO (Management by Objectives) of individual and, 283, 284t–298; measures for monitoring systems, 287–291; OASIS system used for, 292, 294 Programs: evidence-based practice used to choosing, 298–300; follow-ups required to improve, 417–418; setting goals for, 88–93, 102, 105–108, 148–150, 206–207 See also Quality improvement Progressive movement (1920s), Project grants, 312 Project management, 425–427 Public Administration Review, 12 Public management: definition of, 11–12; performance management context of, 4, 11–13; stakeholder engagement role of, 356; three “big questions” for, 12–13 Public organizations: clarifying mission, goals, and objectives for, 88–93; establishing performance measures for, 110; goal structures for nonprofit and, 108–110; programmatic versus managerial goals and objectives for, 105–108; service standards established for, 102–105; SMART objectives and targets established for, 93–102 Public sector: balanced scorecard models for service-oriented nonprofit and, 219–220; benchmarking in the, 384–387; conceptual understanding of budgeting performance in the, 232–235; five strategic “action areas” in the, 109; performance-based budgeting (PBB) by the, 231–270; privatization of services through third-party implementation by, 304–329 Public transit system performance: comparisons against standards and other agencies’ performance, 181–183; considerations for analyzing, 175–176; estimated variable cost recovery per route (2012), 186t–187; local transit system logic model, 176fig–177; monitoring trends over time, 179fig–181t; performance measures on, 177–179; RFT ridership by quarter, 180–181t; route analysis of, 184–187; RVT ridership (2008–2012), 179fig–180; RVT ridership by route (2012), 184fig; sampling and statistical reliability, 187–188 Purposes: clarifying the performance management system, 37fig–38, 424; identifying key performance management system, 37fig, 39–41 Q Quality-important matrixes: description of, 369–370; local public services in the Atlanta area performance, 371fig–372; Pennsylvania Department of Transportation, 370fig–371; stakeholder feedback analysis using, 369–372 Quality improvement: examples of EPA process and organizational metrics used for, 344t; 5S methodology for, 349–350; Iowa Department of Natural Resources (DNR) results for, 350, 351t; kaizen events, 350, 352t; Lean Six Sigma, 350–352; lean tool for, 348; performance measure 460 Subject Index follow-ups required for, 417–418; as performance measurement purpose, 417; Six Sigma approach to, 343–344; Total Quality Management approach to, 339, 341–344; value stream mapping (VSM) tool for, 348–349 See also Productivity improvement; Program improvement; Programs Quality management: Deming’s fourteen steps toward, 341–342; Joseph Juran’s approach to, 342–343; productivity improvement through, 339–352t; Social Security Administration disabled worker beneficiary statistics (2012–2013) measures for, 339–341 Quality performance measures: description of, 416; major challenges to collecting, 416–417 R Rates: description of, 115; statistical applications of, 115–116 Ratios: description of, 115; statistical applications of, 115–116 Raw numbers, 114 Reactive measurement, 140–141 “The Real Jobless Rate” (2009), 130 Recalibrated measures: comparative measures to benchmark using, 396–397; crimes per 1,000 residents and estimated daytime population, 396–397fig Red Hat, 304 Regulatory benchmarking: Board of Nursing Program logic model, 402–403fig; Commitment to Ongoing Regulatory Excellence (CORE) used for, 367–368fig, 402, 405–408; National Council of State Boards of Nursing (NCSBN) example of, 402–408; number of actions taken to remove nurses from practice by total licensees, 405fig; patient feedback on how often patients receive help quickly from staff, 404fig–405 Reliability: benchmarking issue of comparative data, 391–392; comparing problems of validity and, 131; description and importance of, 126–127; internal consistency, 128–129; overview of issue, 127–129; possible problems associated with, 128; public transit system performance sampling and statistical, 187–188 Reporting: audience needs related to, 157–158; comparative frameworks and nature of, 156–157; performance-based budgeting, 266–267 Reporting audience: effective communication with the, 155–158; nature of performance data and comparative frameworks for the, 156–157; understanding the needs of the, 157–158 Reporting formats: basic tables and spreadsheets, 158–160; common graphical displays, 160–166; maps, 169, 171fig– 173; scorecards and dashboards, 166–169fig, 170fig Reporting performance data: Fatality Analysis Reporting System, 94; performance management framework step for, 7fig–8; problem of overreporting or underreporting, 137–138; Uniform Crime Reporting System, 116 See also Data Resource measures: description of, 74; public transit system performance, 177–178e Resource requirements, 432–433 Responsibility accountability dimension, 15 Responsiveness accountability dimension, 15 Results First Initiative (PewMacAthur), 261, 263 “Results owners,” 434 Rework productivity factor, 338 River Valley Transit (RVT): estimated variable cost recovery per route (2012), 186t–187; ridership (2008– 2012) of the, 179fig–180; ridership by quarter, 180–181t; ridership by route (2012), 184fig A Road to Results: A Performance Measurement Guidebook for The Annie E Casey Foundation’s Educational Program (Manno, Crittenden, Arkin, and Hassel), 121 S San Francisco Metropolitan Area school ratings, 169, 171fig, 172 SAT test, 129, 141–142 Scorecards See Balanced scorecards Seoul’s information and community technology (ICT) [South Korea], 378 September 11, 2001, 337 Service consumption, public transit system performance, 177–178e Service quality measures: common dimensions of, 70; description of, 70; examples of, 71; productivity analysis of, 337–338; public transit system performance, 177–178e; River Valley Transit performance standards, 181, 182t, 183; teen mother parenting education program, 80t See also Customer satisfaction measures Service standards: description of, 102–103; distinguishing between targets and, 102; examples of public, 103–104t; selecting a focus for, 104; targets selected for adherence to, 105; of workers’ compensation programs, 107t–108 Six Sigma: description and quality improvement approach of, 343–346; DMAIC steps of, 347; process and organizational metrics of, 344t; roles and tasks used in, 346; ultimate goal of, 348 SMART objectives: identifying performance measures using, 318; importance and benefits of establishing, 93–94; issues to consider when establishing, 95–96; MBO (Management by Objectives), 278; programmatic and managerial objectives stated as, 105–108; setting targets for, 99–102; targeted public and nonprofit programs, 94–95t, 96 See also Objectives 461 Subject Index Social Security Administration: disabled worker beneficiary statistics (2012–2013), 339–341; monitoring state agency performance by, 387 South Dakota’s performanceinformed budgeting, 240 South Korean information and community technology (ICT), 378 Southcoast Health System, 352 Spreadsheets report format, 158–160 Stakeholder audit, 357–358 Stakeholder buy-in, 434–435 Stakeholder engagement: build ownership through, 435; as citizens and partners in community issues, 376–378; e-government and, 378; logic model for stakeholder feedback and, 379fig–380; measuring and evaluating, 378–380; performance measurement buy-in for, 434–435; as performance measurement success factor, 420, 421–424; roles of the public in relation to the government for, 355–357 Stakeholder feedback analysis: customer satisfaction and objective performance measures, 372–374fig; expectations, satisfaction, and confirmation and disconfirmation models for, 374–376; overview of, 368–369; quality-importance matrixes used for, 369–372; of stakeholder involvement as citizens and partners in community issues, 376–378 Stakeholders: analyzing feedback by, 368–378; challenges of collaborations with, 356–357; citizens as, 356; customers as, 355; e-government involvement of, 378; Georgia Department of Transportation (GDOT) stakeholder map, 357–359; identifying, 357–359; identifying external, 37fig, 39; logic model for feedback from, 379fig–380; partners as, 356; political and agency, 356; public managers as, 356; sharing the performance management results with, 37fig, 48–49, 425 Standard hours productivity factor, 334 State highway safety program logic model, 63–65 State-level performance-informed budgeting system: Albuquerque (New Mexico) city budget process using, 253, 258fig– 260fig; City of Livingston (Montana) budget representing, 252–253, 254fig–257fig; examining current practice of, 240; Georgia’s process of, 241, 242t; Maryland’s Managing for Results approach to, 240; Nevada’s approach to, 241–244, 245fig; Oregon’s adoption of legislative mandate on, 260–261, 262; Pennsylvania (1968) development of a, 240; Pew-MacArthur Results First Initiative adopted by numerous states, 261, 263; South Dakota’s beginnings with, 240; Texas’s long-term use of performance budgeting, 22fig–23, 244, 246–251fig; Virginia’s practice of, 252fig See also Federal budget reform; Performance-based budgeting (PBB) States government: federalism relationship between federal government and, 309; performance -informed budgeting systems used by, 22fig–23, 240–263 See also Government; Local government; specific state Statistical benchmarking, 386 Statistics See Operational indicators STD prevention programs: explanatory variables for comparative measures of, 393; logic model of, 65–66fig, 161, 162fig Strategic management: balanced scorecards incorporated into, 221; concerns and processes of, 201–202fig; description of, 199–200; differentiating between strategic planning and, 199–200; incorporating strategic planning, budgeting, and measurement systems, 207; Pennsylvania Department of Transportation (PennDOT) example of, 202–203; performance measurement intersection with, 228–229 Strategic management model: description of, 207; illustrated diagram of, 208fig Strategic performance measures: strategic management model on, 208fig; United Way of Metropolitan Atlanta, 208–209 Strategic Performance Measures for State Departments of Transportation (American Association of State Highway and Transportation Officials), 121 Strategic planning: components of, 200; definition of, 200; differentiating between strategic management and, 199–200; illustration of conventional iterative process of, 204fig; implementation of, 202; incorporating both short-term and long-term goals into, 206–207; for performance-based budgeting, 265–266; performance management system tied to, 415–416; purpose of, 200–201; ten-step process of, 203–206 Strategisys, 17 Strategy maps: Kenya Red Cross, 222fig; objectives determined through, 220; origins and description of, 220 Subunits targeting approach, 100–101 Survey data, 125–126 Sustainable Seattle, 386 SWOC (strength, weaknesses and external opportunities and challenges) analysis, 204–205 SWOT analysis, 204 System productivity measures: description of, 73; examples of, 73–74 See also Productivity measures T Table report format, 158–160 Target setting approaches: benchmarking, 101; forecasting, 100; past trends, 100; production function, 100; subunits, 100–101 Targets: distinguishing between service standards and, 102; establishing performance standards with clear, 96–97; 462 examples of program SMART objectives and, 94–95t, 96; service standards selected for adherence to, 105; set for SMART objectives, 99–102; United Way of Metropolitan Atlanta’s established goals and, 102 Teen mother parenting education program: crisis stabilization logic model used for monitoring a, 61fig, 78; logic model used for, 79fig; performance measures of, 78, 80t–81 Tenuous proximate measures, 136–137 Tests: ACT, 141–142; as data sources, 125; National Assessment of Educational Progress, 130; SAT, 129, 141–142; validity of, 129–130 Texas: Annual Texas Index Crime Rate of, 248; Combat Crime and Terrorism budget goal of, 248; strategic planning and performance monitoring systems of, 22fig–23; Texas Department of Public Safety, Fiscal Year 2014–2015 Budget, 249fig–251fig; Texas State Auditor reporting on the performance-informed budgeting system used by, 22fig–23, 244, 246–248 Texas Instruments, 345 Texas Tomorrow program, 386 Third-party implementation: challenges of government transition to, 305–308; considerations using performance management in grant programs, 327–329; distinguishing contracts from grants, 308–311; HeathCare gov debacle example of problems with, 304–305; performance measurement and management in grants and contracts, 311–327 360 degree assessment model: description of, 364fig–365, 367; GDOT’s use of, 364fig–367; grades for highway safety, 366fig; survey results regarding selected process dimensions, 366fig Time Warner Cable’s Government Channel, 378 Toronto’s child care spaces (Canada), 389fig–390 Subject Index Total Quality Management, 339, 341–344 Track DC website, 378 Transparency accountability dimension, 14 Transportation Research Board, 99 Treviicos-Soletanche, JV, 318–319 U Underreporting problem, 137–138 Uniform Crime Reporting System, 116 United Way of America, 79, 85, 376–377 United Way of Metropolitan Atlanta: five major goals established by, 102; performance measures of, 208–209 Urban Institute, 109 US Army Corps of Engineers, 318–319 US Centers for Disease Control and Prevention (CDC): National Breast and Cervical Cancer Early Detection Program (NBCCEDP), 323–324; program logic models from grantee and grantor perspectives, 324–326; STD prevention program, 65–66fig, 161, 162fig US Centers for Medicare and Medicaid Services, 368, 404 US Department of Health and Human Services (DHHS), clarifying mission, goals, and objectives, and performance measures at, 89–92 US Department of Housing and Urban Development (HUD), 314, 316 US Department of Justice, 23 US Department of Transportation: mission of the, 216; outcome measures used by the, 216, 217t–218t US Department of Veterans Affairs (VA): disability compensation and patient expenditures (2000–2012), 335–337; monitoring service quality at the, 337–338; productivity analysis consideration of hospital standard hours by the, 334 US Diplomatic Service performance monitoring, 131 US Environmental Protection Agency: cost per investigation completed by the, 117; lean improvement tool used at, 348; process and organizational metrics used by the, 344t US Forest Service performance monitoring, 131 US Internal Revenue Service, 57 US Navy, 343 US Patent Office dashboard, 167–168, 169fig User-oriented outcomes, 380 Utilization measures: public transit system performance, 177–178e; River Valley Transit performance standards, 182t, 183 V VA Office of Inspector General, 334 Validity: challenges related to tracking desired outcomes, 129–133; comparing problems of reliability and, 131; concurrent or correlational, 134; consensual, 133–134; Content, 134; description and importance of, 126–127, 129–130; example of a metropolitan transit authority’s welfare-to-work initiative, 130; example of unemployment statistics, 130; face, 133; performance-based budgeting, 267 Value stream mapping (VSM), 348–349 Variable names See Performance measures Vehicle productivity (public transit system performance), 178e, 179 Virginia: performance-informed budgeting practice by, 252fig; VDOT highway construction projects (2002), 164–165, 164fig; Virginia Performs program of, 209–212fig, 252fig Virginia Performs program: framework of the, 210fig; scorecard of, 211–212fig; Virginia Department of Environmental Quality 463 Subject Index agency budget report by, 252fig; vision that guides the, 209–210 Vision: Council on Virginia’s Future objectives, measures, and goals role of, 209–216; Kenya Red Cross, 221fig; strategic planning to establish organizational, 206 Vocational rehabilitation logic model, 62–63fig W Washington, DC Office of the City Administrator (OCA), 362–365fig Washington, DC’s Track DC website, 378 Wolf Creek Dam contract (Kentucky), 318–319 Workers’ compensation programs: program logic model of, 106fig–107; service standards and management objectives of, 107t–108 Workload measures: description of, 74; examples of, 74–75 Z Zero-based budgeting (1970s), 237 WILEYWILEY END USER LICENSE AGREEMENT END USER LICENSE AGREEMENT Go to www.wiley.com/go/eula to access Wiley’s ebook EULA Go to www.wiley.com/go/eula to access Wiley’s ebook EULA ... MANAGING AND MEASURING PERFORMANCE IN PUBLIC AND NONPROFIT ORGANIZATIONS MANAGING AND MEASURING PERFORMANCE IN PUBLIC AND NONPROFIT ORGANIZATIONS An Integrated Approach Second... Cataloging -in- Publication Data Poister, Theodore H [Measuring performance in public and nonprofit organizations] Managing and measuring performance in public and nonprofit organizations: an integrated... Networks • Developing a Performance Culture 8 Managing and Measuring Performance in Public and Nonprofit Organizations Performance measurement and reporting is the central element in the performance

Ngày đăng: 09/08/2017, 11:04

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN