1. Trang chủ
  2. » Cao đẳng - Đại học

Handbook of hpt third edition

1,4K 2 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 1.410
Dung lượng 4,73 MB

Nội dung

S S Handbook of Human Performance Technology Third Edition Principles, Practices, and Potential James A Pershing Editor Foreword by Harold D Stolovitch and Erica J Keeps Praise for the Handbook of Human Performance Technology, Third Edition “This third edition of the seminal Handbook weaves in two decades of applied HPT experience to provide even more relevant guidelines to today’s performance improvement practitioners as they continue the important work of leveraging an organization’s most precious capital—its people—toward verifiable, measurable, and valuable outcomes.” —Clare Marsch, senior principal, global learning consulting, Convergys Learning Solutions “The Handbook of Human Performance Technology is a valued resource for professionals who lead learning and performance improvement efforts in organizations In this edition, top thinkers in our field take on the tough issues, summarize current thinking, and offer valuable new insights.” —Catherine M Sleezer, CPT, Ph.D., professor, human resource/adult education, Oklahoma State University “This Handbook not only bridges the gap between European and American performance improvement strategies, it also includes key multicultural approaches for change agents that focus on business results.” —Steven J Kelly, CPT, managing partner, KNO Worldwide “Taking the helm with the third edition, James Pershing ensures that the Handbook of Human Performance Technology retains its leading role in the field Two aspects particularly resonate: a new classification of interventions at the worker and team levels and workplace and organizational levels, and a superb section on measurement and assessment, which concisely applies a variety of research and evaluation techniques specifically for use in our field.” —Saul Carliner, assistant professor, graduate program in educational technology, Concordia University, Montreal, Canada “The Handbook’s clear and supportive structure and the high scientific and/or practical expertise of its authors makes this excellent documentation of HPT’s mission, values, processes, and tools very beneficial and credible for both managers and HPT practitioners in work or social settings as well as academic readers with interest in state-of-the-art HPT related knowledge and experience.” —Verena Dziobaka-Spitzhorn, house of training/ head of learning and communication, METRO Cash & Carry International GmbH, Germany “The Handbook reflects the vast and diverse experience of the very best thinking and applications of HPT in the world today It is an invaluable and comprehensive reference for anyone interested in improving human performance in the workplace.” —Christine Marsh, CPT, principal, Prime Objectives, United Kingdom “As the knowledge revolution takes hold, victory will go to the smartest organizations and societies This must-have reference handbook provides consultants and business leaders with visual models, practices, and case histories to achieve measurable improvements in human performance and business results.” —Geoffrey A Amyot, CPT, CEO, Achievement Awards Group, South Africa S S Handbook of Human Performance Technology Third Edition Principles, Practices, and Potential James A Pershing Editor Foreword by Harold D Stolovitch and Erica J Keeps Copyright © 2006 by John Wiley & Sons, Inc Published by Pfeiffer An Imprint of Wiley 989 Market Street, San Francisco, CA 94103-1741 www.pfeiffer.com No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400, fax 978-646-8600, or on the web at www copyright.com Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, 201-748-6011, fax 201-748-6008, or online at http://www.wiley.com/go/permission Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose No warranty may be created or extended by sales representatives or written sales materials The advice and strategies contained herein may not be suitable for your situation You should consult with a professional where appropriate Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages Readers should be aware that Internet websites offered as citations and/or sources for further information may have changed or disappeared between the time this was written and when it is read For additional copies/bulk purchases of this book in the U.S please contact 800-274-4434 Pfeiffer books and products are available through most bookstores To contact Pfeiffer directly call our Customer Care Department within the U.S at 800-274-4434, outside the U.S at 317-572-3985, fax 317-572-4002, or visit www.pfeiffer.com Pfeiffer also publishes its books in a variety of electronic formats Some content that appears in print may not be available in electronic books Copyright page continued on 1,364 Cataloging-in-Publication Data on file with the Library of Congress Acquiring Editor: Matthew Davis Director of Development: Kathleen Dolan Davies Production Editor: Nina Kreiden and Liah Rose Editor: David Horne Manufacturing Supervisor: Becky Carreño Editorial Assistant: Leota Higgins Illustrations: Interactive Composition Corporation Printed in the United States of America Printing 10 S S To Patricia Lorena, James Frederick, and Dara Lynn S S S CONTENTS Foreword to the Third Edition xiii Harold D Stolovitch, Erica J Keeps Preface xxi Acknowledgments xxvii The Editor and Editorial Advisory Board xxix Foreword to the First Edition xxxi Thomas F Gilbert Foreword to the Second Edition xxxvii Robert F Mager PART ONE: FOUNDATIONS OF HUMAN PERFORMANCE TECHNOLOGY Monique Mueller, editor Human Performance Technology Fundamentals James A Pershing The Performance Architect’s Essential Guide to the Performance Technology Landscape 35 Roger M Addison, Carol Haig Business Perspectives for Performance Technologists 55 Kenneth H Silber, Lynn Kearny vii viii CONTENTS Performance Improvement: Enabling Commitment to Changing Performance Requirements 93 William R Daniels, Timm J Esque Systemic Issues 111 Dale M Brethower Mega Planning and Thinking: Defining and Achieving Measurable Success 138 Roger Kaufman The Origins and Evolution of Human Performance Technology 155 Camille Ferond PART TWO: THE PERFORMANCE TECHNOLOGY PROCESS 189 Jim Hill, editor Aligning Human Performance Technology Decisions with an Organization’s Strategic Direction 191 Ryan Watkins Analysis and More 208 Allison Rossett 10 Requirements: The Bridge Between Analysis and Design 223 Ray Svenson 11 Modeling Mastery Performance and Systematically Deriving the Enablers for Performance Improvement 238 Guy W Wallace 12 Dimensions of Organizational Change 262 Larissa V Malopinsky, Gihan Osman 13 Using Evaluation to Measure and Improve the Effectiveness of Human Performance Technology Initiatives 287 Robert O Brinkerhoff 14 The Full Scoop on Full-Scope Evaluation 312 Joan C Dessinger, James L Moseley PART THREE: INTERVENTIONS AT THE WORKER AND WORK TEAM LEVELS 331 Karen L Medsker, editor 15 Instruction as an Intervention 335 Michael Molenda, James D Russell 1344 SUBJECT INDEX Level-based evaluation models: comprehensive performance evaluation (CPE), 1168–1169 Leveled production, 731–732 Level of significance, 858 Levels of analysis, 1194–1195 Libraries, 62 Life cycle of project management, 945–947; evaluation of project, 961–962; execution phase, 960–961; implementation phase, 960–961; initiating phase, 947–951; organizing phase, 959–960; planning phase, 951–959; processes, 946; termination of project, 961 Likert scale, 770–773 Local focus, 1059 Logical processor (LP), 610 Logic models, 1172–1175, 1178–1180; Calhoun County Health Improvement Program model, 1174; comparisons, 1182; for comprehensive performance evaluation, 1179; generic model, 1173; improvements, 1182; measurement of key indicators, 1181; National Tobacco Control Program (NTCP), 1183, 1184; program-as-implemented, 1181–1182, 1185; program-as-intended, 1180–1181, 1183–1185; program redesign, 1186; telephone sales company example, 1183–1186 Logics of an organization, 59–60, 63–66, 74–75; economic logic, 59, 66, 67–69; external logic, 59, 63–66, 88; internal logic, 59, 76, 82–86; process logic, 59, 76, 79–81; product logic, 59, 76, 77–78; strategy logic, 59, 66, 70–73; worksheets, 89–90 London School of Economics, 528, 529 Long-term memory, 371–372 M Machinery ergonomics, 683 Malcolm Baldrige National Quality Award, 693 Management disciplines, 943–944 Management meetings, 108–109 Management practices, 47 Management systems, 19 Managing HPT projects See Project management Marketing plan, 120–121 Market research data usage, 1094–1095 Master performers, 255 Mastery performance models: areas of performance (AoPs), 246–247; development of, 238–239 Matrix performance models: analysis, 240–243, 249–251 Maximizing performance, 724–725 Measurable goals, 201 Measurement oriented practice, 837 Measurement phase: learning and performance functions, 914 Measurements: of key indicators, 1181; knowledge management (KM), 630; organizational change, 281–282; organizational development (OD), 586–588 Measurement theory, 839–842 Measures of central tendency, 846–847 Measures of dispersion, 847–848 Measures of relatedness, 850–853; coefficient of alienation, 852–853; coefficient of determination, 852, 864; examples, 851, 852; Pearson’s correlation coefficient, 852; types of co-relationships, 851 Mechanical determinism, 169–171 Media in instruction, 346–350 Medical ethics, 1050–1051 Mega planning and thinking: basic questions, 141; critical success factors (CSFs), 142–144; defined, 151; future change planning, 144–148; getting agreement, 144; historical origins of HPT, 175; job aid flowchart, 144–147; needs for desired results, 150; objectives before process, 148–149; objectives success criteria, 149–150; organizational elements model (OEM), 140–141; organizational planning, 138–139; organizational success, 138–139; SUBJECT INDEX planning levels, 140–141; proactive planning, 144; results levels, 149; results levels aligned with planning, 149; six-step problem-solving model, 142; strategic direction, 195–196 Mental models: definitions, 594; instructional design, 380–382; learning organizations, 594–595 Mentoring relationships, 455–456; abuses in, 461; analysis, 474; analysis in systematic design, 474; apprenticeship as mentoring, 460–461; banking organization, 465; circle mentoring, 462–463; coaching example, 459; communities of practice (CoP), 647; community organization, 465–466; computer business reengineering, 466; corporate programs, 456–458; counseling example, 459; definitions, 455–456; demonstrating example, 459–460; design and production, 474–475; diversity, valuing, 464; economic impact, 463; economic impact of skills loss, 463; effective mentoring, 465–467; evaluation of systematic design, 475; expanding the program, 475–476; facilitating desired performance example, 460; feedback and coaching example, 460; financial organization, 466; giving feedback example, 459; giving information example, 460; group mentoring, 462; guiding example, 460; Hawthorne effect, 468; health care product distribution, 466; highway engineering, 466–467; implementation of systematic design, 475; information systems business, 467; listening example, 459; MMHA Mentoring Coordinator Development Institute, 472; modeling example, 459; one-to-one pairings, 461; organizational benefits, 467, 470–471, 476; organizations, 456; performance system design, 474–475; pitfalls, 472–474; preventive actions to potential pitfalls, 472–474; process, roles and tasks, 458–460; reverse mentoring, 462; role of mentors, 458–459; as a strategy for performance improvement, 455, 457–458; success factors of corporate programs, 457; supervisor benefits, 471–472; systematic design, 474–475; tasks performed by mentors, 458–459; teaching example, 459; transfer of skills, 464–465, 476; tutoring, 459; tutoring example, 459 Meta-analysis, 867 Metaethics, 1026 Meta evaluation, 318–320, 324–326 Metamodel of performance improvement, 176 Methodological approaches: quantitative vs qualitative methods, 750–754 Methods of evaluation: analysis of performance issues, 302; design of solution, 304; goal setting, 300; implementation, 306; sustained impact, 309–310 Methods of instruction, 345–350; construction, 348; demonstration, 347; discovery-inquiry, 349; discussion, 348; drill and practice, 348; effectiveness, 350; expression, 348; learning methods, 346; presentation, 347; reading, 347; reflection, 348; sample, 346; teaching methods, 346; tutorial, 347 Metrics of organizational goals, 59 Millennium Project, 1112 Minimalism, 559–561 Mission analysis phase: navy aircraft design, 1071–1074 Mission-based performance hierarchy, 1073 Mission Essential Task List (METL), 1072 Mission statements, 194–195, 263, 264, 906–907 MMHA Mentoring Coordinator Development Institute, 472 Mobile computing technologies, 549–550 Modality effect, 877; interventions, 884; learners who benefit, 887; measurements of, 886; reproduction of 1345 1346 SUBJECT INDEX Modality effect (continued) experiments, 891; retention periods, 888–889; skill-building lessons, 887–888; summary of experiments, 878–880; theoretical context, 890–891 Modes: learning organizations, 609–610; organizational change, 265; of reflection, 1133, 1135 Monitoring, 960–961 Moral awareness, 1031–1032 Moral center, 1051–1052 Moral exemplars, 1064 Morality, 1025–1026, 1029–1030 Motivation: definitions, 478; performance maps, 44; performance support systems (PSS), 548 Motivational programs: appreciative inquiry (AI), 1157, 1158; arbitrary rules, 483; Big Five, 481; causes of motivation, 480–481; collaboration, 493–494; commitment to performance goals, 487; control, desires for, 480–481; criticism for errors, 484; dishonesty or unfairness, 482; disputes about systems, 490; financial incentives, 488–489; flat-rate incentive plans, 490; impossible performance goals, 483; individual member accountability, 494; interest values, 488; killers of motivation, 482; mutual respect among teammates, 491–492; Myers Briggs Types, 481; needs for, 478–479, 495–496; performance gaps, 479–480; piece-rate incentive plans, 489; positive emotional environments, 486–487; quota incentive plans, 489; self-confidence in work skills, 485–486; skill values, 488; successful motivators for individuals, 485; successful motivators for teams, 491; team motivation, 490–491; tournament incentive plans, 489; utility values, 488; values and work goals, 487–488 Motivation for improvement, 332 Motorola and six sigma, 693 Multi-generational (product) plan (MGP or MGPP), 701 Multiple regression, 866 Multivariate analysis of variance (MANOVA), 866 Multivariate techniques of data analysis, 865–867 Music in workplace design, 679 Myers Briggs Types, 481 Mystery caller audit, 1235–1236 N National Air and Space Model (NASM), 1075 National Organization for Competency Assurance, 1017–1018 National Society for Programmed Instruction (NSPI), 165 National Tobacco Control Program (NTCP), 1183, 1184 Navy aircraft design: analysis phases, 1070–1071; constraints and requirements, 1076–1077; crew and team concepts and designs phase, 1080–1081; crew design, 1080–1081; feedback, 1081–1082; function allocation phase, 1076–1077; function analysis phase, 1074–1076; high drivers, 1082; human-centered approach, 1068; human performance improvement, 1068–1070; human systems performance goals analysis phase, 1074; interface concepts and designs phase, 1079–1080; International Society for Performance Improvement (ISPI) model, 1069; mission analysis phase, 1071–1074; mission-based performance hierarchy, 1073; Mission Essential Task List (METL), 1072; National Air and Space Model (NASM), 1075; performance estimation phase, 1081; principles of HPT in acquisition, 1067–1068; procurement process, 1068; systems model, 1069; task design and analysis phase, 1077–1079; team design, 1080–1081; Top-Down Function Analysis (TDFA), 1068, 1082–1083; training estimation phase, 1081; Universal Task List, 1073; user and requirements review phase, 1081–1082; SUBJECT INDEX workload estimation phase, 1081 Need hierarchy, 904–905 Need-payoff questions, 970 Needs assessment, 1099–1100 Network diagrams, 952, 954, 956 Network theory, 1261–1263 Nine-minute-teach, 939 No Child Left Behind Act of 2001, 874 Noise reduction, 676 Nonlinear dynamic systems (NDS), 1257 Nonplayer characters (NPCs), 420 Nonreactive measures, 802–804 Normal distributions: quantitative data analysis, 848 Normal excellence See Lean systems Norms: definitions, 505; standards and ethics, 1026; strategic alignment, 18; values alignment in teams, 505 Note-taking, 791 Nurse call centers, 1248–1249 O Objective observations, 1227–1228 Objectives Alignment Framework, 359–360 Objectives of a profession: community, 1049–1050; International Society for Performance Improvement (ISPI), 1049 Objectivism, 1027, 1029–1031 Obscuring variance, 1236–1237 Observation: algorithms, 799; artifact study, 800; category scheme sample, 813; checklists and category systems, 812–813; compiling data, 810–811; core participants, 807–808; data analysis, 810; data collection, 795–796, 817; data review, 811; descriptive observations, 810, 811; direct-analysis observations, 800–801; errors of central tendency, 809; errors of leniency, 809; focused observations, 810; frequency count, 799; halo effect, 809; hermeneutical research tradition, 812; history of the method, 796; in human performance technology, 797; indirectanalysis observations, 800, 802; instruments for observation, 804–806; interpretational analysis, 812–814; key stakeholder identification, 808; legitimate peripheral participants, 807; nonreactive measures, 802–804; observation alignment matrix, 804–805; observer unreliability, 808–810; participant selection, 807–808; phenomenological analysis, 812–814; quantitative vs qualitative methods, 751; reflective analysis, 811–812; reliability, 806; stimulus response tables, 799; structural analysis, 811; structured observations, 798–800; successful observations, 815–817; task listing, 799; time and motion observations, 799; triangulation, 815; unobtrusive measures, 802–804; unreliability of observer, 808–810; unstructured observations, 798 Observation alignment matrix, 804–805 Observed behavior focus, 160 Observer unreliability, 808–810 Obstacles in games, 417 Obstacles to distance learning, 448 Obstacles to performance, 218, 219 Occupational Safety and Health Administration (OSHA), 671 Office of Educational Research and Improvement, 874 Official ignorance, 1229–1230 One-to-one pairings, 461 Open-ended questions, 768–769 Open systems, 1256 Operant, 1192 Operation adventure tools, 607 Operational levels, 525 Operations managers in stress, 42–43 Operations-process level model, 1204, 1205 Opinion leadership, 274 Opportunities See SWOT (strength, weakness, opportunity, and threat) analysis 1347 1348 SUBJECT INDEX Ordinal scales of measurement, 841 Organizational analysis of HPT, 1154 Organizational benefits of mentoring relationships, 467, 470–471, 476 Organizational change: adoption stages, 277; appreciative inquiry (AI), 1159–1160; change agent, 274–275; changing stage, 267; coaching, 275; communication, 275; communication and training programs, 282–283; confirmation stages, 278–279; control of change, 265–266; decision stages, 278; defining organizational change, 263–264; diagnosis of organization, 270–271; diffusion, 277; early adoptors, 279; executive sponsorship, 273–274; force or coercion strategy, 271; forces driving change, 266–267; implementation of change, 276; implementation plan designs, 275–276; implementation stages, 278; incentive and recognition strategies, 283; incremental change, 264–265; incremental changes, 264–265; individuals’ needs, 276–277; individuals’ perceptions, 276–277; inertia, dangers of, 155–157; innovators, 279; knowledge stages, 278; laggards, 280; late majority, 279–280; leadership support, 275; managing change, 268, 269; measuring change, 281–282; model selection, 272; modes of change, 265; opinion leadership, 274; persuasion stages, 278; planned changes, 265–266; planning change, 268–270; proactive changes, 265; rational persuasion, 271–272; reactive changes, 265; refreezing stage, 268; resistance management, 275, 280–281; role definition, 273; role of performance technologists, 262–263; S.C.O.R.E (SymptomsCauses-Output-ResourcesEffects), 273; shared power, 272; strengths, weaknesses, opportunities, and threats (SWOT), 273; sustaining change efforts, 282; tool selection, 272–273; total quality management (TQM), 263; training, 276; transformational change, 264–265; unfreezing stage, 267; unplanned, 265–266; values of organization, 270; why organizations change, 264 Organizational Development Network: standards and ethics, 1034 Organizational development (OD): business indicators, auditing, 586; cultural factors, 574; cultural fit, 576; cultural indicators, auditing, 587; current state example, 577; definition of, 573–574; deliverables, 575, 577–578; description of problem for X-Press software developer, 572–573; designing OD solutions, 580–581; emergent influences, 587–588; evaluation of effectiveness, 590–591; evaluation of OD solutions, 586–588; external deliverables, 575; external resource deficiencies, 583; horizontal fit, 581; ideal state, 574–575; implementation of OD solutions, 583; implementation plan, 590; internal deliverables, 575; internal resource deficiencies, 582; knowledge management (KM), 619; lagging indicators, 586; leading indicators, 586; measurement of OD solutions, 586–588; operational indicators, auditing, 587; overcoming resistance, 585–586; overcontribution by employee, 579–580; ownership of deliverables, 584; performance gaps, 579–580; project management, 584–585; reasons for OD intervention, 571; talent fit, 576 Organizational ecology, 156–157 Organizational elements model (OEM): basic questions, 141; and critical success factors (CSFs), 148–149; mega planning and thinking, 140–141 Organizational engineering, 609–610 SUBJECT INDEX Organizational learning See Learning organizations Organizational level: iceberg model, 46–47; interventions, 567–569 Organizational performance, 1202–1206; operations level, 1204; operationsprocess level model, 1204, 1205; organizationaladministrative level model, 1206, 1207; organizational alignment, 1206–1208, 1219–1221; organizational level, 1204–1206; people-job level model, 1202, 1203; people level, 1202; sample probes, 1210–1219; subsystems of the organization, 1208; whole systems organizational performance, 1209 Organizational planning: basic questions, 141; critical success factors (CSFs), 142–151; job aid flowchart, 145–147; mega planning and thinking, 138–139; organizational elements model (OEM), 140–141, 148–149; performance technologists, 191–192; planning levels, 140–141; proactive planning, 144; results levels, 149; sixstep problem solving model, 142; strategic direction, 196–198 Organizational programs and processes, 627 Organizational relationships, 722–723 Organizational scan model, 173–174 Organizational structure design, 118–120 Organizational success, 138–139 Organizational survival, 155–157 Organizational systems, 19 Organizations: blame of, 215–216; challenges facing HPT, 29; community or society contributions, 39; concerns, 1112–1113; definitions, 11–12; effectiveness of HPT, 28–29; General Motors example, 117–122; and the individual worker, 11; mentoring programs, 456; role set model, 98–100; subsystem maximization, 127–128, 130; as a system, 40; as systems, 117 See also Motivational programs; Values alignment; Work group performance Organizing phase, 959–960 Outbound call centers, 1231–1232 Overt alignment, 503–505 Ownership: perception analysis, 17 P Pace of production in lean systems, 730–731 Parametric scales of measurement, 841–842 Partnership phase, 911–913 Partnerships with clients: ACT (access, credibility, trust) approach, 912–913; failures, 915 Partnerships with management: American Society for Training and Development (ASTD), 925; As Is and To Be models, 931, 937–938; business-unit model, 931–933; cause analysis, 938–940; change-of-state analysis, 938–940; coreprocess model, 933–934; cultural due diligence, 936; current state of HPT profession, 924–927, 941; evaluation, 940–941; gap analysis, 937–938; human resources, 936; integrated human resource system (IHRS), 936–937; International Society for Performance Improvement (ISPI), 925; intervention selection, 938–940; job models, 934–935, 937; Language of Work Model, 927, 941; nine-minute-teach, 939; value proposition, 931; work environment, 935–937; worker analysis, 933–935; work execution improvement, 927–928; work groups, 935 Pathfinders of HPT: historical origins, 157 Patient discharge study, 1161 Pearson’s correlation coefficient, 852 Pebble-in-the-Pond model: constructivist framework, 356 People and mission alignment, 917–920 People-job level model, 1202, 1203 Perception analysis, 16–17 Perception management, 508–509 Performance: alternative for, 1092; definitions, 38; strategic alignment, 18 Performance analysis: action research, 21; case study, 977–984; causes and drivers, 218; 1349 1350 SUBJECT INDEX Performance analysis (continued) causes and solutions, 219; content analysis, 821–822; human and social systems, 19; of human performance technology, 1155–1156; management systems, 19; organizational systems, 19; performanceimprovement models, 18–21; physical and technical systems, 19–20; strategic direction, 203–204; SWOT (strength, weakness, opportunity, and threat) analysis, 1092–1093; technical systems, 19–20; worksheet, 984 Performance-centered design (PCD): attributes, 560; project development, 559, 560; winners, 557–558 Performance Consulting Guide: leadership skills, 967–969, 984–985 Performance consulting profession, 219–220 Performance design, 12 Performance development, 12 Performance equations, 1167–1168 Performance estimation phase, 1081 Performance feedback, 41 Performance gaps: motivational programs, 479–480; organizational development (OD), 579–580; performance support systems (PSS), 542 Performance-improvement initiatives: design, development, and implementation, 23–25; evaluation and feedback, 25–26; feasibility of interventions, 23; How questions, 17; samples, 16; strategic alignment with organization, 18; Who questions, 16–17; Why questions, 17 See also Evaluation; Rapid Reflection Model (RRM) Performance-improvement models: action research, 21; analysis team, 254–255; areas of performance (AoPs), 239, 246–247; change variables, 243–244; design, development, and implementation, 23–25; downstream teams, 255; enabler analysis, 248–249; enabler matrices, 239–243; environmental asset management systems (EAMS), 245, 259–260; environmental assets, 243–244; evaluation and feedback, 25–27; feasibility analysis, 23; feedback, 25; gap analysis, 239, 241, 247; getting started, 15; graphic representation, 15; How questions, 17; human and social systems, 19; human asset management systems (HAMS), 244–245, 256–258; human assets, 243–244; individual master performers, 255; interventions, 21–23; key data sets, 239–240; management systems, 19; master performers, 255; multiplicity of models, 14–15; nonteam approach to analysis, 255; organizational systems, 19; perception analysis, 15–17; performance analysis, 18–21; process design and redesign, 244, 256; project steering team (PST), 252, 253–254; strategic alignment with organization, 17–18; team approach to analysis, 251; technical systems, 19–20; Who questions, 16–17; Why questions, 17 Performance improvement process (PIP), 172–173 See also Evaluation Performance learningsatisfaction (PLS) evaluation system: historical origins of HPT, 175 Performance maps, 44–46, 52 Performance measures, 884–886 Performance models: definitions, 239; enabler matrices, 240–243, 249–251; improvement efforts, 242; mastery, 238–240 Performance obstacles, 218, 219 Performance problems: intervention selection, 21; performance maps, 44–46; samples, 16 Performance support systems (PSS): American Society for Training and Development (ASTD), 541; benefits of, 555–556; best practices, 547; business process integration (BPI), 550; challenges for HPT practitioners, 539, 562–563; changing field of, 540, 562–563; chat rooms, 547; communities SUBJECT INDEX of practice (CoP), 547; composite applications, 551; data mining, 547; decision support systems (DSS), 551; definitions, 540–541; development teams, 556–559; DLS Group developed system for government, 552, 554–555; EASY system, 552, 554–555; enterprise application integration (EAI), 550–551; evaluation, 544; Factory Automation Support Technology, 550; guidance and tracking, 545; IBM developed system for utility company, 552, 553–554; implementation, 561; interventions, 333; knowledge management (KM), 547; minimalism, 559–561; mobile computing technologies, 549–550; motivation, 548; on-line external system example, 552, 553–554; on-line intrinsic system example, 552, 554–555; performance-centered design (PCD) winners, 557–558; performance gaps, 542; The Performance Zone, 542; personal digital assistants (PDAs), 549–550; portals, 551; ProCarta, 550; return on investment (ROI), 561; scaffolding, 546; scope of systems, 542–544; taskstructuring support, 545–547; taxonomy of components, 545; tools, 548; user interface, 546–547; values alignment for project teams, 542–543; variations on, 548 Performance systems, 43–44 Performance systems, alignment with organizations: adaptive performance system, 1192–1193; analytical tools, 1193–1194; basic system model, 1197, 1198; culture factor: aligning practices, 1208, 1219–1221; dynamic systems, 1193; early models, 1192–1193; fundamental behavior model, 1192–1193; individual performer emphasis, 1191; individual performer system, 1199–1202; integrated framework, 1196; International Society for Performance Improvement (ISPI), 1195; levels of analysis, 1194–1195; models, 1190–1191; operant, 1192; operations-process level model, 1204, 1205; organizationaladministrative level model, 1206, 1207; organizational performance, 1202–1206; people-job level model, 1202, 1203; people level, 1202; practices alignment framework, 1220; reinforcing consequences, 1192; response, 1192; RULEG model, 1190; scalability, 1197–1199; stimulus, 1192; strategy factor: aligning processes, 1208, 1219; systems analysis, 1196–1197; systems framework, 1191–1192; taxonomic models, 1193–1194; whole systems organizational performance, 1209 Performance systems, design: ADDIE (analyze, design, develop, implement, evaluate) processes, 341–342; administrative requirements, 228, 229; analysis, 223–224; analysis, moving from, 223; analysis data, 230–232; best practices, 228, 229; cultural factors, 228, 230; cultural norms, 230; financial requirements, 228, 230; functional requirements, 228, 231; geographical requirements, 228, 229; instructional system, 233–234; interface requirements, 228–229, 231; legal and policy requirements, 228, 229–230; mentoring relationships, 474–475; multilevel solutions, 226–228; organization redesign examples, 234–235; qualification system design examples, 235–236; requirements, 223–224, 228; simple solutions, 224–225; solution development, 43; two-level solutions, 226; value of explicit requirements, 232–233, 236 Performance Systems Engineering Approach: International Society for Performance Improvement (ISPI), 224; model, 224; multilevel solutions, 226–228; simple solutions, 224–225; two-level solutions, 226 1351 1352 SUBJECT INDEX Performance technologists: blame of, 217–218; continuous improvement, 569; function in organizations, 191; organizational planning, 191–192; professional development, 899–901 See also Tools for performance technologists Performance technology landscape: critical learnings, 52–53; definitions, 41; landscape model, 51; performance, defined, 38; performance technology, defined, 38; principles, 39; resource materials, generally, 36–37; sample model, 37, 40; systemic approach to projects, 42; systems viewpoint, 40–42, 51–52; work environment, 39–40 Performance technology process, 189–190, 743 The Performance Zone, 542 Personal digital assistants (PDAs), 549–550 Personal ethics, 1058, 1063–1064 Personalization of workplace design, 671–672 Personal mastery, 594 Personal trust, 1060 Persuasion stages, 278 Pharmaceutical sales simulations case study, 423–424 Phase space, 1257 Phenomenological analysis, 812–814 Philosophical positions, 746–747 Philosophy of six sigma, 693–694 Pick up sticks game: as analogy for organization, 14 Piece-rate incentive plans, 489 Plan development, 321–322 Planning and analysis, 1041 Planning change, 268–270; change management strategy, 271–272; diagnosis of organization, 270–271; implementation plan designs, 275–276; model selection, 272; role definition, 273–275; tool selection, 272–273; values of organization, 270 Planning organizational change, 270 Planning phase of project management, 951; cost estimation, 958; fixed costs, 958; Gannt charts, 952, 956, 958; network diagrams, 952, 954, 956; Project Evaluation and Review Technique (PERT), 954–956; project risks, 958–959; resource need estimation, 956–958; task lists, 952, 955, 957; time estimates, 954–956; variable costs, 958; work breakdown structures (WBSs), 951–954; work packages, 951 Planning tools for sustainable development, 1117 Plant design: Ford Motor Company, 1113–1114 Player characters (PCs), 420 Play time and complexity, 431 Pokayoke, 733 Police academy communication example, 1261–1263 Political factors: feasibility analysis, 23 Population sampling, 748–749; interviewing, 787 Portals, 551 Post-positivism, 746, 758 Power law of learning, 371 Practical factors in feasibility analysis, 23 Practical guides for decision making, 192–193 Practical significance, 838–839 Practice of six sigma, 694 Practices alignment framework, 1220 Practitioners See Performance technologists Pragmatic philosophy: behavior analysis, 161–163; laboratory experiments in behavioral theory, 160–161; observed behavior focus, 160; programmed instruction, 163; response classes, 162; stimulus discrimination and generalization, 162; teaching machine, 163 Principles: performance technology landscape, 39; and practices, 516–518 Principles and Practices programs, 966–969 Principles of diversity, 180 Principles of evaluation, 315 Privacy and stimulation, 670–671 Proactive planning, 144, 265 Probability theory, 838 SUBJECT INDEX PROBE Model, 977, 978–979 Problem based learning (PBL), 887 Problem questions, 970 Problem space: definitions, 385; work group performance, 531–532 ProCarta, 550 Procedural knowledge: cognitive approach, 372; sample lesson on running effective meetings, 402–410 Process: knowledge management (KM), 628; performance technology landscape, 41; quantitative data analysis, 837–838, 868–869 Process and mission alignment, 910–911; ACT (access, credibility, trust) approach, 912–913; assessment phase, 913; failures, 915; implementation phase, 914; measurement phase, 914; partnership phase, 911–913; phases of, 911; roles of other departments, 914, 915; workflow process, 914–915 Process focus, 107 Process-level analysis, 1000–1004 Process logic, 59, 76, 79–81 Process mapping, 702 Process model, 598–599 Process of content analysis, 825 Process of evaluation, 289–290 Process results, 993 Process step comparison, 698–699 Procurement process, 1068 Production of appearances, 1245–1246 Product logic, 59, 76, 77–78 Product-Market Matrix, 1091 Professional development: performance technologists, 899–901 Professional ethics: benefits to businesses, 1048–1049; certification, 1047; Certified Performance Technologist (CPT), 1048; China, 1053–1054; common terms, 1050; community objectives, 1049–1050; computer manufacturer example, 1060–1061; consistency of practice (CoP), 1057; core beliefs of different cultures, 1052; duty, 1059; East Asia, 1053; ethical environment, building an, 1062–1064; global duty, 1060; global impact of HPT professionals, 1059–1060; the Golden Rule, 1056–1057; integrity, 1051–1052; Islam, 1055–1056; Japan, 1054–1055; Korea, 1054; local focus, 1059; medical ethics, 1050–1051; moral center, 1051–1052; moral exemplars, 1064; organizational trustworthiness, 1061–1062; personal ethics, 1058, 1063–1064; personal trust, 1060; principles of, 1036–1037, 1057–1059, 1063–1064, 1067–1068; principles of HPT practice, 1047; principles of professional ethics, 1057, 1058; prosperity and trust, 1052; Shintoists, 1054; societal framework, 1058–1059; South Asia, 1055; spin and trust, 1062; the West, 1056–1057; WestSouthwest Asia, 1055–1056; worthwhile lives, 1064 See also Codes of ethics; Standards and ethics Professionalism movement, 875 Program-as-implemented logic model, 1181–1185 Program-as-intended logic model, 1180–1185 Program evaluation models, 1171–1172 Programmed instruction, 5–6, 163 Project-based pilots, 921 Project development: collaboration between developers and users, 556; design, 559–561; development teams, 556–559; evaluation, 561–562; implementation, 561; minimalism, 559–561; performancecentered design (PCD), 559, 560; rapid application development (RAD), 556–559, 561–562; return on investment (ROI), 561 Project Evaluation and Review Technique (PERT), 954–956 Project management: ADDIE (analyze, design, develop, implement, evaluate) processes, 944, 960; analyzing stakeholder requirements, 947–948; benefits of, 1353 1354 SUBJECT INDEX Project management (continued) 944–945, 962–963; communication with stakeholders, 961; constraints and boundaries, 949–950; cost estimation, 958; credibility of project managers, 962; defined, 943; delegating the work, 959–960; dependencies, 952–956; essential elements, 951; evaluation of project, 961–962; fixed costs, 958; Gannt charts, 952, 956, 958; goals and deliverables, 949; initiating phase, 947; life cycle, 945–947; management disciplines, 943–944; monitoring, 960–961; network diagrams, 952, 954, 956; organizational development (OD), 584–585; planning phase, 951; problem definition, 947; Project Evaluation and Review Technique (PERT), 954–956; project risks, 958–959; resource need estimation, 956–958; roles and responsibilities, 959; scheduling, 952–956; scope documents, 950–951; scope enlargement, 922; selfassessment form, 945; six sigma, 703; stakeholder management worksheet, 948; stakeholder requirement analysis, 947–948; task lists, 952, 955, 957; team member selection, 959; termination of project, 961; time estimates, 954–956; triple constraint model, 949; use of project management by HPT practitioners, 944–945; variable costs, 958; work breakdown structures (WBSs), 951–952; worksheets, 948 Project risks, 958–959 Project steering team (PST), 252, 253–254 Psychological theory of needs: historical origins of HPT, 159 Psychomotor learning: goals and objectives, 345 Pull and flow, 725–726 Pyramid of quality, 709–710 Q Qualifications: certification, 1015; performance system design, 235–236 Quality functional deployment (QFD), 703 Quality improvements: samples, 16 Quality ratings: form, 1234; systems theory, 1232–1234 Quantitative data analysis: analysis of variance (ANOVA), 861, 866; canonical correlation, 866; categorical data, 842–843; central tendency, measures of, 846–847; chi-square tests, 859; coefficient of alienation, 852–853; coefficient of determination, 852, 864; continuous data, 842–843; correlation, 850–853; data types, 842–843; decision matrix, 857; decision theory, 838; descriptive statistics, 839; discrete data, 842–843; discriminant analysis, 866–867; dispersion, measures of, 847–848; effect size, 858–859; factor analysis, 867; graphs, 843; inferential statistics, 839; inferential techniques, 853–855; interval scales of measurement, 841; level of significance, 858; meaningfulness, 839–842, 867–869; measurement oriented practice, 837; measurement theory, 839–842; measures of central tendency, 846–847; measures of dispersion, 847–848; measures of relatedness, 850–853; meta-analysis, 867; multiple regression, 866; multivariate analysis of variance (MANOVA), 866; multivariate techniques, 865–867; nominal scales of measurement, 841; nonparametric scales of measurement, 841–842; normal distributions, 848; numerical data, 842–843; ordinal scales of measurement, 841; parametric scales of measurement, 841–842; Pearson’s correlation coefficient, 852; practical significance, 838–839; probability theory, 838; process, 837–838, 868–869; ratio scales of measurement, 842; recommendations, 868–869; regression, 861–864; relatedness, measures of, 850–853; resources, 869–870; SUBJECT INDEX scales of measurement, 841–842; skewed distributions, 849–850; statistical significance, 838–839; test statistics, 839; t-tests, 859–861; Type I error, 857; Type II error, 857 Quantitative vs qualitative methods: ADDIE (analyze, design, develop, implement, evaluate) processes, 745, 757; case study: medical school reaction evaluations, 755–757; content analysis, 820–821; data analysis, 755; data gathering, 754–755; deductive reasoning, 746; inductive reasoning, 746; methodological approaches, 750–754; observation, 751; philosophical positions, 746–747; population sampling, 748–749; postpositivism, 746, 758; reactionnaires, 755–756; research designs and methodologies, 746, 757–758; rigorous use of, 747–748; study focus, 747–748; written questionnaires, 751, 752–754 Questioning, 969, 970, 972 Questionnaire construction: administrative guidelines for use, 775–776; anonymity, 763–764; assess questions, 762–763; closed-ended questions, 768–769; closing statement, 765; data analysis, 774–775; design considerations, 760–761; information requirements, 762; instructions, 764–765; introduction, 763–764; layout or format, 772–774; Likert scale, 770–773; open-ended questions, 768–769; process of construction, 761–762; quality questionnaires, 763, 776; question construction, 765–768; question formats, 768–769; rating scales, 769–772; semantic differential scale, 772, 774; Thurstone scaling, 770, 771 Quick changeover, 727–728 Quota incentive plans, 489 R Rapid application development (RAD), 556–559, 561–562 Rapid deployment, 1009–1010 Rapid Reflection Model (RRM), 1123, 1125–1128, 1130, 1136, 1138; After Action Review (AAR), 1129, 1138; American Society for Training and Development (ASTD), 1130; anticipatory reflection, 1128; challenges, 1139; coaches, 1130, 1136, 1138; critical reflection, 1127; doubleloop learning, 1127; example of model in action, 1139–1140; focus of reflection, 1133; foundations of reflection, 1124–1125; framework for questions, 1135; HPT professional, role of, 1132; for HPT projects, 1126; idea cisterns, 1135; implementation, 1131–1138; increasing use of model, 1141–1142; individual skills and mind-set, 1130–1131; learning organization, 1124–1125; model summary, 1131; mode of reflection, 1133; nature of project, 1131; organizational culture, 1131; performanceimprovement initiatives in HPT profession, 1122–1124; preparatory reflection, 1128; project, nature of, 1131; reflectionin-action, 1127–1128; reflection-on-action, 1124, 1125, 1128–1130; reflection-to-action, 1125, 1128; reflective moments, 1136; reflective planning, 1128; resources, 1142–1143; responses to reflection questions, 1135–1136; role of HPT professional, 1132; success strategies, 1138–1139; summary of model, 1131; support for reflection, 1131; support of reflection, 1132–1133; support of reflection within organization, 1132–1133; technology usage, 1136; template of guiding questions, 1133–1136; time constraints, 1123, 1124; timing of reflection, 1133; tips and techniques, 1136–1138; tools, 1137–1138; Universal Rapid Reflection Model, 1141–1142 1355 1356 SUBJECT INDEX Rating scales in questionnaire construction, 769–772; Likert scale, 770–773; semantic differential scale, 772, 774; Thurstone scaling, 770, 771 Rational persuasion, 271–272 Ratio scales of measurement, 842 Reactionnaires, 755–756 Reactive changes, 265 Reactive stimulator (RS), 610 Recognition of achievement, 1024–1025 Reductionism, 1253 Reflection, 348 Reflection-in-action, 1127–1128; framework for questions, 1135; and the HPT process, 1128; need for, 1139; responses to reflection questions, 1135; template of guiding questions, 1134; Universal Rapid Reflection Model, 1141–1142 Reflection-on-action, 1124, 1125, 1128–1130; coaches, 1136; framework for questions, 1135; and the HPT process, 1129; responses to reflection questions, 1135–1136; template of guiding questions, 1134; tips and techniques, 1136–1137; Universal Rapid Reflection Model, 1141–1142 Reflection-to-action, 1125, 1128; coaches, 1136; framework for questions, 1135; responses to reflection questions, 1135–1136; template of guiding questions, 1134; tips and techniques, 1136–1137; Universal Rapid Reflection Model, 1141–1142 Reflection tools, 606–607 Reflective analysis, 811–812 Reflective moments, 1136 Reflective planning, 1128 Refreezing stage, 268 Regression, 861–864, 866 Relatedness, measures of, 850–853; coefficient of alienation, 852–853; coefficient of determination, 852, 864; examples, 851, 852; Pearson’s correlation coefficient, 852; types of co-relationships, 851 Relational innovator (RI), 610 Relativism, 1029–1031 Reliability, 806, 808–810, 817 Repair technician training, 111–112 Research: effectiveness of specific interventions, 9; evaluation of performance improvement initiatives, 25; quantitative vs qualitative methods, 746, 747, 757–758 Research study problems, 892, 893 Resistance management, 275, 280–281 Resistance to change, 1278–1279 Resource management simulations case study, 424–425 Resource materials, 36–37 Resource need estimation, 956–958 Respect for other views, 652–653, 662 Respect for people, 724 Response classes, 162 Responsibilities for employees, 629 Responsibility and accountability, 507–508 Results chain, 992–995 Results levels, 149 Results oriented, 13 Retention periods, 888–889 Return on investment (ROI): analysis of outcomes, 208–210; benefits to mentor in mentoring relationships, 469–470; benefits to the mentored in mentoring relationships, 467–469; cost of failure, 433; evaluation of improved performance, 49, 52, 315; evaluation of performance support systems (PSS), 561; games and simulations in training, 433–434; General Motors systemic issues, 120; organizational benefits of mentoring relationships, 467, 470–471; performance support systems (PSS), 561; supervisor benefits in mentoring relationships, 471–472; tools for performance technologists, 49 Reverse mentoring, 462 Role-set model, 95–100, 107–109 Routine work, 673–674 RULEG model, 1190 S Safety performance improvement, 833 Scaffolding, 546 Scalability, 1197–1199 Scenario planning, 613 Schema theory, 353 SUBJECT INDEX Scientific management, 158 Scope documents, 950–951 Scope of observation, 805 Scope of systems, 542–544 S.C.O.R.E (SymptomsCauses-Output-ResourcesEffects), 273 Script preparation in interviewing, 789 Secondary adjustments, 1230 Self-assessment form, 945 Self-confidence in work skills, 485–486 Self-management of teams, 501–502 Self-organizations, 1256 Self-reference in observation, 1282–1283 Semantic differential scale, 772, 774 Sensitivity analysis, 527, 533–534 Serious Performance Consulting According to Rummler, 910 Setup reduction, 727–728 The Habits of Highly Effective People, 906 Shift work, 686–687 Shintoists, 1054 Short term memory, 371 Significance level, 858 Simulations (sims), 418–421, 429–430 Single-loop learning, 598–602 Single-piece flow, 726–727 Situational Leadership Model, 964–966 Situational questions, 969 Six sigma: background, 693; barriers to implementation, 695; benchmarking, 700; benefits of, 694–695; critical customer requirements (CCRs), 700; critical to quality (CTQ), 700; cultural impact on implementation, 709; definitions, 693; design of experiments (DOE), 701; Eckes’ formula, 709; failure modes and effects analysis (FMEA), 701; fault tree analysis, 701; flowcharting, 701–702; Ford Motor Company, 694–695; full-scope evaluation, 323; health care field, 710–711; integration with HPT, 711–713; International Society of Performance Improvement (ISPI) HPT Model, 696, 697, 712; major organizational change, 263; Malcolm Baldrige National Quality Award, 693; modeling and simulations, 702–703; and Motorola, 693; multigenerational (product) plan (MGP or MGPP), 701; philosophy, 693–694; practice, 694; process mapping, 702; process step comparison, 698–699; project management, 703; pyramid of quality, 709–710; quality functional deployment (QFD), 703; roles and responsibilities, 705, 707–708; selecting the right tool, 705, 706; similarities and differences with HPT, 692–693, 711–715; statistical analysis, 703–704; statistical measurement, 848; structure tree diagram, 704; techniques, 695–696; theory of inventive problem solving (TRIZ), 704; total quality management (TQM), 701; verification-validation plan, 705; voice of the customer (VOC), 704–705 Skewed distributions, 849–850; examples, 850 Skill-building lessons, 887–888 Skill development, 646 Skills loss: diversity, valuing, 464; economic impact, 463; transfer of skills, 464–465, 476 Skinner Box, 163 SMARTER goals, 174–175 Social dynamics, 523–524 Social responsibility: standards and ethics, 1033; sustainable development, 1112 Social science motivation theories, 158–160 Societal trends, 63–64 Societal value, 1114–1115 Socio-technical interaction networks (STIN), 649 Soft systems methodology (SSM), 1278 Software program team case study, 100–101; all hands program meetings, 101–103; program team meetings, 105–106; project chart, 102; project team meetings, 104–105 Solution development, 43 Solution space, 532–533 Sort and sift, 827 South Asia, 1055 Spin and trust, 1062 SPINquestions, 969, 972 SPIN Selling, 969 Stability principle, 723–724 Stages of capability, 448–449 Stages of organizational capability, 448–449 1357 1358 SUBJECT INDEX Stakeholders, 17; management worksheet, 948; requirement analysis, 947–948 Standardized work, 734–735 Standards and ethics: absolute, 1029–1031; Academy of Human Resource Development (AHRD), 1034; ADDIE (analyze, design, develop, implement, evaluate) processes, 1025; ambiguous situations, decision making in, 1031–1032; American Society for Training and Development (ASTD), 1034; Aristotle, 1028–1029; business ethics, defined, 1026; certification, 1014, 1016–1017, 1042–1043; Certified Performance Technologist (CPT), 1043; code of professional conduct, 1034–1036; codes of ethics, 1032; consequentialism, 1027–1028; consequentialist theories, 1026–1029; decision making in ambiguous situations, 1031–1032; deontology, 1028; descriptive ethics, 1026; design and development, 1041; ethical altruism, 1027; ethical egoism, 1027; ethical foundations and HPT, 1033–1034; ethical perspectives and HPT, 1032–1033; ethics, defined, 1026; ethics, described, 1025–1026; Ford Pinto, 1031–1032; and HPT professionals, 1024–1025; implementation and management, 1041; intentionality, 1028; International Society for Performance Improvement (ISPI), 10, 1034; Kitchen Cabinet, 1042; metaethics, 1026; moral awareness, 1031–1032; morality, absolute or relative, 1029–1030; morality, defined, 1025–1026; normative ethics, 1026; objectives of HPT professionals, 1036; objectivism, 1027, 1029–1031; Organizational Development Network, 1034; planning and analysis, 1041; principles of HPT professionals, 1036–1037; professional foundations, 1040; recognition of achievement, 1024–1025; relativism, 1029–1031; social responsibility, 1033; standards, 1039–1040; teleological theories, 1026–1029; training managers’ codes of ethics, 1038; utilitarianism, 1027; values, defined, 1026; virtue ethics, 1028–1029 See also Professional ethics Starbucks Corporation, 1094 Statistical analysis: evidence-based practice, 889; six sigma, 703–704 Statistical measurement, 848 Statistical significance, 838–839, 855–859 Stimulus discrimination and generalization, 162 Stimulus response tables, 799 Strange attractors, 1258 Strategic alignment, 12, 17–18, 50 Strategic Business Partner: Aligning People Strategies with Business Goals, 912 Strategic dialogue, 603–604 Strategic direction: analysis of, 196–198; annual reports, 198; assessing multiple perspectives, 201; cause analysis, 204; chain of results, 195; considerations for HPT, 203–206; decision making, 202; effective direction setting, 194–196; employee role, 192–193, 195; evaluation, 206; failure in planning, 192; goals of an organization, 12; ideal vision, 194; implementation of interventions, 205–206; intervention design, 205; measurable goals, 201; mega planning and thinking, 195–196; mission statement, 194–195; model for effective direction setting, 194–196; organizational planning, 196–198; organizational plan questions, 197; performance analysis, 203–204; practical guides for decision making, 192–193; shared goals and objectives, 193; strategic plans, supporting with, 206–207; unconventional wisdom, 193 Strategic Impact Model, 336–337 Strategic intent, 611–612 Strategic planning, 174–175, 526–527, 610–613, 1099–1100 Strategic work, 906

Ngày đăng: 04/10/2023, 16:53