Practicing Organization Development (A guide for Consultants) - Part 36 ppsx

10 289 0
Practicing Organization Development (A guide for Consultants) - Part 36 ppsx

Đang tải... (xem toàn văn)

Thông tin tài liệu

IMPLEMENTATION AND CONTINUING THE CHANGE EFFORT 321 TARGET GROUP TYPES OF INTERVENTIONS • Interventions designed to improve • Intergroup activities —Process directed the effectiveness of INTERGROUP —Task directed RELATIONS • Organizational mirroring • Partnering • Process consultation • Third-party peacemaking at the group level • Grid OD phase 3 • Survey feedback • Interventions designed to improve • Sociotechnical systems (STS) the effectiveness of the • Parallel learning structures TOTAL ORGANIZATION • MBO (participation forms) • Cultural analysis • Confrontation meetings • Visioning • Strategic planning/strategic management activities • Grid OD phases 4, 5, 6 • Interdependency exercise • Survey feedback • Appreciative inquiry • Future search conferences • Quality of work life (QWL) programs • Total quality management (TQM) • Physical settings • Large-scale systems change From W.L. French & C.H. Bell, Jr. (1995). Organization Development: Behavioral Science Interventions for Organizational Improvement. Englewood Cliffs, NJ: Prentice-Hall, p. 165. Reprinted with permission. RESEARCH REGARDING IMPLEMENTATION We will first consider some issues regarding research on organization change and development, and, second, some of the major results of research in this arena. Research Issues The overriding issue regarding research in the arena of organization change and development is purpose—that is, whether the research effort is for evaluation (Did it work?) or for knowledge generation (What is truth?). In other words, is 17_962384 ch11.qxd 2/3/05 12:18 AM Page 321 the research for the benefit of the client or the researcher? If we rely on the tra- ditional scientific method, then we control and manipulate some independent variables, make some interventions, and see whether any difference occurred in relation to certain dependent variables. For instance, we decide to use team building as an intervention and collect information (dependent variables) to see whether it made any difference. We might use a questionnaire to ask team members if they feel more satisfied with and committed to the team, and we might determine if the team’s work performance increased after the team- building effort occurred. Even if our data showed increased satisfaction, com- mitment, and work performance, it would be difficult to demonstrate that the team-building intervention had caused these outcomes unless we had also col- lected data from a matched control group—for example, a similar team for which no team building had been done—unless we had collected data from both teams before and after the team-building effort, and unless we had col- lected these two sets of data at essentially the same time. Another critical factor in this evaluation effort would be the people who col- lect and analyze the data. Numerous studies have shown that the researcher can affect the outcome (Rosenthal, 1976), which raises the question of objectivity. To be scientific, or objective, the researcher should be someone other than the team- building consultant or the organization members involved. Argyris has argued, however, that the more scientific the evaluation is, the less likely it is to be relevant to and therefore used by the client. He states that traditional scientific methods of evaluation (his term is “mechanistic”) “tend to create primarily depen- dent and submissive roles for the clients and provide them with little responsi- bility; therefore, the clients have low feelings of essentiality in the program (except when they fulfill the request of the professionals)” (Argyris, 1970, p. 105). To illustrate—rather dramatically—the challenges of employing the scien- tific method in an organizational field setting, consider the study by Blumberg and Pringle (1983). Their studies of interventions include job redesign and par- ticipative management in a mine. Following the proper methods of science, these researchers had a control group and an experimental group of miners. The control group, however, found out about what was going on with the experimental group and resented it, feeling that the latter had the unfair advan- tage of improving their quality of work life. The control group of miners, being unionized, then proceeded to vote to stop the entire process. And they suc- ceeded. The change effort stopped abruptly. We will never know whether the interventions worked, and we will never have the advantage of knowing the efficacy of these interventions from whatever learning did occur. The impor- tant lesson that the researchers learned was about attempting traditional research in a field setting. Beer and Walton (1987) have suggested that moving in the direction of what is labeled as “action science” (Argyris, Putnam, & Smith, 1985) is a better choice. As noted before: 322 PRACTICING ORGANIZATION DEVELOPMENT, 2ND EDITION 17_962384 ch11.qxd 2/3/05 12:18 AM Page 322 “This means moving away from typical positivistic assumptions regarding research in organizations and toward a process that (a) involves the users in the study, (b) relies on self-corrective learning, say, trying certain assessment meth- ods, and then modifying them along the way as trial and error yields knowledge, and (c) occurs over time, not episodically. As Beer and Walton noted, the litera- ture about this kind of choice regarding organization change research has begun to grow.” (Burke, 2002, p. 125) Work by Carnall (1982), Morgan (1983), and Legge (1984) represent some of this growth. Research Results There has been early evidence that organization change and development inter- ventions do work. French and Bell (1978) selected nine studies that they con- sidered to be supportive. These studies were conducted from 1964 to 1974 and included interventions such as grid OD, participative management, team build- ing, and the use of multiple interventions. In a later and very careful review of seventy-two studies of OD interventions, Porras and Robertson (1992) found overall that (a) more than 40 percent of these studies had positive outcomes, that is, the interventions led to significant change in the intended direction; (b) about 50 percent of the studies showed little or no change; and (c) a small percentage, ranging from 7 to 14 percent, led to negative outcomes. Porras and Robertson classified the many different interventions they studied into four broad categories: (1) organizing arrangements: for example, establishing new committees, task forces, or quality circles; (2) social factors: for example, team building; (3) technology: for example, job redesign; and (4) physical settings: such as modifying an office layout from a closed to a more open space. These researchers drew the following conclusions from their extensive review: “The fact that lack of change in the dependent variables occurred more fre- quently than any other change can potentially be explained in one of three ways. First, and most pessimistically, it could simply be that OD, in general, is not very effective, with desired results being achieved less than half the time. Second, it could be that the interventions used in these studies cannot achieve results consistently. The problem may not be with any specific intervention, but that too frequently only one intervention, or interventions of one type, was used. Only six cases existed in which a multifaceted, multi-category program of intervention took place. The lack of positive change may have been the result of the lack of comprehensiveness of the change effort. “Finally, it could be that beta change was involved in the measurement of many of the dependent variables. As previously discussed, beta change involves a psychological recalibration of the instrument used to measure a stable dimension of reality (Golembiewski, Billingsly, & Yeager, 1976, pp. 135–136). In other words, as a result of the OD intervention or interventions, organizational members’ IMPLEMENTATION AND CONTINUING THE CHANGE EFFORT 323 17_962384 ch11.qxd 2/3/05 12:18 AM Page 323 perceptions of various aspects of their work environment can be altered so that the measures used to assess these characteristics do not maintain their calibration over time. Consequently, while the characteristics being measured may in fact undergo change, such change may not be demonstrated because of the psycholog- ical recalibration of the measure. An apparent lack of change can thus mask an actual change in the variable measured.” (Porras & Robertson, 1992, p. 786) The good news from the Porras and Robertson study is that little or no harm was done by using the interventions that were investigated. SUMMARY AND CONCLUSIONS As stated at the outset, implementation is the difficult part of OD. But not the most difficult. Sustaining the change effort once underway is more difficult than implementation as such, certainly early implementation in the overall change process. The difficult aspects of implementation include dealing with (a) resistance, and remember that there are different forms of resistance (blind, ideological, and political), and (b) unanticipated consequences, those reactions to interventions that are not in the plan because we simply did not think about them in advance. Also bear in mind that standards exist for effectiveness of implementation, that is, in the form of an intervention—the three criteria from Argyris: valid information, choice, and commitment—and that no single intervention by itself is sufficient for effective OD. Finally, as research demonstrates, understanding the outcomes and effective- ness of these outcomes regarding our interventions is not obvious. The researcher’s behavior can affect outcomes, and there is usually the question of exactly what was affected by our interventions—just perceptions or actual attitudes and values? Research does show that interventions do have organizational effects. We must be diligent about being as clear as we can about the depth of these effects. References Argyris, C. (1970). Intervention theory and method. Reading, MA: Addison-Wesley. Argyris, C., Putnam, R., & Smith, D.M. (1985). Action science. San Francisco: Jossey- Bass. Beckhard, R., & Harris, R.T. (1987). Organizational transitions: Managing complex change (2nd ed.). Reading, MA: Addison-Wesley. Beer, M., & Walton, A.E. (1987). Organization change and development. Annual Review of Psychology, 38, 339–367. Blake, R.R., & Mouton, J.S. (1976). Consultation. Reading, MA: Addison-Wesley. 324 PRACTICING ORGANIZATION DEVELOPMENT, 2ND EDITION 17_962384 ch11.qxd 2/3/05 12:18 AM Page 324 Blumberg, M., & Pringle, C.D. (1983). How control groups can cause loss of control in action research: The case of Rushton Coal Mine. Journal of Applied Behavioral Science, 19, 409–425. Brehm, J.W. (1966). A theory of psychological reactance. New York: Academic Press. Burke, W.W. (1982). Organization development: Principles and practices. Boston, MA: Little Brown. Burke, W.W. (1994). Organization development: A process of learning and changing (2nd ed.). Reading, MA: Addison-Wesley. Burke, W.W. (2001). The broadband of organization development and change: An introduction. In L. Carter, D. Giber, & M. Goldsmith (Eds.), Best practices in organi- zation development and change (pp. 3–9). San Francisco: Pfeiffer. Burke, W.W. (2002). Organization change: Theory and practice. Thousand Oaks, CA: Sage. Burke, W.W., & Biggart, N.W. (1997). Interorganizational relations. In D. Druckman, J.E. Singer, & H. Van Cott (Eds.), Enhancing organizational performance (pp. 120–149). Washington, DC: National Academy Press. Burke, W.W., Clark, L.P., & Koopman, C. (1984). Improve your OD project’s chances for success. Training & Development, 38(8), 62–68. Burke, W.W., & Hornstein, H.A. (Eds.). (1972). The social technology of organization development. San Diego, CA: Pfeiffer & Company. Carnall, C.A. (1982). The evaluation of organizational change. Brookfield, VT: Gower. Cummings, T.G., & Worley, C.G. (2005). Organization development and change (8th ed.). Cincinnati, OH: Southwestern. Fagenson, E.A., & Burke, W.W. (1990). The activities of organization development practitioners at the turn of the decade of the 1990s. Group & Organization Studies, 15, 366–380. French, W.L., & Bell, C.H., Jr. (1978). Organization development: Behavioral science interventions for organization improvement (2nd ed.). Englewood Cliffs, NJ: Prentice-Hall. French, W.L., & Bell, C.H., Jr. (1995). Organization development: Behavioral science interventions for organization improvement (5th ed.). Englewood Cliffs, NJ: Prentice-Hall. Golembiewski, R.T., Billingsley, K., & Yeager, S. (1976). Measuring change and persis- tence in human affairs: Types of change generated by OD designs. Journal of Applied Behavioral Science, 12, 133–157. Hambrick, D.C., & Cannella, A.A., Jr. (1989). Strategy implementation as substance and selling. Academy of Management Executive, 3, 278–285. Legge, K. (1984). Evaluating planned organizational change. Orlando, FL: Academic. Lewin, K. (1947). Group decision and social change. In T.M. Newcomb, E.L. Hartley et al. (Eds.), Readings in social psychology (pp. 330–344). New York: Henry Holt. Massarik, F., & Pei-Carpenter, M. (2002). Organization development and consulting. San Francisco: Pfeiffer. IMPLEMENTATION AND CONTINUING THE CHANGE EFFORT 325 17_962384 ch11.qxd 2/3/05 12:18 AM Page 325 Morgan, G. (Ed.). (1983). Beyond method: Strategies for social research. Thousand Oaks, CA: Sage. O’Toole, J. (1995). Leading change: Overcoming the ideology of comfort and the tyranny of custom. San Francisco: Jossey-Bass. Pascale, R.T., Millemann, M., & Gioja, L. (2000). Surfing the edge of chaos: The laws of nature and the new laws of business. New York: Crown Business. Porras, J.I., & Robertson, P.J. (1992). Organizational development: Theory, practice, and research. In M.D. Dunnette & L.M. Hough (Eds.), Handbook of industrial and organizational psychology (2nd ed., Vol. 3) (pp. 719–822). Palo Alto, CA: Consulting Psychologists Press. Rosenthal, R. (1976). Experimenter effects in behavior science (enlarged ed.). New York: Halsted Press. Schmuck, R.A., & Miles, M.B. (Eds.). (1971). Organization development in schools. Palo Alto, CA: National Press Books. Smither, R.D., Houston, J.M., & McIntire, S.D. (1996). Organization development: Strategies for changing environments. New York: HarperCollins. 326 PRACTICING ORGANIZATION DEVELOPMENT, 2ND EDITION 17_962384 ch11.qxd 2/3/05 12:18 AM Page 326 CHAPTER TWELVE Evaluation Gary N. McLean and Steven H. Cady A s described in Chapter Two, evaluation is a major step in OD; however, it is one piece of the model that is frequently omitted or cut short. This chap- ter clarifies what evaluation means in OD and describes the importance of evaluation for OD consultants and their clients, barriers to evaluation, issues to consider when planning an evaluation, and evaluation competencies. Instru- ments and techniques used for evaluation are explored, along with examples. EVALUATION DEFINED WITHIN A SYSTEMS PERSPECTIVE “Evaluation is a set of planned, information-gathering, and analytical activities undertaken to provide those responsible for the management of change with a sat- isfactory assessment of the effects and/or progress of the change effort.” —Beckhard & Harris, 1977, p. 86. A commitment to planned evaluation should be made early in the OD process, preferably in the pre-launch phase. Planned evaluation allows those involved to gather and examine data and to judge the value of the OD process on a con- tinuing basis, with the purpose of improving the process or deciding whether to continue it. Workplace realities rarely facilitate the application of pure research methods, so OD evaluation “is likely to be more action centered, value based, collaboratively 327 ∂ ∂ 18_962384 ch12.qxd 2/3/05 12:18 AM Page 327 contexted, experientially rooted, situationally responsive, praxis oriented, and self- reflective than the current image” of research (Evered, 1985, p. 439). Carefully planned evaluation pays attention to both soft (attitudinal) data, such as job sat- isfaction, and hard (quantitative) data, such as employee turnover rates. The target of an OD evaluation may be the total organization or system, the organization’s relationship with the external world and other organizations (transorganizational interaction), individual development, interpersonal devel- opment, intra-team and inter-team development, or role development. These targets for evaluation are an expansion of those shown by Schmuck and Miles (1976) on the z-axis of their OD Cube. (See Figure 12.1.) Evaluation may target either the processes in use during the change effort (see Figure 12.2, the x-axis of the OD Cube by Schmuck & Miles, 1976) or the outcomes of the change effort (see Figure 12.3, the y-axis of the OD Cube). 328 PRACTICING ORGANIZATION DEVELOPMENT, 2ND EDITION Figure 12.1. The OD Cube From R. Schmuck & M. Miles (1971), Organization Development in Schools. In M Miles & R. Schmuck (Eds.), Improving Schools Through OD: An Overview (pp. 1-28). Palo Alto, CA: National Press Books. Used by permission of the publisher. The OD Cube was originally published in Organization Development in Schools, p. 8, 1971. San Diego, CA: Pfeiffer & Company. For subsequent development of the OD Cube, see R. Schmuck & P. Runkel (1985), The Handbook of Organization Development in Schools (3rd ed.). Prospect Heights, IL: Waveland Press. Goals, plansGoals, plans CommunicationCommunication Culture, climateCulture, climate Leadership, authorityLeadership, authority Problem-solvingProblem-solving Decision-makingDecision-making Conflict/cooperationConflict/cooperation Role definitionRole definition OtherOther Goals, Plans Communication Culture, Climate Leadership, Authority Problem-Solving Decision-Making Conflict/Cooperation Role Definition Other DIAGNOSED PROBLEMS MODE OF INTERVENTIONS Total organizationTot a l or g aniz ation OtherOt he r PersonP ers o n RoleRo le Dyad/triadD yad /tria d Team/groupTe am/g rou p Intergroup (2 or more)Int e rgrou p (2 o r mo r e) Total Organization Other Person Role Dyad/Triad Team/Group Intergroup (2 or more) OtherOther Plan-makingPlan-making ConfrontationConfrontation Data feedbackData feedback Problem-solvingProblem-solving Training (education)Training (education) Techno-structural activityTechno-structural activity Process consultation, coachingProcess consultation, coaching OD task force establishmentOD task force establishment Other Plan-Making Confrontation Data Feedback Problem-Solving Training (Education) Techno-Structural Activity Process Consultation, Coaching OD Task Force Establishment FOCUS OF ATTENTION 18_962384 ch12.qxd 2/3/05 12:18 AM Page 328 EVALUATION 329 Figure 12.2. OD Research Variables—Process From J. Porras & P. Berg (1978, August). The Impact of Organization Development. Academy of Man- agement Review, p. 252. Used by permission of Academy of Management Review. Group Orientation Goal Emphasis and Goal Setting Policies and Procedures Decision Mak ing Work Facilitation Intergroup Relations Intergroup Consideration and Concern Imparting Enthusiasm Supportivene ss Interaction Facilitation Openness to Influence Facilitation of Participation Goal Emphasis and Goal Setting Initiating Structure Problem So lving and Decision Making Work Facilitation Attitudes and Values Leader-Subor dinate Relations Leader-Subor dinate Interpersonal Relations Interpersonal Intimacy Openness Listening Awareness and Understanding Self-Actualization a nd Self-Deve lopment Structure and Functionin g Trust Leader Approachab ility Mutual Influence Interaction and Communication Peer Support Involvement and Motivation Human Resources Primacy Motivat ion Support Comm unication Influence Leadership Tru st Group Process Conflict Resolution Norms Participation Goal Setting, Consensus Control Decision Making Integration Organizational Climate Systems of Management Task Oriented Task Oriented Task Oriented People Oriented People Oriented People Oriented Macro Level Behavioral Psychological Leader Characteristics Organization Group Individual Leader Process Variables In whatever context the evaluation occurs, the focus of the evaluation must be on how the OD process has impacted the total organizational system. When a continuing evaluation effort is made, the following outcomes are likely: • When management requests information to prove the value of the expenditure for OD, quality data are available or are in the process of being collected. • Participants are more likely to have a positive attitude about OD and about the organization because they have been involved in the evaluation. 18_962384 ch12.qxd 2/3/05 12:18 AM Page 329 • The OD endeavor is likely to be more efficient and effective. • There is likely to be increased quality and productivity in accomplishing organizational objectives. • If there are additional OD needs identified that arise from a lack of orga- nizational support for the changes created by the intervention or any other source, they may be identified and addressed. Figure 12.4 depicts an evaluation model that may be followed when con- ducting a planned evaluation of an OD intervention. 330 PRACTICING ORGANIZATION DEVELOPMENT, 2ND EDITION Figure 12.3. OD Research Variables—Outcomes From J. Porras & P. Berg (1978, August). The Impact of Organization Development. Academy of Man- agement Review, p. 253. Used by permission of Academy of Management Review. Production Levels Production Efficiency Workforce Efficiency Absenteeism Turnover Miscellaneous General With Job Security With Pay With Work Group With Supervisor With Job With Company Job Effectiveness General Performance Contact Frequency Leader Performance Quality of Meetings Number or Length of Meetings Group Performance Economic Outcomes Performance Leader Performance Economic Performances Performance Ratings Satisfaction Workforce Characteristics Group Organization Individual Outcome Variables 18_962384 ch12.qxd 2/3/05 12:18 AM Page 330 . science interventions for organization improvement (2nd ed.). Englewood Cliffs, NJ: Prentice-Hall. French, W.L., & Bell, C.H., Jr. (1995). Organization development: Behavioral science interventions for organization. effort (see Figure 12.2, the x-axis of the OD Cube by Schmuck & Miles, 1976) or the outcomes of the change effort (see Figure 12.3, the y-axis of the OD Cube). 328 PRACTICING ORGANIZATION DEVELOPMENT, . Work Group With Supervisor With Job With Company Job Effectiveness General Performance Contact Frequency Leader Performance Quality of Meetings Number or Length of Meetings Group Performance Economic Outcomes Performance Leader Performance Economic Performances Performance Ratings Satisfaction Workforce Characteristics Group Organization Individual Outcome Variables 18_962384

Ngày đăng: 02/07/2014, 02:20

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan