Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống
1
/ 30 trang
THÔNG TIN TÀI LIỆU
Thông tin cơ bản
Định dạng
Số trang
30
Dung lượng
1,11 MB
Nội dung
Problems with Computer Control 363 If an instrument reading is faulty, operators are sometimes able to override the instrument and type in an estimated reading. Sometimes they are right, and production continues; sometimes they are wrong, and an incident occurs. Operators are usually reluctant to believe unusual readings and rush to the conclusion that the instrument is faulty, whatev- er the type of control (see Section 3.3.2). Today it is usually harder than in the early days of computer control for operators to interfere with the software. override interlocks, or type in "correct" readings. However, many operators acquire keys or passwords that they should not have. in much the same way as operators have always unofficially acquired and secreted an assortment of tools and adaptors, On one plant an interlock was found to be illegally blocked: the password had been disclosed to 40 people, all of whom denied responsi- bility (see Section 14.5 d). I have seen only one report of a virus in process control software and none of access by hackers. The virus was found on a Lithuanian nuclear reactor and is said to have been introduced by someone who wanted the credit for detecting and removing it. However, this does not mean vims infection or hacking will never occur, and their consequences could be much more serious than loss of accountancy data. As long as a control PES stands alone and is not connected to other systems, infection is impossible (unless a virus is present in the original software), but net- working is becoming increasingly common. Computer viruses are rather like AIDS. To avoid infection, do not promiscuously share data or disks, and keep the covers on your disks in the presence of computers whose background is unknown. 20.6 NEW APPLICATIONS Permits-to-work could be prepared and stored on a computer. The sav- ing in effort would not be great. but additional functions are now possi- ble. For example: e The cormputer could remind the user of any special hazards associat- ed with[ this piece of equipment and its contents and the actions that should be taken. e The computer could also remind the user of any problems encoun- tered when the equipment was being prepared or maintained on earli- er occasions. 364 What Went Wrong? If a vessel is being prepared for entry. the computer could check that the number of slip-plates (blinds) to be fitted (or pipes disconnected) is the same as the number of connections shown on the drawing. If someone tries to take out a second permit on the same item of equipment, this would be instantly apparent, and the computer could refuse to issue it. Suppose a fitter has to replace a gasket during a night shift. On some plants it is easy; only one sort is used, and all the fitter has to do is select the right size. On other plants many types are used. The fitter has to get out a line diagram, find the line number, and then look up the details in a bulky equipment list. It should be possible for him to view the line diagram on a computer screen. select the line with a cursor, and have details of the line displayed, including the location of spare parts and any distinguishing marks, such as the color of the gaskets. The line diagram and equipment list will have been prepared on a computer; all that is needed is a link between the design system and the maintenance system. (Of course, we should, if possible, reduce the number of types of gaskets, nuts, bolts, etc., required even though we may use more expensive types than strictly necessary on some duties.) Another new application under development is to give operators more information about approaching hazards. For example, if hot oil, over 100°C, is added to a storage tank containing a water layer or the oil in the tank is heated above 100°C, the water may be vaporized with explosive violence; a mixture of steam and oil will be expelled through the tank vent and may even blow the roof off the tank (see Section 12.2). If the temper- ature of the incoming oil or the oil in the tank approaches 100°C, then the screen could display a warning message, not merely announcing a high temperature but reminding the operator of the consequences. The reminder message could also be displayed if the operator starts up or increases the heat supply to a tank that contains a water layer. On request the system could explain why the consequences may occur and refer the operator to a plant instruction, accident report, or other document, accessi- ble on the screen, from which the operator could find more information. The number of possible incidents that might occur and warnings that might be given is enormous, and each plant would have to make a selec- tion based on its own experience and that of the industry. The informa- Problems with Computer Control 365 rion would also be accessible to designers and hazop teams, though they will probably require access to the whole accident database [ 171, 20.7 CONCLUSIONS If we can learn from the incidents that have occurred on process plants controlled by computers, we may be able to prevent them from happen- ing again. Familiar errors caused the incidents that have occurred. Acci- dents or process upsets wil occur in any plant, whatever the method of control, if we do not allow for foreseeable slips or equipment failures if modifications are not controlled, if operators are overloaded by too muc information, if information display is poor, if controllers are set incor- rectly, if warnings are ignored, or if operators are not told of changes that have been made. However, some of these errors are more likey to occur on plants controlled by computers than on conventional plants. This is because different departments may be responsible for operation of the plant and design and operation of the control system, and operating staff members may have exaggerated views of the power of the computer mmd a limited understanding of what it can and cannot do. One way of improving communication between chemical and software engineers would be to combine the jobs. There is a need for engineers who are equally at home in the two fields. REFERENCES 1. I. Nimmo, S. R. Nunns, and B. W. Eddershaw, “Lessons Learned from the Failure of a Computer System Controlling a Nylon Polymer Plant,’’ Paper presented at Safety and Reliability Society Symposium, Altrincham, UK, Nov. 1987. 2. B. W. Eddershaw, Loss Prevention Bulletin, No. 088, p. 3. 3. Chemical Safety Summary, Vol. 56, No. 221, Chemical Industries Association, London, UK, 1985, p. 6. 4. T. A. Kletz, Process Plants: A Handbook for Inherently Sa-fer Design, 2nd edition, Taylor and Francis, Washington, D.C., 1998, Chapter 7. 5. N. G. Laveson, IEEE Sofhyare, Vol. 7, No. 6, Nov. 1990, p. 55. 6. S, M. Englund and D. J. Grinwis, Chemical Engineering Progress, 7. A. M. Way, New Scientist, Vol. 119, Sept. 8, 1988, p. 61. Vol. 88, No. 10, Qct. 1992, p. 36. 366 What Went Wrong? 8. L. Bodsberg and 0. Ingstad, “Technical and Human Implications of Automatic Safety Systems,” Paper presented at Sixth International Symposium on Loss Prevention and Safety Promotion in the Process Industries, Oslo, Norway, 1989. 9. D. G. Mooney, -‘An Overview of the Shell Fluoroaromatics Explo- sion,’’ Hazards XI-New Directiorzs in Process Safeh, Symposium Series No. 124. Institution of Chemical Engineers. Rugby, UK, 1991. 10. D. K. Lorenzo. A Manager‘s Guide to Reducing Hurnaiz Errors, Chemical Manfacturers Association, Washington. D.C 1990, p. 18. 11. P. Mahon, Verdict on Erebus, Collins, Auckland, New Zealand, 1984. 12. M. Shadbolt, Reader5 Digest, Nov. 1984, p. 164. 13. R. E. Eberts. Cliemical Engineering Progress, Vol. 82, No. 12, Dec. 1985, p. 30. 14. J. L. Lions, Ariane 5-Flight 501 Failure, European Space Agency, Paris, 1996. 15. N. G. Leveson, Safeware-System Safety and Conzputers, Addison- Wesley, Reading, Mass., 1996, Appendix A. 16. T. A. Kletz, et al., Conzputer Control and Huinarz Error; Institution of Chemical Engineers, Rugby, UK, 1995, pp. 13 and 107. 17. M. Jefferson, P. W. H. Chung, and T. A. Kletz, “Learning the Lessons of Past Accidents,” Hazards XIII-Process Safety: The Future. Sym- posium Series No. 141, Institution of Chemical Engineers, Rugby, UK, 1997. ADDITIONAL READING References 15 and 16. Chapter 21 . . . a11 great controversies depend on both sides sharing one false premise. -a 4th century theologian Those who want to spend more money to make a plant safer and those who think enough has been spent share a false premise: they both assume more safety will cost more money. Many of the incidents in this book were the result of leaks of haz- ardous materials, and the recommendations describe ways of preventing leaks by providing better equipment or procedures. As we have seen, equipment can fail or can be neglected, and procedures can lapse. The most effective methods, therefore, of preventing leaks of hazardous materials are to use so little that it hardly matters if it all leaks out (inten- sification or minimization) or to use a safer material instead (substitu- tion). If we cannot do this and have to store or handle large amounts of hazardous material, we should store or handle it in the least hazardous form (attenuation or moderation). Plants in which this is done are said to be inherently safer because they are not dependent on added-on equip- ment or procedures that might fail; the hazard is avoided rather than con- trolled, and the safety is inherent in the design. Because hazards are avoided, there is less need to add on protective equipment, such as interlocks, alarms, emergency isolation valves, fire insulation, water spray, etc., and the plants are therefore usually cheaper as well as safer. 367 368 What Went Wrong? The principles of inherently safer design may seem obvious, but until the explosion at Hixborough in 1974 (see Section 2.4), little thought was given to ways of reducing inventories of hazardous materials. We simply designed a plant and accepted whatever inventory was needed for that design, confident of our ability to keep it under control. Hixborough weak- ened our own and the public’s confidence in this ability, and ten years later Bhopal almost destroyed it. The first incident described in this chapter on inherently safer design is therefore the toxic gas release at Bhopal. My book Plant Design for Safety-A User-Friendly Approach [I] and References 12-15 describe many examples of ways in which plants can be made inherently safer. Note that we use the term inherently safel; not inherently sufe, as we cannot avoid every hazard. 21.1 BHOPAL The worst disaster in the history of the chemical industry occurred in Bhopal, in the state of Madhya Pradesh in central India, on December 3, 1984. A leak of methyl isocyanate (MIC) from a chemical plant, where it was used as an intermediate in the manufacture of the insecticide car- baryl, spread beyond the plant boundary and caused the death by poison- ing of more than 2,000 people. The official figure was 2,153, but some unofficial estimates were much higher. In addition, about 200,000 people were injured. Most of the dead and injured were living in a shanty town that had grown up next to the plant. The immediate cause of the disaster was the contamination of an MIC storage tank by several tons of water and chloroform. A runaway reaction occurred, and the temperature and pressure rose. The relief valve lifted, and MIC vapor was discharged to atmosphere. The protective equipment, which should have prevented or minimized the release, was out of order or not in full working order: the refrigeration system that should have cooled the storage tank was shut down, the scrubbing system that should have absorbed the vapor was not immediately available, and the flare system that should have burned any vapor that got past the scrubbing system was out of use. The contamination of the MIC was probably the result of sabotage [2], but, as we shall see, the results would have been much less serious if less MIC had been stored, if a shanty town had not grown up close to the plant, and if the protective equipment had been kept in full working order. inherently Safer Design 369 21.1.1 “What You Don’t Have Can’t Leak” The most important lesson to be learned from Bhopal was missed by most commentators: the material that leaked was not a product or raw material but an intermediate. and while it was convenient to store it, it was not essential to do so. Following Bhopal, the company concerned. Union Carbide, and other companies decided to greatly reduce their sl:ocks of MIC and other hazardous intermediates. A year after the disas- ter, Union Carbide reported that stocks of hazardous intermediates had been reduced by 75% [3]. The product, carbaryl, was manufactured by reacting phosgene and methylamine to produce MIC, which was then reacted with alpha-naph- thol. The same product can be made from the same raw materials by reacting them in a different order and avoiding the production of MIC. Phosgene is reacted with alpha-naphthol, and then the intermediate is reacted with methylamine. 21 -1 .2 Plant Location If materials that are not there cannot leak, people who are not there cannot be killed. The death toll at Bhopal-and at Mexico City [see Sec- tion 8.1.4) and Sao Paulo (see Section 9.1.8)-would have been lower if a shanty town had not been allowed to grow up near the plant. It is, of course, much more difficult to prevent the spread of shanty towns than of permanent dwellings, but nevertheless we should try to do so by buying and fencing land if necessary (or removing the need to do so, as described above), 25.1.3 Keep Incompatible Materials Apart The MIC storage tank was contaminated by substantial quantities of water and chlorofom-up to a ton of water an 1% tons of chlorofom- and this led to a complex series of runaway reaction [4]. The precise route by which water entered the tank is unknown; several theories have been put forward, and sabotage seems the most likely [2], though whoever deliberately added the water may not have realized how serious the conse- quences would be. Hazard and operability studies (Section 18.7) are a powerful tool for identifying ways in which contamination and other 370 What Went Wrong? unwanted deviations can occur, and since water was known to react vio- lently with MIC, it should not have been allowed anywhere near it. 21.1.4 Keep Protective Equipment in Working Order-and Size It Correctly As already stated, the refrigeration, flare, and scrubbing systems were not in full working order when the leak occurred. In addition, the high temperature and pressure on the MIC tank were at first ignored because the instruments were known to be unreliable. The high-temperature alarm did not operate, as the set-point had been raised and was too high. One of the main lessons of Bhopal is, therefore, the need to keep protec- tive equipment in working oder. Chapter 14 describes some other acci- dents that illustrate this theme. It is easy to buy safety equipment. All we need is money, and if we make enough fuss we get the equipment in the end. It is much more diffi- cult to make sure the equipment is kept in full working order when the initial enthusiasm has faded. All procedures, including testing and main- tenance procedures, are subject to a form of corrosion more rapid than that which affects the steelwork and can vanish without trace once man- agers lose interest. A continuous auditing effort is needed to make sure that procedures are maintained. Sometimes managers and supervisors lose interest, and unknown to them, operators stop carrying out procedures. However, shutting the flare system down for repair and taking the refrigeration system out of use were not decisions operators would make on their own. Managers must have made these decisions and thus showed a lack of understanding andor commitment. The refrigeration, scrubbing, and flare systems were probably not big enough to have prevented a discharge of MIC of the size that occurred, but they would have reduced the amount discharged to atmosphere. The relief valve was not big enough to handle the two-phase flow of liquid and vapor it was called upon to handle, and the tank was distorted by the rise in pres- sure, although it did not burst. Protective systems cannot be designed to handle every conceivable eventuality, but nevertheless Bhopal does show the need to consider a wide range of circumstances, including contamina- tion, when highly toxic materials such as MIC are handled. It also shows the need, when sizing relief valves, to ask if two-phase flow will occur. Inherently Safer Design 371 21 -1.5 Joint Ventures The Bhopal plant was half-owned by a U.S. company and half-owned locally. The local company was responsible for the operation of the plant as required by Indian law. In such joint ventures, it is important to be clear who is responsible for safety-in both design and operation. The technically more sophisticated partner has a special responsibility and should not go ahead unless it is sure that the operating partner has the knowledge. experience, commitment. and resources necessary for han- dling hazardous materials. It cannot shrug off responsibility by saying that it is not in full control. 21 .I .6 Training In Loss Prevention Bhopal-and many of the other incidents described in this book- leads us to ask if those who designed and operated the plant received suf- ficient training in loss prevention, as students and from their employers. In the UK. all chemical engineering undergraduates get some training in loss prevention. but this is not the case in most other countries, including the United States. Loss prevention should be included in the training of all engineers; it should not be something added onto a plant after design, like a coat of paint, but an integral part of design. Whenever possible, hazards should be removed by a change in design, such as a reduction in inventory. rather than by adding on protective equipment. While we may never use some of the skill and knowledge we acquire as students, every engineer will have to make decisions about loss preventian, such as deciding how far to go in removing a hazard [5]. At Bhopal, there had been changes in staff and reductions in manning, and the new recruits may not have been as experienced as the original team. However, I do not think that this contributed significantly to the cause of the accident. The errors that were made, such as taking protec- rive equipment out of commission, were basic ones that cannot be blamed on inexperience of a particular plant. 21 -1 .? Public Response Bhopal s,howed the need for companies to collaborate with local authorities and emergency services in drawing up plans for handling emergencies. 372 What Went Wrong? Inevitably, Bhopal produced a great deal of public reaction throughout the world but especially in India and the United States. There have been calls for greater control (a paper titled “A Field Day for the Legislators” [6] listed 32 U.S. government proposals or activities and 35 international activities that had been started by the end of 1985) and attempts to show that the industry can put its own house in order (for example, the setting up of the Center for Chemical Process Safety by the American Institute of Chemical Engineers and of the Community Awareness and Response program by the Chemical Manufacturers Association). Terrible though Bhopal was, we should beware of overreaction or sug- gestions that insecticides, or the whole chemical industry, are unneces- sary. Insecticides, by increasing food production, have saved more lives than were lost at Bhopal. But Bhopal was not an inevitable result of insecticide manufacture. By better design and operations and by learning from experience, further Bhopals can be prevented. Accidents are not due to lack of knowledge but failure to use the knowledge we have. Perhaps this book will help spread some of that knowledge. 21.2 OTHER EXAMPLES OF INHERENTLY SAFER DESIGN 21.2.1 Intensification The most effective way of designing inherently safer plants is by intensification, that is, using or storing smaller amounts of hazardous material so that the effects of a leak are less serious. When choosing designs for heat exchangers, distillation columns, reactors, and all other equipment, we should, whenever possible. choose designs with a small inventory or hold-up of hazardous material. References 1 and 12-15 describe some of the changes that are possible. Intensification is easy to apply on a new plant, but its application to existing plants is limited unless we are prepared to replace existing equipment. However, as we have seen, stocks of hazardous intermediates can be reduced on existing plants. When the product of one plant is the raw material of another, stocks can be reduced by locating both plants on the same site, and this also reduces the amount of material in transit. One company found that it could manage without 75% of its product storage tanks, though in this case the tanks, not the product, were haz- ardous (see Section 9.2.1 8). [...]... handled in plastic (or plastic-coated) tanks heated by electric immersion heaters If the liquid level falls exposing part of the heatel; the tank wall may get so hot that it catches fire One insurance company 376 What Went Wrong? reported 36 such fires in two years many of which spread to other parts of the plants Five were due to failure of a low-level interlock The inherently safer solution is to use a... addition temperature was wrong Runaways have also occurred when operators added the wrong material to a reactor, often because different materials had similar names, were stored in similar drums, or were poorly labeled (see Chapter 4) A batch distillation column, used for distilling nitrotoluene, had not been cleaned for 30 years A buildup of sludge caused some problems, or 388 What Went Wrong? so it was believed,... the company had time to test all its chemicals (b) As the result of a steam leak into a reactor jacket, some nitrobenzene sulfonic acid was held for 11 hours at 150°C Decomposition 382 What Went Wrong? occurs above 145 "C, and a violent explosion expelled the reactor from the building At a time, decomposition was believed to occur only above 200°C [3] (c) A solution of ferric chloride in a solvent was... the temperature at the point of measurement, but it may be different in other parts of the bulk liquid A reactor was provided with a quench water system; if the contents got too hot, water could be added from a hose A power failure caused the stirrer to stop The operator watched the temperature As it was falling 384 What Went Wrong? he did nothing After a while it started to rise; before he could connect... 378 What Went Wrong? By making incorrect assembly impossible (for an example, see Section 9.1.3) By making the status of equipment clear Thus, figure-8 plates are better than slip-plates, as the position of the former is obvious at a glance, and valves with rising spindles are better than valves in which the spindle does not rise Ball valves are friendly if the handles cannot be replaced in the wrong. .. storage tank containing acrolein was kept cool by circulating the liquid through a water-cooled heat exchanger Demineralized water was normally used, but the supply failed, and water from an under- 386 What Went Wrong? ground borehole was used instead; it contained numerous minerals There was a very slight leak in the heat exchanger, some water contaminated the acrolein, and the minerals catalyzed rapid polymerization... design, as the reduction in inventory results in a smaller and thus cheaper plant This is, in addition to the reduction in cost achieved by reducing the need for added-on protective equipment 374 What Went Wrong? 21.2.2 Substitution If intensification is not possible, we should consider substitution, that is, replacing a hazardous material by a less hazardous one For example, benzene, once widely used... Plarzt/Operatioizs Progress, Vol 9, No 2, Apr 1990, p 131 5 T Kotoyori, Journal of Loss Pi-eventioiz in the Process Industq, Vol 4, No 2, p 120 4 Lass Pi-eellention Bulletin, No 098, Apr 1991, p 7 390 What Went Wrong? 7 R Grollier Baron, “Hazards Caused by Trace Substances,” Seiienth International Symposium on Loss Prevention and Safe8 Promotion in the Process Industries, Taormina, Italy, May 4-8, 1992 8... that occurred because ethers were kept for too long A particularly tragic accident befell a research chemist He tried to open a bottle of isopropyl ether by holding it against his stomach and twisting the cap The bottle exploded, injuring him so severely that he died two hours later [12] Nevertheless, according to a recent report from the U.S Department of Energy [13], 21 containers of dimethyl ether... the U.S Department of Energy [13], 21 containers of dimethyl ether more than 21 months old were found in one of its laboratories The U.S Department of Energy also points out that polyethylene bottles containing corrosive chemicals may deteriorate with prolonged use [ 141 Other limited-life chemicals listed by Bretherick are bleaching powder (”Material which has been stored for a long time is liable to . exposing part of the heatel; the tank wall may get so hot that it catches fire. One insurance company 376 What Went Wrong? reported 36 such fires in two years. many of which spread to other parts. nitroben- zene sulfonic acid was held for 11 hours at 150°C. Decomposition 382 What Went Wrong? occurs above 145 "C, and a violent explosion expelled the reactor from the building. At. water spray, etc., and the plants are therefore usually cheaper as well as safer. 367 368 What Went Wrong? The principles of inherently safer design may seem obvious, but until the explosion