1. Trang chủ
  2. » Giáo Dục - Đào Tạo

Universe a grand tour of modern science Phần 4 ppsx

77 237 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 77
Dung lượng 458,55 KB

Nội dung

Jacques Laskar of the Bureau des Longitudes in Paris was a pioneer in the study of planetary chaos. He found many fascinating effects, including the possibility that Mercury may one day collide with Venus, and he drew special attention to chaotic influences on the orientations of the planets. The giant planets are scarcely affected, but the tilt of Mars for example, which at present is similar to the Earth’s, can vary between 0 and 60 degrees. With a large tilt, summers on Mars would be much warmer than now, but the winters desperately cold. Some high-latitude gullies on that planet have been interpreted as the products of slurries of melt-water similar to those seen on Greenland in summer. ‘All of the inner planets must have known a powerfully chaotic episode in the course of their history,’ Laskar said. ‘In the absence of the Moon, the orientation of the Earth would have been very unstable, which without doubt would have strongly frustrated the evolution of life.’ E Also of relevance to the Earth’s origin are Comets and asteroids and Minerals in space . For more on life-threatening events, see Chaos, Impacts, Extinctions and Flood basalts. Geophysical processes figure in Plate motions, Earthquakes and Continents and supercontinents. For surface processes and climate change, see the cross-references in Earth system. S hips that leave tokyo bay crammed with exports pass between two peninsulas: Izu to starboard and Boso to por t. The cliffs of their headlands are terraced, like giant staircases. The flat par t of each terrace is a former beach, carved by the sea when the land was lower. The vertical rise from terrace to terrace tells of an upward jerk of the land during a g reat earthquake. Sailors wishing for a happy return ought to cross their fingers and hope that the landmarks will be no taller when they get back. On Boso, the first step up from sea level is about four metres, and corresponds with the uplifts in earthquakes afflicting the Tokyo region in 1703 and 1923. The interval between those two was too brief for a beach to form. The second step, 219 earthquakes five metres higher, dates from about 800 bc. Greater rises in the next two steps happened around 2100 bc and 4200 bc. The present elevations understate the rises, because of subsidence between quakes. Only 20 kilometres offshore from Boso, three moving plates of the Earth’s outer shell meet at a triple junction. The Eurasian Plate with Japan standing on it has the ocean floor of both the Pacific Plate and the Philippine Plate diving to destruction under its rim, east and west of Boso, respectively. The latter two have a quarrel of their own, with the Pacific Plate ducking under the Philippine Plate. All of which makes Japan an active zone. Friction of the descending plates creates Mount Fuji and other volcanoes. Small earthquakes are so commonplace that the Japanese may not even pause in their conversations during a jolt that sends tourists rushing for the street. And there, in a nutshell, is why the next big earthquake is unpredictable. I Too many false alarms As a young geophysicist, Hiroo Kanamori was one of the first in Japan to embrace the theory of plate tectonics as an explanation for geological action. He was co- author of the earliest popular book on the subject, Debate about the Earth (1970). For him, the terr aces of Izu and Boso were ample proof of an unstoppable process at work, such that the earthquake that devastated Tokyo and Yokohama in 1923, and killed 100,000 people, is certain to be repeated some day. First at Tokyo University and then at Caltech, Kanamori devoted his career to fundamental research on earthquakes, especially the big ones. His special sk ill lay in extracting the fullest possible information about what happened in an earthquake, from the recordings of ground movements by seismometers lying in different directions from the scene. Kanamori developed the picture of a subducted tectonic plate pushing into the Earth with enormous force, becoming temporarily locked in its descent at its interface with the overriding plate, and then suddenly breaking the lock. Looking back at the records of a big earthquake in Chile in 1960, for example, he figured out that a slab of rock 800 by 200 kilometres suddenly slipped by 21 metres, past the immediately adjacent rock. He could deduce this even though the fault line was hidden deep under the surface. That, by the way, was the largest earthquake that has been recorded since seismometers were invented. Its magnitude was 9.5. When you hear the strength of an earthquake quoted as a figure on the Richter scale, it is really Kanamori’s moment magnitude, which he introduced in 1977. He was careful to match it as closely as possible to the scale pioneered in the 1930s by Charles Richter of Caltech and others, so the old name sticks. The Kanamori scale is more directly related to the release of energy. 220 earthquakes Despite great scientific progress, the human toll of earthquakes continued, aggravated by population growth and urbanization. In Tangshan in China in 1976, a quarter of a million died. Earthquake prediction to save lives therefore became a major goal for the experts. The most concerted effor ts were in Japan, and also in California, where the coastal strip slides north-westward on the Pacific Plate, along the San Andreas Fault and a swarm of related faults. Prediction was intended to mean not just a general declaration that a region is earthquake prone, but a practical early warning valid for the coming minutes or hours. For quite a while, it looked as if diligence and patience might give the answers. Scatter seismometers across the land and the seabed to record even the smallest tremors. Watch for foreshocks that may precede big earthquakes. Check especially the portions of fault lines that seem to be ominously locked, without any small, stress-relieving earthquakes. The scientists pore over the seismic charts like investors trying to second-guess the stock markets. Other possible signs of an impending earthquake include electrical changes in the rocks, and motions and tilts of the ground detectable by laser beams or navigational satellites. Alterations in water levels in wells, and leaks of radon and other gases, speak of deep cracks developing. And as a last resort, you can observe animals, which supposedly have a sixth sense about earthquakes. Despite all their hard work, the forecasters failed to give any warning of the Kobe earthquake in Japan in 1995, which caused more than 5000 deaths. That event seemed to many experts to draw a line under 30 years of effort in prediction. Kanamori regretfully pointed out that the task might be impossible. Micro-earthquakes, where the rock slippage or creep in a fault is measured in millimetres, rank at magnitude 2. They are imperceptible either by people or by distant seismometers. And yet, Kanamori reasoned, many of them may have the potential to grow into a very big one, ranked at magnitude 7–9, with slippages of metres or tens of metres over long distances. The outcome depends on the length of the eventual crack in the rocks. Crack prediction is a notoriously difficult problem in materials science, with the uncertainties of chaos theory coming into play. In most micro-earthquakes the rupture is halted in a short distance, so the scope for false alarms is unlimited. ‘As there are 100,000 times more earthquakes of magnitude 2 than of magnitude 7, a short-term prediction is bound to be very uncertain,’ Kanamori concluded in 1997. ‘It might be useful where false alarms can be tolerated. However, in modern highly industrialized urban areas with complex lifelines, communication systems and financial networks, such uncertain predictions might damage local and global economies.’ 221 earthquakes I Earthquake control? During the Cold War a geophysicist at UC Los Angeles, Gordon MacDonald, speculated about the use of earthquakes as a weapon. It would operate by the explosion of bombs in small faults, intended to trigger movement in a major fault. ‘For example,’ he explained, ‘the San Andreas fault zone, passing near Los Angeles and San Francisco, is part of the great earthquake belt surrounding the Pacific. Good knowledge of the strain within this belt might permit the setting off of the San Andreas zone by timed explosions in the China Sea and the Philippines Sea.’ In 1969, soon after MacDonald wrote those words, Canada and Japan lodged protests against a US series of nuclear weapons tests at Amchitka in the Aleutian Islands, on the grounds that they might trigger a major natural earthquake. They didn’t, and the question of whether a natural earthquake or an explosion, volcanic or man-made, can provoke another earthquake far away is still debated. If there is any such effect it is probably not quick, in the sense envisaged here. MacDonald’s idea nevertheless drew on his knowledge of actual man-made earthquakes that happened by accident. An underground H-bomb test in Nevada in 1968 caused many small earthquakes over a period of three weeks, along an ancient fault nearby. And there was a longer history of earthquakes associated with the creation of lakes behind high dams, in various parts of the world. Most thought provoking was a series of small earthquakes in Denver, from 1963 to 1968, which were traced to an operation at the nearby Rocky Mountain Arsenal. Water contaminated with nerve gas was disposed of by pumping it down a borehole 3 kilometres deep. The first ear thquake occurred six weeks after the pumping began, and activity more or less ceased two years after the operation ended. Evidently human beings could switch earthquakes on or off by using water under pressure to reactivate and lubricate faults within reach of a borehole. This was confirmed by experiments in 1970–71 at an oilfield at Rangely, Colorado. They were conducted by scientists from the US National Center for Earthquake Research, where laboratory tests on dry and wet rocks under pressure showed that jerks along fractures become more frequent but much weaker in the presence of water. From this research emerged a formal proposal to save San Francisco from its next big earthquake by stage-managing a lot of small ones. These would gently relieve the strain that had built up since 1906, when the last big one happened. About 500 boreholes 4000 metres deep, distributed along California’s fault lines, would be needed. Everything was to be done in a controlled fashion, by pumping water out of two wells to lock the fault on either side of a third well where the quake-provoking water would be pumped in. 222 earthquakes The idea was politically impossible. Since every earthquake in Califor nia would be blamed on the manipulators, whether they were really responsible or not, litigation against the government would continue for centuries. And it was all too credible that a small man-made earthquake might trigger exactly the major event that the scheme was intended to prevent. By the end of the century Kanamori’s conclusion, that the growth of a small earthquake into a big one might be inherently unpredictable, carried the additional message: you’d better not pull the tiger’s tail. I Outpacing the earthquake waves Research efforts switched from prediction and prevention to mitigating the effects when an earthquake occurs. Japan leads the world in this respect, and a large part of the task is preparation, as if for a war. It begins with town planning, the design of earthquake-resistant buildings and bridges, reinforcements of hillsides against landslips, and improvements of sea defences against tsunamis—the great ‘tidal waves’ that often accompany earthquakes. City by city, district by district, experts calculate the risks of damage and casualties from shaking, fire, landslides and tsunamis. The entire Japanese population learns from infancy what to do in the event of an earthquake, and there are nationwide drills every 1 September, the anniversary of the 1923 Tokyo–Yokohama earthquake. Operations rooms like military bunkers stand ready to take charge of search and rescue, firefighting, traffic control and other emergency services, equipped with all the resources of information technology. Rooftops are painted with numbers, so that helicopter pilots will know where they are when streets are filled with rubble. The challenge to earthquake scientists is now to feed real-time information about a big earthquake to societies ready and able to use it. A terrible irony in Kobe in 1995 was that the seismic networks and communications systems were themselves the first victims of the earthquake. The national government in Tokyo was unaware of the scale of the disaster until many hours after the event. The provision for Japan’s bullet trains is the epitome of what is needed. As soon as a strong quake begins to be felt in a region where they are running, the trains slow down or stop. They respond automatically to radio signals generated by a computer that processes data from seismometers near the epicentre. When tracks twist and bridges tumble, the life–death margin is reckoned in seconds. So the system’s designers use the speed of light and radio waves to outpace the earthquake waves. Similar systems in use or under development, in Japan and California, alert the general public and close down power stations, supercomputers and the like. 223 earthquakes Especially valuable is the real-time warning of aftershocks, which endanger rescue and repair teams. A complication is that, in a very large earthquake, the idea of an epicentre is scarcely valid, because the great crack can run for a hundred or a thousand kilometres. I Squeezing out the water There is much to learn about what happens underground at the sites of earthquakes. Simple theories about the sliding of one rock mass past another, and the radiation of shock waves, have now to take more complex processes into account. Especially enigmatic are very deep earthquakes, like one of magnitude 8 in Bolivia in 1994. It was located 600 kilometres below the surface and Kanamori figured out that nearly all of the energy released in the event was in the form of heat rather than seismic waves. It caused frictional melting of the rocks along the fault and absorbed energy. In a way, it is surprising that deep earthquakes should occur at all, seeing that rocks are usually plastic rather than brittle under high temperatures and pressures. But the earthquakes are associated with pieces of tectonic plates that are descending at plate boundaries. Their diving is a crucial part of the process by which old oceanic basins are destroyed, while new ones grow, to operate the entire geological cycle of plate tectonics. A possible explanation for deep earthquakes is that the descending rocks are made more rigid by changes in composition as temperatures and pressures increase. Olivine, a major constituent of the Earth, converts into serpentine by hydration if exposed to water near the surface. When carried back into the Earth on a descending tectonic plate, the serpentine could revert to olivine by having the water squeezed out of its crystals. Then it would suddenly become brittle. Although this behaviour of serpentine might explain earthquakes to a depth of 200 kilometres, dehydration of other minerals would be needed to account for others, deeper still. A giant press at Universita ¨ t Bayreuth enabled scientists from University College London to demonstrate the dehydration of serpentine under enormous pressure. In the process, they generated miniature earthquakes inside the apparatus. David Dobson commented, ‘Understanding these deep ear thquakes could be the key to unlocking the remaining secrets of plate tectonics.’ I The changes at a glance After an earthquake, experts traditionally tour the region to measure ground movements revealed by miniature scarps or crooked roads. Nowadays they can use satellites to do the job comprehensively, simply by comparing radar pictures obtained before and after an earthquake. The information contained within an 224 earthquakes image generated by synthetic-aperture radar is so precise that changes in relative positions by only a centimetre are detectable. The technique was put to the test in 1999, when the Izmit earthquake occurred on Turkey’s equivalent of California’s San Andreas Fault. Along the North Anatolian Fault, the Anatolian Plate inches westwards relative to the Eurasian Plate, represented by the southern shoreline of the Black Sea. The quake killed 18,000 people. Europe’s ERS-2 satellite had obtained a radar image of the Izmit region just a few days before the event, and within a few weeks it grabbed another. When scientists at the Delft University of Technology compared the images by an interference technique, they concluded that the northern shore of Izmit Gulf had moved at least 1.95 metres away from the satellite, compared with the southern shore of the Black Sea. Among many other details perceptible was an ominous absence of change along the fault line west of Izmit. ‘At that location there is no relative motion between the plates,’ said Ramon Hanssen, who led the analysis. ‘A large part of the strain is still apparent, which could indicate an increased risk for a future earthquake in the next section of the fault, which is close to the city of Istanbul.’ E For the driving force of plate motions and the use of earthquake waves as a means of probing the Earth’s interior, see Plate motions and Hotspots. 225 earthquakes A mong leonardo da vinci’s many scientific intuitions that have stood the test of half a millennium is his suggestion that the Moon is lit by the Earth, as well as by the Sun. That was how he accounted for the faint glow visible from the dark portion of a crescent Moon. ‘Some have believed that the Moon has some light of its own,’ the artist noted in his distinctive back-to-front writing, ‘but this opinion is false, for they have based it upon that glimmer visible in the middle between the horns of the new Moon.’ With neat diagrams depicting relative positions of Sun, Earth and Moon, Leonardo reasoned that our planet ‘performs the same office for the dark side of the Moon as the Moon when at the Full does for us’. The Florentine polymath was wrong in one respect. Overimpressed by the glistening wester n sea at sunset, he thought that the earthshine falling on the Moon came mainly from sunlight returned into space by the Earth’s oceans. In fact, seen from space, the oceans look quite dark. The brightest features are cloud tops, which modern air travellers know well but Leonardo did not. If you could stand on the Moon’s dark side and behold the Full Earth it would be a splendid sight, almost four times wider than the Full Moon seen from the Earth, and 50 times more luminous. The whole side of the Earth turned towards the Moon contributes to the lighting of each patch of the lunar surface, to varying degrees. Ice, snow, deserts and airborne dust appear bright. But the angles of illumination from Sun to Earth to Moon have a big effect too, so that the most important brightness is in the garland of cloud tops in the tropics. From your lunar vantage point you’d see the Earth rotating, which is a pleasure denied to Moon watchers, who only ever see one face. In the monsoon season, East and South Asia are covered with dense rain clouds. So when dawn breaks in Shanghai, and Asia swings out of darkness and into the sunlight, the earthshine can increase by as much as ten per cent from one hour to the next. In the 21st century a network of stations in California, China, the Crimea and Tenerife is to measure Leonardo’s earthshine routinely, as a way of monitoring climate change on our planet. Astronomers can detect small variations in the Earth’s brightness. These relate directly to warming or cooling, because the 226 30 per cent or so of sunlight that the Earth reflects can play no part in keeping the planet warm. The rejected fraction is called the albedo, and the brighter the Earth is, the cooler it must be. What’s more, the variations seen on the dark side of the Moon occur mainly because of changes in the Earth’s cloudiness. Weather satellites observe the clouds, region by region, and with due diligence NASA scientists combine data from all the world’s satellites to build up global maps of cloudiness, month by month. But if you are interested in the total cloud cover, it is easier, cheaper and more reliably consistent just to look at the Moon. And you are then well on the way to testing theories about why the cloud cover changes. I The ‘awfully clever’ Frenchmen The pioneer of earthshine measurements, beginning in 1925, was Andre ´ -Louis Danjon of the Observatoire de Strasbourg, who later became director of the Observatoire de Paris. Danjon used a prism to put two simultaneous images of the Moon side by side. With a diaphragm like a camera stop he then made one image fainter until a selected patch on its sunlit part looked no brighter than a selected earthlit patch. By the stoppage needed, he could tell that the earthshine’s intensity was only one five-thousandth of the sunshine’s. Danjon found that the intensity varied a lot, from hour to hour, season to season, and year to year. His student J. E. Dubois used the technique in systematic observations from 1940 to 1960 and came to suspect that changes in the intensity of earthshine were linked to the activity of the Sun in its 11-year sunspot cycle, with the strongest earthshine when the sunspots were fewest. But the measurements were not quite accurate enough for any firm conclusions to be drawn, in that respect. ‘You realize that those old guys were awfully clever,’ said Steven Koonin of Caltech in 1994. ‘They didn’t have the technology but they invented ways of getting around without it.’ Koonin was a nuclear physicist who became concerned about global warming and saw in earthshine a way of using the Moon as a mirror in the sky, for tracking climate change. He re-examined the French theories for interpreting earthshine results, and improved on them. A reconstruction of Danjon’s instrument had been made at the University of Arizona, but Koonin wanted modern electronic light detectors to do the job thoroughly. He persuaded astronomers at Caltech’s Big Bear Solar Observatory to begin measurements of earthshine in 1993. And by 2000 Philip Goode from Big Bear was able to report to a meeting on climate in Tenerife that the earthshine in that year was about two per cent fainter than it had been in 1994–95. As nothing else had changed, to affect the Earth’s brightness so much, there must have been an overall reduction of the cloud cover. Goode noted two 227 earthshine possible explanations. One had to do with the cycle of El Nin ˜ o, affecting sea temperatures in the eastern Pacific, which were at a minimum in 1994 and a maximum in 1998. Conceivably that affected the cloud cover. The other explanation echoed Dubois in noting a possible link with the Sun’s behaviour. Its activity was close to a minimum in 1994–95, as judged by the sunspot counts, and at maximum in 2000. Referring to a Danish idea, Goode declared: ‘Our result is consistent with the hypothesis, based on cloud cover data, that the Earth’s reflectance decreases with increasing solar activity.’ I Clouds and cosmic rays Two centuries earlier, when reading Adam Smith’s The Wealth of Nations, the celebrated astronomer William Herschel of Slough noticed that dates given for high prices of wheat in England were also times when he knew there was a lack of dark sunspots on the Sun’s bright face. ‘It seems probable,’ Herschel wrote in 1801, ‘that some temporary scarcity or defect of vegetation has generally taken place, when the Sun has been without those appearances which we surmise to be symptoms of a copious emission of light and heat.’ Thereafter solar variations always seemed to be a likely explanation of persistent cooling or warming of the Earth, from decade to decade and century to century, as seen throughout climate history since the end of the last ice age. They still are. There was, though, a lapse of a few years in the early 1990s, when there was no satisfactory explanation for how the Sun could exert a significant effect on climate. The usual assumption, following Herschel, was that changes in the average intensity of the Sun’s radiation would be responsible. Accurate gauging of sunshine became possible only with instruments on satellites, but by 1990 the space measurements covered a whole solar cycle, from spottiest to least spotty to spottiest again. Herschel was right in thinking that the spotty, active Sun was brightest, but the measured variations in solar radiation seemed far too small to account for important climatic changes. During the 1990s other ways were suggested, whereby the Sun may exert a stronger influence on climate. One of them involved a direct effect on cloudiness, and therefore on earthshine. This was the Danish hypothesis to which Goode of Big Bear alluded. It arose before there was any accurate series of earthshine measurements; however, the compilations of global cloud cover from satellite observations spanned enough years for the variations in global cloudiness to be compared with possible causes. Henrik Svensmark, a physicist at the Danmarks Meteorologiske Institut in Copenhagen, shared the general puzzlement about the solar effect on climate. Despite much historical evidence for it, there was no clear mechanism. He knew 228 earthshine [...]... constellation, and it first showed up as an exceptionally loud source of radio waves NASA’s Compton gamma-ray observatory, launched in 1991, carried a German instrument that charted evidence of element-making all around the sky For this purpose, the German astronomers selected the characteristic gamma rays coming from aluminium-26, a radioactive element that decays away, losing half of all its nuclei in a. .. astronomers at the sites of the 10 54, 1572 and 16 04 supernovae By a modern interpretation, Tycho’s and Kepler’s Stars may have been supernovae of Type Ia That means a small, dense white dwarf star, the corpse of a defunct normal star, sucking in gas from a companion star until it becomes just hot enough to burn carbon atoms in a stupendous nuclear explosion, making radioactive nickel that soon decays into stable... younger than the well-known Vela object With the Compton satellite, his colleagues recorded gamma-ray emissions from Aschenbach’s object, due to radioactive titanium -44 made in the stellar 250 elements explosion As half of any titanium -44 disappears by decay every 90 years, this confirmed that the remnant was young Estimated to date from around a d 1300, and to be only about 650 light-years away, it was the... system—see Volc anic explosio ns Another source of headaches is the non-stop discovery of new linkages An early example in the era of Earth system science was the role of marine algae as a source of sulphate grains in the air—see Global enzymes Later came a suggestion that cosmic rays from the Galaxy are somehow involved in cloud formation—see Earthshi ne The huge effort in the global-change programmes in... years that followed Astronomers had the time of their lives They had routinely watched supernovae in distant galaxies, but Supernova 198 7A was the first occurring at fairly close range that could be examined with the panoply of modern telescopes The multinational Infrared Ultraviolet Explorer was the quickest satellite on the case and its data revealed exactly which star blew up SanduleakÀ698 202 was... as physicists call it—required a neutral Z as well as the Ws of positive and negative charge There was a big snag Although the W and Z particles of the weak force were supposedly similar to the photons of the electric force, they operated only over a very short range, on a scale smaller than an atomic nucleus A principle of quantum theory relates a particle’s sphere of influence inversely to its mass,... court related that the keeper of the calendar prostrated himself before the emperor and announced, ‘I have observed the appearance of a guest star.’ In the summer of 10 54, the bright star that we would call Aldebaran in the Taurus constellation had acquired a much brighter neighbour The guest was visible even in daylight for three weeks, and at night for more than a year Half a millennium later in southern... from an unusual nuclear cataclysm in the sky A few hours later, at Las Campanas in Chile, Ian Shelton from Toronto was observing the Large Magellanic Cloud, the closest galaxy to our own, and he saw a bright new speck of light It was the first such event since Kepler’s that was visible to the naked eye Austerely designated as Supernova 198 7A, it peaked in brightness 80 days after discovery and then faded...earthshine that the Sun’s activity during a sunspot cycle affects the influx of cosmic rays These energetic atomic particles rain down on the Earth from exploded stars in the Galaxy They are fewest when the Sun is in an active state, with many sunspots and a strong solar wind that blows away many of the cosmic rays by its magnetic effect Cosmic rays make telltale radioactive materials in the Earth’s... virtual photons At first this idea seemed crazy or at best unmanageable because the calculations gave electrons an infinite mass and infinite charge Nevertheless, in 1 947 slight discrepancies in the wavelengths of light emitted by hydrogen atoms established the reality of the cloud of virtual photons Sin-Itiro Tomonaga in Tokyo, Julian Schwinger at Harvard and Richard Feynman at Cornell then tamed . words, Canada and Japan lodged protests against a US series of nuclear weapons tests at Amchitka in the Aleutian Islands, on the grounds that they might trigger a major natural earthquake. They. in Japan, and also in California, where the coastal strip slides north-westward on the Pacific Plate, along the San Andreas Fault and a swarm of related faults. Prediction was intended to mean. explosions. Another source of headaches is the non-stop discovery of new linkages. An early example in the era of Earth system science was the role of marine algae as a source of sulphate grains in the air—see Global

Ngày đăng: 08/08/2014, 01:20

TỪ KHÓA LIÊN QUAN