Are Volcanoes Slowing Global Warming?

In 2013, the world pumped 36 billion metric tons (40 billion US tons) of CO2 into the atmosphere through the burning of fossil fuels. Such emissions form a carbon dioxide blanket that allows the sun to penetrate, but prevents much of the surface heat from reflecting back into space. As a result, the oceans are rapidly warming, arctic sea ice is diminishing, and glaciers and the Greenland Ice Sheet are melting at a record rate.

The world’s surface temperature has steadily increased for the past 150 years, and it was assumed the curve would keep climbing at the same rate. But unexpectedly, global surface temperature peaked to its highest historical level in 1998, then flattened out and has remained about the same since, raising questions in the scientific community.

A Lawrence Livermore National Laboratories study that appeared in the Feb. 23, 2014, issue of the journal Nature Geoscience suggests one reason for this unexpected development is the higher than normal rate of volcanic activity over the past 15 years.  Eruptions during that period included 17 ranked VEI 4 on the Volcanic Explosivity Index. A VEI 4 is termed cataclysmic and sends an ash plume 10 to 25km (6 to 15 mi) into the air, sufficient to penetrate the stratosphere with sulfur dioxide aerosols that remain there for months, even years.

“In the last decade the amount of volcanic aerosol in the stratosphere has increased, so more sunlight is being reflected back into space,” said Lawrence Livermore Climate Scientist Benjamin Santer, lead author on the study. “This has created a natural cooling of the planet and has partly offset the increase in surface and atmospheric temperatures due to human influence.” The paper states the research team found evidence for significant correlations between volcanic aerosol observations and satellite-based estimates of lower temperatures, as well as sunlight reflected back into space by the aerosol particles.

Santer’s conclusions seem to be supported by an earlier study by the University of Saskatchewan. In this study, the researchers found that sulfur dioxide aerosols from a very small African eruption had “hitchhiked” their way into the stratosphere. Warm air rising from the seasonal Asian Monsoon lifted the volcano’s aerosols from the lower atmosphere into the stratosphere, where it was detected by the Canadian Space Agency’s satellite OSIRIS, an instrument specifically designed to measure atmospheric aerosols. Even though coming from a small eruption, the concentration of particles was the largest load of SO2 aerosol ever recorded by OSIRIS in its 10 years of operation.

The Lawrence Livermore paper suggests that one other possible contributor to the temporary cooling effect is the unusually long and low minimum in the solar cycle. Don’t be surprised to see surface temperatures start climbing again when volcanic activity subsides and the cooler phase of the solar cycle concludes.

 

 

 

 

 

Why Chile Has So Many Earthquakes & Tsunamis

The Magnitude 8.2 quake that struck off the coast of Chile on April 1, 2014, was the latest in a series of major earthquakes and tsunamis to hit that area in recent years. The undersea quake and resulting 7 ft. (2.1m) tsunami killed 7, toppled buildings, and severely damaged the Chilean fishing fleet.  Earthquake/tsunami events in 2010 (M8.8), 2007 (M7.7), 2005 (M7.8), and 2001 (M8.4) killed more than 1,000 and inflicted billions of dollars in property damage .

The most powerful earthquake ever recorded, a Magnitude 9.5, hit the coast of Chile on May 22, 1960. The monster quake triggered an 82 ft (25m) tsunami that not only battered the west coast of South America, but rolled across the Pacific Basin, devastating Hilo, Hawaii, and damaging coastal villages as far away as Japan and the Philippines. Some sources estimate 6,000 dead and $800 million in property loss (6 billion in 2014 dollars).

Why does this area of planet earth spawn so many high-magnitude earthquakes and punishing tsunamis?

One explanation is that the collision of the two tectonic plates that meet off the South American west coast occurs, in geologic terms, at a very high rate of speed. The oceanic Nazca Plate and the continental South American Plate converge in the Peru-Chile trench that lies about 100 mi (160km) off the coast. The overriding South American Plate moves eastward at 10cm a year, while the subducting Nazca Plate pushes west at 16cm/y, a closing velocity of 26cm/y (about 10 in.), one of the fastest absolute motions of any tectonic plate. The Africa Plate, for example, moves approximately 7 times slower.

This high closing velocity builds up fault line strain much faster than it does when slower-moving plates converge. Every few years, tension on the Peru-Chile fault line builds up to a breaking point. In this latest earthquake on April 1, a 100 mi. (160km) section of the fault line ruptured, allowing the Nazca Plate to ram under the South American Plate. This sudden violent action 12.5 mi (20.1km) below the ocean floor triggered the tsunami and the 8.2 earthquake, and at the same time wedged the South American Plate higher. Uplifting from frequent fault line failures continues to build the Andes Mountain Range into one of the highest in the world. During the 1960 M9.5 quake, some coastal areas uplifted as much as 10 ft. (3m).

As long as the two tectonic plates that meet off the South American coast move geologically at such high speed, major earthquakes and tsunamis will keep happening. We hope the zoning laws and building codes put in place by the governments of Chile and Peru will keep the damage and loss of life to a minimum.  

Why Did the Hill Come Down?

As of this writing, 21 people have been confirmed dead and 30 are missing in the disastrous March 22, 2014, Oso, Washington mudslide. We send our condolences to all those affected by this terrible tragedy.

At the same time, we have to ask ourselves why a forest-covered mountainside would suddenly shear off and bury an entire community of 30 homes under a 1 square mile (2.6km²) mud and debris slide 40 ft (12m) deep.

Two main reasons have been given. One is that the hill had become saturated after weeks of heavy rainfall. The rainfall in that area during the month of March was 200% of normal. Although the soil there is compacted clay that tends to be impermeable, it is believed there were cracks at the top that allowed the rain to penetrate. The other reason for the failure is that the swollen Stillaguamish River at the base was undercutting the toe of the hill. With the base of the hill weakened and the slope heavy with soaked-in rain, the hill collapsed.

After a number of landslides had been reported in that area during the prior 40 years, the US Army Corps of Engineers did a survey there in 1999 and issued a report warning of “the potential for catastrophic failure.” In 2006, a section of that same hill collapsed and blocked the course of the river. Other state and local agencies had examined the hill at various times and all concluded it was unstable. Whether the permit-issuing authorities were aware of those findings is not known. What is known is that building permits for that location continued to be issued, even after the 2006 slide.

The last compilation of world landslide statistics was posted by the American Geographical Union for the year 2010. In that year, 6,211 people died in 494 landslide events worldwide. 83,275 landslide deaths were reported for the period September, 2002 to December 2010, an average of a little more than 10,000 a year. People living in the mountains of China, India, Central America, the Philippines, Taiwan, and Brazil were the most vulnerable during that period. Landslides and mudslides often occur when intense rainfall from tropical storms and monsoons saturate hillsides that have been compromised by logging, farming, and construction. Although not as highly dramatic as earthquakes and tsunamis, landslides may be the most costly of all natural disasters in loss of life and property.

In the United States, landslide fatalities average between 25 and 50 a year, according to the Centers for Disease Control and Prevention. Using airborne Lidar, a laser-based mapping system, it is now possible to set up a national data bank on areas throughout the US that are susceptible to hillside failure, but it would be a long and very costly project. Until such a survey is done, local jurisdictions will have to rely on other methods to determine landslide-prone areas. Even knowing the possible dangers, people will still build homes below unstable hillsides, in fire areas, and flood plains. It is up to local zoning authorities to prohibit building in these hazardous places.

 

 

Offshore Wind Farms

Constant winds in coastal waters make offshore wind farms highly productive. Most offshore wind turbines are installed on pilings in shallow waters within a few miles of the shoreline, but there are some on floating platforms farther offshore.

The United Kingdom’s 20 offshore wind farms supplied 10% of that nation’s total electrical power production in January, 2014, and 11% in February. Britain is the world leader in number of wind farms located in coastal waters, and in total amount of energy produced. Germany, Netherlands, Denmark, Belgium, and Sweden are close behind with another 58 offshore wind farms, and dozens more under construction or in the planning stage. Offshore wind farms are projected to produce 4% of total European power by 2020, and 15% by 2030.

The US leads the world in amount of energy produced by wind turbines: 120 billion kilowatt hours in 2013, representing more than 4% of US energy production. However, all US wind farms are currently land based. At this time, the US has no offshore wind farms. Plans are on the drawing board and permits have been granted for offshore wind farms in Massachusetts, New Jersey, Rhode Island, and Oregon, but so far no construction work has started. Reasons given are reluctance to increase the cost to the rate payer, and NIMBY (not in my backyard) campaigns by homeowners and environmental groups.

The US Atlantic and Gulf coasts provide more suitable sites for offshore installations than the Pacific Coast, because of a longer and shallower slope out to the edge of the continental shelf. In some areas, shallow waters extend out as far as 200km (160 mi) on the Atlantic coast. The continental shelf drop-off to deep water on the Pacific coast is steeper and more abrupt and not as suitable for shallow water farms. A Seattle company has obtained a lease from Dept. of Interior for 15 square miles of federal waters off Coos Bay, Oregon, for a wind farm on floating platforms anchored by cable to the ocean floor.

Could a massive offshore wind farm project also serve as a buffer against hurricanes and storm surges? Yes, according to a study by Mark Jacobson, professor of civil and environmental engineering at Stanford, and two co-authors, published in the journal Nature Climate Change. In the study, the researchers used computer simulations of Hurricanes Katrina, Sandy, and Isaac to determine the effect of massive offshore wind farms on wind speed and storm surge. In the case of Katrina, the researchers found that an array of 78,000 turbines in coastal waters would have reduced wind speed at landfall 65% to 78%, and storm surge by 79%. Similar results were obtained for Sandy and Isaac. It is not likely that 78,000 turbines will ever be installed offshore in one farm, but if that had been the case, and if the researchers’ conclusions are correct, it would have brought Katrina’s wind speed down to 28 to 44 mph from 125 mph, saved thousands of lives, and $100 billion in Gulf Coast reconstruction. Also, that many turbines would be producing millions of megawatts of clean power. It’s something to think about.  

 

 

 

 

 

 

Sun, Wind, & Fresh Water

Converting ocean water into fresh water is energy intensive, and therefore expensive. Saudi Arabia is a desert kingdom with plenty of oil but very little fresh water. The Saudis burn 1 million barrels of oil a day to produce 60% (4 billion cubic meters) of its total fresh water supply through desalination. If exported onto the world market, those 1 million barrels of oil would bring Saudi Arabia $115 million a day, but it is worth it to them to forgo the profits and have the fresh water. From an environmental standpoint, burning 1 million barrels of oil a day sends close to a half million tons of CO2 emissions into the atmosphere every day, contributing greatly to the pace of global warming.

To deal with these problems, the Saudis have joined with IBM to build a series of solar-powered desalination plants that could by mid-century produce a large share of the kingdom’s water needs.

However, the largest solar-powered desalination plant yet designed will be built in the United Arab Emirates. The Ras Al Khaimah plant, scheduled to start production in 2015, will produce 100,000 cubic meters (approx. 22 million gallons) of fresh water a day, and in addition, provide 20 megawatts of electrical power daily. The developers estimate they will be able to deliver water at a cost of $0.75 per cubic meter. Average cost per cubic meter of water delivered to households in the United States runs between 0.35 and 0.40. Most of the desalination plants run by solar energy are situated in the Middle East where there is an abundance of year round sun and a scarcity of water.

The largest desalination plant run by wind power is near Perth in Western Australia. The Kwinana Desalination Plant produces 144,000 cubic meters of water a day (approx. 38 million gallons), about 17% of Perth’s water supply. The Kwinana plant is powered by the 80 Megawatt Emu Downs wind farm located 200 miles away. Because electrical power has to be supplied evenly 24/7, and because the wind stops blowing from time to time, the power from the wind farm goes into the grid on a trade-off basis. The wind farm contributes 270 Gigawatt hours a year into the power grid, more than offsetting the 180 Gigawatt/h year required to operate the desalination plant. There are a number of smaller desalination plants run by wind-generated electrical power that goes directly from the wind farm to the plant, but Perth has opted for the offset arrangement.

Most desalination plants are still operated with grid power generated by coal, oil, or natural gas because it is less expensive than spending hundreds of millions to construct solar arrays or wind farms. For example, Australia’s other desalination plants providing fresh water to Sydney, Melbourne, Adelaide, and other coastal areas use fossil fuel power from the grid. But more and more, new desalination plants around the world are being planned to operate on alternative power. At some point in the future, all our electricity will have to come from those sources.

 

Crazy Weather & Global Warming

In the first 6 weeks of 2014, the world spawned some of the most severe weather in hundreds of years, including record snowfall in the Midwest and Great Lakes, record cold in the US northeast, ice storms in the southeast, record drought in the southwest, record flooding and windstorms in the UK, unseasonal warming in Scandinavia and Russia, record snowfall in the southern Alps, record flooding in Italy, and record heatwaves and wildfires in Australia, Argentina, and Brazil.

Despite the record snow, ice, and freezing temperatures in some areas, the world continued its long term upward warming trend. NOAA reported that 2013 was tied with 2003 as the warmest year on record. What’s going on?

According to a paper presented this month at a meeting of the American Assn. for the Advancement of Science in Chicago, a weakening jet stream caused by Arctic warming is a possible cause. The polar jet stream is a high-altitude air current with wind speeds of 100 to 120 mph (160 to 200kph) that acts as a weather conveyor belt. When Arctic temperatures stay cold, the jet stream blows stronger and tends to stay in place, bringing normal winter weather to North America, Europe, and Asia.

In January, 2014, the air temperature over the Arctic Ocean was 2 to 4˚C (4 to 7˚ F) higher than average, and 7 to 8˚C (13 to 14˚ F) higher than average over Greenland and Alaska. As the Arctic warms, the jet stream weakens and begins sinking south of its polar route. At the same time, Arctic sea ice is melting at a record rate, exposing more ocean to the rays of the sun. The warmer ocean water in turn accelerates Arctic warming. More rapid evaporation pumps extra moisture into the atmosphere.

A sinking jet stream carries the moisture-laden high-altitude cold Arctic air south into the Midwest and southeast, and across the Atlantic to Europe. While southern Europe is experiencing record rains and snowfall, northern Europe, normally very cold in January and February, is basking in abnormally warm temperatures. With the glaciers and polar ice caps melting at a record rate, sea ice contracting, and oceans warming, it seems obvious that global warming is here, and to some extent driving the world’s current radical weather patterns. The weather will become more radical and storms more intense as the earth gets warmer.

But what is driving global warming? The UN’s International Panel for Climate Change (IPCC) has concluded from all available scientific evidence that it is 95% likely that most of the rise in global temperature since the middle of the 20th Century is due to emissions of greenhouse gases, deforestation, and other human activities.

If greenhouse emissions continue at their present rate the IPCC computer models predict our planet will warm 5˚C (9˚ F) by 2100, and by 10˚C (18˚F) during the following century. The earth is now warmer than it has been since the end of the last ice age 11,300 years ago. If we don’t drastically reduce our carbon-based emissions and start relying more on alternative fuels, are we headed for another ice age? Or another age hot enough for dinosaurs?

 

 

 

 

 

 

Natural Disasters 2013 Review

According to figures released by the German Reinsurer Munich Re, twice as many people died in natural disasters in 2013 than in the prior year, but property damage and insurance losses were significantly less.

Munich Re reports 880 natural disaster events in 2013, costing $125 billion in total losses, compared to $173 billion in 2012, and insured losses of $31 billion, about half the insured costs in the year before. However more than 20,000 people died in natural disasters in 2013, twice the number of deaths reported for 2012. Here are some of the most costly natural disasters of 2013, in either lives or property losses.

Earthquakes: Magnitude 7.0 to 7.7 quakes struck China in April, Pakistan in September, and the island of Bohol in the Philippines in October, killing 1,300 and destroying tens of thousands of homes. Damage amounts were not available.

Tornadoes: On May 20, an EF-5 tornado with a wind speed of 210 mph (340 km/h) ripped through the town of Moore, Oklahoma. The tornado, 1.3 miles (2km) wide, stayed on the ground for 40 minutes on a 17-mile (27km) path of destruction. 1150 homes were wiped out, 91 people died, including 7 children in a local school. Total damage was more than $2 billion.

Floods: Flooding in India, Central Europe, Canada, Mexico, and Colorado resulted in a combined death toll of 7,000 and damages exceeding $30 billion. European flooding was called the worst since the middle ages. Most of the deaths occurred in flash floods and landslides in the mountains of northern India and Nepal.

Meteor strike: A 13,000 ton meteor traveling at 60 times the speed of sound streaked into earth’s atmosphere on Feb. 15 and exploded in a fireball over the Caucuses region of Russia. The shock wave damaged 7,200 buildings and injured 1,500 people. The injuries were mainly from flying glass from blown-out windows. Fortunately, there were no reported deaths.

Wildfires: Brush fires in Australia and California scorched hundreds of thousands of acres. In October, Australian firefighters fought 66 brush fires along a line that stretched for 1,000 miles (1,650km). In California’s Sierra Nevada Mountains, the Rim Fire that started in August was not put out till mid October, after burning 257,000 acres of heavily forested watershed.

Typhoons: Super Typhoon Haiyan struck the Philippines island of Leyte on November 8 with wind speed of 195 mph (320km/h), the strongest ever recorded for a tropical cyclone making landfall. A 20-ft (6m) tidal surge wiped out the city of Tacloban. More than 6,000 people lost their lives in the storm. Total cost has been estimated at up to $15 billion.

While the Pacific typhoon season was quite active, with 31 tropical storms, of which 13 were typhoons and 5 were super typhoons, the Atlantic hurricane season was much quieter than expected, with no major storms. The first few weeks of 2014 have also been relatively quiet, with the exception of the Mt. Sinabung volcano eruptions in Indonesia, during which 14 people have died and 20,000 have been evacuated.  Inevitably, there will be more natural disasters in the months ahead. We will have to wait and see what the rest of 2014 will bring.

 

 

 

 

When Volcanoes Endanger Aircraft

In a report issued by U.S. Geological Survey, there were 94 confirmed ash-cloud encounters by aircraft between 1953 and 2009. 79 of those produced various degrees of engine or airframe damage. 26 encounters involved significant to very severe damage, and 9 caused engine shutdown during flight.

Two of the most well known incidents involved passenger jets flown by KLM and British Airways. On June 24, 1982, British Airways Flight 9 flying at 37,000 ft. (11,000m) from London to Auckland, New Zealand, with 248 passengers and a crew of 15, entered an ash cloud rising from the erupting Mt. Galunggung volcano in Indonesia. All 4 engines flamed out due to the silica in the volcanic ash melting inside the engines and coating everything with glass. The plane had dropped 23,500 ft. (4,200m) before the crew was able to restart 3 of the engines and make an emergency landing in Jakarta.

On December 15, 1989, KLM Flight 867 from Amsterdam to Tokyo flew through a thick ash cloud from Alaska’s Mt. Redoubt volcano as the 747 started its descent into Anchorage. All 4 engines failed, and the plane lost 14,000 ft. (4,400m) in altitude before the crew could restart the engines and make a safe landing. The ingested ash caused $80 million in damage to the aircraft, including replacement of all 4 engines. The expertise of the air crews in both cases averted what could have been disastrous crashes.

The aviation industry learned from those incidents and started grounding all flights when volcanic ash was present. That’s why most European and North Atlantic flights were cancelled between April 15 and April 20, 2010, when Iceland’s Mt. Eyjafjallajökull erupted, ejecting 250 million cubic meters (330 million cubic yards) of volcanic ash into the atmosphere. The ash cloud drifted west, covering the sky over the North Atlantic and most of Europe. Many thousands of passengers were stranded in European airports for up to 5 days.

Ash clouds are hard to distinguish from moisture clouds either visually or by radar. That’s why aircraft continue to wander into them, and why the United Nations has set up a network of Volcanic Ash Advisory Centers (VAAC). There are 9 centers located around the world, each covering a geographic region. When an eruption produces an ash cloud, the VAAC in that area uses a computer model to predict the path of the cloud at different flight levels and issues an international alert. VAACs are located in Alaska, Argentina, Australia, England, Canada, Japan, France, and Washington, DC. Fewer incidents have been reported since the centers have been in full operation.

On average, 15 major explosive volcanic eruptions powerful enough to eject tons of ash into the stratosphere occur each year. A sudden Mt. St. Helens or Mt. Pinatubo type of super explosion can eject massive amounts of ash into the stratosphere in minutes, creating unexpected hazardous conditions. Air crews must stay ready to act immediately on VAAC ash alerts, and take the necessary evasive action to keep their flights safe and uneventful.  

 

  

The Next Tsunami — Where?

According to USGS, two North American fault line systems are at a critical stage. In a December 29, 2013, news release, USGS states that enough strain may be currently stored in an earthquake zone near the Caribbean island of Guadeloupe to cause a magnitude 8 or larger earthquake and subsequent tsunami. The release goes on to say that USGS and French researchers studying the plate boundary where 20 of the 26 Caribbean islands are located, estimate that enough unreleased strain may have accumulated to create a magnitude 8.0 to 8.4 earthquake. A 7.5-8.5 quake in the same area in 1843 killed thousands in Guadeloupe. A similar quake in the future could cause many hundreds of fatalities and hundreds of billions US dollars in damages. An accompanying tsunami could inflict an even higher toll.

The other fault zone considered to be due for a major failure lies off the northwestern US coastline. The Cascadia Subduction Zone runs 1,100km (700 mi) from Vancouver Island in British Columbia to Cape Mendocino in northern California. Recent studies indicate that a 60km (40 mi) segment of the fault off the coast of Washington is locked. In geological terms, locked means a point where the converging plates have been pressing together without releasing energy, perhaps for hundreds of years. The strain constantly builds until the fault’s frictional strength is exceeded and it finally ruptures.

The last major earthquake and tsunami on the Cascadia struck in 1700. That 9.0 quake triggered a tsunami that flattened trees many miles inland in Washington state, and rolled across the Pacific to inflict damage on Japanese coastal villages. The northwest was sparsely inhabited at that time, so there were no known casualties. A similar earthquake and tsunami today could be catastrophic. A study commissioned by the Oregon legislature concluded that in Oregon alone a Cascadia 9.0 earthquake and tsunami could kill 10,000 and cost $30 billion in damages.

Megathrust earthquakes and tsunamis have occurred on the Cascadia every 300 to 600 years. It has been a little over 300 years since the last one. The Oregonian newspaper recently reported that some geologists are predicting a 10% to 14% probability that the Cascadia will produce a magnitude 9.0 or greater earthquake within the next 50 years. An article in Science Daily  suggests that the risk could be as high as 37% for a magnitude 8.0 or greater in the same period.

Still, it’s impossible to say where or when the next big one will strike. Even though the Caribbean and Cascadia faults appear ready to go, the 4 ocean trench fault zones that have produced the biggest earthquakes and tsunamis of the recent past should not be ruled out. The Japan Trench off the northeastern coast of Honshu produced the 9.0 quake in 2011 that killed 20,000. The 2004 Indian Ocean 9.1 earthquake and tsunami that killed more than 200,000 started in the 2,600km (1,600 mi)-long Sunda Trench. The Great Alaska Earthquake, a magnitude 9.2 that struck on Good Friday in 1964, originated in the Aleutian Trench. The Atacama Trench off the coast of South America generated the largest earthquake on record, a magnitude 9.5 that struck off the coast of Chile in 1960, killing 5,000 and sending a tsunami speeding thousands of miles across the Pacific Ocean. These 4 ocean trench fault zones mark the convergence of highly active tectonic plates. All are part of the Pacific Ring of Fire.

Will Yellowstone Erupt?

The magma chamber that powers Old Faithful and the other geysers. hot springs, fumaroles, and mud pots of Yosemite National Park is considered by scientists to be the largest in the world. And a new study by researchers at the University of Utah finds that the chamber underlying Yellowstone is far larger than originally thought in terms of both size and amount of molten rock it contains.

According to the study, the Yellowstone Volcano magma chamber is 2.5 times larger than earlier estimates. By using a network of seismometers situated around the park, the research team found that the magma cavern is 90km (55 mi) long, 30km (20 mi) wide, and up to 15km (10mi) deep, containing up to 600 cubic km (144 cubic mi) of hot gas and molten rock.

Geologic research indicates Yellowstone Volcano erupts every 700,000 years. In the last three events – 2.1 million, 1.3 million, and 640,000 years ago — the magma chamber emptied out in a single violent volcanic blast. Millions of tons of rocks, sulfur dioxide, and ash rocketed into the atmosphere, blocking sunlight around the world . The empty chamber collapsed, forming a geographic depression or caldera, and the land for thousands of miles around was blanketed with a thick coat of ash.

The park floor has been rising as the magma chamber continues to swell. Between 2004 and 2009, Yellowstone’s ground uplifted 20cm (8 in), but since 2010 the uplift has continued at a slower pace. The park experiences between 1,000 and 3,000 earthquakes a year as the magma moves into the chamber. Most are less than Magnitude 3.0 and are seldom felt by park visitors. Scientists believe the next supereruption will occur sometime in the next 40,000 years. When and if it blows, it will cause disastrous damage and loss of life in a wide area around the volcano.

Yellowstone sits atop a volcanic hotspot, a pocket deep in the earth that sends a plume of molten rock and hot gas rising into a magma chamber just below earth’s crust. Both the hotspot and the magma chamber are stationary, but the North American Plate, the section of crust upon which Yellowstone is situated, constantly moves southwesterly at 2.5cm (approx. 1 in) a year. Over the past 16.5 million years, as the North American Plate has slowly moved over the hotspot, 15 to 20 massive eruptions have left immense craters dotting the landscape from the Nevada-Oregon border through Idaho’s Snake River Plain. Plate movement eventually positioned the hotspot and magma chamber under Yellowstone. Over the next 16 million years, plate movement will progressively move the hotspot under Montana, North Dakota, and Canada. As the North American Plate moves Yellowstone away from the hotspot over the expanse of geologic time, the park’s geysers will gradually die.

But for now the park’s thermal features remain alive and well and will stay that way over the next few million years. Although the possibility of a blowout remains, USGS and National Park Service scientists with the Yellowstone Volcano Observatory state that they “see no evidence that another such cataclysmic eruption will occur in the foreseeable future.”

Tsunami & Earthquake Networks

Someplace on earth the ground is shaking. According to USGS estimates, there are an average of 1,300,000 earthquakes on our planet every year, or one every 24 seconds. 98% of those quakes are under magnitude 4.0 and many occur in remote locations, so most of us are unaware of the constant seismic activity, even when it happens close by.

However between 1,500 and 2,000 annual quakes are in the magnitude 5.0 to 9.0 range. Those are the quakes that can do damage on land, and possibly trigger a tsunami if one strong enough hits on the seafloor where tectonic plates converge.

Where do USGS and other reporting centers get their real time information? Two worldwide seismic hazard networks report earthquakes as they happen, and provide early warning when a tsunami starts rolling toward land.

Global Seismographic Network (GSN) is a permanent digital network of 150 land-based and ocean-bottom seismometers positioned in earthquake prone locations around the world, and connected by a telecommunications network. GSN is a partnership among USGS, the National Science Foundation, and Incorporated Research Institutions for Seismology (IRIS), a consortium of 100 worldwide labs and universities. Although US based, GSN is fully coordinated with the international community. GSN stations are operated by USGS and UC San Diego. The network determines location and magnitude of earthquakes anywhere in the world as they happen. The data is used for emergency response, hazard mitigation, research, and tsunami early warning for seafloor locations.

Deep Ocean Assessment and Reporting of Tsunamis (DART) is the main component of an international tsunami warning system. The DART system is based on instant detection and relay of ocean floor pressure changes. DART stations consist of an ocean bottom sensor that picks up changes in pressure as the tsunami wave passes and sends the data to a nearby communications buoy, which transmits it to a satellite, which in turn relays it within seconds to tsunami warning centers around the world.

The US has deployed 39 DART stations in the Pacific, Atlantic, and Caribbean. Australia and Peru have also installed DART systems, and since the 2004 Indian Ocean tsunami that killed over 200,000 people, the nations bordering the Indian Ocean have cooperated in the installation of 6 Indian Ocean DART stations, along with 17 seismic satellite stations. The DART data, along with GSN and satellite data, flow into two major tsunami warning centers: the Pacific Tsunami Warning Center in Ewa Beach, Hawaii, and the West Coast and Alaska Tsunami Warning Center in Palmer, Alaska. It is the job of the tsunami warning centers to issue alerts and warnings to population centers in the path of a developing tsunami.

Although the GSN and DART systems have proved effective, NASA is testing a GPS system that can spot the epicenter location and earthquake magnitude 10 times faster, giving those in peril extra seconds and minutes to evacuate before the tsunami strikes land. NASA is still testing the system.    

 

  

Storm Surge — the Big Killer

When a hurricane strikes land, the storm surge can be more deadly than the storm’s violent wind. Tropical cyclones – called hurricanes in the Atlantic, typhoons in the Pacific, and cyclones in Australia and India – have killed over 1 million people in the past hundred years. The majority of those deaths are attributed to the surge component of the storm.

Typhoon Haiyan hit the Philippine Islands city of Tacloban on November 11, 2013, with a wind speed of 195 mph (315 km/h), the strongest landfall speed ever recorded. Over 5,000 died and the city was leveled. The savage wind took its toll, but it was the 20 ft. (6.6m) wall of ocean water surging more than a mile (1.6 km) inland that took most of the lives.

When Superstorm Sandy came ashore in New Jersey and New York in late October, 2012, the wind speed was only 115 mph (185 km/h), but the storm was so massive it pushed a 14 ft. (4.4m) storm surge far inland, killing more than 100 and wiping out or badly damaging thousands of homes. Reconstruction costs have reached $70 billion.

In August, 2005, Hurricane Katrina, a Category 3 with a wind speed of120 mph (192 km/h) struck New Orleans and Gulf Coast cities in Louisiana, Mississippi, and Alabama. Although the wind did some damage, the storm surge with waves as high as 28 ft. (7.5m) wiped out shoreline communities, and breached New Orleans’ levees, flooding the city, and causing most of the 1,800 deaths.

Some of the most destructive storm surges have occurred in Bangladesh and India. The northern end of the Bay of Bengal is funnel shaped, and storm surges become tidal bores that sweep many miles inland. The Bhola cyclone in 1970 produced a storm surge of 35 ft. (11m), taking 500,000 lives in Bangladesh. The largest storm surges ever recorded took place in India in 1839 when a 40 ft. (12.2m) surge killed 300,000; and in Bathurst Bay, Queensland, Australia, where a 42 ft (12.8m) surge killed 400 in 1899. It was reported at the time that dolphins and fish were found atop the cliffs surrounding the bay.

A storm surge is created by the storm’s high wind piling the ocean’s surface higher than ordinary sea level. Low pressure at the center of the weather system has a lifting effect and aids in the buildup of the sea and the energy of the surge.

People living near the shoreline in tropical storm-prone areas should be prepared not only to protect property against the high wind, but also be aware of storm surge danger, and prepared to evacuate before the storm makes landfall.       

Will Nuclear Fusion Power the World?

Nuclear Fusion holds great potential as a clean power source that might someday power the world. Unlike nuclear fission, nuclear fusion poses no radiation dangers or waste storage problems. The aim of fusion is to create an artificial sun — a superhot plasma that replicates the composition and heat of the sun — and to use the harnessed heat to operate steam generators that make electricity.

Scientists at the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory have taken a step closer to the goal of achieving ignition, the point at which the energy released by the fusion of one atom causes a self-sustaining chain reaction, the stage needed to create the sun-like plasma.

By focusing 192 powerful laser beams on a tiny fuel pellet made of the hydrogen isotopes deuterium and tritium, the NIF researchers, for the first time, have achieved a stage of fusion in which the amount of energy released by the nuclear fusion reaction was greater than the amount of energy that went into the pellet. Achieving ignition, the final step, will require an ultra high level of precision in every phase of the process, including pinpointing the laser beams and perfecting  the fuel pellet. By continually refining the process, the research team is confident they will reach ignition.

When ignition is achieved, a way must be found to contain a plasma mass as hot as the sun (3.5 million degrees Fahrenheit, 2 million degrees Celsius). Since there is no material container capable of withstanding such temperatures, other means have to be developed. One solution is to keep the hot plasma out of contact with the walls of the container by keeping it moving in a circular path by means of magnetic force. The process is called magnetic confinement.  A magnetic confinement test reactor has been constructed at Princeton University. It uses a combination of two magnetic fields to confine and control the plasma. Since nuclear fusion plasma has not yet been created, the Princeton reactor has not been fully tested.

Assuming ignition is realized at some point in the near future, it will still be many years before nuclear fusion moves from the lab to commercial application. But when it does, it might very well be the breakthrough that brings clean, plentiful, inexpensive power to the world.

 

Solar & Wind — Making a Difference?

Depending on where you live, you might pass by miles of wind farms producing electrical power, or arrays of solar panels on rooftops and in open fields. If so, you may have wondered how big a dent renewable energy is making in total power production, and whether it is helping to cut down the use of fossil fuels. 

According to industry estimates, wind farms worldwide produce about 300 billion watts of power annually, and solar about 70 billion. As big as those numbers appear, they account for only 3% of global energy production. Coal and oil still fuel more than 80% of world power generation, and still pump 10 billion tons of carbon emissions into the atmosphere every year.

Wind power production has been increasing 20% a year for the past 10 years. In the United States, wind produces 4% of total national power, but the figure is higher in several states. Wind contributes more than 20% of the power in Iowa, South Dakota, and Kansas. The US Dept. of Energy (DOE) has set a goal of 20% of national power to be produced by wind by 2030.

Wind provides 7% of the power consumed by the countries of the European Union. Denmark derives more than a quarter of its power from wind. The UN’s International Energy Agency projects wind delivering 18% of world electrical power by 2050, which if achieved will reduce carbon fuel emissions by approximately 2 billion tons a year.

Solar power is starting from a smaller base, but is now the fastest growing source of renewable energy, having grown 40% a year since 2000. According to the June, 2013, issue of the MIT Technology Review, DOE has set a goal of less than $1 per watt for complete installed systems by year 2020. If the solar industry hits that target, that would bring the direct cost of solar power down to 6 cents per kilowatt hour, and make it competitive with power now delivered by the grid. The key to cost reduction lies in further technological advances in solar panel materials, efficiency in installation, ease of connecting to the grid, and battery storage capacity.

Some of the solar advances currently being tested are (1) a two-sided panel that potentially will produce 20% more power; (2) flexible solar cells on a new Corning product called Willow Glass, thin enough to roll up; and a cheaper alternative to silicon made up of light-absorbing calcium titanium oxide compounds. Modern solar cells now achieve 10% to 20% efficiency in converting sunlight to power. Some of the materials under test are showing up to 30% efficiency, and a German laboratory announced that they had achieved a 44% conversion rate. One laboratory founded by a Caltech professor expects to produce a panel reaching a 50% conversion rate.

When all the solar research comes out of the lab into commercial production, solar energy will be highly cost competitive with all other power sources.

As wind and solar production grows, it is only a matter of time before the two renewable sources will account for half or more of global energy production. It will be a time of cleaner air and cleaner oceans, and might help us slow the galloping advance of global warming.  

 

 

Islands That Rise From the Sea

From time to time, a new island will suddenly or gradually appear somewhere in the vast oceans of Planet Earth. Some become permanent islands that attract plant and animal life. Others, eroded by wave and storm action, quickly disappear.

The most recent example of a temporary island materialized off the coast of Pakistan on September 23, 2013, following the magnitude 7.7 earthquake that struck Balochistan province in southwestern Pakistan, killing 850 people and injuring thousands. The seismic waves of the earthquake hundreds of miles away loosened pockets of volcanic gas under the seafloor, creating a mud volcano. The eruption blew a pile of seafloor mud and rocks to the surface, forming an island the size of a football field, 60 ft. (18m) high. Because the island is made up of soft material, geologists believe it will shortly be washed away by waves, tides, and storms.

Other emerging volcanic islands are the tops of undersea mountains building up from constant volcanic eruptions, finally emerging above sea level. Surtsey, which rose from the sea off the southern tip of Iceland between 1963 and 1967, is one example of this kind of new volcanic island. The island was 1 sq. mi. (2.7km²), but since has since lost half its size to erosion. Whether new volcanic activity will rebuild the island is not known.

Anak Krakatau, another example, rose from the submerged caldera of the famous Indonesian volcano Krakatau (Krakatoa) in 1930. The original Krakatau exploded in 1883, in a blast heard around the world, killing an estimated 100,000, its ash cloud blocking out the sun for months. Constant volcanic  activity has built the new Anak Krakatau to 1,000 ft. (300m) at the summit. The volcano erupts frequently and the new island continues to grow in size.

A different kind of emerging island is one caused by land uplift. This occurs along the coastlines of Sweden and Finland, where a 1 mi. (1.6km) thick glacier started melting away 10,000 years ago. As the weight of the glacier subsided, the land it was pressing down on began slowly to lift up. It has taken all of the 10,000 year span for the freed-up land to finally rise out of the sea.  A number of these uplift islands have appeared along the Scandinavian coastlines.

 A new island appeared off the coast of Greenland in 2005 as the Greenland Ice Sheet continued to retreat. There is a controversy over whether the island was already there and only revealed by the melting of the ice sheet, or whether the retreat of the ice sheet allowed the island to rise from the sea. Either way, Uunartoq Qeqertaq (The Warming Island in Greenlandic) is shown as a new island on world maps.

As new islands appear, many older islands around the world face the prospect of being flooded out by rising sea levels. A sea level rise of up to 3.2 ft.(1m) by 2100 is projected by the UN’s International Panel for Climate Change (IPCC) and other scientific organizations. The predictions are based on rate of glacier and ice sheet melt, temperature increases from global warming, and water expansion as oceans warm. Can global warming be slowed and sea level rise reduced?  Possibly, if the world cuts back carbon emissions in time and switches to renewable energy.

 

 

 

Is Desalination the Answer?

The widely held view in the scientific community that climate change is bringing much longer periods of drought to wider areas of the world than in the past is now supported by the United Nations Intergovernmental Panel on Climate Change.  The panel’s recently released Fifth Annual Assessment indicates a definite increase in the number of drought days over large portions of the globe.

Farming is especially hard hit during periods of drought. Lack of water stunts plant growth, reduces yields, and even wipes out entire crops in dry farming areas. The 2011 Midwest drought cost $12 billion in crop losses, and 2012 drought damage may exceed that when finally calculated. Texas has suffered a crippling drought over the past few years. And the Colorado River Basin is in the grip of a 14-year drought that threatens to drastically cut water supplies to California, Arizona, Nevada, Colorado, Utah, Wyoming, and New Mexico.

A number of proposals have been put forth for supplementing the Colorado River water supply, including a pipeline from either the Missouri or Mississippi Rivers, but now officials in the region are taking a serious look at desalinating ocean water and pumping it to areas in need.

Advances in technology have reduced the amount of electrical energy needed for processing and pumping desalinated water from a cost of about $1 per m³ (a cubic meter contains 264 gallons or 1,000 liters) to approximately $0.50 per m³.  In the United States, water from natural sources such as reservoirs fed by rain and snow runoff, and pumped into water distribution networks now costs the end user between $0.35 and $0.40 per m³, or 25% to 30% less.

New research projects at Lawrence Livermore National Laboratory and Lockheed Martin show promise of making desalination costs competitive with natural source water when it is rolled out commercially. The Lawrence Livermore technology is based on carbon nanotubes, special molecules made of carbon atoms. They allow liquids to flow through, while the tiny pore size blocks larger molecules, a cheaper way to remove salt from water. This process uses less power to filter more water than any of the methods presently in use. According to Lawrence Livermore researcher Martin Suss, the new method removes salt 5 to 10 times faster than previous systems.

Countries in the Middle East are among the biggest users of desalination. Saudi Arabia, Dubai, Bahrain, and Israel depend on desalination for between 70% and 80% of their fresh water. In oil producing countries such as the Emirates, oil and power are cheap, and the cost of desalination is therefore less expensive. In Israel and other areas without access to cheap oil, carbon fuels are supplemented with wind and solar power to run the desalination plants.

The world consumes about 25 billion cubic meters of water a day from natural sources. By 2020, the world will have more than 15,000 desalination plants in operation, projected to be producing 120 million cubic meters of fresh water a day, less than one half of one percent of total water usage. But desalination will continue to grow, and when it eventually becomes price competitive with natural water, will grow much faster. Many believe it will prove to be the best way to bring fresh water to the drought-stricken areas of the world. 

 

Drought, Fire, & Flash Floods

In a 3-day period starting September 9, 2013, 18 inches (46cm) of rain drenched the Rocky Mountains Front Range, setting off flash floods that roared through Boulder, Lyons, Estes Park, and other Colorado foothill communities. A dozen dams overflowed and six blew out. Walls of water 20 ft. (500cm) high raced down canyons, sweeping away houses and stranding thousands of area residents. As of this writing, the flooding had taken 8 lives and destroyed 1,500 homes.

Average rainfall for the month of September in Boulder is 1.63 in. (3.45cm). So what were the conditions that caused 11 times that amount to fall in 3 days? Many scientists believe that climate change, forest fires, and the severe drought that has gripped the U.S. Southwest for 14 years all played a part.

To begin, a low-pressure center settled over the Great Basin and was held in place by a high-pressure ridge over the Pacific Northwest. The low pressure system tapped into a plume of monsoonal moisture coming up from the Pacific Ocean off Mexico. Since the low was stationary, it kept sucking in the monsoon moisture in a loop, like it was coming in on a conveyor belt. The storm dumped its deluge on the drought-dried Front Range with steep canyons running downhill from peaks exceeding 14,000 ft (4,300m).

Professor Brad Udall, director of University of Colorado’s Wilkinson Center for Natural Resources, said that while current science can’t pin any particular extreme weather event to climate change, this flooding is likely a reflection of global warming.  Scientists have warned that as the planet warms,  drought and flash flooding will become more prevalent.

According to Sandra Postel, National Geographic’s Freshwater Fellow and noted authority on water use, the drought that has parched the area and gripped the Colorado River Basin for the past 14 years may be partly to blame for the severity of the floods. She said that drought hardens the soil, and when rains do come, the ground absorbs less water and quickly runs off the land.

Postel added that fires lead to worse flooding because they remove vegetation that can slow and trap rainfall. Hundreds of acres of Front Range forest were scorched by the Fourmile Canyon fire in 2010 and the Flagstaff fire in 2012. The burn area from those fires lies directly above the communities hit by the flash floods in September, 2013.

As our climate continues to warm, this same scenario will most likely be repeated in coming years in areas all over the world. Lengthy droughts, severe wildfires, record flooding, and more intense tropical storms are all expected to be part of our future climate menu.  

Hidden Danger Under the Sea

Our planet’s oceans cover the world’s largest volcano and longest mountain range, and hide seamount peaks and erupting volcanoes lying close enough to the surface to pose a danger to shipping. Tamu Massif, the world’s largest volcano, lies 1,600km (1,000 mi) east of Japan. Covering an area the size of New Mexico, the massive mountain rises 4,400m (14,400 ft) from the ocean floor. Its summit, 1,980m (6,500 ft) below the surface, is not a threat.

The world’s longest continuous mountain range, the Mid-Ocean Ridge (MOR), is a chain that stretches unbroken for 65,000km (40,000 mi) along our planet’s ocean floor. This long stretch of mountainous terrain, which threads around the globe like the raised seams of a baseball, marks the boundaries of the 12 tectonic plates that make up the earth’s crust. Average height of MOR’s mountain peaks is 1,500m (5,000 ft), and not considered hazardous.

But danger lurks in the tops of a few of the estimated 30,000 seamounts, self-standing undersea mountains not associated with the Mid-Ocean Ridge. Seamounts are usually extinct volcanoes rising up to 5,000m (16,000 ft) from the ocean bottom. Most leave a wide clearance between summit and surface, but some can spell danger.

 In 2005, the submarine USS San Francisco ran into an uncharted seamount southeast of Guam at a speed of 35 knots, killing one seaman and causing extensive damage to the vessel. In 1973, the merchant vessel MV Muirfield badly damaged its keel when it struck an uncharted seamount near the Cocos Islands in the Indian Ocean. The undersea peak was later charted and named Muirfield Seamount.

In 1985, the aircraft carrier USS Enterprise struck the Cortes Bank reef 100 mi (160km) southeast of San Diego, California. The Cortes Bank is a submerged mountaintop that was once the outermost island in Southern California’s Channel Islands chain. Its shoals range from 3 to 30m (30 to 100 ft) in depth and are marked as a hazard to shipping. The Bowie Seamount off the coast of British Columbia is a 3,000m (9,600 ft) undersea peak with a summit to surface clearance of only 24m (75 ft). Bowie is charted and avoided by shipping.

Erupting undersea volcanoes can also pose a shipping hazard. The Myojin-Sho undersea volcano lying 450km (330 mi) south of Tokyo rises to within 50m (164 ft) of the surface. In September, 1953, the Japanese scientific vessel Kaiyo Maru No. 5 was conducting research in the area when the volcano erupted. The ship and its crew of 31 was destroyed.

Kick-’em-Jenny is another dangerous undersea volcano, located 8km (5 mi) north of Grenada in the Caribbean. Rising to within 180m (580 ft) of the surface, it had a massive blowout in 1939 and has erupted intermittently since, up to its last eruption in December, 2001. Kick-’em-Jenny is still considered active and dangerous, and is charted and lies in a navigation exclusion zone.

Considering the thousands of ships that depart from ports all over the world every day, deep water shipping is statistically a very safe pursuit. But from time to time a storm, navigation error, or undersea hazard takes its toll.

Will Hurricane Season Slow Sandy Recovery?

As the $50 billion that Congress appropriated for Superstorm Sandy aid funnels into reconstruction projects along New York and New Jersey beaches, NOAA continues to predict a very active, above-normal 2013 Atlantic hurricane season.

According to NOAA, oceanic and atmospheric conditions in the tropical Atlantic and Caribbean so far in the summer of 2013 are conducive to hurricane formation. These conditions include above-average sea surface temperatures; an enhanced African monsoon season; the presence of 2 named tropical storms in June and July; and an ENSO (El Niño Southern Oscillation) Neutral condition in the equatorial Pacific.

Atlantic hurricane activity is suppressed during an El Niño year, when the warmer water in the Pacific weakens Atlantic wind patterns. Activity is enhanced during a La Niña year, when cooler tropical water strengthens Atlantic wind patterns. ENSO Neutral means water temperature along the equator between Asia and South America remains normal and has little affect on wind patterns in other parts of the globe. Historically, years with early season named storms have a high likelihood of an above-normal hurricane season.

NOAA is currently staying with their pre-season estimate of 13 to 19 named storms, 6 to 9 hurricanes, 3 to 5 major hurricanes, and an Accumulated Cyclone Energy range of 120% to 190% of the median, meaning wind speeds and storm surge are expected to be above average. NOAA makes no predictions on how many, if any, of the expected storms will make landfall. Many hurricanes stay out to sea or lose power as they approach land.

Shoreline reconstruction is moving ahead in New York and New Jersey, but much of it is still in the early stages. An example is a planned 16 ft. steel wall to be covered with sand for the protection of beachfront homes in 2 New Jersey townships. Funding has been approved, and the Corps of Engineers will manage the construction, but the necessary homeowner easements have not yet been received, so the project is on hold.

We hope that the storm-damaged areas have a chance to rebuild completely, observing storm-resistant building codes, before the next Sandy comes to visit.

 

 

 

   

Sobering Climate Change Report

Every six or seven years, under the auspices of the United Nations, the world’s leading climate scientists come together to write a climate change status report. This latest report of the UN’s Intergovernmental Panel on Climate Change (IPCC) is being produced by 830 international scientists from universities, laboratories, and government agencies such as NOAA and USGS.

The scientists involved are participating in three working groups: The Physical Science Basis; Impacts, Adaptation & Vulnerability; and Mitigation of Climate Change. The IPCC has completed the first draft of its Fifth Annual Assessment Report, to be published in final form in 2014. However, a summary draft of the report was leaked to the media ahead of schedule.

Several key points in the leaked summary probably will not change in the final document. One of the most important is the increase in probability that global warming is caused by humans. The 2001 Assessment gave a 66% probability that most of the observed warming is caused by humans. The next report, published in 2007, raised that probability to 90%. The latest report to be published in 2014 ups that probability to 95%. The continued burning of fossil fuels is considered the major cause of the rapid warming of the planet.

A second point is that sea level rise is now labeled unequivocal. In the opinion of the world’s climate scientists, sea level rise WILL take place, no ifs or maybes. The actual amount of rise between now and the end of the century is still being debated. It depends on how fast the world’s glaciers and ice sheets melt, and how fast the ocean warms. When ocean water warms, it expands and adds to sea level rise. Estimates run from a 3 ft. (.91m) rise to a 6 ft. (2.73m) rise.

Another finding in the summary is that CO2 in the atmosphere has increased by 20% since 1958, and 40% since 1750. To quote the summary, “virtually all (of the increase) is due to burning of fossil fuels and deforestation, and a small contribution from cement production.”

And finally, it is stated that climate change will continue for centuries even if emissions of greenhouse gases are stopped totally. In the words of the report, “This represents a substantial multi-century commitment created by past, present, and future emissions of CO2.”

To summarize the leaked summary, the world’s climate will get increasingly warmer, sea levels will continue to rise, the main reason behind these changes is the continued burning of fossil fuels and resulting emissions of CO2 into the atmosphere, and even if we stop burning fossil fuels today, global warming will continue for centuries to come.

One of the IPCC panels working on the Fifth Annual Assessment is concerned with mitigation. Even though a lot of long-lasting damage has already been done, it’s not too late to replace carbon energy with wind, solar, bio-fuels, and storage battery technology to help us cut CO2 emissions and slow the pace of global warming. Reversing global warming seems to be a lost cause, but we hope the mitigation panel will present ideas that will help us cope with the changes to come.

 

  

Landslides Take a Deadly Toll

When most people hear “Natural Disaster” they think first of highly dramatic events such as earthquakes, volcanoes, and tsunamis. But it turns out that landslides and mudslides, though seldom in the headlines, are the world’s most costly natural disasters in loss of life and property damage.

According to a 2012 study published in the journal Geology, 32,000 people died in landslides between  2004 and 2010. The lead author of the study, Dave Petley of the International Landslide Centre at Durham University, said, “Total fatalities are even higher than that, as my analysis only considered landslides triggered by rainfall. If other landslides are taken into account, especially those triggered by earthquakes, the death toll rises to a remarkable 80,000.”

Combined statistics for the years 2011 to 2013 have not yet been compiled, but landslides go on month after month, year after year in dozens of countries around the world. Brazil, China, Colombia, Guatemala, Pakistan, Bangladesh, India, Mexico, the Philippines, Japan, Italy, and Uganda are some of the areas where multiple deaths have been caused by collapsing hillsides. In the 2.5 years since the 2010 report was published, additional thousands have died in landslides and mudslides. Example: 40 people were buried in a mudslide when monsoon rains struck the Sichuan province of China on July 10, 2013. The costs in property loss, evacuations, and restoration amount to many billions of dollars a year.

The main landslide cause is prolonged or heavy rain that saturates and destabilizes a hillside, causing a section to detach and slide to the bottom. Many landslides start during monsoons and tropical storms in the mountainous regions of Asia and India, and during hurricanes and tropical rainstorms in the mountains of Central and South America. Landslides occur in the mountains of North America and Europe, too, but the hillsides are not as heavily populated as those in developing countries, and more mitigation measures are used.

Natural hillsides are inherently stable. One of the things that destabilizes them and makes them vulnerable to collapse under heavy rainfall is deforestation, or loss of vegetation. Vegetation absorbs water and keeps a hillside dry. The root systems tend to strengthen and stabilize the slope. Wildfires, clear-cutting of timber, and demolishing natural ground cover to create farmland are all ways to make a hillside susceptible to failure.

Grading, adding fill, road building, and irrigating a slope for farming all add weight and loosen and destabilize hillside soil. All of these actions can contribute to a landslide when soaking rainstorms saturate the hill. Water saturation reduces the friction between soil particles. And water adds weight to a slope. Without enough friction to hold the soil in place, and with many tons of water added, a hillside becomes a heavy mass of mud and a segment can detach and slide to the bottom.

Landslides occur hundreds of times a year all over the world. All it takes is a hillside and heavy rain. As long as people build homes on and below hillsides and mountainsides, there will be casualties and damage to property. Mitigation measures such as drainage systems and retaining walls help, but even with those in place, hillsides still come down.               

Disasters – Natural & Manmade

Disasters come in all shapes and sizes. People die in earthquakes, tsunamis, floods, fires, hurricanes, and volcano eruptions.  Then there are the  not-so-natural disasters that happen as a result of human failings. In July, 2013, more people met their deaths in manmade disasters than in the natural kind.

July’s major natural disaster occurred in the Gansu province of China. A magnitude 5.9 earthquake struck this mountainous area of western China on July 22, killing 89, demolishing 1,200 homes, and badly damaging 21,000 more. 500 suffered injuries and 27,000 people were left homeless. Gansu borders Sichuan province where a magnitude 6.6 quake in April killed 166 people. This part of China is situated on the edge of the Tibetan Plateau where the Indo-Australian Plate converges with the Eurasian Plate, one of the world’s most seismically active areas.

 While 89 people perished in the Gansu earthquake, 132 people died in three disasters caused by human error during the month of July, 2013. On July 6, a train of 72 full oil tanker cars made a shift-change stop in the small town of Lac-Magantic in Quebec. The engineer said he set the train’s brakes before leaving , but the unattended train started rolling downhill, gained speed, became a runaway train, and derailed in the middle of the small Canadian town. At least 5 of the tanker cars exploded, erupting into a wall of flame that destroyed downtown Lac-Magantic. At least 50 people died in the fire.

On the same day, July 6, an Asiana Airlines plane flying from Seoul, Korea, crashed while landing at San Francisco International Airport. Witnesses reported that the Boeing 777 carrying 291 passengers and 16 crew came in too low and too slow, and lost flying speed before crashing on the runway. 3 people died in the accident, and 182 were injured and taken to local hospitals. The NTSB’s investigation is ongoing, but aviation observers say pilot error is a possible cause.

A passenger train travelling from Madrid to Santiago de Compostela derailed in northwestern Spain on July 24, killing 79 people.  The train was travelling at 190 kilometers per hour (120 mph) when it entered a sharp bend in the track. The speed limit for that section of track is 80 km/h (50 mph). Why the engineer took the curve at full speed is not known. What is known is that the entire train flew off the track and smashed into a concrete retaining wall. Spanish authorities are holding the engineer in custody and considering charging him with 79 counts of homicide.

With rare exceptions, freight trains are safe, airline flights are safe, and passenger trains are safe. But from time to time, human error will unfortunately contribute to a major disaster. Murphy’s Law states that Anything that can go wrong will go wrong, and often at the worst possible time. The Peter Principle states that A member of an organization who works hard will eventually be promoted to his level of incompetence. Did either or both of these time-honored rules apply in any of these cases? No one really knows.

 

  

 

Pacific Typhoon Season

While residents of the East and Gulf Coasts of North America prepare for hurricane season, people in Japan, Taiwan, the Philippines, and the coast of China get ready for typhoon season. Hurricane and typhoon are different names for the same type of tropical storm — what meteorologists refer to as tropical cyclones. Both originate in warm tropical waters as tropical waves, evolving to tropical depressions. Both typhoons and hurricanes are reported on the same Safir-Simpson Hurricane Wind Speed Scale, elevating from tropical depression to tropical storm to Category One Typhoon or Hurricane when wind speed reaches 74 mph (121 km/h).

The 2013 typhoon forecast issued by the Typhoon Storm Risk Consortium of University College in London predicted between 23 and 27 tropical storms in the Western Pacific during 2013, an average year. Historically, there have been 25.7 tropical storms per season. Seven have already occurred this year, including Tropical Storm Rumbia that struck the Philippines and China in late June and early July, causing widespread flooding and mountain landslides that killed 55 and made thousands homeless.

On July 13, 2013, Typhoon Soulik, the first Pacific storm to strengthen to Category 3 typhoon status, battered Taiwan with 115 mph (190 km/h) winds and torrential rain. One mountain village reported 24-hour rainfall of 35 inches (900mm). Taiwan authorities had evacuated a number of mountain villages to prevent landslide deaths. Even so, 2 people were killed and over 100 injured on Taiwan by the storm.

Halsey’s Typhoon. One of the most notable typhoons in history took place during World War II. On December 17, 1944, Admiral William “Bull” Halsey inadvertently sailed the U.S. Third Fleet into the heart of Typhoon Cobra, with winds gusting to 145 mph (230 km/h), extremely high seas, and torrential rain. The fleet, consisting of 13 aircraft carriers, 8 battleships, 15 cruisers, and 50 destroyers, had been refueling after a series of air raids on Japanese installations on Luzon, and were caught unprepared. 3 destroyers capsized in the storm, with a loss of over 800 lives. Nine other ships were badly damaged, more than 100 aircraft were wrecked or washed overboard, and almost every ship in the fleet sustained some damage. 93 sailors from the capsized ships were rescued by other ships in the fleet. A court of inquiry found that Admiral Halsey had committed an error of judgment in sailing the fleet into the storm, but no sanctions were recommended. Many believe the novel The Caine Mutiny was based on the experience of Herman Wouk, the author, who was serving aboard one of the fleet’s ships during Typhoon Cobra.

 

Dead Zones Plague World’s Oceans

Chemical fertilizer may grow bumper crops, but it also serves to poison large areas of the world’s oceans. Spring rains wash the residue out of the soil and into tributaries and streams that feed major rivers, resulting in tons of nitrogen and phosphorous spilling into the ocean. The chemicals settle on the ocean bottom, creating algae blooms that suck all the oxygen out of the water. Fish, shellfish, and marine life of all kinds cannot exist in water without oxygen. The infected area becomes what is called a dead zone.

According to a 2008 UN study, there are more than 400 dead zones around the world. One of the largest of these begins at the mouth of the Mississippi River in the Gulf of Mexico. This past year, a combination of thousands acres of heavily fertilized corn grown for ethanol production, and severe spring flooding in the American Midwest, has flushed unusually large amounts of  chemicals into the gulf. The USGS estimates that in May, 2013, alone, 153,000 metric tons of chemical nutrients flowed into the Gulf of Mexico. As a result, NOAA forecasts that the 2013 Gulf of Mexico hypoxic dead zone will expand up to 8,500 square miles (22,000 sq km), an area the size of New Jersey. The expanding dead zone is decimating the Louisiana shrimp industry, and damaging other valuable Gulf fisheries.

Other notable dead zones are the Black Sea in Central Asia and the Baltic Sea in northern Europe. Both are inland seas connecting to larger oceans through narrow straits. Salt water exchange with open ocean is restricted, and many rivers and streams empty into them, reducing the salinity and keeping them brackish. More than 250 rivers and streams, many carrying agricultural waste, flow into the Baltic, and sewage is discharged into the Baltic by the 9 countries bordering the sea. The Black Sea deals with the same set of problems: restricted salt water exchange, and the constant inflow of chemicals and sewage, producing algae bloom and dead-zone water with little or no oxygen.

The good news is that dead zones are reversible. During the Soviet era, large amounts of chemical fertilizers were used to increase crop production, the waste flowing into the Black Sea. After the Soviet collapse, farms in the Black Sea area stopped using chemical fertilizers. By the 1990s, a large area of the black sea had recovered, and marine life was once again thriving. Countries along the Rhine River have cooperated in cutting down agricultural discharge, resulting in a 35% improvement in the North Sea Dead Zone. Dead Zones have also been reduced in San Francisco Bay, Chesapeake Bay, and the Hudson River.

Even though some dead zones are being cleaned up through community action, others keep popping up. As world population increases, the need to produce more food increases the use of chemical fertilizer, much of which finds its way into the ocean. The UN is working with member countries to find ways to mitigate the problem. The question is: can we have bumper crops and clean oceans at the same time? We hope a solution can be found.

Emissions, Thunderstorms, & Climate Change

Smoke from wildfires in California and Colorado can make it hotter in Budapest. Isoprene, a naturally occurring hydrocarbon, rising in a summer mist from the forests of the Southeastern U.S., can damage the ozone layer that protects earth from harmful ultra violet radiation.

NASA’s Earth Science Division will employ three different aircraft and a fleet of satellites over the Southeastern U.S. this summer to collect data for a project called SEAC4RS, Studies of Emissions, Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys. Quite a mouthful, meaning the researchers will examine the process of how localized pollutants change the composition of thunderstorms and invade the upper atmosphere, and how that impacts the global climate cycle.

In a prior study conducted by scientists at NASA’s Pacific Northwest Laboratory in June, 2012, it was found that locally-occurring contaminants such as car exhaust, factory smoke, and methane from cow manure rise with heat updrafts into the clouds of summer thunderstorms.  Sucked up in the form of tiny aerosol particles, local pollutants mix with the water in the cloud when it condenses to form a thunderhead. The pollution then acts to divide the water droplets inside the storm cloud, making them too small to create rain.  Polluted storm clouds, instead of bringing cool rain, become heat traps, gradually reflecting stored heat back to earth.

In the new SEAC4RS study, earth’s atmosphere will be probed from top to bottom with aircraft, satellite, and ground-based sensors at the critical time of year when regional air pollution and natural emissions pump gases and particles high into the atmosphere, with potentially global consequences for Earth’s climate.

Brian Toon of the University of Colorado, the study’s lead scientist, states, “In summertime across the United States, emissions from seasonal fires, metropolitan areas, and vegetation are moved upward by thunderstorms and the North American Monsoon. When these chemicals get into the stratosphere, they can affect the whole Earth. They may also influence thunderstorm behavior. We hope to better understand how all these things interact.”

Data will be collected from specialized instruments on (1) a fleet of formation-flying satellites known as NASA’s A-Train, (2) an ER-2 high-altitude aircraft that flies to the edge of space, (3) a DC-8 flying at lower levels, and (4) a specialty aircraft that measures cloud properties. A network of ground-based sensors will also be used.

By analyzing the collected data, the scientists expect to achieve a more precise understanding of how manmade and natural pollutants affect global climate:  What happens when polluted clouds travel with a weather system? Which pollution particles are absorbed by clouds, and which go directly into the stratosphere, and in what amounts? What happens when the jet stream carries stratospheric pollution around the world? Knowing more about this process should help us find more and better ways to reduce emissions and slow the pace of global warming.