July, 2014 – Bad & Not So Bad

Four natural disasters struck different parts of the world in the first half of July, 2014. One was quite destructive, causing multiple fatalities, injuries, population displacement, and considerable property damage.  The other events, though serious, with some injuries and property loss, could have been worse. But all served as reminders that major catastrophes have struck these areas in the past, and will do so again in the future.

July 15 – Typhoon Rammasun, a Category 3 Tropical Cyclone with winds gusting to 170 km/h ((106 mph), swept across the island of Luzon in the Philippines. 38 people died in the storm, 25,000 homes were damaged or destroyed, over a half million people took refuge in evacuation centers, and 2 million homes lost electrical power. Rice, corn, and other crops suffered $15 million in losses due to flooding. Typhoons causing much greater devastation have hit the Philippines many times in the past, including Super Typhoon Haiyan that struck the southern Philippines in November, 2013, killing more than 6,000.

July 11 – Japan Earthquake. At 4:22 am local time, a magnitude 6.8 earthquake struck off Japan’s northeast coast near Fukushima, site of the devastating 9.0 megathrust quake and tsunami of March, 2011, that wiped out villages, killed 19,000, and knocked out the Fukushima nuclear power plant. 100,000 people who were evacuated at the time are still unable to return to their homes because of radiation contamination. Authorities reported only one injury and no significant damage from the recent July 11 quake. 8 coastal towns in the area issued an evacuation advisory causing thousands of people to move to higher ground. The advisory was cancelled 2 hours later when the tsunami wave created by the earthquake turned out to be only 20cm (8 in) high.

July 7 – Mexico/Guatemala Earthquake. At 6:23 am local time, a magnitude 6.9 quake rattled southern Mexico and Guatemala, killing 3, injuring 35, and causing widespread property damage. The quake epicenter was on the Pacific Coast in a seismically active area that has spawned 12 quakes of magnitude 7.0 or higher in the past 100 years. In 1985, a quake registering 8.1 with its epicenter off the Pacific Coast in the same general area caused extensive damage and loss of life in Mexico City 220 miles (350km) away. The official death toll for the 1985 quake stands at 10,000, but other sources estimate fatalities could have been as high as 40,000.

July 3 – Hurricane Arthur, the first named storm of the Atlantic Hurricane season, made landfall in North Carolina with a sustained wind speed of 100 mph (155km/h). Classified a Category 2 storm, Arthur weakened as it travelled north, moving ashore again in New England as a tropical storm, bringing flooding and power outages. No deaths or injuries directly related to the storm were reported. However, Arthur was a reminder that far deadlier hurricanes have hit the US East Coast in the past, such as Sandy in 2011, and will again at some future time..

The motto for all these areas vulnerable to natural disasters should be the same as that of the Boy Scouts: Be Prepared.        

 

Can We Feed 9 Billion?

One billion people on this planet suffer from chronic hunger. With world population projected to increase from the present 7 billion to 9 billion by 2050, will there be enough food to go around, or will even more human beings go chronically hungry? Chronic hunger means a basic lack of calories and protein to sustain human health. One-third of the children in developing countries now experience stunted growth, and malnourished people are far more susceptible to disease.

According to the Food & Agricultural Organization of the UN (FAO), if population increases to 9 billion, food production will need to rise by 70%, and in the developing world it will need to double. The projected increases in food production will have to overcome rising energy prices, growing depletion of ground water, loss of farmland to urbanization, and increased drought and flooding due to climate change.

Other challenges include a rising middle class in China, India, and other parts of the developing world. As disposable income increases, demand for meat products goes up. Raising cows, pigs, and chickens requires multiple pounds of feed for each pound of meat produced. That means a huge increase in demand for grain, water, and land.

Agriculture is a major emitter of CO2, methane, and nitrous oxide, pumping more greenhouse gasses into the air than all our cars, trucks, trains, and airplanes combined. With ramped-up food production, those emissions will increase even more just as the world is trying to reduce the volume of greenhouse gasses going into our atmosphere.

More efficient farming will be needed to overcome these problems. To Big Ag (companies such as DuPont, Monsanto, John Deere, and Archer Daniels Midland), efficiency means using advanced farming techniques with the latest innovations in fertilizer, equipment, and genetically modified seed to produce more food per acre or hectare. However, UN’s FAO believes the answer lies in helping small farmers in developing countries improve production locally by preserving natural resources and practicing better organic farming methods. This includes reduced tillage to save soil (wind blows tilled topsoil away), crop rotation to save soil nutrients, and improved seeds to save water. Rather than use genetically modified seeds, FAO recommends using traditional breeding methods to develop seeds that need less water, produce more, and resist pests and disease.

It seems that both approaches will be needed if the world is to meet the challenge of feeding 9 billion people a healthy diet, including those who now suffer from chronic malnutrition.  

 

Rising Seas, Sinking Land, & Flooded Cities

Oceans are warming and expanding in volume. Glaciers are melting at a rapid pace around the world. The Greenland Ice Sheet is losing mass at an alarming rate. The West Antarctica Ice Sheet is eroding and could eventually collapse from ocean water heated by thermal vents recently discovered underneath it. This all adds up to a projected sea level rise of up to 3.3 ft. (1m) by 2100, and possibly more, depending on how fast the Antarctic Ice Sheet melts.

What does this mean for low-lying cities along the US coastlines? Based on studies by USGS, NOAA, National Climate Assessment, and several university research teams, the cities most at risk are located along the Atlantic seaboard from Boston to Florida.

A 2012 study by USGS concludes that sea levels along the US east coast will rise 3 to 4 times faster than the global average during the balance of the 21st Century. While sea levels worldwide are projected to rise 2 to 3.3 ft. (0.7m to 1m) by 2100, they are expected to surge more than 6 ft. (1.8m) along the Atlantic coast. The USGS study and the 2013 National Climate Assessment named Boston, New York, Norfolk, and Miami as the large population centers most vulnerable to sea level rise flooding.

Boston. A University of Massachusetts Boston study indicates that a 2 ft. (0.7m) sea level rise by 2060 would mean twice a day flooding in the lower parts of Boston. During hurricanes, storm surges could flood 30% of the city, including Back Bay and the Harvard campus. The design firm Sasaki Assoc. concludes that approximately 200,000 residents, 89,000 housing units, and $8 billion in property values are vulnerable to flooding during a major storm surge.

New York/New Jersey. The flooding that occurred during Hurricane Sandy is an example of what can happen when rising sea levels combine with a major storm surge. Lower Manhattan and parts of the New Jersey coast suffered severe flood damage when the storm surge overtopped Manhattan’s seawall and powerful waves washed far inland along the Jersey Shore.

An additional problem for most of the mid-Atlantic coast is land subsidence. As sea levels continue to rise, the land is gradually sinking, so that even small sea level changes can result in major damage. When the glaciers that once covered the area retreated after the last Ice Age, the land that had been compressed by the weight of the glacial ice gradually rose, while adjacent land such as the seacoast that had been squeezed higher started to sink. That process has continued for hundreds of years and shows no sign of stopping. Islands in Chesapeake Bay that once stood well above the water line have disappeared over the past 50 years as the bottom sinks and the water level rises.

Norfolk. The land in Norfolk and Virginia’s Tidewater region is sinking especially fast, causing streets throughout the area to flood even during normal high tides. The city is spending millions to raise streets and improve drainage. A long-term fix could cost $1 billion, money the city doesn’t have. The mayor acknowledged that some areas might have to be abandoned.

Miami. The low-lying greater Miami area, with a population of 5.7 million, is one of the worldwide communities most at risk from sea level rise flooding. Miami Beach, at an elevation of 4.4 ft. (1.3m), is already seeing frequent salt water street flooding at high tide. Miami is exceptionally vulnerable because it is built on top of porous limestone, which is allowing the rising sea level to soak into the city’s foundation, bubble up through pipes and drains, encroach on fresh water supplies, and saturate infrastructure. According to the US Government’s National Climate Assessment, the sea level around Miami could rise up to 2 ft. (0.7m) by 2060. Broward County officials estimate a 1 ft. (0.3m) sea level rise will threaten $4 billion of south Florida’s property base.

Most low-lying east coast communities are working on flood mitigation plans. Some are more advanced than others. We hope they complete those plans and put them in place in time to keep their cities dry.

 

 

 

 

Is El Niño Back?

For the past few years La Niña has been the dominant weather driver, bringing cold, wet winters to the northern tier of states in the US, and drought to much of the southwest, including Texas, Oklahoma, and Colorado. In 2011, the drought in Texas and the Southwest expanded into the southern portion of the Midwest, greatly reducing the production of corn and soybeans.

Except for 2013, the Atlantic and Gulf Coast hurricane season was active during the La Niña years, including the devastating Hurricane Katrina in 2005 and the highly destructive Superstorm Sandy in 2012.

Now, according to the scientists at NOAA, there is a 65% chance that La Niña will recede, and El Niño will be back with us this summer. NOAA’s forecast is based on the observed warming of the surface waters in the equatorial Pacific Ocean. At the moment, the surface waters are cool enough to be declared ENSO neutral, meaning neither too hot nor too cold. But the subsurface water temperatures are warming rapidly. The warm subsurface water is expected to rise to the surface, creating an increase of at least .05°C (.09°F) in the surface temperature. This small rise in the surface temperature in the tropical Pacific is called El Niño, and sets in motion a whole new set of global weather patterns.

If El Niño arrives as expected, the northern tier of US states will become warmer and drier. The Southwestern states and the southeast will be cooler and wetter. The extra rain will help alleviate the long standing drought that has plagued Texas and the Southwest for the past several years, but will probably not be enough to end it.

El Niño will also have an impact on the 2014 hurricane season. The Atlantic hurricane season will become quieter, with fewer hurricanes and even fewer making landfall. El Niño causes the surface water in the Atlantic to cool, and develops strong wind currents off Africa, making it harder for hurricanes to form. For the season which starts on June 1, NOAA predicts a 70% chance of 8 to 13 named storms with winds of 39 mph (65kp/h), of which 3 to 6 could become hurricanes with winds of 74 mph (123kp/h) or higher, including 1 or 2 major category 3, 4, or 5 hurricanes with winds of 111 mph (185kp/h). All are below seasonal norms.

Keep in mind that these forecasts, both for El Niño and for the Atlantic hurricane season, are based on current observations and computer modeling. Computer models sometimes get it wrong, as happened when NOAA predicted an active 2013 hurricane season that turned out to be very quiet. At this point it looks like El Niño is coming back and it looks like a quiet hurricane season, but we will have to wait and see.

Tornadoes — Storms of Mystery

An outbreak of 69 confirmed tornadoes during the last four days of April, 2014, took 35 lives and caused over $1 billion in property damage. Two Arkansas towns north of Little Rock – Vilonia and Mayflower — were the hardest hit. The rash of tornadoes also devastated communities in Oklahoma, Kansas, and Texas.

Tornadoes kill an average of 60 people a year in the US, according to NOAA. This varies greatly by year. 2011 was one of the most destructive and deadly on record. An F4 tornado with wind speeds of 200 mph (322kp/h) wiped out Joplin, Missouri, killing 162. Earlier that year, an F-4 struck Tuscaloosa, Alabama, killing 65 and leveling a wide path through part of the city. Total fatalities for all tornadoes that year were 551, with damages estimated at $28 billion.

Tornadoes are a product of high-energy cloud formations called supercells. In the spring, when warm, moist air flows into the Midwest and Southeast from the Gulf of Mexico, it rises and mingles with layers of cooler, drier air coming in from Canada and the mountain west. The warm air condenses when it meets the cool air, forming cumulus clouds. Rising convection currents create energy and instability inside the cumulus formation. When the energy level peaks high enough, a rotating updraft or mesocyclone develops and the storm formation becomes a supercell. In some cases, the energy moves vertically down from the base of the supercell to the ground in the form of a spinning vortex.

There are several mysteries about tornadoes. Scientists know generally how they form and what happens once they do, but do not know why some storm clouds morph into supercells and most do not. Also, once a cumulus buildup turns into a supercell, why do 30% produce tornadoes, and 70% only rain or hail?  Even though the National Weather Service has gotten quite good at forecasting tornadoes in a specific area, the behavior of a tornado once it touches down is not always predictable.  Tornado paths range in width from 100 yards (91m) to 2.6 mi (4.3km), and the length from 10 miles (16km) to hundreds of miles. They can last from a few seconds to more than an hour. They move across the land in a northeasterly direction at between 30 mph and 70mph (48 to 112kp/h).

Not all states in “Tornado Alley” have building codes that require storm shelters in schools and hospitals, where many of the casualties have occurred in past tornadoes. If more states included that in their building codes, it would undoubtedly save lives in the future.

 

 

 

Are Volcanoes Slowing Global Warming?

In 2013, the world pumped 36 billion metric tons (40 billion US tons) of CO2 into the atmosphere through the burning of fossil fuels. Such emissions form a carbon dioxide blanket that allows the sun to penetrate, but prevents much of the surface heat from reflecting back into space. As a result, the oceans are rapidly warming, arctic sea ice is diminishing, and glaciers and the Greenland Ice Sheet are melting at a record rate.

The world’s surface temperature has steadily increased for the past 150 years, and it was assumed the curve would keep climbing at the same rate. But unexpectedly, global surface temperature peaked to its highest historical level in 1998, then flattened out and has remained about the same since, raising questions in the scientific community.

A Lawrence Livermore National Laboratories study that appeared in the Feb. 23, 2014, issue of the journal Nature Geoscience suggests one reason for this unexpected development is the higher than normal rate of volcanic activity over the past 15 years.  Eruptions during that period included 17 ranked VEI 4 on the Volcanic Explosivity Index. A VEI 4 is termed cataclysmic and sends an ash plume 10 to 25km (6 to 15 mi) into the air, sufficient to penetrate the stratosphere with sulfur dioxide aerosols that remain there for months, even years.

“In the last decade the amount of volcanic aerosol in the stratosphere has increased, so more sunlight is being reflected back into space,” said Lawrence Livermore Climate Scientist Benjamin Santer, lead author on the study. “This has created a natural cooling of the planet and has partly offset the increase in surface and atmospheric temperatures due to human influence.” The paper states the research team found evidence for significant correlations between volcanic aerosol observations and satellite-based estimates of lower temperatures, as well as sunlight reflected back into space by the aerosol particles.

Santer’s conclusions seem to be supported by an earlier study by the University of Saskatchewan. In this study, the researchers found that sulfur dioxide aerosols from a very small African eruption had “hitchhiked” their way into the stratosphere. Warm air rising from the seasonal Asian Monsoon lifted the volcano’s aerosols from the lower atmosphere into the stratosphere, where it was detected by the Canadian Space Agency’s satellite OSIRIS, an instrument specifically designed to measure atmospheric aerosols. Even though coming from a small eruption, the concentration of particles was the largest load of SO2 aerosol ever recorded by OSIRIS in its 10 years of operation.

The Lawrence Livermore paper suggests that one other possible contributor to the temporary cooling effect is the unusually long and low minimum in the solar cycle. Don’t be surprised to see surface temperatures start climbing again when volcanic activity subsides and the cooler phase of the solar cycle concludes.

 

 

 

 

 

Why Chile Has So Many Earthquakes & Tsunamis

The Magnitude 8.2 quake that struck off the coast of Chile on April 1, 2014, was the latest in a series of major earthquakes and tsunamis to hit that area in recent years. The undersea quake and resulting 7 ft. (2.1m) tsunami killed 7, toppled buildings, and severely damaged the Chilean fishing fleet.  Earthquake/tsunami events in 2010 (M8.8), 2007 (M7.7), 2005 (M7.8), and 2001 (M8.4) killed more than 1,000 and inflicted billions of dollars in property damage .

The most powerful earthquake ever recorded, a Magnitude 9.5, hit the coast of Chile on May 22, 1960. The monster quake triggered an 82 ft (25m) tsunami that not only battered the west coast of South America, but rolled across the Pacific Basin, devastating Hilo, Hawaii, and damaging coastal villages as far away as Japan and the Philippines. Some sources estimate 6,000 dead and $800 million in property loss (6 billion in 2014 dollars).

Why does this area of planet earth spawn so many high-magnitude earthquakes and punishing tsunamis?

One explanation is that the collision of the two tectonic plates that meet off the South American west coast occurs, in geologic terms, at a very high rate of speed. The oceanic Nazca Plate and the continental South American Plate converge in the Peru-Chile trench that lies about 100 mi (160km) off the coast. The overriding South American Plate moves eastward at 10cm a year, while the subducting Nazca Plate pushes west at 16cm/y, a closing velocity of 26cm/y (about 10 in.), one of the fastest absolute motions of any tectonic plate. The Africa Plate, for example, moves approximately 7 times slower.

This high closing velocity builds up fault line strain much faster than it does when slower-moving plates converge. Every few years, tension on the Peru-Chile fault line builds up to a breaking point. In this latest earthquake on April 1, a 100 mi. (160km) section of the fault line ruptured, allowing the Nazca Plate to ram under the South American Plate. This sudden violent action 12.5 mi (20.1km) below the ocean floor triggered the tsunami and the 8.2 earthquake, and at the same time wedged the South American Plate higher. Uplifting from frequent fault line failures continues to build the Andes Mountain Range into one of the highest in the world. During the 1960 M9.5 quake, some coastal areas uplifted as much as 10 ft. (3m).

As long as the two tectonic plates that meet off the South American coast move geologically at such high speed, major earthquakes and tsunamis will keep happening. We hope the zoning laws and building codes put in place by the governments of Chile and Peru will keep the damage and loss of life to a minimum.  

Why Did the Hill Come Down?

As of this writing, 21 people have been confirmed dead and 30 are missing in the disastrous March 22, 2014, Oso, Washington mudslide. We send our condolences to all those affected by this terrible tragedy.

At the same time, we have to ask ourselves why a forest-covered mountainside would suddenly shear off and bury an entire community of 30 homes under a 1 square mile (2.6km²) mud and debris slide 40 ft (12m) deep.

Two main reasons have been given. One is that the hill had become saturated after weeks of heavy rainfall. The rainfall in that area during the month of March was 200% of normal. Although the soil there is compacted clay that tends to be impermeable, it is believed there were cracks at the top that allowed the rain to penetrate. The other reason for the failure is that the swollen Stillaguamish River at the base was undercutting the toe of the hill. With the base of the hill weakened and the slope heavy with soaked-in rain, the hill collapsed.

After a number of landslides had been reported in that area during the prior 40 years, the US Army Corps of Engineers did a survey there in 1999 and issued a report warning of “the potential for catastrophic failure.” In 2006, a section of that same hill collapsed and blocked the course of the river. Other state and local agencies had examined the hill at various times and all concluded it was unstable. Whether the permit-issuing authorities were aware of those findings is not known. What is known is that building permits for that location continued to be issued, even after the 2006 slide.

The last compilation of world landslide statistics was posted by the American Geographical Union for the year 2010. In that year, 6,211 people died in 494 landslide events worldwide. 83,275 landslide deaths were reported for the period September, 2002 to December 2010, an average of a little more than 10,000 a year. People living in the mountains of China, India, Central America, the Philippines, Taiwan, and Brazil were the most vulnerable during that period. Landslides and mudslides often occur when intense rainfall from tropical storms and monsoons saturate hillsides that have been compromised by logging, farming, and construction. Although not as highly dramatic as earthquakes and tsunamis, landslides may be the most costly of all natural disasters in loss of life and property.

In the United States, landslide fatalities average between 25 and 50 a year, according to the Centers for Disease Control and Prevention. Using airborne Lidar, a laser-based mapping system, it is now possible to set up a national data bank on areas throughout the US that are susceptible to hillside failure, but it would be a long and very costly project. Until such a survey is done, local jurisdictions will have to rely on other methods to determine landslide-prone areas. Even knowing the possible dangers, people will still build homes below unstable hillsides, in fire areas, and flood plains. It is up to local zoning authorities to prohibit building in these hazardous places.

 

 

Offshore Wind Farms

Constant winds in coastal waters make offshore wind farms highly productive. Most offshore wind turbines are installed on pilings in shallow waters within a few miles of the shoreline, but there are some on floating platforms farther offshore.

The United Kingdom’s 20 offshore wind farms supplied 10% of that nation’s total electrical power production in January, 2014, and 11% in February. Britain is the world leader in number of wind farms located in coastal waters, and in total amount of energy produced. Germany, Netherlands, Denmark, Belgium, and Sweden are close behind with another 58 offshore wind farms, and dozens more under construction or in the planning stage. Offshore wind farms are projected to produce 4% of total European power by 2020, and 15% by 2030.

The US leads the world in amount of energy produced by wind turbines: 120 billion kilowatt hours in 2013, representing more than 4% of US energy production. However, all US wind farms are currently land based. At this time, the US has no offshore wind farms. Plans are on the drawing board and permits have been granted for offshore wind farms in Massachusetts, New Jersey, Rhode Island, and Oregon, but so far no construction work has started. Reasons given are reluctance to increase the cost to the rate payer, and NIMBY (not in my backyard) campaigns by homeowners and environmental groups.

The US Atlantic and Gulf coasts provide more suitable sites for offshore installations than the Pacific Coast, because of a longer and shallower slope out to the edge of the continental shelf. In some areas, shallow waters extend out as far as 200km (160 mi) on the Atlantic coast. The continental shelf drop-off to deep water on the Pacific coast is steeper and more abrupt and not as suitable for shallow water farms. A Seattle company has obtained a lease from Dept. of Interior for 15 square miles of federal waters off Coos Bay, Oregon, for a wind farm on floating platforms anchored by cable to the ocean floor.

Could a massive offshore wind farm project also serve as a buffer against hurricanes and storm surges? Yes, according to a study by Mark Jacobson, professor of civil and environmental engineering at Stanford, and two co-authors, published in the journal Nature Climate Change. In the study, the researchers used computer simulations of Hurricanes Katrina, Sandy, and Isaac to determine the effect of massive offshore wind farms on wind speed and storm surge. In the case of Katrina, the researchers found that an array of 78,000 turbines in coastal waters would have reduced wind speed at landfall 65% to 78%, and storm surge by 79%. Similar results were obtained for Sandy and Isaac. It is not likely that 78,000 turbines will ever be installed offshore in one farm, but if that had been the case, and if the researchers’ conclusions are correct, it would have brought Katrina’s wind speed down to 28 to 44 mph from 125 mph, saved thousands of lives, and $100 billion in Gulf Coast reconstruction. Also, that many turbines would be producing millions of megawatts of clean power. It’s something to think about.  

 

 

 

 

 

 

Sun, Wind, & Fresh Water

Converting ocean water into fresh water is energy intensive, and therefore expensive. Saudi Arabia is a desert kingdom with plenty of oil but very little fresh water. The Saudis burn 1 million barrels of oil a day to produce 60% (4 billion cubic meters) of its total fresh water supply through desalination. If exported onto the world market, those 1 million barrels of oil would bring Saudi Arabia $115 million a day, but it is worth it to them to forgo the profits and have the fresh water. From an environmental standpoint, burning 1 million barrels of oil a day sends close to a half million tons of CO2 emissions into the atmosphere every day, contributing greatly to the pace of global warming.

To deal with these problems, the Saudis have joined with IBM to build a series of solar-powered desalination plants that could by mid-century produce a large share of the kingdom’s water needs.

However, the largest solar-powered desalination plant yet designed will be built in the United Arab Emirates. The Ras Al Khaimah plant, scheduled to start production in 2015, will produce 100,000 cubic meters (approx. 22 million gallons) of fresh water a day, and in addition, provide 20 megawatts of electrical power daily. The developers estimate they will be able to deliver water at a cost of $0.75 per cubic meter. Average cost per cubic meter of water delivered to households in the United States runs between 0.35 and 0.40. Most of the desalination plants run by solar energy are situated in the Middle East where there is an abundance of year round sun and a scarcity of water.

The largest desalination plant run by wind power is near Perth in Western Australia. The Kwinana Desalination Plant produces 144,000 cubic meters of water a day (approx. 38 million gallons), about 17% of Perth’s water supply. The Kwinana plant is powered by the 80 Megawatt Emu Downs wind farm located 200 miles away. Because electrical power has to be supplied evenly 24/7, and because the wind stops blowing from time to time, the power from the wind farm goes into the grid on a trade-off basis. The wind farm contributes 270 Gigawatt hours a year into the power grid, more than offsetting the 180 Gigawatt/h year required to operate the desalination plant. There are a number of smaller desalination plants run by wind-generated electrical power that goes directly from the wind farm to the plant, but Perth has opted for the offset arrangement.

Most desalination plants are still operated with grid power generated by coal, oil, or natural gas because it is less expensive than spending hundreds of millions to construct solar arrays or wind farms. For example, Australia’s other desalination plants providing fresh water to Sydney, Melbourne, Adelaide, and other coastal areas use fossil fuel power from the grid. But more and more, new desalination plants around the world are being planned to operate on alternative power. At some point in the future, all our electricity will have to come from those sources.

 

Crazy Weather & Global Warming

In the first 6 weeks of 2014, the world spawned some of the most severe weather in hundreds of years, including record snowfall in the Midwest and Great Lakes, record cold in the US northeast, ice storms in the southeast, record drought in the southwest, record flooding and windstorms in the UK, unseasonal warming in Scandinavia and Russia, record snowfall in the southern Alps, record flooding in Italy, and record heatwaves and wildfires in Australia, Argentina, and Brazil.

Despite the record snow, ice, and freezing temperatures in some areas, the world continued its long term upward warming trend. NOAA reported that 2013 was tied with 2003 as the warmest year on record. What’s going on?

According to a paper presented this month at a meeting of the American Assn. for the Advancement of Science in Chicago, a weakening jet stream caused by Arctic warming is a possible cause. The polar jet stream is a high-altitude air current with wind speeds of 100 to 120 mph (160 to 200kph) that acts as a weather conveyor belt. When Arctic temperatures stay cold, the jet stream blows stronger and tends to stay in place, bringing normal winter weather to North America, Europe, and Asia.

In January, 2014, the air temperature over the Arctic Ocean was 2 to 4˚C (4 to 7˚ F) higher than average, and 7 to 8˚C (13 to 14˚ F) higher than average over Greenland and Alaska. As the Arctic warms, the jet stream weakens and begins sinking south of its polar route. At the same time, Arctic sea ice is melting at a record rate, exposing more ocean to the rays of the sun. The warmer ocean water in turn accelerates Arctic warming. More rapid evaporation pumps extra moisture into the atmosphere.

A sinking jet stream carries the moisture-laden high-altitude cold Arctic air south into the Midwest and southeast, and across the Atlantic to Europe. While southern Europe is experiencing record rains and snowfall, northern Europe, normally very cold in January and February, is basking in abnormally warm temperatures. With the glaciers and polar ice caps melting at a record rate, sea ice contracting, and oceans warming, it seems obvious that global warming is here, and to some extent driving the world’s current radical weather patterns. The weather will become more radical and storms more intense as the earth gets warmer.

But what is driving global warming? The UN’s International Panel for Climate Change (IPCC) has concluded from all available scientific evidence that it is 95% likely that most of the rise in global temperature since the middle of the 20th Century is due to emissions of greenhouse gases, deforestation, and other human activities.

If greenhouse emissions continue at their present rate the IPCC computer models predict our planet will warm 5˚C (9˚ F) by 2100, and by 10˚C (18˚F) during the following century. The earth is now warmer than it has been since the end of the last ice age 11,300 years ago. If we don’t drastically reduce our carbon-based emissions and start relying more on alternative fuels, are we headed for another ice age? Or another age hot enough for dinosaurs?

 

 

 

 

 

 

Natural Disasters 2013 Review

According to figures released by the German Reinsurer Munich Re, twice as many people died in natural disasters in 2013 than in the prior year, but property damage and insurance losses were significantly less.

Munich Re reports 880 natural disaster events in 2013, costing $125 billion in total losses, compared to $173 billion in 2012, and insured losses of $31 billion, about half the insured costs in the year before. However more than 20,000 people died in natural disasters in 2013, twice the number of deaths reported for 2012. Here are some of the most costly natural disasters of 2013, in either lives or property losses.

Earthquakes: Magnitude 7.0 to 7.7 quakes struck China in April, Pakistan in September, and the island of Bohol in the Philippines in October, killing 1,300 and destroying tens of thousands of homes. Damage amounts were not available.

Tornadoes: On May 20, an EF-5 tornado with a wind speed of 210 mph (340 km/h) ripped through the town of Moore, Oklahoma. The tornado, 1.3 miles (2km) wide, stayed on the ground for 40 minutes on a 17-mile (27km) path of destruction. 1150 homes were wiped out, 91 people died, including 7 children in a local school. Total damage was more than $2 billion.

Floods: Flooding in India, Central Europe, Canada, Mexico, and Colorado resulted in a combined death toll of 7,000 and damages exceeding $30 billion. European flooding was called the worst since the middle ages. Most of the deaths occurred in flash floods and landslides in the mountains of northern India and Nepal.

Meteor strike: A 13,000 ton meteor traveling at 60 times the speed of sound streaked into earth’s atmosphere on Feb. 15 and exploded in a fireball over the Caucuses region of Russia. The shock wave damaged 7,200 buildings and injured 1,500 people. The injuries were mainly from flying glass from blown-out windows. Fortunately, there were no reported deaths.

Wildfires: Brush fires in Australia and California scorched hundreds of thousands of acres. In October, Australian firefighters fought 66 brush fires along a line that stretched for 1,000 miles (1,650km). In California’s Sierra Nevada Mountains, the Rim Fire that started in August was not put out till mid October, after burning 257,000 acres of heavily forested watershed.

Typhoons: Super Typhoon Haiyan struck the Philippines island of Leyte on November 8 with wind speed of 195 mph (320km/h), the strongest ever recorded for a tropical cyclone making landfall. A 20-ft (6m) tidal surge wiped out the city of Tacloban. More than 6,000 people lost their lives in the storm. Total cost has been estimated at up to $15 billion.

While the Pacific typhoon season was quite active, with 31 tropical storms, of which 13 were typhoons and 5 were super typhoons, the Atlantic hurricane season was much quieter than expected, with no major storms. The first few weeks of 2014 have also been relatively quiet, with the exception of the Mt. Sinabung volcano eruptions in Indonesia, during which 14 people have died and 20,000 have been evacuated.  Inevitably, there will be more natural disasters in the months ahead. We will have to wait and see what the rest of 2014 will bring.

 

 

 

 

When Volcanoes Endanger Aircraft

In a report issued by U.S. Geological Survey, there were 94 confirmed ash-cloud encounters by aircraft between 1953 and 2009. 79 of those produced various degrees of engine or airframe damage. 26 encounters involved significant to very severe damage, and 9 caused engine shutdown during flight.

Two of the most well known incidents involved passenger jets flown by KLM and British Airways. On June 24, 1982, British Airways Flight 9 flying at 37,000 ft. (11,000m) from London to Auckland, New Zealand, with 248 passengers and a crew of 15, entered an ash cloud rising from the erupting Mt. Galunggung volcano in Indonesia. All 4 engines flamed out due to the silica in the volcanic ash melting inside the engines and coating everything with glass. The plane had dropped 23,500 ft. (4,200m) before the crew was able to restart 3 of the engines and make an emergency landing in Jakarta.

On December 15, 1989, KLM Flight 867 from Amsterdam to Tokyo flew through a thick ash cloud from Alaska’s Mt. Redoubt volcano as the 747 started its descent into Anchorage. All 4 engines failed, and the plane lost 14,000 ft. (4,400m) in altitude before the crew could restart the engines and make a safe landing. The ingested ash caused $80 million in damage to the aircraft, including replacement of all 4 engines. The expertise of the air crews in both cases averted what could have been disastrous crashes.

The aviation industry learned from those incidents and started grounding all flights when volcanic ash was present. That’s why most European and North Atlantic flights were cancelled between April 15 and April 20, 2010, when Iceland’s Mt. Eyjafjallajökull erupted, ejecting 250 million cubic meters (330 million cubic yards) of volcanic ash into the atmosphere. The ash cloud drifted west, covering the sky over the North Atlantic and most of Europe. Many thousands of passengers were stranded in European airports for up to 5 days.

Ash clouds are hard to distinguish from moisture clouds either visually or by radar. That’s why aircraft continue to wander into them, and why the United Nations has set up a network of Volcanic Ash Advisory Centers (VAAC). There are 9 centers located around the world, each covering a geographic region. When an eruption produces an ash cloud, the VAAC in that area uses a computer model to predict the path of the cloud at different flight levels and issues an international alert. VAACs are located in Alaska, Argentina, Australia, England, Canada, Japan, France, and Washington, DC. Fewer incidents have been reported since the centers have been in full operation.

On average, 15 major explosive volcanic eruptions powerful enough to eject tons of ash into the stratosphere occur each year. A sudden Mt. St. Helens or Mt. Pinatubo type of super explosion can eject massive amounts of ash into the stratosphere in minutes, creating unexpected hazardous conditions. Air crews must stay ready to act immediately on VAAC ash alerts, and take the necessary evasive action to keep their flights safe and uneventful.  

 

  

The Next Tsunami — Where?

According to USGS, two North American fault line systems are at a critical stage. In a December 29, 2013, news release, USGS states that enough strain may be currently stored in an earthquake zone near the Caribbean island of Guadeloupe to cause a magnitude 8 or larger earthquake and subsequent tsunami. The release goes on to say that USGS and French researchers studying the plate boundary where 20 of the 26 Caribbean islands are located, estimate that enough unreleased strain may have accumulated to create a magnitude 8.0 to 8.4 earthquake. A 7.5-8.5 quake in the same area in 1843 killed thousands in Guadeloupe. A similar quake in the future could cause many hundreds of fatalities and hundreds of billions US dollars in damages. An accompanying tsunami could inflict an even higher toll.

The other fault zone considered to be due for a major failure lies off the northwestern US coastline. The Cascadia Subduction Zone runs 1,100km (700 mi) from Vancouver Island in British Columbia to Cape Mendocino in northern California. Recent studies indicate that a 60km (40 mi) segment of the fault off the coast of Washington is locked. In geological terms, locked means a point where the converging plates have been pressing together without releasing energy, perhaps for hundreds of years. The strain constantly builds until the fault’s frictional strength is exceeded and it finally ruptures.

The last major earthquake and tsunami on the Cascadia struck in 1700. That 9.0 quake triggered a tsunami that flattened trees many miles inland in Washington state, and rolled across the Pacific to inflict damage on Japanese coastal villages. The northwest was sparsely inhabited at that time, so there were no known casualties. A similar earthquake and tsunami today could be catastrophic. A study commissioned by the Oregon legislature concluded that in Oregon alone a Cascadia 9.0 earthquake and tsunami could kill 10,000 and cost $30 billion in damages.

Megathrust earthquakes and tsunamis have occurred on the Cascadia every 300 to 600 years. It has been a little over 300 years since the last one. The Oregonian newspaper recently reported that some geologists are predicting a 10% to 14% probability that the Cascadia will produce a magnitude 9.0 or greater earthquake within the next 50 years. An article in Science Daily  suggests that the risk could be as high as 37% for a magnitude 8.0 or greater in the same period.

Still, it’s impossible to say where or when the next big one will strike. Even though the Caribbean and Cascadia faults appear ready to go, the 4 ocean trench fault zones that have produced the biggest earthquakes and tsunamis of the recent past should not be ruled out. The Japan Trench off the northeastern coast of Honshu produced the 9.0 quake in 2011 that killed 20,000. The 2004 Indian Ocean 9.1 earthquake and tsunami that killed more than 200,000 started in the 2,600km (1,600 mi)-long Sunda Trench. The Great Alaska Earthquake, a magnitude 9.2 that struck on Good Friday in 1964, originated in the Aleutian Trench. The Atacama Trench off the coast of South America generated the largest earthquake on record, a magnitude 9.5 that struck off the coast of Chile in 1960, killing 5,000 and sending a tsunami speeding thousands of miles across the Pacific Ocean. These 4 ocean trench fault zones mark the convergence of highly active tectonic plates. All are part of the Pacific Ring of Fire.

Will Yellowstone Erupt?

The magma chamber that powers Old Faithful and the other geysers. hot springs, fumaroles, and mud pots of Yosemite National Park is considered by scientists to be the largest in the world. And a new study by researchers at the University of Utah finds that the chamber underlying Yellowstone is far larger than originally thought in terms of both size and amount of molten rock it contains.

According to the study, the Yellowstone Volcano magma chamber is 2.5 times larger than earlier estimates. By using a network of seismometers situated around the park, the research team found that the magma cavern is 90km (55 mi) long, 30km (20 mi) wide, and up to 15km (10mi) deep, containing up to 600 cubic km (144 cubic mi) of hot gas and molten rock.

Geologic research indicates Yellowstone Volcano erupts every 700,000 years. In the last three events – 2.1 million, 1.3 million, and 640,000 years ago — the magma chamber emptied out in a single violent volcanic blast. Millions of tons of rocks, sulfur dioxide, and ash rocketed into the atmosphere, blocking sunlight around the world . The empty chamber collapsed, forming a geographic depression or caldera, and the land for thousands of miles around was blanketed with a thick coat of ash.

The park floor has been rising as the magma chamber continues to swell. Between 2004 and 2009, Yellowstone’s ground uplifted 20cm (8 in), but since 2010 the uplift has continued at a slower pace. The park experiences between 1,000 and 3,000 earthquakes a year as the magma moves into the chamber. Most are less than Magnitude 3.0 and are seldom felt by park visitors. Scientists believe the next supereruption will occur sometime in the next 40,000 years. When and if it blows, it will cause disastrous damage and loss of life in a wide area around the volcano.

Yellowstone sits atop a volcanic hotspot, a pocket deep in the earth that sends a plume of molten rock and hot gas rising into a magma chamber just below earth’s crust. Both the hotspot and the magma chamber are stationary, but the North American Plate, the section of crust upon which Yellowstone is situated, constantly moves southwesterly at 2.5cm (approx. 1 in) a year. Over the past 16.5 million years, as the North American Plate has slowly moved over the hotspot, 15 to 20 massive eruptions have left immense craters dotting the landscape from the Nevada-Oregon border through Idaho’s Snake River Plain. Plate movement eventually positioned the hotspot and magma chamber under Yellowstone. Over the next 16 million years, plate movement will progressively move the hotspot under Montana, North Dakota, and Canada. As the North American Plate moves Yellowstone away from the hotspot over the expanse of geologic time, the park’s geysers will gradually die.

But for now the park’s thermal features remain alive and well and will stay that way over the next few million years. Although the possibility of a blowout remains, USGS and National Park Service scientists with the Yellowstone Volcano Observatory state that they “see no evidence that another such cataclysmic eruption will occur in the foreseeable future.”

Tsunami & Earthquake Networks

Someplace on earth the ground is shaking. According to USGS estimates, there are an average of 1,300,000 earthquakes on our planet every year, or one every 24 seconds. 98% of those quakes are under magnitude 4.0 and many occur in remote locations, so most of us are unaware of the constant seismic activity, even when it happens close by.

However between 1,500 and 2,000 annual quakes are in the magnitude 5.0 to 9.0 range. Those are the quakes that can do damage on land, and possibly trigger a tsunami if one strong enough hits on the seafloor where tectonic plates converge.

Where do USGS and other reporting centers get their real time information? Two worldwide seismic hazard networks report earthquakes as they happen, and provide early warning when a tsunami starts rolling toward land.

Global Seismographic Network (GSN) is a permanent digital network of 150 land-based and ocean-bottom seismometers positioned in earthquake prone locations around the world, and connected by a telecommunications network. GSN is a partnership among USGS, the National Science Foundation, and Incorporated Research Institutions for Seismology (IRIS), a consortium of 100 worldwide labs and universities. Although US based, GSN is fully coordinated with the international community. GSN stations are operated by USGS and UC San Diego. The network determines location and magnitude of earthquakes anywhere in the world as they happen. The data is used for emergency response, hazard mitigation, research, and tsunami early warning for seafloor locations.

Deep Ocean Assessment and Reporting of Tsunamis (DART) is the main component of an international tsunami warning system. The DART system is based on instant detection and relay of ocean floor pressure changes. DART stations consist of an ocean bottom sensor that picks up changes in pressure as the tsunami wave passes and sends the data to a nearby communications buoy, which transmits it to a satellite, which in turn relays it within seconds to tsunami warning centers around the world.

The US has deployed 39 DART stations in the Pacific, Atlantic, and Caribbean. Australia and Peru have also installed DART systems, and since the 2004 Indian Ocean tsunami that killed over 200,000 people, the nations bordering the Indian Ocean have cooperated in the installation of 6 Indian Ocean DART stations, along with 17 seismic satellite stations. The DART data, along with GSN and satellite data, flow into two major tsunami warning centers: the Pacific Tsunami Warning Center in Ewa Beach, Hawaii, and the West Coast and Alaska Tsunami Warning Center in Palmer, Alaska. It is the job of the tsunami warning centers to issue alerts and warnings to population centers in the path of a developing tsunami.

Although the GSN and DART systems have proved effective, NASA is testing a GPS system that can spot the epicenter location and earthquake magnitude 10 times faster, giving those in peril extra seconds and minutes to evacuate before the tsunami strikes land. NASA is still testing the system.    

 

  

Storm Surge — the Big Killer

When a hurricane strikes land, the storm surge can be more deadly than the storm’s violent wind. Tropical cyclones – called hurricanes in the Atlantic, typhoons in the Pacific, and cyclones in Australia and India – have killed over 1 million people in the past hundred years. The majority of those deaths are attributed to the surge component of the storm.

Typhoon Haiyan hit the Philippine Islands city of Tacloban on November 11, 2013, with a wind speed of 195 mph (315 km/h), the strongest landfall speed ever recorded. Over 5,000 died and the city was leveled. The savage wind took its toll, but it was the 20 ft. (6.6m) wall of ocean water surging more than a mile (1.6 km) inland that took most of the lives.

When Superstorm Sandy came ashore in New Jersey and New York in late October, 2012, the wind speed was only 115 mph (185 km/h), but the storm was so massive it pushed a 14 ft. (4.4m) storm surge far inland, killing more than 100 and wiping out or badly damaging thousands of homes. Reconstruction costs have reached $70 billion.

In August, 2005, Hurricane Katrina, a Category 3 with a wind speed of120 mph (192 km/h) struck New Orleans and Gulf Coast cities in Louisiana, Mississippi, and Alabama. Although the wind did some damage, the storm surge with waves as high as 28 ft. (7.5m) wiped out shoreline communities, and breached New Orleans’ levees, flooding the city, and causing most of the 1,800 deaths.

Some of the most destructive storm surges have occurred in Bangladesh and India. The northern end of the Bay of Bengal is funnel shaped, and storm surges become tidal bores that sweep many miles inland. The Bhola cyclone in 1970 produced a storm surge of 35 ft. (11m), taking 500,000 lives in Bangladesh. The largest storm surges ever recorded took place in India in 1839 when a 40 ft. (12.2m) surge killed 300,000; and in Bathurst Bay, Queensland, Australia, where a 42 ft (12.8m) surge killed 400 in 1899. It was reported at the time that dolphins and fish were found atop the cliffs surrounding the bay.

A storm surge is created by the storm’s high wind piling the ocean’s surface higher than ordinary sea level. Low pressure at the center of the weather system has a lifting effect and aids in the buildup of the sea and the energy of the surge.

People living near the shoreline in tropical storm-prone areas should be prepared not only to protect property against the high wind, but also be aware of storm surge danger, and prepared to evacuate before the storm makes landfall.       

Will Nuclear Fusion Power the World?

Nuclear Fusion holds great potential as a clean power source that might someday power the world. Unlike nuclear fission, nuclear fusion poses no radiation dangers or waste storage problems. The aim of fusion is to create an artificial sun — a superhot plasma that replicates the composition and heat of the sun — and to use the harnessed heat to operate steam generators that make electricity.

Scientists at the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory have taken a step closer to the goal of achieving ignition, the point at which the energy released by the fusion of one atom causes a self-sustaining chain reaction, the stage needed to create the sun-like plasma.

By focusing 192 powerful laser beams on a tiny fuel pellet made of the hydrogen isotopes deuterium and tritium, the NIF researchers, for the first time, have achieved a stage of fusion in which the amount of energy released by the nuclear fusion reaction was greater than the amount of energy that went into the pellet. Achieving ignition, the final step, will require an ultra high level of precision in every phase of the process, including pinpointing the laser beams and perfecting  the fuel pellet. By continually refining the process, the research team is confident they will reach ignition.

When ignition is achieved, a way must be found to contain a plasma mass as hot as the sun (3.5 million degrees Fahrenheit, 2 million degrees Celsius). Since there is no material container capable of withstanding such temperatures, other means have to be developed. One solution is to keep the hot plasma out of contact with the walls of the container by keeping it moving in a circular path by means of magnetic force. The process is called magnetic confinement.  A magnetic confinement test reactor has been constructed at Princeton University. It uses a combination of two magnetic fields to confine and control the plasma. Since nuclear fusion plasma has not yet been created, the Princeton reactor has not been fully tested.

Assuming ignition is realized at some point in the near future, it will still be many years before nuclear fusion moves from the lab to commercial application. But when it does, it might very well be the breakthrough that brings clean, plentiful, inexpensive power to the world.

 

Solar & Wind — Making a Difference?

Depending on where you live, you might pass by miles of wind farms producing electrical power, or arrays of solar panels on rooftops and in open fields. If so, you may have wondered how big a dent renewable energy is making in total power production, and whether it is helping to cut down the use of fossil fuels. 

According to industry estimates, wind farms worldwide produce about 300 billion watts of power annually, and solar about 70 billion. As big as those numbers appear, they account for only 3% of global energy production. Coal and oil still fuel more than 80% of world power generation, and still pump 10 billion tons of carbon emissions into the atmosphere every year.

Wind power production has been increasing 20% a year for the past 10 years. In the United States, wind produces 4% of total national power, but the figure is higher in several states. Wind contributes more than 20% of the power in Iowa, South Dakota, and Kansas. The US Dept. of Energy (DOE) has set a goal of 20% of national power to be produced by wind by 2030.

Wind provides 7% of the power consumed by the countries of the European Union. Denmark derives more than a quarter of its power from wind. The UN’s International Energy Agency projects wind delivering 18% of world electrical power by 2050, which if achieved will reduce carbon fuel emissions by approximately 2 billion tons a year.

Solar power is starting from a smaller base, but is now the fastest growing source of renewable energy, having grown 40% a year since 2000. According to the June, 2013, issue of the MIT Technology Review, DOE has set a goal of less than $1 per watt for complete installed systems by year 2020. If the solar industry hits that target, that would bring the direct cost of solar power down to 6 cents per kilowatt hour, and make it competitive with power now delivered by the grid. The key to cost reduction lies in further technological advances in solar panel materials, efficiency in installation, ease of connecting to the grid, and battery storage capacity.

Some of the solar advances currently being tested are (1) a two-sided panel that potentially will produce 20% more power; (2) flexible solar cells on a new Corning product called Willow Glass, thin enough to roll up; and a cheaper alternative to silicon made up of light-absorbing calcium titanium oxide compounds. Modern solar cells now achieve 10% to 20% efficiency in converting sunlight to power. Some of the materials under test are showing up to 30% efficiency, and a German laboratory announced that they had achieved a 44% conversion rate. One laboratory founded by a Caltech professor expects to produce a panel reaching a 50% conversion rate.

When all the solar research comes out of the lab into commercial production, solar energy will be highly cost competitive with all other power sources.

As wind and solar production grows, it is only a matter of time before the two renewable sources will account for half or more of global energy production. It will be a time of cleaner air and cleaner oceans, and might help us slow the galloping advance of global warming.  

 

 

Islands That Rise From the Sea

From time to time, a new island will suddenly or gradually appear somewhere in the vast oceans of Planet Earth. Some become permanent islands that attract plant and animal life. Others, eroded by wave and storm action, quickly disappear.

The most recent example of a temporary island materialized off the coast of Pakistan on September 23, 2013, following the magnitude 7.7 earthquake that struck Balochistan province in southwestern Pakistan, killing 850 people and injuring thousands. The seismic waves of the earthquake hundreds of miles away loosened pockets of volcanic gas under the seafloor, creating a mud volcano. The eruption blew a pile of seafloor mud and rocks to the surface, forming an island the size of a football field, 60 ft. (18m) high. Because the island is made up of soft material, geologists believe it will shortly be washed away by waves, tides, and storms.

Other emerging volcanic islands are the tops of undersea mountains building up from constant volcanic eruptions, finally emerging above sea level. Surtsey, which rose from the sea off the southern tip of Iceland between 1963 and 1967, is one example of this kind of new volcanic island. The island was 1 sq. mi. (2.7km²), but since has since lost half its size to erosion. Whether new volcanic activity will rebuild the island is not known.

Anak Krakatau, another example, rose from the submerged caldera of the famous Indonesian volcano Krakatau (Krakatoa) in 1930. The original Krakatau exploded in 1883, in a blast heard around the world, killing an estimated 100,000, its ash cloud blocking out the sun for months. Constant volcanic  activity has built the new Anak Krakatau to 1,000 ft. (300m) at the summit. The volcano erupts frequently and the new island continues to grow in size.

A different kind of emerging island is one caused by land uplift. This occurs along the coastlines of Sweden and Finland, where a 1 mi. (1.6km) thick glacier started melting away 10,000 years ago. As the weight of the glacier subsided, the land it was pressing down on began slowly to lift up. It has taken all of the 10,000 year span for the freed-up land to finally rise out of the sea.  A number of these uplift islands have appeared along the Scandinavian coastlines.

 A new island appeared off the coast of Greenland in 2005 as the Greenland Ice Sheet continued to retreat. There is a controversy over whether the island was already there and only revealed by the melting of the ice sheet, or whether the retreat of the ice sheet allowed the island to rise from the sea. Either way, Uunartoq Qeqertaq (The Warming Island in Greenlandic) is shown as a new island on world maps.

As new islands appear, many older islands around the world face the prospect of being flooded out by rising sea levels. A sea level rise of up to 3.2 ft.(1m) by 2100 is projected by the UN’s International Panel for Climate Change (IPCC) and other scientific organizations. The predictions are based on rate of glacier and ice sheet melt, temperature increases from global warming, and water expansion as oceans warm. Can global warming be slowed and sea level rise reduced?  Possibly, if the world cuts back carbon emissions in time and switches to renewable energy.

 

 

 

Is Desalination the Answer?

The widely held view in the scientific community that climate change is bringing much longer periods of drought to wider areas of the world than in the past is now supported by the United Nations Intergovernmental Panel on Climate Change.  The panel’s recently released Fifth Annual Assessment indicates a definite increase in the number of drought days over large portions of the globe.

Farming is especially hard hit during periods of drought. Lack of water stunts plant growth, reduces yields, and even wipes out entire crops in dry farming areas. The 2011 Midwest drought cost $12 billion in crop losses, and 2012 drought damage may exceed that when finally calculated. Texas has suffered a crippling drought over the past few years. And the Colorado River Basin is in the grip of a 14-year drought that threatens to drastically cut water supplies to California, Arizona, Nevada, Colorado, Utah, Wyoming, and New Mexico.

A number of proposals have been put forth for supplementing the Colorado River water supply, including a pipeline from either the Missouri or Mississippi Rivers, but now officials in the region are taking a serious look at desalinating ocean water and pumping it to areas in need.

Advances in technology have reduced the amount of electrical energy needed for processing and pumping desalinated water from a cost of about $1 per m³ (a cubic meter contains 264 gallons or 1,000 liters) to approximately $0.50 per m³.  In the United States, water from natural sources such as reservoirs fed by rain and snow runoff, and pumped into water distribution networks now costs the end user between $0.35 and $0.40 per m³, or 25% to 30% less.

New research projects at Lawrence Livermore National Laboratory and Lockheed Martin show promise of making desalination costs competitive with natural source water when it is rolled out commercially. The Lawrence Livermore technology is based on carbon nanotubes, special molecules made of carbon atoms. They allow liquids to flow through, while the tiny pore size blocks larger molecules, a cheaper way to remove salt from water. This process uses less power to filter more water than any of the methods presently in use. According to Lawrence Livermore researcher Martin Suss, the new method removes salt 5 to 10 times faster than previous systems.

Countries in the Middle East are among the biggest users of desalination. Saudi Arabia, Dubai, Bahrain, and Israel depend on desalination for between 70% and 80% of their fresh water. In oil producing countries such as the Emirates, oil and power are cheap, and the cost of desalination is therefore less expensive. In Israel and other areas without access to cheap oil, carbon fuels are supplemented with wind and solar power to run the desalination plants.

The world consumes about 25 billion cubic meters of water a day from natural sources. By 2020, the world will have more than 15,000 desalination plants in operation, projected to be producing 120 million cubic meters of fresh water a day, less than one half of one percent of total water usage. But desalination will continue to grow, and when it eventually becomes price competitive with natural water, will grow much faster. Many believe it will prove to be the best way to bring fresh water to the drought-stricken areas of the world. 

 

Drought, Fire, & Flash Floods

In a 3-day period starting September 9, 2013, 18 inches (46cm) of rain drenched the Rocky Mountains Front Range, setting off flash floods that roared through Boulder, Lyons, Estes Park, and other Colorado foothill communities. A dozen dams overflowed and six blew out. Walls of water 20 ft. (500cm) high raced down canyons, sweeping away houses and stranding thousands of area residents. As of this writing, the flooding had taken 8 lives and destroyed 1,500 homes.

Average rainfall for the month of September in Boulder is 1.63 in. (3.45cm). So what were the conditions that caused 11 times that amount to fall in 3 days? Many scientists believe that climate change, forest fires, and the severe drought that has gripped the U.S. Southwest for 14 years all played a part.

To begin, a low-pressure center settled over the Great Basin and was held in place by a high-pressure ridge over the Pacific Northwest. The low pressure system tapped into a plume of monsoonal moisture coming up from the Pacific Ocean off Mexico. Since the low was stationary, it kept sucking in the monsoon moisture in a loop, like it was coming in on a conveyor belt. The storm dumped its deluge on the drought-dried Front Range with steep canyons running downhill from peaks exceeding 14,000 ft (4,300m).

Professor Brad Udall, director of University of Colorado’s Wilkinson Center for Natural Resources, said that while current science can’t pin any particular extreme weather event to climate change, this flooding is likely a reflection of global warming.  Scientists have warned that as the planet warms,  drought and flash flooding will become more prevalent.

According to Sandra Postel, National Geographic’s Freshwater Fellow and noted authority on water use, the drought that has parched the area and gripped the Colorado River Basin for the past 14 years may be partly to blame for the severity of the floods. She said that drought hardens the soil, and when rains do come, the ground absorbs less water and quickly runs off the land.

Postel added that fires lead to worse flooding because they remove vegetation that can slow and trap rainfall. Hundreds of acres of Front Range forest were scorched by the Fourmile Canyon fire in 2010 and the Flagstaff fire in 2012. The burn area from those fires lies directly above the communities hit by the flash floods in September, 2013.

As our climate continues to warm, this same scenario will most likely be repeated in coming years in areas all over the world. Lengthy droughts, severe wildfires, record flooding, and more intense tropical storms are all expected to be part of our future climate menu.  

Hidden Danger Under the Sea

Our planet’s oceans cover the world’s largest volcano and longest mountain range, and hide seamount peaks and erupting volcanoes lying close enough to the surface to pose a danger to shipping. Tamu Massif, the world’s largest volcano, lies 1,600km (1,000 mi) east of Japan. Covering an area the size of New Mexico, the massive mountain rises 4,400m (14,400 ft) from the ocean floor. Its summit, 1,980m (6,500 ft) below the surface, is not a threat.

The world’s longest continuous mountain range, the Mid-Ocean Ridge (MOR), is a chain that stretches unbroken for 65,000km (40,000 mi) along our planet’s ocean floor. This long stretch of mountainous terrain, which threads around the globe like the raised seams of a baseball, marks the boundaries of the 12 tectonic plates that make up the earth’s crust. Average height of MOR’s mountain peaks is 1,500m (5,000 ft), and not considered hazardous.

But danger lurks in the tops of a few of the estimated 30,000 seamounts, self-standing undersea mountains not associated with the Mid-Ocean Ridge. Seamounts are usually extinct volcanoes rising up to 5,000m (16,000 ft) from the ocean bottom. Most leave a wide clearance between summit and surface, but some can spell danger.

 In 2005, the submarine USS San Francisco ran into an uncharted seamount southeast of Guam at a speed of 35 knots, killing one seaman and causing extensive damage to the vessel. In 1973, the merchant vessel MV Muirfield badly damaged its keel when it struck an uncharted seamount near the Cocos Islands in the Indian Ocean. The undersea peak was later charted and named Muirfield Seamount.

In 1985, the aircraft carrier USS Enterprise struck the Cortes Bank reef 100 mi (160km) southeast of San Diego, California. The Cortes Bank is a submerged mountaintop that was once the outermost island in Southern California’s Channel Islands chain. Its shoals range from 3 to 30m (30 to 100 ft) in depth and are marked as a hazard to shipping. The Bowie Seamount off the coast of British Columbia is a 3,000m (9,600 ft) undersea peak with a summit to surface clearance of only 24m (75 ft). Bowie is charted and avoided by shipping.

Erupting undersea volcanoes can also pose a shipping hazard. The Myojin-Sho undersea volcano lying 450km (330 mi) south of Tokyo rises to within 50m (164 ft) of the surface. In September, 1953, the Japanese scientific vessel Kaiyo Maru No. 5 was conducting research in the area when the volcano erupted. The ship and its crew of 31 was destroyed.

Kick-’em-Jenny is another dangerous undersea volcano, located 8km (5 mi) north of Grenada in the Caribbean. Rising to within 180m (580 ft) of the surface, it had a massive blowout in 1939 and has erupted intermittently since, up to its last eruption in December, 2001. Kick-’em-Jenny is still considered active and dangerous, and is charted and lies in a navigation exclusion zone.

Considering the thousands of ships that depart from ports all over the world every day, deep water shipping is statistically a very safe pursuit. But from time to time a storm, navigation error, or undersea hazard takes its toll.

Will Hurricane Season Slow Sandy Recovery?

As the $50 billion that Congress appropriated for Superstorm Sandy aid funnels into reconstruction projects along New York and New Jersey beaches, NOAA continues to predict a very active, above-normal 2013 Atlantic hurricane season.

According to NOAA, oceanic and atmospheric conditions in the tropical Atlantic and Caribbean so far in the summer of 2013 are conducive to hurricane formation. These conditions include above-average sea surface temperatures; an enhanced African monsoon season; the presence of 2 named tropical storms in June and July; and an ENSO (El Niño Southern Oscillation) Neutral condition in the equatorial Pacific.

Atlantic hurricane activity is suppressed during an El Niño year, when the warmer water in the Pacific weakens Atlantic wind patterns. Activity is enhanced during a La Niña year, when cooler tropical water strengthens Atlantic wind patterns. ENSO Neutral means water temperature along the equator between Asia and South America remains normal and has little affect on wind patterns in other parts of the globe. Historically, years with early season named storms have a high likelihood of an above-normal hurricane season.

NOAA is currently staying with their pre-season estimate of 13 to 19 named storms, 6 to 9 hurricanes, 3 to 5 major hurricanes, and an Accumulated Cyclone Energy range of 120% to 190% of the median, meaning wind speeds and storm surge are expected to be above average. NOAA makes no predictions on how many, if any, of the expected storms will make landfall. Many hurricanes stay out to sea or lose power as they approach land.

Shoreline reconstruction is moving ahead in New York and New Jersey, but much of it is still in the early stages. An example is a planned 16 ft. steel wall to be covered with sand for the protection of beachfront homes in 2 New Jersey townships. Funding has been approved, and the Corps of Engineers will manage the construction, but the necessary homeowner easements have not yet been received, so the project is on hold.

We hope that the storm-damaged areas have a chance to rebuild completely, observing storm-resistant building codes, before the next Sandy comes to visit.

 

 

 

   

Sobering Climate Change Report

Every six or seven years, under the auspices of the United Nations, the world’s leading climate scientists come together to write a climate change status report. This latest report of the UN’s Intergovernmental Panel on Climate Change (IPCC) is being produced by 830 international scientists from universities, laboratories, and government agencies such as NOAA and USGS.

The scientists involved are participating in three working groups: The Physical Science Basis; Impacts, Adaptation & Vulnerability; and Mitigation of Climate Change. The IPCC has completed the first draft of its Fifth Annual Assessment Report, to be published in final form in 2014. However, a summary draft of the report was leaked to the media ahead of schedule.

Several key points in the leaked summary probably will not change in the final document. One of the most important is the increase in probability that global warming is caused by humans. The 2001 Assessment gave a 66% probability that most of the observed warming is caused by humans. The next report, published in 2007, raised that probability to 90%. The latest report to be published in 2014 ups that probability to 95%. The continued burning of fossil fuels is considered the major cause of the rapid warming of the planet.

A second point is that sea level rise is now labeled unequivocal. In the opinion of the world’s climate scientists, sea level rise WILL take place, no ifs or maybes. The actual amount of rise between now and the end of the century is still being debated. It depends on how fast the world’s glaciers and ice sheets melt, and how fast the ocean warms. When ocean water warms, it expands and adds to sea level rise. Estimates run from a 3 ft. (.91m) rise to a 6 ft. (2.73m) rise.

Another finding in the summary is that CO2 in the atmosphere has increased by 20% since 1958, and 40% since 1750. To quote the summary, “virtually all (of the increase) is due to burning of fossil fuels and deforestation, and a small contribution from cement production.”

And finally, it is stated that climate change will continue for centuries even if emissions of greenhouse gases are stopped totally. In the words of the report, “This represents a substantial multi-century commitment created by past, present, and future emissions of CO2.”

To summarize the leaked summary, the world’s climate will get increasingly warmer, sea levels will continue to rise, the main reason behind these changes is the continued burning of fossil fuels and resulting emissions of CO2 into the atmosphere, and even if we stop burning fossil fuels today, global warming will continue for centuries to come.

One of the IPCC panels working on the Fifth Annual Assessment is concerned with mitigation. Even though a lot of long-lasting damage has already been done, it’s not too late to replace carbon energy with wind, solar, bio-fuels, and storage battery technology to help us cut CO2 emissions and slow the pace of global warming. Reversing global warming seems to be a lost cause, but we hope the mitigation panel will present ideas that will help us cope with the changes to come.