The Typhoon & The Big Chill

How could a warm water Pacific typhoon that started near the Philippines travel 7,000 miles (12,000km) and drop temperatures to minus 27°F in Casper, Wyoming one week later?

Super Typhoon Nuri started on October 31, 2014, as a tropical depression in the warm waters of the Pacific east of the Philippine Islands. By November 2 it had blossomed into one of the most powerful typhoons of 2014 with sustained winds of 180 mph (300km/h). Fortunately, Nuri did not make landfall in either the Philippines or Japan as it traveled northeast, but did kick up high surf. Waves up to 16 ft (5m) hit Japan’s eastern shoreline.

Three days later, on November 5, 2014, Nuri had been downgraded to a tropical storm, phasing from a warm core tropical system to a cold core post tropical system. It had traveled north far enough to be picked up by the jet stream and regenerated into a powerful post tropical cyclone over the Bering Sea southwest of Alaska, with winds gusting to 100 mph (170kp/h), waves cresting at 50 ft (15m) and atmospheric pressure dropping to a record low 924 millibars.

When the storm was at its height on November 7 the jet stream weakened and super cold air from Siberia and the North Pole flooded into the remnants of Nuri. At the same time, high pressure set in over Alaska on one side and northwest Canada on the other, creating a direct pipeline for the newly formed Polar Express to sweep through Canada and down into the Midwestern section of the US, bringing freezing weather from the Rocky Mountains to the Ohio Valley.

Livingston, Montana dipped to minus 21°F and Denver to minus 14°F, the coldest November day since 1880. Texas and the Plains states saw daytime highs in the twenties (F), and in the Ohio Valley daytime highs dropped into the thirties. A new blast of polar air is expected in the same area the week of November 17.

Often we think of weather as a localized phenomenon. Will it rain or snow here? Will the sun shine tomorrow? But Typhoon Nuri’s journey demonstrates quite clearly that weather is a product of massive global forces that have no regard for manmade borders or schedules.

October Disasters – 2014

October, 2014 was not a month of devastating disasters. No big earthquakes or tsunamis were reported. But like most months, October produced its share of disasters of different kinds.

Space Explosions.  On October 28, an unmanned rocket carrying 5,055 pounds of supplies and scientific material for the International Space Station blew up seconds after liftoff. The Antares rocket manufactured by Orbital Sciences was powered by engines made by the Soviet Union in the 1960s. A formal investigation is under way, but many believe the failure occurred in the 50-year-old engines. The costs were $200 million for the rocket and $237 million for the cargo, a total loss of $437 million, plus a setback for commercial spacecraft development.

Another commercial space explosion occurred on October 31 in the stratosphere above the Mojave Desert in California, killing one test pilot and injuring a second. Virgin Galactic was test flying a rocket ship designed to carry passengers into space. The spaceship was dropped from its mothership 10 miles above the earth. Minutes later, shortly after ignition, the spacecraft exploded. Debris was scattered over a 2 square mile area in the Mojave Desert. Further development of civilian space travel is on hold pending an investigation into the cause of the failure.

Ebola. As of October 29, 13,567 cases of the virus had been reported virtually all in West Africa, and 4951 people had died of the disease. One Liberian immigrant died of the disease in Dallas, Texas, and two nurses who had treated the immigrant contracted the disease but have recovered. One doctor who treated Ebola patients in Africa returned to the US with the disease. He has been treated and has recovered.

Hillside collapse. On October 28, heavy rains caused a hillside in Sri Lanka to sheer off and bury hundreds of tea plantation homes in a river of mud. At first it was thought that over 300 tea workers had died, but updated reports indicate 38 confirmed deaths. Every year, monsoon rains soak the Sri Lankan hillsides, making them spongy and unstable. Mudslide tragedies are not uncommon.

Colorado Earthquake. The USGS reported on Oct. 29 that observations from a radar satellite showed major ground subsidence in the Raton Basin area of Colorado and New Mexico. Analysis indicates that the sinking ground and a magnitude 5.3 earthquake in that area in 2011 were likely caused by natural gas extraction and wastewater disposal from fracking operations. In fracking, high pressure fluid is injected into deep wells to extract the gas, then withdrawn and re-injected into the ground for disposal. The wastewater seeps into fault lines, causing the slippage that produces an earthquake. Having abundant natural gas supplies is good for the consumer, but we pay a high price for it in the degradation of our natural landscape.

On the plus side, the flaws that caused the two space explosions eventually will be tracked down and fixed, and before long people and cargo will be flying safely in space. Thousands of health care workers from around the world are working in West Africa to treat those with Ebola, limit new infections, and keep the disease from spreading.     

 

  

Storms, Satellites, & Methane

Storms. Hurricane Gonzolo was downgraded from a Category 4 to a Category 3, and finally to a Category 2 storm when it struck Bermuda on Friday, Oct. 17, 2014, with sustained winds of 110mpn (175km/h). No deaths or injuries were reported, but the island suffered heavy wind and rain damage, power outages, and storm surge flooding from 40 ft. (12m) waves. Gonzolo is expected to track north from Bermuda, grazing Newfoundland, and then curving east toward the British Isles.

Also on October 17, out in the Pacific, Tropical Storm Ana which was headed for Hawaii with sustained winds of 80mph (135kmh) veered south and missed the Big Island by 155 miles (250km). Ana’s outer bands brought some wind and rain to the islands.

New Satellites. On Sept. 20, 2014, NASA launched JPL’s RapidScat to help meteorologists spot hurricanes developing in their earliest stages. RapidScat, lodged on the exterior of the International Space Station (ISS), measures surface wind speed and direction over the ocean. Using this early information, weather forecasters are able to make much more accurate predictions about a storm’s eventual path and strength.

NASA’s SMAP satellite, scheduled for launch in January, 2015, will provide high resolution global measurements of soil moisture, which is critical for plant growth and recharging underground water supplies. SMAP data will aid in predictions of agricultural productivity, improve weather and climate forecasts, and gauge severity of droughts and where floods might occur.

Methane. Hydraulic fracturing, or fracking, is making it possible for energy producers in the US to tap into new underground gas supplies. As a result, natural gas, which is 80% to 95% methane, is plentiful and relatively cheap. More and more, industry is switching from coal to natural gas to generate power and run factories. Good news and bad news. Natural gas is less polluting than coal, but burning it still pumps 7 billion tons of methane into the atmosphere every year. Also, many believe that fracking is bad for the environment, tainting aquifers, causing earthquakes, and releasing methane into the air. Energy producers say such charges are yet to be proven.

According to a study conducted by DOE’s Pacific Northwest Laboratory and 5 teams of international climate scientists, and published in Nature Advanced Online Publication on October 14, 2014, the switch to natural gas is doing nothing to slow global warming. The low price and plentiful supply encourages industry to burn gas and continue polluting, and postpones for many years any serious efforts by industry to fully develop non-polluting energy sources such as wind, thermal, and solar. The trade-off is temporarily good for the pocketbook, but continues to heat the planet and pollute the air we breathe.

Japan Volcano Disaster

How did it happen? In the days leading up to Sept. 29, 2014, twelve seismic sensors installed on Japan’s Mt. Ontake were reporting swarms of tremors under the mountain, a sign that magma was moving up into the volcano. There were no other indications that the volcano was about to erupt, so no warnings were issued until 11:53 am on the 29th when smoke was seen rising from the caldera. Japanese authorities closed the mountain to hikers at 12:36 pm. By then it was too late. The volcano had erupted 43 minutes earlier, and more than 200 hikers had already started up the trails leading to the top. Choking clouds of ash, mud, and rock blanketed the slopes. 47 people died, and 70 were injured.

One lesson to be learned is that volcanoes are unpredictable. Seismologists usually rely on 3 different observations in predicting an eruption. In addition to seismometer readings showing magma movement, seismologists look for ground deformation on the volcano’s sides as the swelling gas pressure inside the volcano pushes the ground outward. The third predictor is an increase in the temperature of gases escaping from the caldera.

In the Mt. Ontake eruption, two of the three warning signs were absent. Tremor swarms had occurred before and nothing had happened. One theory is this was an explosion caused by water, presumably from melting snow or a recent typhoon, surging into the volcano. When the cold water hit the hot magma, what is called a phreatic eruption occurred: an explosion of hot steam blew ash, mud, and rocks out the top and spilling down the mountainside. No magma was expelled.

In most cases, enough advance warning is given for people near a volcano to evacuate before the mountain erupts. Three examples are (1) In 1991, 30,000 people living on the slopes Mt. Pinatubo in the Philippines were relocated before the volcano exploded with a Volcano Explosive Index force of 6, one of the most violent eruptions of the 20th century; (2) In 1980, evacuation of all hazard zones around the Mt. St. Helens volcano in Washington was ordered before the historic eruption blew the top off the mountain; and (3) in 1996, a valley in Iceland was evacuated before the eruption of a volcano that melted a 3-mile-thick ice cap and sent a wall of water, ice, and rock sweeping through the valley.

Sometimes, unfortunately, the authorities fail to order evacuations despite knowing of the eruption warning signs in advance. In 1985, the Nevado del Ruiz volcano in Colombia showed all the signs of imminent eruption. The authorities were advised but took no action. When the volcano erupted, it melted the glacial ice on the mountain and sent rivers of mud, ice, rock, and pumice speeding down the slopes at 60km (35 mi) an hour. Villages in the valleys below the mountain were swept away or buried in mud. 22,000 people died.

Volcanoes can go dormant for hundreds of years between eruptions. Those who live on the Pacific Rim, or wherever tectonic plates converge, should be aware that some of the nearby mountain peaks might be active volcanoes taking a nap between eruptions.

 

 

Do Earthquake Warning Systems Really Work?

On August 24, 2015, a warning horn blared in a lab at UC Berkeley 10 seconds before the shaking started.  The ShakeAlert system had picked up the Napa Valley 6.0 earthquake 10 seconds before arrival of the main shock. The lab’s instruments accurately displayed the arrival time and strength of the coming quake. But is 10 seconds enough time to react and seek safety? Maybe enough time to scramble under a table to avoid falling objects. Or not step into an elevator.

The amount of advance warning depends on distance from the epicenter. The farther from the epicenter, the more warning time given. Berkeley is about 25 miles (42km) from the Napa Valley fault line rupture site.

The ShakeAlert system is currently under development by a team of seismologists from UC Berkeley, Caltech, U. of Washington, and the USGS. It is undergoing testing in California, where 400 sensors have been installed along the state’s fault lines. The developers have requested funding to add more sensors before the system is put into full operation.

ShakeAlert is based on an earthquake early warning system that has been operating in Japan since 2007, employing automatic warning signals for mobile phones, radio, TV, and the internet. When the magnitude 9.0 earthquake and tsunami hit Japan in 2011, a teacher in a school in Sendai, the largest city near the epicenter, received a 32-second advance warning signal on his mobile phone. He immediately had the class take shelter under their desks. After the severe shaking was over, he evacuated the class to safer ground and no one was injured. Unfortunately, there is not yet an early warning for a tsunami, and that is what did most of the damage.

Earthquake early warning systems are based on the lag time between three seismic waves that occur when a fault line ruptures. The first is a compression wave called the Primary Wave or P Wave. The P Wave is fast, traveling through granite at 5,000 meters per second. The second wave or S Wave is slower, traveling at about half the speed of the P Wave. The even slower Surface Wave arrives last. The S Wave and the Surface Wave are the big shakers that do all the damage.

To quote a USGS article posted on Sept. 11, 2014, “When an earthquake occurs, both compressional P Waves and transverse S Waves radiate outward from the epicenter. The P Wave, which travels fastest, trips sensors causing alert signals to be sent ahead, giving people and automated electronic systems time (seconds to minutes) to take precautionary actions before damage can begin with the arrival of the slower but stronger S Waves and late-arriving surface waves. Computers and mobile phones receiving the alert message calculate the expected arrival time and intensity of the shaking at your location.”

Once the ShakeAlert system is operating successfully in California, the USGS plans to install the system in all earthquake-prone areas across the US. Since this author lives less than 5 miles (8km) from the southern San Andreas Fault where the Big One is expected, the sooner the system is rolled out the better.

 

Water For a Dry Southwest

For generations the Colorado River has supplied water to homes, farms, and industry in Arizona, California, Colorado, Utah, Nevada, New Mexico, and Texas. Now a 14-year drought, climate change, a growing population, and overuse are drying up the Colorado. The water levels in Lake Powell, the reservoir behind the Glen Canyon dam on the upper Colorado, and Lake Mead, the reservoir behind Hoover dam on the lower Colorado, have dropped to all time lows.

In California, the snowpack in the Sierra Nevada was 10% of normal in the winter of 2013-14. Water deliveries to farmers in the San Joaquin Valley, where 65% of the nation’s fruits and nuts are grown, were drastically reduced. Ground water supplies are also declining due to over-pumping. Water shortages are threatening to curtail the region’s $44 billion annual agricultural production.  

Water use restrictions have been imposed throughout the region, but even the strictest rationing can’t make up for the enormous loss of basic water supply. A number of ideas have been proposed for supplying additional water to the parched Southwest. Some appear less practical than others, but all are receiving new consideration.

Alaska to California Pipeline. In the 1990s, the then-governor of Alaska proposed construction of a 2,000 mile (3,218km) undersea pipeline from river sources in southern Alaska to the Shasta reservoir in Northern California. The congressional Office of Technical Assessment estimated the cost of construction at $150 billion in 1990 dollars. Most experts consider the plan unfeasible due to cost and engineering challenges.

Missouri River Pipeline. A proposal to run a 600 mile (965km) pipeline from the Missouri River to Denver has been considered by Interior’s Bureau of Reclamation. Interior Secretary Ken Salazar at the time of the proposal opposed the idea due to high construction cost, to keeping water levels on the Missouri and Mississippi high enough for navigation, and because of political opposition from environmental groups.

Converted Oil Tankers. Single-hulled oil tankers were mothballed when new laws mandated double-hulled vessels for transporting oil in 1993. A proposal to sanitize the tanks on the single-hulled tankers and use them for transporting water from Alaska to California has been floated, so to speak. The expense of bringing the ships out of mothballs, sandblasting the tanks; and fuel, crew, and maintenance costs would make it impractical to deliver fresh water by this method at an affordable price, according to those who have studied the idea.

Giant waterbags. A California company is building flexible fabric barges designed to carry more than a million US gallons of fresh water. Since fresh water is lighter than salt water, these huge waterbags will float when full, and a train of 4 or 5 of them can be towed by a vessel the size of a tug. They are built to withstand almost any weather. A train of such bags can deliver 4 to 5 million gallons of water at a cost lower than water delivered by pipeline or aqueduct, according to the builders. The bags have been used for delivering water successfully from Turkey to points in the Mediterranean, but have not yet been used on the US west coast.

Desalination. This process uses large amounts of fuel to pump seawater through filters to extract the salt. However, several of the US national labs have been researching ways to make desalination more efficient and cost effective. Fresh water supplied by the Israeli-designed plant under construction near San Diego will cost about twice as much as water from the California Aqueduct, but supplies of aqueduct water are being cut back.

The three most practical approaches at this time seem to be more efficient desalination, towing waterbags from surplus-water areas to places that need the water, or living with water rationing. People living in drought areas will have to decide.