A Blog by Jonathan Low


Apr 19, 2019

How 5G Interference May Put Accurate Weather Reporting At Risk

At the intersection of convenience, speed and safety. JL

Dan Maloney reports in Hackaday:

Three-day outlooks are right 90% of the time. What made accuracy possible is super computers running weather modeling software. But models are only as good as the raw data they use as input, and increasingly that data comes from satellites with sensitive sensors detecting changes in winds and water vapor in real-time. People tasked with running these systems believe the quality of that data faces a threat from 5G cellular networks. Microwave radiometry can tell us what’s going on within a vertical column of  atmosphere. 23.8-GHz (is) in danger of picking up interference from 5G, which will use frequencies very close to that.
If the great Samuel Clemens were alive today, he might modify the famous meteorological quip often attributed to him to read, “Everyone complains about weather forecasts, but I can’t for the life of me see why!” In his day, weather forecasting was as much guesswork as anything else, reading the clouds and the winds to see what was likely to happen in the next few hours, and being wrong as often as right. Telegraphy and better instrumentation made forecasting more scientific and improved accuracy steadily over the decades, to the point where we now enjoy 10-day forecasts that are at least good for planning purposes and three-day outlooks that are right about 90% of the time.
What made this increase in accuracy possible is supercomputers running sophisticated weather modeling software. But models are only as good as the raw data that they use as input, and increasingly that data comes from on high. A constellation of satellites with extremely sensitive sensors watches the planet, detecting changes in winds and water vapor in near real-time. But if the people tasked with running these systems are to be believed, the quality of that data faces a mortal threat from an unlikely foe: the rollout of 5G cellular networks.

Where’s the Water?

To understand how a new generation of wireless technology can deleteriously impact weather forecasting, it helps to take a look at exactly what powers the weather, and what these satellites are looking at. Our weather is largely the result of differences between air masses. Pressure, temperature, and moisture, each determined by energy inputs from the Sun, all team up in a complex manner to determine where and when clouds will form and which direction the winds will come from. Remotely sensing these differences is the key to accurately forecasting the weather.
The satellites that watch our weather are largely passive sensor platforms that measure the energy reflected or emitted by objects below them. They gather data on temperature and moisture — pressure is still measured chiefly by surface measurements and by radiosondes — by looking at the planet in different wavelengths. Temperature is measured mainly in the optical wavelengths, both visible and infrared, but water vapor is a bit harder to measure. That’s where microwaves come in, and where weather prediction stands to run afoul of the 5G rollout.

NASA’s Advanced Microwave Sounding Unit (ASMU-A1). Source: ESA
Everything on Earth – the plants, the soil, the surface water, and particularly the gases in the atmosphere – both absorb and, to a lesser degree, emit microwave radiation. Measuring those signals from space is the business of satellites carrying microwave radiometers, essentially sensitive radio receivers tuned to microwave frequencies. By looking at the signals received at different wavelengths, and by adding in information about the polarization of the signal, microwave radiometry can tell us what’s going on within a vertical column of the atmosphere.
For water vapor, 23.8-GHz turns out to be very useful, and very much in danger of picking up interference from 5G, which will use frequencies very close to that. Since microwave radiometers are passive receivers, they’ll see pretty much everything that emits microwave signals in that range, like the thousands of cell sites that will be needed to support a full 5G rollout. Losing faint but reliable water vapor signals in a sea of 5G noise is the essential problem facing weather forecasters, and it’s one they’ve faced before.

Real World Consequences

At the 2019 annual meeting of the American Meteorological Society, Sidharth Misra, a research engineer at NASA’s Jet Propulsion Laboratory, presented data showing how commercial enterprises can have unintended consequences on the scientific community. Between 2004 and 2007, satellite-based microwave radiometers detected an increase in noise in a curious arc across the top of the United States. A similar signal was detected by another satellite, with the addition of huge signals being returned from the waters off each coast and the Great Lakes. The signals turned out to be reflections from geosynchronous direct TV satellites, bouncing off the surface and swamping the water vapor signals the weather satellites were trying to measure.

Reflections from DTV satellites can effectively blind microwave radiometers. Source: AMS meeting panel discussion, “The Wizard Behind the Curtain?—The Important, Diverse, and Often Hidden Role of Spectrum Allocation for Current and Future Environmental Satellites and Water, Weather, and Climate”
But surely the scientists are overreacting, right? Can losing one piece of data from as complex a puzzle as weather prediction really have that much of an impact? Probably yes. The water vapor data returned by microwave radiometers like the Advanced Microwave Sounding Unit (AMSU) aboard a number of weather satellites is estimated to reduce the error of weather forecasts by 17%, the largest contributor by far among a group of dozens of other modalities.
The loss of microwave water vapor data could have catastrophic real-world consequences. In late October of 2012, as Hurricane Sandy barreled up the East coast of the United States, forecasts showed that the storm would take a late turn to the northwest and make landfall in New Jersey. An analysis of the forecast if the microwave radiometer data had not been available showed the storm continuing in a wide arc and coming ashore in the Gulf of Maine. The availability of ASMU data five days in advance of the storm’s landfall bought civil authorities the time needed to prepare, and probably reduced the casualties caused by the “Storm of the Century”, still the deadliest storm of the 2012 season.

Superstorm Sandy would have been predicted to track into the Gulf of Maine (red) without microwave water vapor data. It actually landed in New Jersey, as predicted five days out with the satellite data (black).

Auction Time

So exactly where are we with this process? The FCC auction of licenses for the Upper Microwave Flexible Use Service (UMFUS), which offers almost 3000 licenses in the 24-GHz band, began on March 14, 2019, despite a letter from NASA Administrator Jim Bridenstine and Secretary of Commerce Wilbur Ross requesting that it be delayed. FCC Chairman Ajit Pai rejected the request, stating that there was an “absence of any technical basis for the objection.”
Will the 5G rollout negatively impact weather forecasts? It’s not clear. Licensees are required to limit out-of-band emissions, but with so many 5G sites needed to cover the intended service areas, and with the critical 23.8-GHz water vapor frequency so close to the UMFUS band, there’s not much room for error. And once the 5G cat is out of the bag, it’ll be difficult to protect that crucial slice of the microwave spectrum.
Whatever happens, it doesn’t look good for weather forecasting. The UMFUS auction proceeds apace, and has raised almost $2 billion so far. Companies willing to spend that much on spectrum will certainly do whatever it takes to realize their investment, and in the end, not only will science likely suffer, but lives may be put at risk for the sake of 5G as our toolset for predicting dangerous weather faces this new data-gathering challenge.


Post a Comment