Will it rain, or shine?

Weather forecasting is a scientific technique to predict the conditions of atmosphere for a given location and time through the application of the principles of physics, supplemented by a variety of statistical and empirical methods. However, broadly speaking, it encompasses the study of different changes on earth’s surface caused by the ice covers, snow tides and floods. Other than that, there are a variety of end uses to weather forecasts. Weather warnings are important because they are used to protect life and property. Forecasts based on temperature and precipitation are important to agriculture, and therefore to traders within commodity markets. Temperature forecasts are used by utility companies to estimate demand over coming days. On an everyday basis, people use weather forecasts to determine what to wear on a given day. Since outdoor activities are severely curtailed by heavy rain, snow and wind chill, forecasts can be used to plan activities around these events, and to plan ahead and survive them.  Therefore no wonder, that in 2009, the US spent $5.1 billion on weather forecasting.

The two men credited with the birth of forecasting as a science were, an officer of the Royal Navy Francis Beaufort and his protégé Robert Fitz Roy. Both were influential men in British naval and governmental circles. Though ridiculed in the press at the time, their work gained scientific credence and was accepted by the Royal Navy and formed the basis for all of today’s weather forecasting knowledge.

The process of weather forecasting is still carried out in basically the same way as it was by our forbearers, by making observations and predicting changes. Although the modern tools to measure the temperature, pressure, wind velocity and humidity  are obviously  better and  are yielding amazing results, yet, even the most sophisticated numerical calculations made on a supercomputer require a set of measurements of the initial conditions of the atmosphere.

The assertion that “He alone knows” is absolute because uncertainty is the inherent law of nature. Since one cannot have thunder without lightning, the polytheistic religions have presumably welded both “thunder god” and “storm god” to have a god of weather.

One cannot know things exactly, even in our chosen fields of expertise because observations are so critical to weather prediction, that atmosphere is truly a random laboratory. The inaccuracy of forecasting is due to the chaotic nature of the atmosphere.  The forecasts also become less accurate as the difference between the current time and the time for which the forecast is being made (the range of the forecast) increases.  We have seen that a forecast proposed recently about a given day is more true than the one made remotely. There has however been a significant improvement in precision since the mid 20th century. The digital computers have made it possible to calculate changes in atmospheric conditions, mathematically and objectively, in such a way that anyone can obtain the same result from the same initial conditions. The widespread adoption of numerical weather prediction models has brought a whole new group of players, computer specialists and experts in numerical processing and statistics to the scene of work with atmospheric scientists and meteorologists. The basic idea of numerical weather prediction is to sample the state of the fluid at a given time and use the equations of fluid dynamics and thermodynamics to estimate the state of the fluid at some time in the future. The main inputs from country-based weather services are surface observations from automated weather stations at ground level over land and from weather buoys at sea. Moreover, the enhanced capability to process and analyze weather data has stimulated the long-standing interest of meteorologists in securing more observations of greater accuracy. Technological advances since the 1960s  led to a growing reliance on remote sensing, particularly the gathering of data with specially instrumented Earth-orbiting satellites. By the late 1980s, forecasts of the weather were largely based on the determinations of numerical models integrated by high-speed supercomputers except for some shorter-range predictions, particularly those related to local thunderstorm activity, which were made by specialists directly interpreting radar and satellite measurements. By the early 1990s a network of next-generation Doppler weather radar (NEXRAD) was largely in place in the United States, which allowed meteorologists to predict severe weather events with additional lead time before their occurrence. During the late 1990s and early 21st century, computer processing power has also increased, which allowed weather bureaus to produce more sophisticated ensemble forecasts.

Qudsia Gani is Faculty (Physics), Cluster University Srinagar