cabbagehead
|
It comes down to chaos theory, which came about thanks to early weather simulations.
In the early 1960s, meteorologists Edward Lorenz was using a computer to work on formulas to predict weather patterns using a simple mathematical model. He found the results he was getting were grossly inaccurate. While the results the computer printed up had a precision of six digits past the decimal point (1.000000,) the calculations were done to the third digit (1.000.) While minute, these changes were enough to throw off the model.
In 1963, Lorenz presented his discovery in a paper titled "Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas?" which popularized the term "butterfly effect." Although his model was relatively simple,* even the most minute change would eventually have major consequences. This meant that accurate weather predictions would be impossible because there was no way all of the contributing factors could be taken into account.
Computer technology has improved tremendously since then, with some of the most powerful computers in the world are used for weather modeling. This has improved weather forecasting, but they still can't take everything into account to guarantee accuracy.
*His formula had twelve variables for predicting overall weather patterns. The formula currently used for calculating the heat index uses eleven variables.
Posted 5341 day ago
|