A Weather Modeling Project/Ecosystem

in #boinc7 years ago (edited)

Hurricanes are in the news these days and it's got everyone talking about the weather. k.

The US media loves to go on about the "European Model" and the "American Model" as the go-to hurricane models. As I understand it, there are many more different atmospheric modeling algorithms which differ in focus and mathematics. The most reliable of these model types are the data driven models. The accuracy of data driven models is limited primarily by the meteorologists access to processing power. In other words, if you want a highly accurate model you have to process a large volume of data, and that requires access to a large supply of processing power or an extended length of time. Even with today's most powerful super computers, a highly accurate model might take longer to process than the life of the storm itself.

To make actionable models, meteorologists must sacrifice data, or accuracy, with respect to their usable processing power and the length of time before action is required. Some models I've come across seem to seek 4 hours or 12 hours as an appropriate run-time. If you want a practical model, you must sacrifice data for speed.


So in the above models we consider:

  • Various types of data input
  • Algorithms which analyze the data
  • A time-frame of action.

Imagine an ecosystem which uses grid computing to form atmospheric prediction models.

Assumptions:

  • People consider all weather modeling up to 5 days ahead important enough to volunteer their spare IPP if compensated at least partially for electric costs.
  • The importance an individual prescribes to accurate weather modeling increases as their local atmosphere becomes more dynamic.
    • Further, as an atmosphere becomes more dynamic individuals prescribe more importance to models accurate more days ahead.
  • Modeling a more static atmosphere requires relatively less data processing than a more dynamic atmosphere.

Let us start with a homogeneous atmosphere -- clear skies; not a molecule out of place. Clearly, modeling this atmosphere would require little data.

Let us now assume an atmosphere of relative calmness -- day-to-day weather; a Bob Ross painting. Let's call it a Basic Atmosphere.

This atmosphere is dynamic, but not terribly. It requires a level of processing power, but nothing extreme. Its accuracy is important, but not crucial. In other words, we need a low level of processing power to produce models accurate enough to plan our day-to-day activities.

Let us now assume an above average thunderstorm -- Rain, maybe some low level flash flooding when the leaves gather under the overpass. A branch falls on your neighbors car. Maybe brings down a power line or two.

This atmosphere is a Dynamic Atmosphere. Its accuracy is more important than a basic atmosphere. This increases processing required for this type of atmosphere in three ways:

  1. Models must be made quicker.
  2. Models must be made more accurate more days ahead.
  3. To increase accuracy a meteorologist must input more data.

In other words, we need a mid level of processing power to produce models accurate enough to plan our actions with regards to storms which might leave us without access to basic necessities or mobility for a night or two.

Let us now assume a category 5 hurricane -- Catastrophic.

This atmosphere is Hyper-Dynamic. Its accuracy is critical -- on the level of life and livelihood. Its accurate modeling requires a level of processing power that is only just becoming accessible.

So,

Atmospheric models, or weather predictions, hold importance because they range from helping us plan our week to saving our lives. The more dynamic the atmosphere, the more dangerous the weather. The more dangerous the weather, the more importance is given to actionable models.

The ability of the project/project(s)/network to model the real atmosphere in actionable time is determined by the importance given to weather modeling at a given moment in time.

For example: Irma becoming a Category 3 storm with the possibility of impacting Florida might have led to a surge in processing power to the project. This surge in processing power might have led to greater confidence in the path of the storm, which might have led to earlier evacuation warnings and given more time to prepare. Etc.

I think there are several ways something like this could be built. I think I'm going to look into what it would take to make a BOINC project which accepts data from meteorological institutions and any sensors someone might have hooked up.

What do you think?


Feel free to reach out to me on Slack if you think this is something with which you could help or have any specific questions.

Sort:  

The idea is nice but weather prediction is a rather complex process. Lets assume you are able to get all the raw data from the weather stations which might be a challenge on its own and a considerable amount of data. After each model run you have a data assimilation process which introduces the new measurements from the time since the last run. And it is not that easy to just throw more processing power on it. The model parameters have to be set carefully. For example you have to fulfill the CFL criterion to get a numerical stable model. That means you have to change the time resolution once you change the space resolution accordingly. There are also parameterisations in weathermodels which replace otherwise very expensive computations. So if you have more processing power available you might want to replace these with a more accurate calculation. But that would mean you had to change the code. And the main problem might be that I am not sure how you would parallelize the model without waiting for other clients to finish their computations.
I think implementing wheather modeling in BOINC is a big challenge.
But there might be areas other than numerical weather prediction better suited for BOINC.
Climate predicion (like climateprediction.net already does) for example is much easier done on BOINC. The difference with climate modeling is that the state of the atmosphere you start your computation with is not important (as it is with weather prediction). In climate modeling you run a model with a certain set of parameters until it is in a equilibrium state. Each host calculates the model with a different set of parameters. Way easier to handle in a BOINC environment than weather forecasting.

Over at www.electricchain.org we are trying to build a system that injects data from solar PV systems into the blockchain, we can then extract the data to build huge models of solar insolation at many locations.

Capture.JPG

We hope data like this will be useful to add many more data points to climate science models.

Love it! Using mathematical algorithms to predict weather at a faster rate would definetly be useful. And if something changes theres enough compute power to read and decipher it, would be applicable but just need access to that data and be used by weatherologists, [i know thats not the right word lol]

Coin Marketplace

STEEM 0.35
TRX 0.12
JST 0.040
BTC 70733.96
ETH 3563.16
USDT 1.00
SBD 4.76