(Un)certainty in hydrological modeling: Analysis tools
16 important questions on (Un)certainty in hydrological modeling: Analysis tools
Assume that we could know all states and all fluxes of everything, would then be able to exactly predict what would happen in the next time fraction? Is the world deterministic or probabilistic?
Deterministic: if everything is known, we can predict everything (min 5)
Currently it seems like the world is probabilistic (Lieke), as we do not know everything (and never will) and multiple things seem to be able to happen from the same state. So we cannot model everything.
What is Heisenbergs uncertainty principle?
Name 3 main sources of uncertainty in hydrological sciences?
- Data uncertainty
- Parameter uncertainty
- Model structural uncertainty
- Higher grades + faster learning
- Never study anything twice
- 100% sure, 100% understanding
Parameter uncertainty: Name several important examples/definitions occuring
- Equifinality
- Generalized Likelihood Uncertainty Estimation (GLUE)
Measurement device issues, scaling issues, no physical representation
What is equifinality, part of parameter uncertainty?
Model parameters: several parameters combis can lead to the same model performance (and then how do you know which combi is good); overparameterization, non-uniqueness, ill-defined problem.
Equifinality also applies to model structures, but it more often used in parameter uncertainty within hydrology.
What does overparameterization/non-uniqueness/ill-defined problems mean? (it's the same thing but different terms in different fields)
How problematic is equifinality
- VIC model > 30 parameters
- Usually, around 5 parameters are highly sensitive that you calibrate
- For 5000 grid cells: 25.000 parameters (5 calibrated parameters x 5000 grid cells)
What are the 3 ways of dealing with equifinality?
- Pareto-optimization or multiple objective functions
- Add multiple observations to calibrate/validate
- Use transfer functions (for distributed modelling)
Explain how you could use multiple observations to calibrate/validate, to deal with equifinality?
First calibrate on Q, you have limited parameter sets and you see how well they do on eg ET/soil moisture/tracers. So by adding more info/requirements you can further determine which parameter sets make most sense and hereby constrain your model.
Problem: model performance could decrease quite a bit by adding more info. Because models are made to calculate Q, adding more data on SM, it colud show the internal consistency of your model (so the processes it simulates) may not make sense at all.
How to read the GLUE parameter uncertainty graph and what are the 4 steps taken to get to a prob dist?
- Dotted line: uniform distribution from which we sample
- NSE: calculated for each run
- > 0.5: only values with result of NSE > 0.5 are considered. All values below are irrelevant or non-behaviourial
- Probability distribution of values with KGE > 0.5 is made
Multiplying y-axis with x-axis should give a 1, since it is a probability distribution.
Highest point in the most probable value and we also have an idea of uncertainty.
I think NSE/KGE etc are called objective functions.
What are the two main sources of model structural uncertainty?
- Assumptions and simplifications
- Process representation
How to deal with model structural uncertainty?
Ensemble mean generally outperforms individual ensemble members. Why is not entirely clear, but hypothesis is that all models have some things they do(nt) account for or some bugs, idea is that these cancel each other out in the ensemble. In this way, errors can cancel each other out.
Con: very time consuming to make model ensemble from different models
How you select these ensembles you should be very carfeul about!
Explain the modular modelling frameworks?
Then we can test which part of the model structure causes the difference (instead of working with all these off the shelf models and then you don't know which hypotheses you are testing). New problems such as do you need to calibrate again, arise.
You can add new building blocks in a new framework more easily than in an already existing model.
What are the main sources of uncertainty?
- Model structure uncertainty
- Parameter uncertainty
- Input data uncertainty
- Observation uncertainty
Also as modellers we make a lot of decisions on forcing data, resolution, calibration period, etc, etc. These influence the sources of uncertainty within the model.
Question: predict a flood-event ....
It is really hard to make the 'right' choice!
What are the main four conclusions on these model uncertainty lecture series?
- Models are never complete and always prone to uncertainty
- Estimating uncertainty puts results into perspective
- Models are not 'value-free' objective tools, but social constructs
- Models are prone to deep uncertainty: unknown unknowns
The question on the page originate from the summary of the following study material:
- A unique study and practice tool
- Never study anything twice again
- Get the grades you hope for
- 100% sure, 100% understanding