Abstract

The formulation and calibration of models is a vital method for probing and predicting the behavior of marine ecosystems. The ability to do this may suffer, however, if the calibrating data set is subject to significant spatial variability between samples that is not resolved in the model. We propose that some of this variability might be accounted for by variable time lags between sampled water masses which are otherwise assumed to follow a common pattern of ecosystem variability (dynamical trajectory). Using twin tests of fitting models to simulated data sets, we show that realistic levels of meso/sub-mesoscale variability in time lags may have significant distortion effects on the parameter fits from standard methods which do not account for it. The distortion is such as to 'smooth out' or underestimate the magnitude of temporal variability within sampled water masses, causing loss of accuracy and robustness of biological parameter estimates and functions thereof (e.g. gross primary production). A new method of model fitting is shown to avoid these effects, allowing improved estimates over a broad range of spatial time lag variability and measurement noise levels, assuming accurate estimation of the time lag variance, for which we also suggest a method.

Share

COinS