I thought it’d be useful at this point to describe how data are screened at BODC. We quality control data using BODC’s in-house software, Edserplo.
With sea level data, we look at both the recorded data and the residual values. The residual is the measured value minus the predicted tide value and so should largely be a reflection of the meteorological conditions. The residuals are useful for detecting instrumental faults such as timing errors, datum shifts and spikes.
Tidal predictions are produced at the National Oceanography Centre (NOC) in Liverpool. They’re generated from harmonic constants using NOC’s tidal analysis software.
Sometimes the harmonic constants can be severely corrupted. The site may have highly nonlinear tides, unusual geography, or may be influenced by nearby estuaries. To produce more accurate predictions, it is advisable to compute fresh tidal constants from recent data, rather than relying on historic values. This is done using Doodson harmonic analysis.
The standard procedure at BODC for the quality control of sea level data includes, where possible:
- Producing a tidal analysis and comparing the major tidal constituents (M2, S2, N2, K1, O1, Z0) with previous data series, adjacent sites and the Admiralty Tide Tables for the closest site
- Screening the series, looking for spikes, gaps, timing errors and datum shifts
- Comparing the series with previous series from the same site
- Comparing the series with data covering the same period from neighbouring sites
- Looking at other parameters, such as sea temperature and atmospheric pressure
- Checking the statistics produced, i.e. mean sea levels, with those from previous years
We mark suspicious data points (and null values) with flags and note any timing errors or datum shifts. No data values are changed. The data quality is noted in accompanying documentation.