News
Data Information
User Access
Provider Information
Support
Related Links
  MADIS Meteorological Surface Quality Control

MADIS Meteorological Surface Quality Control Checks

The level 1 validity checks restrict each observation to falling within a TSP-specified set of tolerance limits, while the level 2 temporal consistency checks restrict the temporal rate of change of each observation to a set of (other) TSP-specified tolerance limits. In both cases, observations not falling within the limits are flagged as failing the respective QC check. The following table lists the tolerance limits:


  -----------------------------------------
  Validity Checks
  -----------------------------------------
  Dewpoint temperature       -90 -   90   F
  Relative humidity            0 -  100  %
  Relative humidity 1hr chng -50 -   50  %
  Altimeter                  568 - 1100  mb
  Altimeter 1hr change       -10 -   10  mb
  Pressure change              0 - 30.5  mb
  Sea level pressure         846 - 1100  mb
  Station pressure           568 - 1100  mb
  Air temperature            -60 -  130   F
  Air temperature 1hr change -35 -   35   F
  Wind Direction               0 -  360  deg
  Wind Speed                   0 -  250  kts
  Visibility                   0 -  100 miles
  Accumulated precip - *h      0 -   44  in
  Precipitation rate           0 -   44  in
  Soil moisture percent        0 -  100  %
  Soil temperature           -40 -  150   F
  Wind dir at gust             0 -  360  deg
  Wind gust                    0 -  287  mph
  24 hour min temperature    -60 -  130   F
  24 hour max temperature    -60 -  130   F
  Wind dir at hourly max       0 -  360  deg
  Wind speed                   0 -  287  mph
  Hourly maximum wind speed    0 -  287  mph
  Snow cover                   0 -   25  ft
  Snow fall - 6h               0 -   50  in
  Snow fall - 24h              0 -  300  in
  Sea surface temperature   28.4 -  104   F

  ---------------------------------------------
  Temporal Consistency Checks
  ---------------------------------------------
  Dewpoint temperature             35  F/hour
  Sea level pressure               15  mb/hour
  Air temperature                  35  F/hour
  Wind speed                       20 kts/hour
  Soil temperature                  5  F/hour
  Sea surface temperature           9  F/hour

The level 2 internal consistency checks enforce reasonable, meteorological relationships among observations measured at a single station. For example, a dewpoint temperature observation must not exceed the temperature observation made at the same station. If it does, both the dewpoint and temperature observation are flagged as failing their internal consistency check. Pressure internal consistency checks include a comparison of pressure change observations at each station with the difference of the current station pressure and the station pressure three hours previous, and a comparison of the reported sea-level pressure with a sea-level pressure estimated from the station pressure and the 12 hour mean surface temperature. In the former check, if the reported 3h pressure change observation does not match the calculated ob, then only the reported observation is flagged as bad. In the latter check, however, if the reported sea-level pressure does not match the calculated ob, then both the sea-level and station pressure obs are flagged as failing.

The level 2 statistical spatial consistency check uses weekly QC statistics to mark observations as failed if they failed any QC check 75% of the time during the previous 7 days. These observations will continue to be marked as failed by this check until such time as the failure rate falls below 25% in the weekly statistics. This check is only performed on observation types that go through the level 3 spatial consistency check.

The level 3 spatial consistency (or "buddy") check is performed using an Optimal Interpolation (OI) technique developed by Belousov et al. (1968). At each observation location, the difference between the measured value and the value analyzed by OI is computed. If the magnitude of the difference is small, the observation agrees with its neighbors and is considered correct. If, however, the difference is large, either the observation being checked or one of the observations used in the analysis is bad. To determine which is the case, a reanalysis to the observation location is performed by eliminating one neighboring observation at a time. If successively eliminating each neighbor does not produce an analysis that agrees with the target observation (the observation being checked), the observation is flagged as bad. If eliminating one of the neighboring observations produces an analysis that agrees with the target observation, then the target observation is flagged as "good" and the neighbor is flagged as "suspect." Suspect observations are not used in subsequent OI analyses. The following figure illustrates the reanalysis procedure:

To improve the performance of the OI, RSAS analysis fields from the previous hour are used as background grids. The analyses provide an accurate 1-h persistence forecast and allow the incorporation of previous surface observations, thus improving temporal continuity near stations that report less frequently than hourly. The differences between the observations and the background are calculated and then interpolated to each observation point before the OI analysis is performed. In addition, uniform distribution of the neighboring observations used in the spatial consistency check is guaranteed (whenever possible) by a search algorithm which locates the nearest observation in each of eight directional sectors distributed around the target observation.

Temperature observations are converted to potential temperature before application of the spatial consistency check. Potential temperature varies more smoothly over mountainous terrain when the boundary layer is relatively deep and well mixed, a marked advantage during daytime hours. For example, potential temperature gradients associated with fronts tend to be well defined during the day even in mountainous terrain (Sanders and Doswell 1995). Unfortunately, this advantage often disappears at night when cool air pools in valleys. To improve the efficacy of the spatial consistency check in these circumstances, elevation differences are incorporated to help model the horizontal correlation between mountain stations. (Miller and Benjamin 1992). The error threshold (to which the absolute value of the difference between analyzed and observed values is compared) is a function of the forecast error, the observational measurement error, and the expected analysis error (Belousov et al. 1968, pg. 128).

*It should be noted that while the QC checks discussed here are generally applied to the form of the variable stored in the database, the QC results will also be applied to any forms of the variable that are requested by the user and are derived from the primary variable. For example, specific humidity will get the QC results from the checks applied to dewpoint temperature.

Station Monitoring

For surface data only (at this time), the MADIS processing also keeps statistics on the frequency and magnitude of the observational errors encountered for NWS sea-level pressure, potential temperature, dewpoint, and surface wind. At the completion of each hourly analysis, the system provides the total number of observations for each variable, the number of observations that failed the QC check, the station names for the failed observations, and the error and threshold values for each of the failed observations. The error is defined as the difference between the QC analysis value and the observed value, as computed in the spatial consistency check described above.

Statistics are calculated for all stations. Stations from different networks are kept statistically separate. Specifically, the following stratifications are currently maintained: "ASOS," "SAO" (METAR manual), "AUTO" (METAR automated, but not ASOS), "BUOY," and "NPN" (NOAA Profiler Network). Local mesonets are stratified by provider. For example, "CDOT," for the Colorado Department of Transportation.

Current hourly, daily, weekly, and monthly QC messages are available for the various surface observing networks.

Subjective Intervention

Two text files, a "reject" and an "accept" list provide the capability to subjectively override the results of the automated QC checks. The reject list is a list of stations and associated input observations that will be labeled as bad, regardless of the outcome of the QC checks; the accept list is the corresponding list of stations that will be labeled as good, regardless of the outcome of the QC checks. In both cases, observations associated with the stations in the lists can be individually flagged. For example, wind observationsat a particular station may be added to the reject list, but not the temperature observations.

Here are the current subjective intervention lists in use:

QC and station monitoring procedures are not affected by subjective intervention lists, with the sole exception that observations on the reject list will be labeled as "suspect" and not used to check the spatial consistency of neighboring observations. This will allow MADIS personnel to continue to monitor the performance of the stations contained in the lists. For example, a station with wind observations that fail the QC checks a large percentage of the time may be added to the reject list. However, once the observation failure rate at the station falls back to near zero (possibly due to an anemometer that has been repaired), the station will likely be deleted from the list.

QC Data Structures

The MADIS QC information available for each variable includes the following QC structures: a single-character "data descriptor", intended to define an overall opinion of the quality of each observation by combining the information from the various QC checks, and for users desiring detailed information, a "QC applied" bitmap indicating which QC checks were applied to each observation, and a "QC results" bitmap indicating the results of the various QC checks.

The following table provides a complete list of the data descriptors and the bits used in the bitmaps:

  ------------------------------
  MADIS QC Information - Surface
  ------------------------------

  QC Data Descriptor Values
  -------------------------

  No QC available:

   Z - Preliminary, no QC

  Automated QC checks:

   C - Coarse pass, passed level 1
   S - Screened, passed levels 1 and 2
   V - Verified, passed levels 1, 2, and 3
   X - Rejected/erroneous, failed level 1
   Q - Questioned, passed level 1, failed 2 or 3

       where level 1 = validity
             level 2 = internal consistency, temporal consistency,
                       statistical spatial consistency checks
             level 3 = spatial consistency check

Subjective intervention:

   G - Subjective good
   B - Subjective bad

  Interpolated/Corrected observations:

   T - Virtual temperature could not be calculated, air temperature passing all QC
       checks has been returned

  Bitmask for QC Applied and QC Results
  -------------------------------------

   Bit       QC Check                              Decimal Value
   ---       --------                              -------------
    1        Master Check                                 1
    2        Validity Check                               2
    3        Reserved                                     4
    4        Internal Consistency Check                   8
    5        Temporal Consistency Check                  16
    6        Statistical Spatial Consistency Check       32
    7        Spatial Consistency Check                   64
    8        Reserved                                   128
    9        Reserved                                   256
   10        Reserved                                   512

The QC bitmask is used in the QC applied and QC result "words" returned along with the QC data descriptor. By examining the individual bits, the user can determine which checks were actually applied, and the pass/fail status of each check that was applied.

In the QC applied word, a bit value of 1 means the corresponding check was applied, a bit value of 0 indicates the check wasn't applied.

In the QC results word, a bit value of 1 means the corresponding check was applied and failed, a bit value of 0 indicates the check passed (given that the check was applied).

The "Master Check" is used to summarize all of the checks in a single bit. If any check at all was applied, this bit will be set in the QC applied word. If the observation failed any QC check, it will be set in the QC results word.

When read as decimal numbers, the different bits that are set in the bitmask are summed together. For example, a QC applied value of 67 should be interpreted as 1 + 2 + 64, meaning the validity and spatial consistency checks were applied.


References

Belousov, S.L., L.S. Gandin, and S.A. Mashkovich, 1968: Computer Processing of Current Meteorological Data. Ed. V. Bugaev. Meteorological Translation No. 18, 1972, Atmospheric Environment Service, Downsview, Ontario, Canada, 227 pp.

Miller, P.A. and S.G. Benjamin, 1992: A System for the Hourly Assimilation of Surface Observations in Mountainous and Flat Terrain. AMS Monthly Weather Review, 120, 2342-2359.

Sanders, F., C.A. Doswell III, 1995: A case for detailed surface analysis. Bull. Amer. Meteor. Soc., 76, 505-521.

Technique Specification Package 88-21-R2 For AWIPS-90 RFP Appendix G Requirements Numbers: Quality Control Incoming Data, 1994. AWIPS Document Number TSP-032-1992R2, NOAA, National Weather Service, Office of Systems Development.


Last updated 15 March 2017