GSN Data Quality

A key GSN goal has always been recording high-quality seismic waveforms. However, aging of the GSN equipment and the discovery of GSN data problems from 2004-2014 motivated a focused and holistic approach to GSN data quality than previously implemented. The discovery of problems with the GSN re-energerized a focus on data quality metrics, and the continuing resolution of these problems places GSN at the forefront of data quality best practices. These include all major operational aspects, from instrumentation, to routine monitoring and analysis of waveform data to detect and/or track problems, to action plans that address station quality issues as they are found, to operational transparency of data quality issues for users. The following web pages below present a brief history of issues that motivated the GSN’s current focus on quality, and details the GSN’s approach to achieve improved and assured data quality.

Recent History

In the mid 2000s, there were indications of problems with the long-period response of the STS-1 sensors used in the GSN. These issues were identified by the Waveform Quality Center (WQC) at the Lamont-Doherty Earth Observatory (LDEO) and were observed as scaling problems when performing their ongoing global centroid moment tensor analysis (Ekström et al., 2006). In 2010, the WQC issued a series of 10 reports detailing performance issues that were observed at 10 different GSN stations, which spanned the time period from installation of the station up to the time of the report. Problems associated with the degradation of the STS-1 sensors were observed, along with maintenance issues such as high noise levels, channel polarity, orientation, and metadata accuracy. Two studies published by the IDA group in 2005 and 2007 also indicated there were ongoing issues related to metadata inaccuracies. This latter work focused primarily on observations of radial normal modes and tides (Davis et al., 2007).

EarthScope Consortium (then IRIS) responded to the quality issues by organizing a Quality Assessment Team (QAT) to review the state of data quality policies, practices, and procedures across all of EarthScope's observing activities. EarthScope also appointed a GSN Data Quality Panel, with outside membership, to review GSN QC issues and the GSN’s response. The QAT assembled a comprehensive set of materials documenting QC practices and procedures, and the Data Quality Panel reviewed these materials and provided feedback. The GSN went on to develop a Concept of Operations as well as an Implementation Plan for the GSN Quality Assurance System that is currently implemented. The GSN has also developed a Data Quality Goals document that guides the current GSN quality objectives and is intended to be updated at regular intervals. For more information, please see the GSN Data Quality Initiative page

As a result of the GSN Data Quality Initiative, EarthScope, in conjunction with the USGS and IDA:

  • Deployed new STS-1 feedback electronics boxes (FBEs)
  • Implemented a new calibration policy, including guidelines for when metadata are updated based on calibration results
  • Deployed Next Generation Systems (NGS) to overcome limitations and aging issues of the legacy GSN station hardware
  • Developed a guide to sensor orientation best practices and onsite “absolute” calibrations
  • Developed and implemented the Data Quality Assessment (DQA) tool at the ASL Data Collection Center
  • Implemented the MUSTANG tool at the DMC for computing quality metrics
  • Implemented the LASSO tool for aggregating MUSTANG data metrics into actionable reports
  • Implemented a problem-tracking system at ASL to integrate QC and field-engineering efforts

These efforts have had a positive impact on both data and metadata quality and improved disclosure of quality related data metrics to the user community. In a follow-up paper in 2012, the IDA group employed the same methodology as their earlier studies, and the results indicated a measurable improvement in metadata accuracy for those sensors whose response was checked using the new quality assessment tools. Furthermore, Gee et al. [2014] showed the positive impact on data quality realized by recent upgrades of seismometers, feedback electronics boxes, and data loggers as well as improved calibration procedures and policies. They also showed that with careful analysis of the data and station records it is possible to correct metadata for some historical station epochs.

Station by Station Data Quality Snapshot Tables