Seismic positioning data management

Seismic Positioning Data Management (SPDM)

The company’s signature service is the provision of quality assurance / quality control services to the seismic data management community.  Here, particular emphasis is placed upon the QA/QC of the positioning and navigation data.  The company has been actively involved within this market sector for over 20 years and the experiences gained have resulted in the creation of the Geo Suite Tool Kit that are used by our analysts to assess the quality and integrity of the positioning data.

The services offered are broadly categorized as follows:

Seis Line – QC services

Seis Line QC refers to the services the company provides is determining the usefulness and correctness of seismic positioning data acquired on both land and marine seismic surveys. The process involves the implementation of our workflows and procedures with respect to both 2D and 3D seismic positioning data.

Positioning data is frequently delivered in a range of files, which are either compliant to one of the various data formats or totally non-compliant.  This often depending upon data vintage and the source of the data authority.  Regardless of these matters all seismic positioning data is subjected to the same general workflows using the functionality provided within Geo Suite Tool Kit:

  • Import – Uses the comprehensive inventory of decoders provided to enable both compliant and non-compliant positioning data to be imported to the Geo Suite Project.  Even the most esoteric .txt files can to be imported given enough time and patience!
  • View – Next, data is viewed to determine that its general appearance is as it is expected to be. Viewing is best achieved when it can be visualized against other available data sets such as block boundaries, digital imagery, asset data (platforms and pipelines) to name but a few.  Does the data fall within a concession block that the company has a license to operate within?
  • Analyze – The positioning data is subjected to a broad range of statistical analysis tested against a series of user-defined tolerance levels.  Data not passing the tolerance test is colour codes to indicate which of the tests it failed.  A set of time series graphs and data tables are also provided to assist the assessment. Reports of the data analysis are produced and recorded within the audit trail provided as part of the process.
  • Edit –  Data failing to pass the tests can be corrected to remove gross and systematic errors. The Tool Kit provides a number of editing functions that enable errors to be rectified against the results of the tests and the analysts experience.  All edits conducted on the data are recorded within the audit trail associated with the process.
  • Publish –  Regardless of the initial format and quality of the data it can be published to a set of new files that will be submitted back to the project team or added to the corporate data store. Data can be published to any of the common seismic positioning data exchange formats, referenced to which coordinate reference system is required.

Positioning data associated with 3D seismic surveys does undergo additional QC checks because of the multiple sources and streamers used during data acquisition.  These check the cross line separations of the sources and streamers as well as the depths at which the equipment is towed.

Seis Grid – QC services

Seis Grid QC provided is a set of workflows and procedures specifically developed to determine the usefulness and correctness of the full fold bin grids associated with 3D seismic surveys. The analysis is performed to ensure that the bin grids are fit for purpose in describing the attributes of the bin grid and the geographical extents over which the seismic survey was conducted.  Again Geo Suite Tool Kit provides the principle tool within which the workflows are conducted:

  • Import – Using the file decoders (P6, .txt, .csv) or the manual grid table the bin grid definition will be captured within a separate thematic layer of the Geo Suite project.  Care is taken to assure that the correct CRS was assigned to the bin grid as part of the load process.
  • View – Next, data is viewed to determine that its general appearance is as it is expected to be. Viewing is best achieved when it can be visualized against other available data sets such as pre-plot lines, block boundaries, digital imagery, asset data (platforms and pipelines) to name but a few. Does the bin grid fall within a concession block that the company has a license to operate within?
  • Analyze – The bin grid is subjected to a  range of tests and the results of the tests are written to the results panel, an associated report and audit trail. The analysis checks for a number of attributes associated with the bin grid, which include:  Orthogonality, orientation, bin increment, bin dimensions and CRS details.
  • Edit –  A bin grid failing any of the tests will be corrected either manually or using the re-adjust function and / or the re-project function.  All edits will be reported and captured within the audit trail accompanying the editing.
  • Publish –  After the bin grid is analysed and editing it will be published to a number of different formats that will typically include OGP P6/98 (P6/11), .shp file and loadsheet.  Additionally, the bin centers of the grid will be published to OGP P1 format using the Q records.  Data written to the P1 file will typically be decimated to every 10th event to reduce file size.