The use of data acquisition and processing technologies produces considerable quantities of data. It is extremely important to establish good data management practices, to describe how the data are acquired and processed. Today this notion of quality is indispensable for ensuring data interoperability and traceability, especially as the notions of quality and level of validation are not "universal" and can vary from one project to another according to the type of data exploitation.
Generally speaking, metadata or "data about data" compile all the necessary information required to use the data effectively. There are 3 levels of metadata:
- The first allows that data to appear in the inventory of available data (within the Data Catalogue such as Sextant)
- The second informs potential users of the data owner, where to find the data and how to obtain them (listing conventions of use where relevant)
- The third and final level allows users to estimate the quantity and adequacy of the data before accessing them (objectives, resolution, quality, limits of use, etc.).
As soon as a processing chain is applied to data, the precise configurations defined at each stage in the process must be defined. This makes it possible to reproduce this processing chain on another set of data available at another site or a later date. Yet above all, it is important to understand that data have a life beyond map production: they end up in different places (organisations, databases, web portals) and can serve as a basis for other studies. All the (relevant) metadata must accompany data on their journey from one stage and one place to another.
Data without associate metadata are simply unusable.
For several years, data producers have be increasingly aware of the need to complete metadata according to international standards, and taking into account European regulations, more specifically the INSPIRE Directive (Infrastructure for Spatial Information in the European Community).
Over and above the above-described initiatives designed to ensure that data management system function correctly and are accessible to users, emphasis is also placed on the quality of banked data, in close association with the scientific teams coordinating observations at sea, and in accordance with the major international programmes (ARGO, IODE/IOC, etc.).
The aim of this quality control is to enhance the reliability of data delivered to users, but also to allow exchanges of consistent and comparable data at European and international levels.