Ever increasing quantities of data in an expanding number of fields
With the first deployments of Argo floats in the 2000s, in situ observation of the ocean hit a turning point: this was the onset of automation. This new pathway was quick to branch out to cover coastal areas, offshore area and the seabed. The sensors mounted on these automated observatories, limited to a few parameters some ten years ago, are now able to measure a wide range of physical, chemical and biological parameters at high frequency. Routine micro-organism identification and counts would appear to be achievable in the medium term. Acoustic or optical imagery also appear possible.
Thanks to the miniaturisation of sensors and their increased robustness, they can potentially be used in a variety of conditions, on automated platforms (floats, gliders, etc.), by sea professionals (voluntary fishermen, etc.) or associations.
Meanwhile, the increasing bandwidth of remote transmission systems means that observations can be transmitted at full native resolution.
Increasing data: a question of organisation
The role of data centres is reinforced by these automatically transmitted data streams. Such data flows call for an operational organisation in order to:
- organise routine data reception,
- control and ensure rapid feedback to the teams which deployed the sensors,
- transmit the data in real time,
- harmonise and integrate datasets from different sources.
To implement these new tasks, data centres must:
- define the procedures to be applied, the feedback and alerts to be issued, in collaboration with the deployment teams and teams using the data,
- organise data and metadata flows in compliance with standards (OGC's Sensor Web Enablement for instance), programmes and international bodies (IOC, WMO, JCOMM for example),
- obtain the material and organisation capacities required to manage this new data.