-
Etude carpologique du comblement médiéval (14e-15e siècle) de la rivière la Senne à Bruxelles – résultats préliminaires
-
Located in
Library
/
RBINS Staff Publications 2021
-
Exemple d’interdisciplinarité en Région bruxelloise : les latrines du Café Greenwich à Bruxelles
-
Located in
Library
/
RBINS Staff Publications 2021
-
SeaDataCloud temperature and salinity data collections
-
Two versions of temperature and salinity historical data collections for each European marginal sea (Arctic Sea, Baltic Sea, Black Sea, North Sea, North Atlantic Ocean, and Mediterranean Sea) have been published within the framework of SeaDataNet2 Project. They represent a snapshot of the SeaDataNet database content at two different times: V1.1 (Jan 2014, Simoncelli et al., 2014) and V2 (Mar 2015, Simoncelli et al., 2015 and 2016). A Quality Control Strategy (QCS) was developed in SeaDataNet2 and continuously refined in order to improve the quality of the data and create the best data products. The QCS iterative approach facilitates the upgrade of the data and it allows a versioning of data products. A newer version of temperature and salinity historical data collections has been released within SeaDataCloud Project in June 2018. The objective of this presentation is to briefly overview the existing SeaDataNet products and to present the first release of SeaDataCloud temperature and salinity historical data collections (SDC_DATA_TS_V1), spanning the time period 1900-2017, their characteristics in terms of space-time data distribution and their usability. A particular focus will be dedicated to the Mediterranean Sea collection. Temperature and Salinity data sets were analyzed at regional level to assess and report on their quality. A common basic QC analysis was performed using ODV software (5.0.0) and following common QC guidelines. Product Information Documents (PIDocs) contain all specifications about the general products’ characteristics (space-time coverage, resolution, format, usability) and quality (validation methodology and results). Fig. 1 shows an example in the Mediterranean Sea of data density map and time distribution histogram produced per each European basin. Fig. 2 is an example of the scatter diagrams produced per each region and contained in the PIDocs. Statistics about the SeaDataNet infrastructure population in terms of temperature and salinity data per sea basin show a progressive increase of available data. Data quality also improved thanks to the introduction of additional checks by regional experts, exploiting the complete metadata description. The statistics about the quality flags after the quality assessment presents very high percentages of good (QF=1) or probably good data (QF=2): ~99% for the Mediterranean Sea; 98- 99% for the Black Sea; ~99% Arctic Sea; ~99% Baltic Sea; 98-99% for the North Sea and 96(S)- 99% for the North Atlantic Ocean. In fact, the analysis could be performed by instrument type to verify the data set completeness and consistency, and per data originator to identify systematic data anomalies. The derived metadata statistics per sea basin allow monitoring the European data sharing landscape per sea basin and the advent of new sensors, which require particular efforts in data management and quality assessment. Conclusions and Developments All SeaDataCloud products are available as ODV collections through a web catalog (https:// www.seadatanet.org/Products) together with their associated Digital Object Identifier (DOI) and Product Information Document (PIDoc) containing the specifications about product’s generation, quality assessment and technical details to facilitate users’ uptake. The progressive automation of the QCS in the SeaDataCloud Virtual Research Environment will speed up the basic quality check process of the data and further improve the quality of the SeaDataNet infrastructure content and the derived products, which could be delivered with a regular time schedule.
Located in
Library
/
RBINS Staff Publications 2018
-
SeaDataCloud quality control of data collections
-
During the SeaDataNet II (SDN) EU-project, the Quality Control Strategy (QCS) has been implemented and continuously reviewed aiming at improving the quality of the global dataset and creating the best products. This QCS has also been used for the first aggregated dataset provided in SeaDataCloud (SDC). New regional temperature and salinity data collections covering the time period 1900-2017 have been released within the SeaDataCloud (SDC) project in 2018. A general description of these datasets, their data quality assessment procedure and results are presented. The specific procedure implemented during SDN II allows assuring and certifying the best quality for the datasets (Fig. 1). After the data harvesting from the central CDI catalogue, QC has been performed at regional levels in a coordinated way, using the ODV software (5.0.0) as common and basic QC analysis tool. In SDC the additional checks have been performed per basin to consider the specific water masses characteristics, per instrument type to investigate data completeness and consistency, per data provider to better identify data anomalies. This QCS allowed to highlight doubtful data and to organize the data anomalies in lists that have been sent to each concerned data originator together with guidelines explaining the expected corrections. The National Oceanographic Data Centers (NODC) have been asked, on the base of those lists, to check and eventually correct the original data and resubmit them in the SDC dataflow. The iterative procedure has been designed to facilitate the update and improvement of SDC database content. A detailed description of each regional dataset (Fig. 2) is contained in a Product Information Document (PIDoc): the general products’ characteristics (space-time coverage, resolution, format), its quality (validation methodology results) together with experts’recommendations for its usability. ODV qualified dataset collections and PIDocs are available at https://www.seadatanet.org/Products. Within SeaDataCloud, the implementation of a cloud environment (Virtual research environment, VRE in Figure 3) aims to optimize and automate the QCS at the central level assuring a continuous monitoring of the database content and its quality. The VRE gives the possibility of generating database snapshots on a regular basis, it facilitates data products versioning and it allows to combine data with subsets from external sources. The VRE will offer to the users the opportunity to access SDC data and services in the cloud thus providing the possibility of generating their own temperature and salinity data products as well as products for other parameters.
Located in
Library
/
RBINS Staff Publications 2018
-
MSFD: an opportunity for harmonised data management
-
MSFD: INSPIRE used as the reporting standard for metadata and data The Marine Strategic Framework Directive strives for Good Environmental Status of marine waters by 2020 and requires the Member States to report a wide array of criteria for eleven themes or descriptors. For Belgium, the criteria cover biodiversity, habitats, population health, eutrophication, seafloor morphology, hydrology, contaminants in the environment and in seafood, macrolitter and the introduction of energy (noise). It is the first time that the (meta)data has to be reported according to the INSPIRE requirements. For Belgium, MUMM (Management Unit of the Mathematical Model of the North Sea, OD Nature, RBINS), is coordinating the monitoring activities and collaborates with experts from different scientific institutes to prepare the second assessment of the status of the Belgian marine waters. The monitoring data is managed and disseminated to the EC and the public by the Belgian Marine Data Centre (BMDC). The primary data has been collected by monitoring activities or collated from other sources by several marine specialists. Harmonised monitoring reporting impossible without transversal approach The very diverse array of data types (in situ or track, polygon or gridded; many data themes), the INSPIRE requirements and the necessity to maximize the reuse of the collected data have led to the need of a streamlined data flow, that creates new and incorporates existing processes. BMDC’s Data and Inventory Tracking System (DITS) (Lagring et al., 2014) codebase was modified to allow the derivation of facets, that can be used to fulfill specific reporting needs and abstract away some of the functionality and metadata fields that are common to a specific reporting theme. Such facets are pluggable in the new website of BMDC. The MSFD facet allows the primary submission of data files and serves three purposes: providing an anchor for the data file(s) during the MSFD reporting by MUMM, providing an entry point for in-situ or track data to be ingested and data managed in the central oceanographic database (IDOD) of BMDC, and fulfilling the obligation Belgium has with regards to INSPIRE. The in-situ data falls within the INSPIRE theme ‘Oceanographic Geographical Features’, which makes use of the Observations and Measurements scheme to describe the data. In IDOD mappings are made with the NERC vocabularies, eg. P02 or P01, that describe the observedProperty in O&M. Surface-based data is represented as shapefiles in a GeoServer system; the shapefiles’ attributes are enriched in order to make the INSPIRE transformations as easy as possible. The metadata of a DITS dataset is exposed in ISO 19115:2003 through an API which allows the harvesting by systems like GeoNetwork and the propagation to the Belgian National Spatial Data Infrastructure. Specific data transformations have been written to extract data into the INSPIRE- compliant GML format according to the recommendations by the INSPIRE maintenance and implementation group (MIG) and the TG DATA of MSFD; the transformed data is hosted at the Belgian National Geographic Institute.
Located in
Library
/
RBINS Staff Publications 2018
-
EMODnet Data Ingestion: ‘Wake up your data’
-
The ‘EMODnet Ingestion and safe-keeping of marine data’ project, started mid-2016, seeks to identify and reach out to organisations from research, public, and private sectors who are holding marine datasets and who are not yet connected and contributing to the existing marine data management infrastructures which are driving EMODnet. Those potential data providers should be motivated and supported to release their datasets for safekeeping and subsequent freely distribution and publication through EMODnet. The EMODnet Data Ingestion portal facilitates submission of their sleeping marine datasets for further processing, Open Data publishing and contributing to applications for society. The activities are undertaken by a large European network that is geographically anchored in the countries bordering all European marine basins, and covers all EMODnet data themes. The EMODnet Data Ingestion members are national and regional marine and oceanographic data repositories and data management experts. The coordinators of the EMODnet thematic portals are also part of this new initiative. Moreover the data centres work together on pan-European and international scales in organisations such as IODE, ICES, EuroGeoSurveys, EuroGOOS, and IHO, and for pan-European marine data management infrastructures such as SeaDataCloud, EurOBIS and EGDI. The latter are feeding into several EMODnet thematic portals. The emphasis of activities in the first year has been put towards developing the EMODnet Data Ingestion portal and its services for ingesting and publishing data sets, developing the pathways for processing and elaborating of data submissions, laying a basis for promotion and marketing activities, and making an initial inventory of potential data sources and their providers. The EMODnet Data Ingestion portal been launched early February 2017. It encourages data providers to share marine data, gives marine data management guidance information, and provides a range of services such as: ■ submission service for easy ingestion of marine data packages ■ view submissions service to oversee submitted data sets ‘as is’ ■ data wanted service to post requests for specific data types Submission forms with data packages are assigned to qualified data centres depending on the country of the data provider and the type of EMODnet theme. This group includes not only the EMODnet Ingestion consortium but also the groups of data centres who are involved in each of the EMODnet Thematic portals. A distinction is made between 2 phases in the life cycle of a data submission: ■ Phase I: from submission to publishing of the submitted datasets package ‘as is’ ■ Phase II: further elaboration of the data sets and integration (of subsets) in national, European and EMODnet thematic portals. This split allows to publish already in an early stage the original data package with high quality metadata. For operational oceanography a close cooperation takes place with EMODnet Physics. This aims at identifying and arranging inclusion of additional stations for Near Real Time (NRT) data exchange. The Data Ingestion portal explains how the NRT exchange is organised with EuroGOOS – Copernicus and guidance how to connect in practice. Furthermore a Sensor Web Enablement (SWE) pilot is set-up for Real Time data exchange. A client service to locate stations and to retrieve data streams in a time series viewer is hosted at the EMODnet Physics portal and ‘advertised’ at the EMODnet Data Ingestion portal. Promotion and outreach activities are equally important as technical developments. In the first year it has focused on establishing cooperation and synergy within the EMODnet community. A portfolio of promotional items has been developed, such as leaflets, posters, presentations, stickers, and a wonderful animation. These are part of the promotion and marketing strategy that was designed to reach out to potential data providers. In the second year this plan has been put into motion on full scale for a wider outreach and marketing to potential data providers in government, science and industry. This has so far resulted in many submissions and also in development of special use cases, such as for monitoring data from offshore renewable energy projects or minting DOIs for research data to support data citing for data submitters.
Located in
Library
/
RBINS Staff Publications 2018
-
From SeaDataNet to SeaDataCloud: historical data collections and new data products
-
Temperature and Salinity historical data collections covering the time period 1900-2013/2014 were created for each European marginal sea (Arctic Sea, Baltic Sea, Black Sea, North Sea, North Atlantic Ocean, and Mediterranean Sea) within the framework of SeaDataNet2 Project and they are available as ODV collections through a web catalog (https://www.seadatanet.org/Products/Aggregated-datasets). Two versions have been published and they represent a snapshot of the SeaDataNet database content at two different times: V1.1 (January 2014) and V2 (March 2015). A Quality Control Strategy (QCS) was developped and continuously refined in order to improve the quality of the database content and create the best data products. The QCS consists of four main phases: 1) data harvesting from the data infrastructure; 2) file and parameter aggregation; 3) secondary quality check analysis; 4)correction of data anomalies. The approach is iterative to facilitate the upgrade of the database content and it allows a versioning of data products. Regional temperature and salinity monthly climatologies have been produced from V1.1 historical data collections and they are also available (https://www.seadatanet.org/Products/Climatologies). Within the new SeaDataCloud Project the release of updated historical data collections and new climatologies is planned. SeaDataCloud novelties are the introduction of decadal climatologies at various resolutions, the development of climatologies for the Global Ocean and a task dedicated to new data products, like Mixed Layer Depth climatologies, Ocean Heat Content estimates, coastal climatologies from HF radar data. All SeaDataCloud products are available through a dedicated web catalogue together with their relative Digital Object Identifier (DOI) and Product Information Document (PIDoc) containing all specifications about product’s generation, quality assessment and technical details to facilitate users’ uptake. The presentation will briefly overview the existing SeaDataNet products and introduce the SeaDataCloud products’ plan, but the main focus will be on the first release (February 2018) of SeaDataCloud Temperature and Salinity historical data collections, spanning the time period 1900-2017, their characteristics in terms of space-time data distribution and their usability.
Located in
Library
/
RBINS Staff Publications 2018
-
Geological evidence for extreme wave events on a coastal lowland facing the Tokai segment of the Nankai-Suruga Trough
-
Located close to Japan’s densest concentrations of people and industry, the easternmost region of the Nankai-Suruga subduction zone has long been the focus of attempts to forecast and even precisely predict future earthquakes. While historical records attest to the occurrence of great earthquakes and subsequent tsunamis that may have originated from the Tōkai segment, past rupture zone extents and recurrence intervals remain poorly understood. Coastal stratigraphy has the potential to record the occurrence of both tsunami inundation and coseismic vertical land-level change over timescales far exceeding the historical record, with important implications for refining understanding of future hazards (Garrett et al., 2016). Here we present initial results from an extensive coring survey of the lower reaches of the floodplain of the Sagara River, close to the town of Sagara, Shizuoka Prefecture. The site lies at an altitude of ~1 – 5 m and is within the anticipated inundation zone of future worst-case tsunami scenarios. Typhoon-driven storm surges and river floods are also likely to have inundated the site, complicating the interpretation of potential tsunami deposits. Using CT scans, multi-sensor core logs, diatom assemblages and radiocarbon dates, we evaluate sedimentary processes and make the distinction between extreme wave events and fluvial deposits. Where possible, we assess methods to differentiate between storm surges and tsunami deposits. Finally, we evaluate the potential for the site to provide a long and continuous record of extreme wave events and highlight the probable influence of changing thresholds of evidence creation and preservation over time.
Located in
Library
/
RBINS Staff Publications 2017
-
Multi-scale ocean colour synergy producs for coastal water quality monitoring
-
Located in
Library
/
RBINS Staff Publications 2020
-
THE PEAT DEPOSITS FROM BRUSSELS (BELGIUM): THE HOLOCENE EVOLUTION OF THE LANDSCAPE IN THE SENNE VALLEY
-
Whereas the evolution of the land cover of the Holocene landscape is rather well documented for the main basin of the Scheldt river (Verbruggen et al , 1996), the vegetation history of Senne valley remains poorly documented Over the last decade, during the systematic archaeological survey conducted by the Direction of Monuments and Sites of the Brussels Capital Region, several exceptionally well preserved meters thick peat deposits have been discovered in the historical centre of Brussels and its surroundings The first results of the palynological, paleofire and geoarchaeological studies reveal a nearly continuous sequence throughout the Holocene The interdisciplinary study of these deposits offer a huge potential to explore the evolution of the paleoenvironment in the river valley and further to contribute to spatial reconstructing the landscape development of the area trough the time As the sites are situated as well in the historical city centre as in the surrounding area it will also allow us to reconstruct the impact of the urbanisation on the natural vegetation and transformation of the peatland ecosystem into urban and cultivated areas in Brussels and its immediate surroundings
Located in
Library
/
RBINS Staff Publications 2017