The CODE-DE marketplace presents an overview of all activities featured by CODE-DE. Main categories are DATASETS, SERVICES, PROJECTS, TOOLS, and PROCESSORS.

Technically speaking the marketplace is just a summary of the content available from the CODE-DE portal managed by its Content Management System (CMS) . The overview is intended to provide the reader quick and simple information on all relevant CODE-DE components. The individual components provide their own web based management interfaces. The service marketplace links to them. Since we use single sign-on, the user can continue his tasks without interruption.

In addition to the data available on the CODE-DE platform the Spatial Data Infrastructure Germany (GDI-DE) and the Geoportal.DE provide further relevant geodata. More information on Copernicus, the Sentinels, Earth Observation Missions and sensors and related topics can be found on Copernicus.EU.

In the next version, projects, realized in the frame of CODE-DE, as well as processor modules available for new services will be featured on the marketplace. At the moment these pages have no content, the message "Your search yielded no results" is returned.

Datasets

The Service Marketplace includes a JavaScript based CSW client which allows creating CSW queries for the Discovery Service of the CODE-DE infrastructure. The Discovery Service returns metadata entries that are INSPIRE conform about products from the collaborative platform as well as third-party missions.

The Global Urban Footprint® (GUF®) dataset is based on the radar (SAR) satellite imagery of the German satellites TerraSAR-X and TanDEM-X. By creating the GUF database, scientists at the German Remote Sensing Data Center (DFD) of the German Aerospace Center (DLR) have succeeded in using a newly developed method to generate a global raster map of the world’s built-up pattern in a so far unprecedented spatial resolution of about 12m per raster cell.

Using a fully automated processing system, a global coverage of more than 180,000 very high resolution SAR images (3m ground resolution) has been analyzed acquired between 2010 and 2013. Thereby, the backscatter amplitudes of the SAR data have been used in combination with derived textural information to delineate human settlements in a highly automated, complex decision-making process. The evaluation procedure based mainly on radar signals detects the characteristic vertical structures of human habitations – primarily built-up areas. In addition, auxiliary data such as digital elevation models have been included to improve the classification process. In total, over 20 million datasets were processed with a combined volume of about 320 terabytes. The final global maps show three coverage categories (e. g. in a B&W representation): Built-up areas (vertical structures only) in black, non-built-up surfaces in white, areas of no coverage by TSX/TDX satellites (NoData) as most parts of the oceans in grey.

The final product has been optimized for fast online access through web services by merging the 5° x 5° GUF tiles into a single global mosaic. Furthermore reduced resolution overviews have been generated with an interpolation algorithm, that computes the average value of all contribution pixels. The global mosaic uses PackBits compression to reduce file size.

(GUF® and Global Urban Footprint® are protected as trademarks.)

Date of last change
01-12-2017 09:23:10

The Global Urban Footprint® (GUF®) dataset is based on the radar (SAR) satellite imagery of the German satellites TerraSAR-X and TanDEM-X. Using a fully automated processing system, the so-called Urban Footprint Processor, a global coverage of more than 180,000 very high resolution SAR images (3m ground resolution) has been analyzed, mainly acquired between 2010 and 2013. Thereby, the backscatter amplitudes of the SAR data have been used in combination with derived textural information to delineate human settlements in a highly automated, complex decision-making process. In addition, auxiliary data such as digital elevation models have been included to improve the classification process. The data collection of satellite imagery was performed mainly between 2011 and 2012 (93 %), with single scenes with more recent acquisition dates (of 2013 / 2014) used to fill data gaps.
For more details see:
Esch, T., Marconcini, M., Felbier, A., Roth, A., Heldens, W., Huber, M., Schwinger, M., Taubenböck, H., Müller, A., Dech, S. (2013) Urban Footprint Processor – Fully Automated Processing Chain Generating Settlement Masks from Global Data of the TanDEM-X Mission. IEEE Geoscience and Remote Sensing Letters, Vol. 10, No. 6, November 2013. Pp. 1617-1621. ISSN 1545-598X, DOI 10.1109/LGRS.2013.2272953 (see: http://elib.dlr.de/83318/ )

The SRTM X-SAR Elevation Mosaic is an aggregation of DLR's SRTM X-SAR DTED files.

The DTED Level-2 files have been generated from Synthetic Aperture Radar (SAR) data acquired by the German-Italian X-band interferometric SAR system during the Shuttle Radar Topography Mission (SRTM) between February 11 and 22, 2000. The X-band system was flown and operated onboard the Space Shuttle Endeavor, along with a NASA C-Band SAR system. The SRTM project page at DLR provides additional information on the SRTM X-band mission (http://www.dlr.de/eoc/Portaldata/60/Resources/dokumente/7_sat_miss/SRTM-...). Further details on the mission in general, the technology, accuracies, and applications are available in http://www2.jpl.nasa.gov/srtm/SRTM_paper.pdf .

The original DTED files have been grouped and mosaicked into 30 x 30 degree tiles. Six out of the total of 48 tiles were empty since they do not contain any DTED files. The resulting 42 tiles are stored as uncompressed GeoTIFF files. The files have been supplemented with nine cubic convolution resampled overviews for fast web delivery.

Date of last change
24-04-2018 04:20:43

Horizontal accuracy (absolute): ±20m 90% circular error
Horizontal accuracy (relative): ±15m 90% circular error
Vertical accuracy (absolute): ±16m 90% linear error
Vertical accuracy (relative): ±6m 90% linear error

The SRTM X-SAR Error Mosaic is based on the height error map (HEM, see SRTM PDF http://www.dlr.de/eoc/Portaldata/60/Resources/dokumente/7_sat_miss/SRTM-...) and provides a local measure of the achieved accuracy. It is statistically determined from a neighborhood of image cells mainly considering the phase and baseline stability. Thus it describes the precision relative to the surrounding. The determination of the absolute accuracy requires the consideration of reference measures.

Date of last change
24-04-2018 04:21:21

Horizontal accuracy (absolute): ±20m 90% circular error
Horizontal accuracy (relative): ±15m 90% circular error
Vertical accuracy (absolute): ±16m 90% linear error
Vertical accuracy (relative): ±6m 90% linear error

The SRTM X-SAR Hillshade Mosaic is a greyscale shaded relief based on the SRTM X-SAR Elevation Mosaic. Combined with the latter, it can be used to add a 3d effect and enhance the visual resolution by pronouncing peaks and valleys.

Date of last change
24-04-2018 04:21:02

Horizontal accuracy (absolute): ±20m 90% circular error
Horizontal accuracy (relative): ±15m 90% circular error
Vertical accuracy (absolute): ±16m 90% linear error
Vertical accuracy (relative): ±6m 90% linear error

Services

These services include search and access (see also DATASETS), browsing imagery, downloading and processing (available from mid-2017). Application projects

The Copernicus Services Portfolios database is a user-friendly tool, designed to facilitate the understanding and ease the access to Copernicus products and services for intermediate/final users and research communities.

The Copernicus Services Portfolios database offers three different Search Functionalities: –Search by Keywords, based on users’ needs, –Search by Alphabetical Index, based on Copernicus Services Portfolios, –Search by Theme, based on Copernicus domains of application. Also, support functionalities help enhancing the user’s experience, such as: –Subscription, for top-down communication, –Feedback and Queries, –Print-Out and Download in PDF format, –Analytics and Statistics, –Automated daily updates.

Date of last change
08-05-2018 09:27:01

EY, on behalf of European Union

Copernicus is a European system for monitoring the Earth. Data is collected by different sources, including Earth observation satellites and in-situ sensors. The data is processed and provides reliable and up-to-date information about six thematic areas: land, marine, atmosphere, climate change, emergency management and security. The land theme is divided into four main components:

Global
The Global Land Service provides a series of bio-geophysical products on the status and evolution of the land surface at global scale at mid and low spatial resolution. The products are used to monitor the vegetation, the water cycle and the energy budget.

Pan-European
The pan-European component provides information about the land cover and land use (LC/LU), land cover and land use changes and land cover characteristics. The latter includes information about imperviousness, forests, natural grasslands, wetlands, and permanent water bodies.

Local
The local component focuses on different hotspots, i.e. areas that are prone to specific environmental challenges and problems. This includes detailed LC/LU information for the larger EU cities (Urban Atlas), riparian zones along European river networks and NATURA 2000 sites. It will also include maps of coastal areas.

In-situ
All of the Copernicus services need access to in-situ data in order to ensure an efficient and effective use of Copernicus space-borne data. Next to data provided by participating countries, Earth observation from space also yields pan-European reference datasets, such as a Digital Elevation Model.

The Copernicus Marine Environment Monitoring Service (CMEMS) provides regular and systematic reference information on the physical state, variability and dynamics of the ocean and marine ecosystems for the global ocean and the European regional seas.

The observations and forecasts produced by the service support all marine applications. For instance, the provision of data on currents, winds and sea ice help to improve ship routing services, offshore operations or search and rescue operations, thus contributing to marine safety.
The service also contributes to the protection and the sustainable management of living marine resources in particular for aquaculture, fishery research or regional fishery organisations.
Physical and marine biogeochemical components are useful for water quality monitoring and pollution control. Sea level rise helps to assess coastal erosion. Sea surface temperature is one of the primary physical impacts of climate change and has direct consequences on marine ecosystems.

As a result of this, the service supports a wide range of coastal and marine environment applications. Many of the data delivered by the service (e.g. temperature, salinity, sea level, currents, wind and sea ice) also play a crucial role in the domain of weather, climate and seasonal forecasting.

Some of today’s most important environmental concerns relate to the composition of the atmosphere. The increasing concentration of the greenhouse gases and the cooling effect of aerosol are prominent drivers of a changing climate, but the extent of their impact is often still uncertain.

At the Earth’s surface, aerosols, ozone and other reactive gases such as nitrogen dioxide determine the quality of the air around us, affecting human health and life expectancy, the health of ecosystems and the fabric of the built environment. Ozone distributions in the stratosphere influence the amount of ultraviolet radiation reaching the surface. Dust, sand, smoke and volcanic aerosols affect the safe operation of transport systems and the availability of power from solar generation, the formation of clouds and rainfall, and the remote sensing by satellite of land, ocean and atmosphere.

To address these environmental concerns there is a need for data and processed information. The Copernicus Atmosphere Monitoring Service (CAMS) has been developed to meet these needs, aiming at supporting policymakers, business and citizens with enhanced atmospheric environmental information.

Projects

These application projects are approved by CODE-DE overall management to demonstrate the capabilities of the infrastructure. The projects are supported by BMVI under special terms and conditions and temporally limited.

AGRO-DE in CODE-DE

Mit dem Projekt AGRO-DE wird ein Daten- und Auswertungscluster geschaffen werden, welches landwirtschaftlichen Betrieben, Beratern, Lohnunternehmern und Serviceprovidern ermöglicht, vorverarbeitete Fernerkundungsinformationen zeitnah nutzen zu können und in ihre Betriebsabläufe zu integrieren. Die Informationen sollen dabei in verschiedenen Formen (z.B. als Karten- oder Datendienst), Informationstiefen (z.B. als Bilddaten, Informationsprodukt, oder als dynamisches Modellierungsergebnis) und Abrechnungsmodellen (kostenfrei und gebührenpflichtig) bereitgestellt werden. Erstmalig können alle Landwirte in Deutschland von aktuellen Satelliteninformationen profitieren. Mit AGRO-DE wird ein offener Zugang zu nutzbaren Informationsprodukten geschaffen, der den Einsatz von Precision Farming Technologien stimulieren soll. Diese Technologien sollen auch kleinstrukturierte Betriebe und Betriebe mit ökologischem Landbau ansprechen. Darüber hinaus können Informationsprodukte aus AGRO-DE zukünftig auch Forschungseinrichtungen, Bundes- und Landesbehörden sowie Nichtregierungsorganisationen (NGO) bei ihrer Arbeit unterstützen, da methodisch einheitlich hergestellte Datensätze für Deutschland bereitstehen.

Date of last change
23-04-2018 10:52:44

Bundesministerium für Ernährung und Landwirtschaft

Tools

CODE-DE provides useful tools like the Sentinel toolboxes. Tools that are externally hosted will be linked; other tools can be directly downloaded from the marketplace. For information on how to use these tools in the frame of CODE-DE check the user manual or contact HelpDesk.

TIMESAT is a software package for analysing time-series of satellite sensor data.

TIMESAT has been developed to investigate the seasonality of satellite time-series data and their relationship with dynamic properties of vegetation, such as phenology and temporal development. The temporal domain holds important information about short- and long-term vegetation changes. TIMESAT was originally intended for handling noisy time-series of AVHRR NDVI data and to extract seasonality information from the data. The program now has the capability to handle different types of remotely sensed time-series , e.g. data from Terra/MODIS at different time resolutions. It has also been tested with eddy covariance data and moisture data, although these applications are not the main target.

Date of last change
07-03-2017 02:29:51

Per Jönsson, Malmö University; Lars Eklundh, Lund University

Sen2Cor is a processor for Sentinel-2 Level 2A product generation and formatting.

The processor performs the atmospheric-, terrain and cirrus correction of Top-Of-Atmosphere Level 1C input data. Sen2Cor creates Bottom-Of-Atmosphere, optionally terrain- and cirrus corrected reflectance images; additional, Aerosol Optical Thickness-, Water Vapor-, Scene Classification Maps and Quality Indicators for cloud and snow probabilities. Its output product format is equivalent to the Level 1C User Product: JPEG 2000 images, three different resolutions, 60, 20 and 10 m.

Date of last change
07-03-2017 02:31:30

European Space Agency (ESA)

The software Fmask (Function of mask) is used for automated clouds, cloud shadows, and snow masking for Landsat (4, 5, 7, and 8) and Sentinel-2 data.

This package implements the Fmask algorithm as a Python module. It is intended that this can be wrapped in a variety of main programs which can handle the local details of how the image files are named and organised, and is intended to provide maximum flexibility. It should not be tied to expecting the imagery to be layed out in a particular manner. This modular design also simplifies the use of the same core algorithm on either Landsat and Sentinel imagery. The wrapper programs take care of differences in file organisation and metadata formats, while the core algorithm is the same for both

Date of last change
07-03-2017 02:30:44

Neil Flood, Sam Gillingham

The Automated Radiative Transfer Models Operator (ARTMO) Graphic User Interface (GUI) is a software package that provides essential tools for running and inverting a suite of plant RTMs, both at the leaf and at the canopy level.

ARTMO facilitates consistent and intuitive user interaction, thereby streamlining model setup, running, storing and spectra output plotting for any kind of optical sensor operating in the visible, near-infrared and shortwave infrared range (400-2500 nm).

Date of last change
07-03-2017 02:33:16

Universitat de València, IPL Image Processing Laboratory

Processors

Certain processing tools such as seasonal cloud free mosaicking or temporal feature generation are available for registered users to be applied to their preferred sensor or region of interest.

Processor for band math with Sentinel Products in CODE-DE

The SNAP Generic BandMath operator allows to create a product with multiple bands based on mathematical expressions. The geo-coding information and metadata for the target product is taken from the source product.

Sentinel-2 Atmospheric Correction (Sen2Cor) Processor in CODE-DE

Sen2Cor is a processor for Sentinel-2 Level 2A product generation and formatting; it performs the atmospheric-, terrain and cirrus correction of Top-Of- Atmosphere Level 1C input data. Sen2Cor creates Bottom-Of-Atmosphere, optionally terrain- and cirrus corrected reflectance images; additional, Aerosol Optical Thickness-, Water Vapor-, Scene Classification Maps and Quality Indicators for cloud and snow probabilities. Equivalent to the Level 1C User Product: JPEG 2000 images, three different resolutions, 60, 20 and 10 m.

Copernicus Data Access and Exploitation Platform for Germany (CODE-DE) - Processors

The CODE-DE processing subsystem is based on existing software components like Hadoop, Calvalus and Sentinel Toolbox. The processing services offered provide an environment to generate higher level data products from the satellite data hosted on the CODE-DE platform.

Date of last change
30-11-2017 01:47:04

Brockmann Consult GmbH

VM Template for CODE-DE

This is the VM Template for CODE-DE. It consists of tools which allow users to search for data in the CODE-DE data storage, retrieve data from the CODE-DE data storage, and to interact with CODE-DE processing system. The tools are accompanied by documentation and a sample processing request file.

Documentation: https://code-de.org/tools/code-de-external-tools.pdf

Sample processing file: https://code-de.org/tools/code-de-external-tools-1.1.tar.gz