Earth Observation – Copernicus Programme

Earth observation is the gathering of information about planet Earth’s physical, chemical and biological systems via remote sensing technologies, usually involving satellites carrying imaging devices. Earth observation is used to monitor and assess the status of, and changes in, the natural and manmade environment.

Earth observation | EU Science Hub, CC BY 4.0

On this exercise usage of the Gaia data will be demonstrated by two approaches. Both utilize TAP protocol to access the data, but the first approach will use dedicated software – TOPCAT, and the second approach will use python software library to access the data programmatically.

Lecture slides

Earth Observation – Copernicus I.

The lecture focuses on general description of the Copernicus programe and platforms for data access.

Earth Observation – Copernicus II.

The lecture focuses on description of basic concepts and data analysis methods.

Earth Observation – Copernicus III.

The lecture focuses on general description of data formats used in the field of Earth Observation and overviews software and libraries used in the analysis of Earth Observation data.

Files for this lab

  • All files are available in the git repository stored in the faculty’s Gitlab server.
  • Address of the project is the following:
  • PIP environment specifications in the repository:
    • requirements.txt
    • requirements_OCEA01.txt
  • Jupyter notebooks with tasks:
    • OCEA01.ipynb
    • NetCDF – C3S Fire Warning Indicator – multiple files.ipynb
    • NetCDF – C3S Fire Warning Indicator – single file.ipynb
    • NetCDF from Sentinel 3 OLCI.ipynb
    • WMS s2cloudless.ipynb
    • Per-pixel classification – 1. preparation of datasets – PART1.ipynb
    • Per-pixel classification – 1. preparation of datasets – PART2.ipynb
    • Per-pixel classification – 1. preparation of datasets – PART3.ipynb
    • Per-pixel classification – 2. training.ipynb
    • Per-pixel classification – 3. application.ipynb
  • Demonstrative Jupyter notebooks:
    • GRIB – C3S Essential climate variables.ipynb
    • Affine example.ipynb
    • Fiona example.ipynb
    • Per-pixel classification – misc – cloud masks.ipynb
    • Read shapefile in Geopandas.ipynb
  • Script for Google Earth Engine:
    • ndvi_ndwi_google_earth_engine.js

Datasets used in the lab

C3S Climate Data Store

  • Fire danger indicators for Europe from 1970 to 2098 derived from climate projections
  • Essential climate variables for assessment of climate variability from 1979 to present
    • Variable: Sea-ice cover, Surface air temperature
    • Product type: Monthly mean
    • Time aggregation: 1 month
    • Year: 1983, 1984, 1989, 1990, 1995, 1996, 2001, 2002, 2007, 2008, 2013, 2014, 2019, 2020
    • Month: January, February, March, April, May, June, July, August, September, October, November, December
    • Origin: ERA5
    • Format: Compressed tar file (.tar.gz)
    • Link to the dataset: https://cds.climate.copernicus.eu/cdsapp#!/dataset/ecv-for-climate-change

Copernicus Open Access Data Hub

  • SENTINEL-1
    • Identifier: S1A_IW_GRDH_1SDV_20191217T165819_20191217T165844_030391_037A3A_08F2
    • Sensing start: 2019-12-17T16:58:19.996Z
    • Platform S1A
    • Product level: L1
    • Instrument: SAR-C
    • Mode: IW
    • Polarisation: VV VH
    • Pass direction: ASCENDING
    • Presnetly “Offline” product (wait period before download)
  • SENTINEL-2 MSI L1C products
    • Data are downloaded via OData API using SentinelSat library
    • https://scihub.copernicus.eu/dhus/search?format=json&rows=100&start=0, query: beginPosition:[2020-04-11T00:00:00Z TO 2020-05-21T00:00:00Z] cloudcoverpercentage:[0 TO 30] producttype:S2MSI1C footprint:"Intersects(POLYGON ((21.54731900099266 48.85782718839408, 21.54731900099266 49.14725103658307, 20.83750699412776 49.14725103658307, 20.83750699412776 48.85782718839408, 21.54731900099266 48.85782718839408)))"
  • SENTINEL-3 OLCI Level 2 Land Product
    • Identifier: S3A_OL_2_LRR____20200722T085024_20200722T093443_20200723T123309_2659_060_378______LN1_O_NT_002
    • Satelite Platform: S3A
    • Product Type: OL_2_LRR__ (level-2, reduced resolution)
    • Instrument: OLCI (Ocean Land Colour Instrument)
    • Acquisition Period Start: 2020-07-22T08:50:23.774429Z
    • Acquisition Period End: 2020-07-22T08:50:23.774429Z
    • Relative Orbit Number: 378

Land Monitoring Service

Download of Prešov dataset form Urban Atlas 2018

Setup of the environment

Installation of dependencies

  • sudo apt install gdal-bin libeccodes0 libgeos-dev python3.8
    • gdal-bin – geopandas, rasterio
    • libeccodes0 – GRIB support in xarray
    • libgeos-dev (maybe libgeos-c1v5 is sufficient) – shapely, cartopy
  • Notebooks focused on geopandas, xarray, rasterio were tested with python 3.8
  • There are two environments used in this lab (due to compatibility issues with SNAPPY and dask)
    • SNAPPY-focused environment (requirements_OCEA01.txt)
    • Everything else (requirements.txt)

Applications

Python environment

  • Setup of the python environment
    • The python and necessary libraries can be installed locally, or the examined jupyter notebooks can be run in cloud-based solutions such as Google colaboratory.
    • Environment setup:
      • Virtual environments (venv, conda) are recommended for the local installation.
        • Virtual environment can be set-up by the following command:
          python3 -m virtualenv -p python3 venv
          # activation
          . venv/bin/activate
      • Required non-standard libraries are listed in requirements.txt file
      • With pip package installer, all required packages can be installed via:
        pip install -r requirements.txt
        • To resolve Shapely issues:
          pip uninstall shapely -y
          pip install shapely --no-binary shapely
  • With conda package installer, all required packages can be installed via:
    conda env create -f environment.yml
    • To use files in Google Drive (such as requirements.txt), the Google Drive directories can be mounted in a notebook using google.colab.drive library (see the example notebook). Shell commands can be executed from a jupyter notebook by putting an exclamation point as the first letter of a line (example of shell commands in IPython).

Summary of the lab activities

Lab 1

Loading, visualizing and analysing NetCDF file using xarray – data from Climate Data Store

Demonstration of manipulation with a NetCDF file in xarray library, In this case the coordinates of data points are provided as 2D grids. A data point is provided for each point of the grid.  The activity also includes a demonstration of the Climate Data Store provided by Copernicus Climate Change Service (C3S).

Loading, visualizing and analysing NetCDF file using xarray – data from Climate Data StoreSingle NetCDF File

  • NetCDF – C3S Fire Warning Indicator – single file.ipynb

Loading, visualizing and analysing multiple NetCDF files using xarray – data from  Climate Data Store

Demonstration of manipulation with a NetCDF file in xarray library, In this case the coordinates of data points are provided as 2D grids. A data point is provided for each point of the grid. The activity also includes a demonstration of the Climate Data Store provided by Copernicus Climate Change Service (C3S). The activity extends the previous activity by concatenation of NetCDF files and demonstration of vectorized application of a fitting procedure.

Loading, visualizing and analysing NetCDF file using xarray – data from Climate Data StoreMultiple NetCDF Files

  • NetCDF – C3S Fire Warning Indicator – multiple files.ipynb

Loading and visualization of NetCDF data from Sentinel-3 (OLCI User products) in xarray

Demonstration of manipulation with a NetCDF file in xarray library, In this case the coordinates of data points are provided as 2D tie points that are not arranged into a grid of points. The activity also includes a demonstration of Copernicus Open Access Hub . The activity extends the previous activity by concatenation of NetCDF files and demonstration of vectorized application of a fitting procedure.

Loading and visualization of NetCDF data from Sentinel-3 (OLCI User products) in xarray

  • NetCDF from Sentinel 3 OLCI.ipynb

Loading and visualization of GRIB files in xarray package

The aim of the activity is to demonstrate capability of xarray to work with GRIB files downloaded from the Climate Data Store of C3S. This is demonstrated for a single file and for multi-file dataset. Three methods of reading multi-file datasets are presented.

Loading and visualization of GRIB files in xarray package

  • GRIB – C3S Essential climate variables.ipynb

Downloading of Sentinel-2 cloudless data via Web Map Service protocol

A brief demonstration of WMS protocol and Sentinel-2 cloudless dataset provided by EOX. Two approaches are presented, one uses Python and another realizes WMS without any abstraction using wget utility in a short bash script.

Downloading of Sentinel-2 cloudless data via Web Map Service protocol

  • WMS s2cloudless.ipynb

Normalized Satellite Indexes in Google Earth Engine

The activity demonstrates Copernicus data access in Google Earth Engine (GEE). It also includes demonstration of the Google Earth Engine API and principle of its operation. The activity shows a different paradigm than the previous lab activities. The task is to calculate NDVI and NDWI indexes using the GEE API.

Normalized Satellite Indexes in Google Earth Engine

  • ndvi_ndwi_google_earth_engine.js
  • Copy the source code into your’s GEE enviroment

Lab 2

Ship detection in Sentinel-1 data – tutorial OCEA01 by RUS Copernicus

The goal of the activity is to introduce students into Research and User Support (RUS) Copernicus and library of tutorials provided by the portal. In this activity student should gain experience with ESA SNAP application.

Ship detection in Sentinel-1 data – tutorial OCEA01 by RUS Copernicus

Ship detection in Sentinel-1 data using SNAPPY – inspired by tutorial OCEA01 by RUS Copernicus

The activity is a reproduction of previous tutorial, but it realized using SNAPPY package provided by SNAP. The images are handled using numpy library. Visualization is realized using matplotlib and cartopy libraries. The activities include application of GPT operators form python, acquisition of vector data in an instance of SNAP’s product object, dealing with interoperability between SNAPPY and Cartopy, and data visualization in matplotlib and cartopy including reprojections.

Ship detection in Sentinel-1 data using SNAPPY – inspired by tutorial OCEA01 by RUS Copernicus

  • OCEA01.ipynb

Lab 3

Per-pixel classification of Sentinel-2 data (inspired by tutorial LAND01 by RUS)

Extensive activity that leads from construction of a dataset of labeled data to application of a trained classifier to unlabeled data of choice. A classifier is trained on an image of Prešov, and a trained model is applied to an image of Košice. The data labels are provided through Urban Atlas 2018, which at the time of preparing this tutorial included data for Prešov, but it did not include data for Košice.

Per-pixel classification of Sentinel-2 data (inspired by tutorial LAND01 by RUS)

Preview of Sentinel-2 cloud mask using Geopandas

Supplementary activity to the previous activity focused on visualization of cloud masks and demonstration of cloud types stored in GML files alongside the mask shapes. This activity should be helpful in understanding the extensive product reading procedure used in the previous activity. The activity should be done alongside the previous activity, if explanation of the procedure requires more experimentation.

Preview of Sentinel-2 cloud mask using Geopandas

  • Per-pixel classification – misc – cloud masks.ipynb

Other Examples

  • Read shapefile in Geopandas.ipynb
  • Fiona example.ipynb


More detailed description of lab activities

Ship detection in Sentinel-1 data – tutorial OCEA01 by RUS Copernicus

The goal of the activity is to introduce students to Research and User Support (RUS) Copernicus and the library of tutorials provided by the portal. In this activity, a student should gain experience with the ESA SNAP application.

The activity is based on tutorial OCEA01 by Copernicus Research and User Support (RUS)

General steps of the activity

  1. Manual download of the products from Open Hub
  2. Loading of the data in SNAP GUI
  3. Analysis of the data using the GUI 
  4. Visualization of data in QGIS
  5. Visualization of the data in Google Earth

Detailed steps of the activity in SNAP

  1. Input file
    • S1A_IW_GRDH_1SDV_20191217T165819_20191217T165844_030391_037A3A_08F2.zip
  2. Subset: click on different widget (Product Explorer), Subset
  3. Orbit file
    • Radar → Apply Orbit file
  4. Vector mask
  5. Ship detection
    • Radar → SAR Applications → Ocean Applications → Ocean Object Detection
      • Land-Sea-Mask
      • Calibration
        • The radiometric correction is necessary for the pixel values to truly represent the radar backscatter of the reflecting surface
      • Adaptive Thresholding
      • Object Discrimination
  6. Export the results to Shapefile
    • Vector Data, ShipDetections, Geometry as a Shapefile
      • Error in SNAP
      • Processing/subset_S1A_IW_GRDH_1SDV_20161009T165807_Orb_Cal_THR_SHP.data/vector_data/ShipDetections.csv
        • data is still not projected to a known projection such as WGS84
      • However, a temporary file with coordinates is located here
        • ~/.snap/var/log/subset_S1A_IW_GRDH_1SDV_20161009T165807_4550_Orb_Cal_THR_object_detection_report.xml
        • xmllint --xpath "//target/@*[name()='Detected_lat' or name()='Detected_lon' or name()='Detected_length']" "$f" | sed -r 's/ ?Detected_lat="([^"]+)" Detected_lon="([^"]+)" Detected_length="([^"]+)"/\1;\2;\3\n/g' > "$outdir/obeject_detection_report_lat_lon_length.csv"
          • Consider extracting other fields
      • Load the Shapefile mask into QGIS via browser 
      • Add Delimited Text Layer, different UI in QGIS 3 to QGIS 2
        • X field longitude (filed_2), Y field latitude (filed_1)
        • Find Coordinate Reference System   WGS84 with ID: EPSG: 4326 
      • Layers: Save as (shp) / Export>Export features as
      • Background map 
        • via MapTiler  (requires registration and API key)
        • Via QuickMapServices,  search QMS “world”
          • ESRI Satellite, EOX::Maps Sentinel-2 cloudless
          • ESRI World topo, Waze World – notice deformations
          • Maps with red dot does not work, yellow nothing visible
      • Layers: detections layer >  Labels  > Single labels (QGIS 3) / Show labels for this layer (QGIS 2)
      • Save Feature As > Format: KML

Ship detection in Sentinel-1 data using SNAPPY – inspired by tutorial OCEA01 by RUS Copernicus

The activity is a reproduction of the previous tutorial, but it realized using the SNAPPY package provided by SNAP. The images are handled using NumPy library. Visualization is realized using Matplotlib and Cartopy libraries. The activities include the application of GPT operators form python, acquisition of vector data in an instance of SNAP’s product object, dealing with interoperability between SNAPPY and Cartopy, and data visualization in Matplotlib and Cartopy including reprojections.

Before running this notebook:

  • Install SNAP
    • SNAP_INSTALL variable is used as a placeholder representing installation directory
  • Symlink gpt utility into working directory of this notebook
    • ln -s $SNAP_INSTALL/bin/gpt
  • Run snappy-conf utility
    • At this time (january 2021) SNAPPY does not work with Python 3.8, Python 3.6.9 works.
    • "$SNAP_INSTALL/bin/snappy-conf" "$(realpath venv_snappy/bin/python)" "$(realpath .)"
    • If snap GUI is not opened, the program is stuck after directory creation (seems to be IPC error). The process can be interrupted via Ctrl+C.

Guides to setup Snappy:

Steps

  1. Reading the product file via SNAPPY API
  2. Visualization of bands using matplotlib
  3. Application of GPT operators
    • Subset, 
    • Apply Orbit File,
    • Application of land-sea mask, 
    • Calibration, 
    • Object detection 
      • Adaptive thresholding, 
      • Object discrimination
        • Forced loading of raster data
        • Retrieval of a vector data group from a product
          • Datatype casting operation via jpy
          • Acquisition of a list of GeoTools’s SimpleFeature objects. 
          • Reading of SimpleFeature object attributes
          • Creating Geopandas dataframe (GeoDataFrame)
    • Recalculation of pixel grid points form centers to edges
    • Visualization using cartopy including reprojection into EuroPP CRS.
  4. Preview of an alternative approach using GPT utility

Tasks

TASK #1: Complete implementation of the visualization function output_view

TASK #2: Apply Land-Sea mask to the product from previous operation (orbit_applied_product)*

  • First, use Import-Vector operator to import a vector file Gulf_of_Trieste_seamask_UTM33.shp
  • Second, use Land-Sea-Mask operator to apply the mask to the product.

TASK #3: Create a GeoDataFrame of detected ships

Per-pixel classification of Sentinel-2 data (inspired by tutorial LAND01 by RUS)

Extensive activity that leads from construction of a dataset of labeled data to application of a trained classifier to unlabeled data of choice. A classifier is trained on an image of Prešov, and a trained model is applied to an image of Košice. The data labels are provided through Urban Atlas 2018, which at the time of preparing this tutorial included data for Prešov, but it did not include data for Košice.

Preparation of the labeled data (products capturing Prešov)

Steps
  1. Manual download of the data from Copernicus Land Monitoring Service (Urban Atlas 2018)
  2. Loading Land Monitoring Service Data in Geopackage format
  3. Comparison of map projections available in Cartopy
  4. Inverse transformation of coordinates using Proj (Pyproj) library
  5. Downloading administrative region shapefiles from Natural Earth Data using cartopy
  6. Visualization of GeoDataFrame alongside Shapely polygon data
  7. Download of specific product files (quicklook images) using OData protocol directly (without any abstraction)
  8. Downloading the data from Copernicus Open Hub – OpenSearchAPI, OData using sentinelsat package
  9. Extracting the downloaded product data
  10. Navigating Sentinel-2 product data
  11. Saving GeoDataFrame as GeoJSON
  12. Reading clouds mask in GML format using Geopandas
  13. Preparation of cloud and footprint raster masks
  14. Windowed reading raster band data using rasterio
  15. Masking of band image data
  16. Reprojection in rasterio
  17. Visualization of raster data alongside vector data
  18. Saving band images in npz
  19. Saving band images in NetCDF
Tasks

TASK #1: Use appopriate Geopandas function to load GeoPackage file into a GeoDataFrame

TASK #2: Use appopriate parameters of GeoDataFrame.plot() function to categorize data by column class_2018

TASK #3: Filter-out LMS data with specific codes

  • Create a view of the GeoDataFrame lms_po_all_gdf which contains only entries with the following codes (use code_2018 column):
    • 31000 – Forests
    • 11100 – Continuous urban fabric (S.L. : > 80%
    • 11210 – Discontinuous dense urban fabric (S.L. : 50% – 80%)
    • 11220 – Discontinuous medium density urban fabric (S.L. : 30% – 50%)
    • 21000 – Arable land (annual crops)

TASK #4: Transform LMS data bounds into longitude/latitude

TASK #5: Construct bounding box polygon

  • Use appropriate Shapely geometry function

TASK #6: Use appropriate SentinelAPI functions to download all products in a list

TASK #7: Rasterize the Clouds/Clear-sky mask

  • Clouds/Clear-sky mask is loaded completly
  • No cropping is applied.
  • The mask is inverted
  • Thus, the returned window should be None and the returned value is ignored.

TASK #8: Load the first (and only) band from the file

TASK #9: Apply the reprojection to the band data

  • Use the windowed transform of the data of interest
  • Provide the appropriate CRSs
  • Use the dst_crs as the destination transform

BONUS TASK: Save the output data as NetCDF

  • Save the produced dataset in NetCDF files, one file per product
  • Provide proper coordinates in the CRS of the the band images

Training of a random forest classifier on the labeled data

Steps
  1. Loading npz files
  2. Flattening of band image data
  3. Preparation of training and testing sets
  4. Training of a random forest classifier (using the first downloaded product)
  5. Cross-validation
  6. Visualization of the classification results
  7. Application of the classifier to other product

Tasks

TASK #1: Load GeoJSON file 'products_of_interest_gdf.json' into a GeoDataFrame

TASK #2: Load CSV file 'lms_class_names.csv' into DataFrame

TASK #3: Create an appropriate supervised classifier available in Scikit-learn API

TASK #4: Run classification for all products in sorted_products_of_interest_df

  • You can base the entire source code for this task on the previous examples.
  • Observe the classification results.
  • Think about possible explanation for varying accuracy between the products.

TASK #5: Duplicate this notebook and modify this condition to include more products

  • After the activities in this notebook are complete, continue with this step.
  • Perform the modification and run the updated notebook. Then compare results between the two models.
  • More sophisiticated approach would use several images in percisely selected temporal windows, each would serve as an input to the classifier, and not just as another datapoint of the training set.

Application of a trained classifier to unlabeled data (products capturing Košice)

Steps
  1. Geolocation of “Kosice, Slovakia”
  2. Downloading the data from Copernicus Open Hub – OpenSearchAPI, OData
  3. Extracting the downloaded product data
  4. Reading clouds mask in GML format using Geopandas
  5. Windowed reading raster band data using rasterio
  6. Masking of band image data
  7. Reprojection in rasterio
  8. Loading trained model
  9. Flattening of band image data
  10. Application of a trained classifier
  11. Visualization of the classification results
Tasks

TASK #1: Use appopriate Geopy function to retreive longitude and latitude of Kosice, Slovakia, then construct a WKT string denoting a point.

TASK #2: Use appropriate SentinelAPI functions to search for products matching the criteria

  • Search for product Sentinel-2 MSI Level 1C products ('S2MSI1C') that intersect Kosice (ke_point_wkt), and the sensing time (date) of the product is 20 days before or after May 1, 2020 (product_mean_datetime).
  • See: https://sentinelsat.readthedocs.io/en/stable/api.html

TASK #3: Load the existing classifier created in the previous notebook

Preview of Sentinel-2 cloud mask using Geopandas

Supplementary activity to the previous activity focused on visualization of cloud masks and demonstration of cloud types stored in GML files alongside the mask shapes. This activity should be helpful in understanding the extensive product reading procedure used in the previous activity. The activity should be done alongside the previous activity, if explanation of the procedure requires more experimentation.

Steps
  1. Navigating Sentinel-2 product data
  2. Reading clouds mask in GML format using Geopandas
  3. Visualization of cloud masks 

Loading, visualizing and analysing NetCDF file using xarray – data from Climate Data Store

Single NetCDF File

Demonstration of manipulation with a NetCDF file in xarray library, In this case the coordinates of data points are provided as 2D grids. A data point is provided for each point of the grid.  The activity also includes a demonstration of the Climate Data Store provided by Copernicus Climate Change Service (C3S).

Steps
  1. Manual download of the data from Copernicus Climate Change Service (data from fire danger indicator models)
  2. Overview of the data in Panoply
  3. Usage of latitude and longitude grids provided in the data
  4. Visualization of data
  5. Analysis of latitude and temporal dependence of variables (fire danger indicator)
    • Binning values by latitude, and calculation of a mean fire danger indicator value per latitude bin
    • Visualization of the data in line plots
Tasks

Multiple NetCDF Files

Demonstration of manipulation with a NetCDF file in xarray library, In this case the coordinates of data points are provided as 2D grids. A data point is provided for each point of the grid. The activity also includes a demonstration of the Climate Data Store provided by Copernicus Climate Change Service (C3S). The activity extends the previous activity by concatenation of NetCDF files and demonstration of vectorized application of a fitting procedure.

Steps
  • Concatenation of xarray datasets
    1. Visualization of the data 
    2. Analysis of latitude and temporal dependence of variables (fire danger indicator)
    3. Fitting fire danger indicator value in time to polynomials for a single latitude bin
    4. Analysis of the data for all latitude bins along the time dimension
      • Vectorized application of the fitting procedure
Tasks

Loading and visualization of NetCDF data from Sentinel-3 (OLCI User products) in xarray

Demonstration of manipulation with a NetCDF file in xarray library, In this case the coordinates of data points are provided as 2D tie points that are not arranged into a grid of points. The activity also includes a demonstration of Copernicus Open Access Hub . The activity extends the previous activity by concatenation of NetCDF files and demonstration of vectorized application of a fitting procedure.

Steps

  1. Loading multiple NetCDF files in xarray, usage of dask library
  2. Loaded files include geo coordinates (tie_goeCoordinates), 
  3. Merging of xarray datasets
  4. Visualization a lat/lon-dependent variable (total ozone)
  5. Optimization of visualization performance
  6. Visualization a lat/lon-dependent variable (global vegetation index) using Cartopy

Tasks

  • TASK #1: Visualize Global Vegetation Index in dataset referenced by variable ogvi_ds
    • Base your code on the examples above.
    • Recommended colormap is "brg"

Loading and visualization of GRIB files in xarray package

The notebook for this activity demonstrates reading GRIB using xarray. This is demonstrated for a single file and for multi-file dataset. Three methods of reading multi-file datasets are presented.

Downloading of Sentinel-2 cloudless data via Web Map Service protocol

A brief demonstration of WMS protocol and Sentinel-2 cloudless dataset provided by EOX. Two approaches are presented, one uses Python and another realizes WMS without any abstraction using wget utility in a short bash script.

Steps

  1. Access the data using OWSLib package
  2. Review of WMS metadata
    • Download of the data, storing the data in a file, efficient visualization in a Jupyter notebook
  3. Access the data using a bash script

Tasks

  • TASK #1: Specify bounding box parameters to download a map for the whole planet
    • You can utlize bound accessible through wms.contents['s2cloudless_3857'].boundingBox
    • See https://epsg.io/900913
  • TASK #2: Select 20x20km box around Kosice, Slovakia
  • TASK #3: Downloading data via WMTS protocol
    • This can be implemented as a bash script outside of this notebook, or it can be implemented in the cell below
    • Manually review https://tiles.maps.eox.at/wmts/1.0.0/WMTSCapabilities.xml
      • Find layer s2cloudless-2019
      • Find TileMatrixSet for this layer
      • Within this TileMatrixSet choose TileMatrix with Identifier value equal 1
      • Identify values for ServiceTypeVersion, the layer’s style, MIME format
      • Either use resource URL from the WMTSCapabilities.xml or do a proper WMTS GetTile request
        • ${SERVER_URL}?SERVICE=WMTS&REQUEST=GetTile&VERSION=${VERSION}&LAYER=${LAYER}&STYLE=${STYLE}&FORMAT=${FORMAT}&TILEMATRIXSET=${TILEMATRIXSET}&TILEMATRIX=${TILEMATRIX}&TILEROW=${i}&TILECOL=${j}
      • Save the output into ows_data_download/WMTS directory
      • To download an image you can use wget utility

Normalized Satellite Indexes in Google Earth Engine

The activity demonstrates Copernicus data access in Google Earth Engine (GEE). It also includes demonstration of the Google Earth Engine API and principle of its operation. The activity shows a different paradigm than the previous lab activities. The task is to calculate NDVI and NDWI indexes using the GEE API.

Tasks

  • TASK #1: Provide the equation for NDVI and rename the band to “NDVI”
  • TASK #2: Provide the equation for NDVI and rename the band to “NDWI”

Some links

General information

Slides

Courses

Courses (in a book form)

Tutorials, Jupyter notebooks

Lists

Datasets

Other