Generate 5m super-resolved images from Sentinel-2 L2A (Theia) products (bands B02, B03, B04, B05, B06, B07, B08, B8A, B11 and B12) using Single Image Super-Resolution model trained in the frame of the EVOLAND Horizon Europe.
The default network has been trained in the course of the project using the Sen2Venµs dataset, complimented with B11 and B12 patches.
Michel, J.; Vinasco-Salinas, J.; Inglada, J.; Hagolle, O. SEN2VENµS, a Dataset for the Training of Sentinel-2 Super-Resolution Algorithms. Data 2022, 7, 96. https://doi.org/10.3390/data7070096
Additional model trained in the following work are also included (see section Additional models).
Julien Michel, Ekaterina Kalinicheva, Jordi Inglada. Revisiting remote sensing cross-sensor Single Image Super-Resolution: the overlooked impact of geometric and radiometric distortion. 2024. https://hal.science/hal-04723225v1
- Installation
- Usage
- Additional models
- Changelog
- Frequently asked Questions
- Inference time for full products
- Credits
Installing the tool should then be straightforward with pip :
$ pip install git+https://framagit.org/jmichel-otb/sentinel2_superresolution.git
In order to perform GPU inference (using the --gpu
command-line switch), first make sure that the following package are installed:
cuda-python
You can then install gpu support with:
$ pip install "sentinel2_superresolution[gpu] @ git+https://github.com/Evoland-Land-Monitoring-Evolution/sentinel2_superresolution.git"
Example of use with a pixel (rows / columns of the 10m input image) ROI :
$ sentinel2_superesolution -v -i SENTINEL2A_20200808-105933-164_L2A_T31TCG_C_V2-2/ -o results/ --bicubic -roip 2000 2000 2500 2500
Note the --bicubic
flag, that allows to also generate a bicubic-upsampled image to 5 meters in order to compare with the SISR result.
Example of use with a UTM (coordinates of the product) ROI:
$ sentinel2_superesolution -v -i SENTINEL2A_20200808-105933-164_L2A_T31TCG_C_V2-2/ -o results/ --bicubic -roi 300160.000 4590400.000 304000.000 4594240.000
Note the --bicubic
flag, that allows to also generate a bicubic-upsampled image to 5 meters in order to compare with the SISR result.
$ sentinel2_superesolution -i S2B_MSIL1C_20231126T105309_N0509_R051_T31TCJ_20231126T113937.SAFE/ -roip 0 0 512 512 --l1c -o results/
Note: Do not forget the --l1c
flag, as the tool will not detect the product format automatically.
sentinel2_superesolution -i SENTINEL2B_20230219-105857-687_L2A_T31TCJ_C_V3-1 -o results/ --gpu
Add the --gpu
switch. If there are no GPUs available or correctly configured, the tool will fallback to CPU inference.
Starting v2.0.0, the tool provides the opportunity to plug additional models for comparison. The following models are provided with the tool in file:src/sentinel2_superresolution/models:
Yaml file | Model | Dataset for training | Bands | Source | Target | Comment |
---|---|---|---|---|---|---|
carn_3x3x64g4sw_bootstrap.yaml | CARN | Sen2Venµs | B02, B03, B04, B05, B06, | 10 meters | 5 meters | Default model |
B07, B08, B8A, B11, B12 | ||||||
s2v2x2_spatrad.yaml | ESRGAN | Sen2Venµs | B02, B03, B04, B08 | 10 meters | 5 meters | s2v2x2 model from hal-04723225 |
s2v2x4_spatrad.yaml | ESRGAN | Sen2Venµs | B05, B06, B07, B8A | 20 meters | 5 meters | s2v2x4 model from hal-04723225 |
wsx2_spatrad.yaml | ESRGAN | WorldStrat | B02, B03, B04, B08 | 10 meters | 5 meters | wsx2 model from hal-04723225 |
wsx4_spatrad.yaml | ESRGAN | WorldStrat | B02, B03, B04, B08 | 10 meters | 2.5 meters | wsx4 model from hal-04723225 |
Additional models can be run with the following command:
sentinel2_superesolution -i SENTINEL2B_20230219-105857-687_L2A_T31TCJ_C_V3-1 -o results/ -m src/sentinel2_superresolution/models/wsx4_spatrad.yaml --gpu
It is also possible to plug external super-resolution models in the tool. They should be first exported to an onnx
model that accepts a tensor of shape [b,c,w,h]
and outputs a tensor of shape [b,c,w*f, h*f]
, where b
is the batch dimension, c
the number of channels, w
and h
are the patch spatial dimension and f
is the model super-resolution factor.
The onnx
model should then be documented by a small yaml
file as follows:
bands: # List the bands that are processed by the model, in the correct order
- B2
- B3
- B4
- B8
factor: 2.0 # The super-resolution factor of the model
margin: 66 # The margin that should be applied to avoid tile artifacts (e.g. receptive field of the model)
model: wsx2_spatrad.onnx # relative path to the model's onnx parameters
The yaml
can then be passed to the -m
switch of sentinel2_superresolution
.
Merge Requests are welcome if you want o include your own model in the models distributed with the tool.
- The
-ov
switch has been removed since the amount of overlap depends on the model and should not be changed by the user - The
-m
switch now requires a path to theyaml
file documenting the onnx exported model - Several additional models have been included in file:src/sentinel2_superresolution/models/
Here is a list of questions that have been frequently asked.
Bands in output image follow the following order: B2, B3, B4, B8, B5, B6, B7, B8A, B11, B12
Processed bands are the FRE variants (Flat Reflectence).
Upon reading of the product, sensorsio applies the -1000 radiometric offset depending on the product version.
If you encounter the following:
raise FileNotFoundError(
FileNotFoundError: Could not find root XML file in product directory data/S2B_MSIL2A_20240424T054639_N0510_R048_T43SCS_20240424T080948.SAFE****
It is likely that you are trying to process Sen2corr L2A. Currently sentinel2_superresolution only supports Theia L2A products from https://theia.cnes.fr.
Here are orders of magnitude for full products inference time of the default model:
CPU (1 core) | CPU (8 cores) | GPU (A100) | |
---|---|---|---|
L1C | 6 hours | 1 hour | 6 minutes |
L2A | 5 hours | 50 minutes | 5 minutes |
- This work was partly performed using HPC resources from GENCI-IDRIS (Grant 2023-AD010114835)
- This work was partly performed using HPC resources from CNES.