Processing Sentinel-2 Data With Python

Sentinel 2-a.png

South Andros Island, Bahamas (09-25-2018, 15:55 UTC), shown by Sentinel-2 MSI (10 m resolution), processed with Python (Satpy). Data downloaded from the Copernicus Open Access Hub.

Below, some zooming in the image above!

Sentinel 2Sentinel 2b.pngSentinel 2c.pngSentinel 2d.pngSentinel 2e.png

As a follow up to the previous Blog post, where we shown how to process Sentinel-3 OLCI data, let’s see how to process Sentinel-2 MSI Data with Python. The process is very similar. Let’s see an example:

ACCESSING SENTINEL-2 DATA USING THE COPERNICUS OPEN ACCESS HUB

Access the Copernicus Open Access Hub at the following link:

scihub.copernicus.eu/dhus/

Sentinel 2f.png

Create an account by clicking at the “Sign up” link on the LOGIN menu.

Sentinel 2g.pngSentinel 2h.png

Sentinel 2i

Awesome!

After creating an account, as it happened in the CODA from the previous Blog Post, navigate to the region of interest. In this example, Bahamas.

Sentinel 2j.png

Now click on the following icon to select your region of interest:

Sentinel 2l.png

And select the area:

Sentinel 2m.png

Expand the “Insert Search Criteria” menu. Under “Mission: Sentinel-2”, in “Satellite Platform” select “S2A_*”, in “Product Type”, choose “S2MSI1C”. Click at the magnifier icon to search for data over the select region.

Sentinel 2n.png

You should see the available passes for that region:

Sentinel 2o.png

Let’s select this one:

Sentinel 2p.png

Click at the following icon to download the L1 data:

Sentinel 2q.png

After the download, extract the data in the directory of you preference. In this example, we extracted it at C:\MSI

Sentinel 2r

PROCESSING THE SENTINEL-2 DATA WITH SATPY

You should be familiar with Anaconda if you followed the GOES-16 and Python tutorials from this blog. Let’s make a quick overview.

Download the Anaconda Distribution from the following link:

http://www.anaconda.com/download/

After installing it, execute the Anaconda Prompt as an Admin:

CODA_10

Install SatPy in a new env using Anaconda and execute the Spyder IDE. Here are the commands we used:

conda create --name satellite
activate satellite

conda install -c conda-forge satpy
conda install -c conda-forge matplotlib
conda install -c conda-forge Pillow
conda install -c conda-forge pyorbital
conda install -c sunpy glymur

Use the following script to generate the True Color composite from that pass:

from satpy.scene import Scene
from satpy import find_files_and_readers
from datetime import datetime

files = find_files_and_readers(base_dir="C:\\MSI",
                               reader='safe_msi')

scn = Scene(filenames=files)
scn.load(['true_color'])
scn.save_dataset('true_color', filename='true_color_S2_gnc_tutorial'+'.png')

IMPORTANT NOTE: Given the high resolution of this image (10 m), this will require a good amount of RAM! And will generate a huge image (almost 200 MB).

And that’s it! This is what we get plotting this dataset!

Sentinel 2-aSentinel 2s

Beautiful!

You can do many things using the features provided by Python / Satpy, like reprojection, exporting to other formats, overlaying maps and many other things!

You. Can. Do. Anything. With. Python.

Processing Sentinel-3 Data With Python

Typhoon-Trami-Diego.png

Super Typhoon Trami shown by Sentinel-3 OLCI (300 m resolution), processed with Python (SatPy). Level 1B Data Downloaded from CODA (Copernicus Online Data Access). Click to enlarge!

Hi all! As seen on this blog post, we may use Python / Satpy to generate a very nice color composite using GOES-16 data. One of the nicest thing about Satpy is that it may be used to process data from GOES-16, METEOSAT, Himawari, Sentinel-2, Sentinel-3, AQUA/TERRA, NPP and others. Actually we found it to be very easy to plot data from other satellites. We used Satpy to plot the image above (Typhoon Trami) using data downloaded from CODA (Copernicus Online Data Access). Below, another example plot:

true_color_florida_caribbean.pngtrue_color_florida_caribbean_2.png

And below, a plot for the Brazilian northeast coast:

true_color_brazilian_northeast_a.pngtrue_color_brazilian_northeast.png

Let’s see how to do it!

ACCESSING SENTINEL-3 DATA USING CODA

Create an account on the EUMETSAT Earth Observation portal for free. Click on “New User – Create New Account”. Fill out the requested data. You will receive a confirmation e-mail to complete your registration.

EUMETSAT_account.png

After accessing your EUMETSAT EO portal account, access CODA, the Copernicus Online Data Access webpage at the following link:

coda.eumetsat.int

CODA_1.png

Click on the following icon to navigate on the map (or press your mouse scroll button):
CODA_2.png

Let’s suppose we want an image form the Pacific coast of South America. Navigate to that region:

CODA_3.png

Now click on the following icon to select your region of interest:

CODA_4.png

And select the area:

CODA_5.png

Expand the “Insert Search Criteria” menu. In “Product Type”, select “OL_1_EFR___”, in instrument, choose “OLCI”, and in “Product Level”, choose “L1”. Click at the magnifier icon to search for data over the select region.

CODA_6.png

You should see the available passes for that region:

CODA_7.png

Let’s select this one:

CODA_8.png

Click at the following icon to download the L1B data:

CODA_9.png

After the download, extract the data in the directory of you preference. In this example, we extracted it at C:\OLCI

CODA_11

Three things you must consider from the folder name: date (on red below), start time (on blue below), and end time (on green below):

CODA_11b

You will use these on the Python code.

PROCESSING THE SENTINEL-3 DATA WITH SATPY

You should be familiar with Anaconda if you followed the GOES-16 and Python tutorials from this blog. Let’s make a quick overview.

Download the Anaconda Distribution from the following link:

http://www.anaconda.com/download/

After installing it, execute the Anaconda Prompt as an Admin:

CODA_10

Install SatPy in a new env using Anaconda and execute the Spyder IDE. Here are the commands we used:

conda create --name satellite
activate satellite

conda install -c conda-forge satpy
conda install -c conda-forge matplotlib
conda install -c conda-forge Pillow
conda install -c conda-forge pyorbital
conda install -c sunpy glymur

Use the following script to generate the True Color composite from that pass:

from satpy.scene import Scene
from satpy import find_files_and_readers
from datetime import datetime

files = find_files_and_readers(sensor='olci',
                               start_time=datetime(2018, 9, 24, 14, 19),
                               end_time=datetime(2018, 9, 24, 14, 22),
                               base_dir="C:\\OLCI",
                               reader='nc_olci_l1b')

scn = Scene(filenames=files)
scn.load(['true_color'])
scn.save_dataset('true_color', filename='true_color_gnc_tutorial'+'.png')

And that’s it! This is what we got plotting this dataset!

CODA_12.png

Ocean, desert and rainforest! 🙂

CODA_13.png

Salar de Uyuni

CODA_14.pngCODA_15.png

You can do many things using the features provided by Python / Satpy, like reprojection, exporting to other formats, overlaying maps and many other things!

You. Can. Do. Anything. With. Python.

Reminder: Script to Download GOES-16 NetCDF’s from Amazon S3

Amazon-GOES-R.png

We have already shown on this blog post how to download GOES-16 data from the NOAA’s Amazon Simple Storage Service (S3) GOES Archive using this great web interface developed by Brian Blaylock, from the University of Utah.

Now we’re going to show you how to do the same, using a Shell Script developed and shared by Dr. Marcial Garbanzo from the University of Costa Rica (Thanks Marcial!).

This procedure was tested on Ubuntu 16.04.

1-) Download and install the AWS Command Line Interface using the following command:

sudo apt install awscli

Tutorialaws1

2-) Save the script below at the folder of your preference. In our example it was saved at /home/GOES-16_AWS/downloadGOES16.sh:

#!/bin/bash

#########################################
# LICENSE
#Copyright (C) 2012 Dr. Marcial Garbanzo Salas
#This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
#This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#You should have received a copy of the GNU General Public License along with this program. If not, see http://www.gnu.org/licenses/.
#########################################

#########################################
# AUTHOR
# This program was created at the University of Costa Rica (UCR)
# It is intended as a tool for meteorology students to obtain data from GOES16
# but it can be used by operational and research meteorology.
#########################################

#########################################
# Warning: This program can download a LARGE amount of information
# and this can cause problems with limited bandwidth networks or
# computers with low storage capabilities.
#########################################

#########################################
# CLEANING FROM PREVIOUS RUNS
#
rm DesiredData.txt
rm FullList.txt
#########################################

echo "GOES16 ABI data downloader"

#########################################
# CONFIGURATION
#
# YEAR OF INTEREST
YEARS='2017'

# DAYS OF THE YEAR
# Can use this link to find out: https://www.esrl.noaa.gov/gmd/grad/neubrew/Calendar.jsp
# Example: 275 for October 2nd, 2017
# NOTE: There is only about 60 days previous to the current date available
DAYS="342 343 344 345 346"

# CHANNELS
# Example: CHANNELS='C01 C02 C03 C04 C05 C06 C07 C08 C09 C10 C11 C12 C13 C14 C15 C16'
CHANNELS='C06'

# ABI PRODUCTS
# For a description look into:
# https://aws.amazon.com/public-datasets/goes/
# and
# http://edc.occ-data.org/goes16/getdata/
# Example: PRODUCTS='L1b-RadC L1b-RadF L1b-RadM L2-CMIPC L2-CMIPF L2-CMIPM L2-MCMIPC L2-MCMIPF L2-MCMIPM'
PRODUCTS='L1b-RadF L2-CMIPF'
#########################################

#########################################
# Get list of remote files available
# PART 1. Obtain full list of files
#
for PRODUCT in $PRODUCTS; do
for YEAR in $YEARS; do
for DAY in $DAYS; do

aws s3 --no-sign-request ls --recursive noaa-goes16/ABI-$PRODUCT/$YEAR/$DAY/ | awk '{print $3";"$4}' >> FullList.txt

done
done
done

#
# PART 2. Select only desired channels
for CHANNEL in $CHANNELS; do
grep $CHANNEL FullList.txt >> DesiredData.txt
done
#########################################

#########################################
# DOWNLOAD
#

for x in $(cat DesiredData.txt);
do
SIZE=$(echo $x | cut -d";" -f1)
FULLNAME=$(echo $x | cut -d";" -f2)
NAME=$(echo $x | cut -d"/" -f5)

echo "Processing file $NAME of size $SIZE"
if [ -f $NAME ]; then
 echo "This file exists locally"
 LOCALSIZE=$(du -sb $NAME | awk '{ print $1 }')
 if [ $LOCALSIZE -ne $SIZE ]; then
 echo "The size of the file is not the same as the remote file. Downloading again..."
 aws s3 --no-sign-request cp s3://noaa-goes16/$FULLNAME ./
 else
 echo "The size of the file matches the remote file. Not downloading it again."
 fi
else
 echo "This file does not exists locally, downloading..."
 aws s3 --no-sign-request cp s3://noaa-goes16/$FULLNAME ./
fi

done
#########################################

echo Program ending.

3-) As the administrator (root), you must change the file permissions in order to execute the script, with the chmod +x * command for example:

chmod +x *

Tutorialaws2

4-) Change the script according to your needs (Year, Julian Day, Channels and Products).

gedit downloadGOES16.sh &

You may change the following:

  • Year of interest: Line 38
  • Julian day (three digits!): Line 44 (as mentioned on the script, you may check the julian days at this link)
  • ABI Channel: Line 48 (C01 to C16)
  • ABI Products to Download: Line 56

The ABI Products (Line 56) may be:

  • L1b-RadC: Level 1b Radiances (CONUS)
  • L1b-RadF: Level 1b Radiances (Full-Disk)
  • L1b-RadM: Level 1b Radiances (Mesoscale)
  • L2-CMIPC: Level 2 CMI (CONUS)
  • L2-CMIPF: Level 2 CMI (Full-Disk)
  • L2-CMIPM: Level 2 CMI (Mesoscale)
  • L2-MCMIPC: Level 2 CMI (CONUS) – All 16 bands [2 km] in a single NetCDF file.
  • L2-MCMIPF: Level 2 CMI (Full-Disk) – All 16 bands [2 km] in a single NetCDF  file.
  • L2-MCMIPM: Level 2 CMI (Mesoscale) – All 16 bands [2 km] in a single NetCDF file.

5-) Test it! For our test we choose:

  • Year of interest (Line 38): ‘2018’
  • Julian day (three digits!) (Line 44): ‘010’ (day of this blog post :))
  • ABI Channel (Line 48): ‘C13’
  • ABI Products to Download (Line 56): ‘L2-CMIPF’

Tutorialaws3

Execute the script:

./downloadGOES16.sh

If it’s the first time you’re running the script, you should see messages saying that the ‘DesiredData.txt’ and ‘FullList.txt’ log files couldn’t be removed. That’s normal, for they weren’t created yet.

Then, the script will start downloading all the chosen files available for that day. If you already have a given file for that day on this directory, the script will not download it.

Tutorialaws4

After the script execution you’ll see the “Program ending.” message, and all the GOES-16 files you choose to download will be available on the directory you run the script.

We start running the script at 14:15 UTC. The last file download was from 14:11 UTC:

Tutorialaws5Tutorialaws6

IMPORTANT: As seen on the script, this program can download a LARGE amount of information and this can cause problems with limited bandwidth networks or computers with low storage capabilities.

Note: Using AWS, you may download up to 60 days of historic data from GOES-16. 

All right! Now you may use our Python Tutorials in order to plot the data downloaded from AWS too! The image below shows the plot of the file mentioned above:

G16_C13_10012018_140041.png

You may use this script operationally, putting it on cron and passing the year, julian day, channel and product as parameters to the script! Great!

Python and GOES-16 RGB’s (Tutorials Coming Soon)

True_Color_SatPy.png

True Color RGB created with Python (SatPy) – 09/14/2018 15:02 UTC

Hi all!

Just sharing a nice step on the quest to create GOES-16 RGB’s using Python. We have plans to publish tutorials on RGB’s in the near future (as a follow up to the Python + GOES-16 Tutorial Series) and use this in Capacity Building sessions.

We already have Python scripts to create the following RGB’s (click to access the RAMMB and NASA SPoRT Quick Guides that greatly helps to understand them):

But we created these without SatPy (former Pytroll). In order to create this nice True Color composite with SatPy, these are the steps we followed:

1 – Download some data from Amazon using scripts or this web interface. For the example below we use CONUS L1b. We are working for SatPy to work with GOES-16 Level 2 too.

2 – Install SatPy in a new env using anaconda. Here are the commands we used:

conda create --name satellite
activate satellite

conda install -c conda-forge satpy
conda install -c conda-forge matplotlib
conda install -c conda-forge Pillow
conda install -c conda-forge pyorbital
conda install -c conda-forge spyder
conda install -c anaconda spyder ipykernel=4.8.2

spyder

3 – Run the suuuuper complicated script below 🙂

from satpy import Scene
from glob import glob

scn = Scene(reader='abi_l1b', filenames=glob('*20182571502128*.nc'))
scn.load(['true_color'])

new_scn = scn.resample(scn.min_area(), resampler='native')
new_scn.save_dataset('true_color', filename='true_color'+'.png')

And… Ta-daaaa:

True_Color_SatPy_CONUS.png

Not bad for 6 lines of code…. 🙂

True_Color_SatPy_Caribbean.pngTrue_Color_SatPy_Mexico.png

One nice thing about SatPy is how easy you may create other compositions. Just by changing “true_color” to “airmass”:

scn.load(['airmass'])
new_scn = scn.resample(scn.min_area(), resampler='native')
new_scn.save_dataset('airmass', filename='airmass'+'.png')

… will give you the Airmass RGB:

Airmass_SatPy.png

Or changing to “day_microphysics”, would give you this:

DMP_SatPy.png

Or changing to “dust”:

Dust_SatPy.png

Or changing to “natural”:

Natural_SatPy.png

Great, isn’t it?

Please find other SatPy/Pytroll related posts from this Blog below:

A Nice GOES-16 Composite Made With Python [2]

Day Cloud Convection RGB - subset.png

We have seen on this blog post a GOES-16 composition using channels 1,2,3 (day) and 13 (night) + night lights.

On GNC-A we have channels 2, 7, 8, 9, 13, 14 and 15, so we can’t create the True Color composition (in the other hand you could automatically download channels 1 and 3 from Amazon as seen on this blog post!).

If we are using only the data from GNC-A, for the day composite we could create the “Day Cloud Convection” RGB using only channels 2 and 13, as seen on the following Quick Guide from RAMMB:

Day Cloud Convection RGB.png

This is the result for the Full Disk (click to enlarge):

teste61.png

Very nice!

Day Cloud Convection RGB - subset 2.png

A Nice GOES-16 Composite Made With Python

RGB subset.pngRGB subset2.png

Check out below the nice GOES-16 composites we produced with Python. We used channels 1, 2 and 3 (downloaded from Amazon) to produce the day composite (sun zenith angle < 90°), and for the night we made some adjustments on channel 13, and also overlayed a GeoTIFF with the night lights! Click to enlarge!

teste50.pngteste51.png

We are making some adjustments and will post the script soon. Stay tuned!

Possible GNC-A data interruptions due to hurricane Florence

5pmweds.png
Topic:  Possible GNC-A data interruptions due to hurricane Florence
Date/Time Issued:  September 13, 2018 1800 UTC
Product(s) or Data Impacted:  Signal quality and products being disseminated through GEONETCast-Americas
Date/Time of Initial Impact: September 14, 2018
Date/Time of Expected End: September 17, 2018
Length of Event:  TBD
Details/Specifics of Change:  Hurricane Florence is approaching US East Coast.  We do not anticipate any GEONETCast America (GNC-A) service outages; however, major weather events like hurricanes can cause some interference or temporary degradation in radio frequency communications due to heavy rain and rain fade on the C band.
Contact Information for Further Information: [email protected] or (301) 817-3880 for any operational concerns, including outages and administrative information.
[email protected] for information on GNC-A Program
Web Site(s) for applicable information:
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Tema: Posibles interrupciones de datos del Sistema GNC-A debido al huracán Florence
Fecha / hora de emisión: 13 de Septiembre de 2018 1800 UTC
Producto(s) o Datos Afectados: Calidad de la señal y productos que se diseminan a través de GEONETCast-Americas
Fecha / hora del impacto inicial: 14 de septiembre de 2018
Fecha / Hora de finalización prevista: 17 de septiembre de 2018
Duración del evento: Por definir
Detalles del cambio: El huracán Florence se acerca a la costa este de los Estados Unidos. No anticipamos ninguna interrupción del servicio GEONETCast America (GNC-A); sin embargo, los eventos climáticos importantes como los huracanes pueden causar alguna interferencia o degradación temporal en las comunicaciones de radiofrecuencia debido a la fuerte lluvia y la degradación por la lluvia en la Banda C.
Contacto para más información:
[email protected] o (301) 817-3880 para cuestiones operativas, incluso interrupciones e información administrativa.
[email protected] para información sobre el programa GNC-A
Sitio(s) web para otras informaciones:
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Tópico: Possível interrupção de dados do Sistema GNC-A devido ao furacão Florence
Data / Hora da Emissão: 13 de Setembro de 2018 1800 UTC
Produto(s) ou Dados Impactados: Qualidade de sinal e produtos disseminados pelo GEONETCast-Americas
Data / Hora do Impacto Inicial: 14 de Setembro de 2018
Data / Hora Esperada para Término: 17 de Setembro de 2018
Duração do Evento: A ser definido
Detalhes / Mudanças Específicas: O furacão Florence está se aproximando da Costa Leste dos EUA. Não prevemos interrupções no serviço GEONETCast-Americas (GNC-A); no entanto, eventos climáticos severos como furacões podem causar alguma interferência ou degradação temporária nas comunicações via radiofrequência, devido à chuva intensa e degradação na Banda C.
Contato para mais informações:
[email protected] ou (301) 817-3880 para questões operacionais, incluindo interrupções e informações administrativas.
[email protected] para informações sobre o programa GNC-A.
Web site(s) para demais informações: