import os
import requests
import boto3
from osgeo import gdal
import rasterio as rio
from rasterio.session import AWSSession
import rioxarray
import hvplot.xarray
import holoviews as hv
Accessing Cloud Optimized GeoTIFF (COG) - S3 Direct Access
Summary
In this notebook, we will access data for the Harmonized Landsat Sentinel-2 (HLS) Operational Land Imager Surface Reflectance and TOA Brightness Daily Global 30m v2.0 (L30) (10.5067/HLS/HLSL30.002) data product. These data are archived and distributed as Cloud Optimized GeoTIFF (COG) files, one file for each spectral band.
We will access a single COG file, L30 red band (0.64 – 0.67 μm), from inside the AWS cloud (us-west-2 region, specifically) and load it into Python as an xarray
dataarray
. This approach leverages S3 native protocols for efficient access to the data.
Requirements
1. AWS instance running in us-west-2
NASA Earthdata Cloud data in S3 can be directly accessed via temporary credentials; this access is limited to requests made within the US West (Oregon) (code: us-west-2) AWS region.
2. Earthdata Login
An Earthdata Login account is required to access data, as well as discover restricted data, from the NASA Earthdata system. Thus, to access NASA data, you need Earthdata Login. Please visit https://urs.earthdata.nasa.gov to register and manage your Earthdata Login account. This account is free to create and only takes a moment to set up.
3. netrc File
You will need a netrc file containing your NASA Earthdata Login credentials in order to execute the notebooks. A netrc file can be created manually within text editor and saved to your home directory. For additional information see: Authentication for NASA Earthdata.
Learning Objectives
- how to retrieve temporary S3 credentials for in-region direct S3 bucket access
- how to perform in-region direct access of HLS Cloud Optimized geoTIFF (COG) files in S3
- how to plot the data
Import Packages
Get Temporary AWS Credentials
Direct S3 access is achieved by passing NASA supplied temporary credentials to AWS so we can interact with S3 objects from applicable Earthdata Cloud buckets. For now, each NASA DAAC has different AWS credentials endpoints. Below are some of the credential endpoints to various DAACs:
= {
s3_cred_endpoint 'podaac':'https://archive.podaac.earthdata.nasa.gov/s3credentials',
'gesdisc': 'https://data.gesdisc.earthdata.nasa.gov/s3credentials',
'lpdaac':'https://data.lpdaac.earthdatacloud.nasa.gov/s3credentials',
'ornldaac': 'https://data.ornldaac.earthdata.nasa.gov/s3credentials',
'ghrcdaac': 'https://data.ghrc.earthdata.nasa.gov/s3credentials'
}
Create a function to make a request to an endpoint for temporary credentials. Remember, each DAAC has their own endpoint and credentials are not usable for cloud data from other DAACs.
def get_temp_creds(provider):
return requests.get(s3_cred_endpoint[provider]).json()
= get_temp_creds('lpdaac')
temp_creds_req #temp_creds_req
Workspace Environment Setup
For this exercise, we are going to open up a context manager for the notebook using the rasterio.env module to store the required GDAL and AWS configurations we need to access the data in Earthdata Cloud. While the context manager is open (rio_env.__enter__()) we will be able to run the open or get data commands that would typically be executed within a with statement, thus allowing us to more freely interact with the data. We’ll close the context (rio_env.__exit__()) at the end of the notebook.
Create a boto3
Session object using your temporary credentials. This Session is used to pass credentials and configuration to AWS so we can interact wit S3 objects from applicable buckets.
= boto3.Session(aws_access_key_id=temp_creds_req['accessKeyId'],
session =temp_creds_req['secretAccessKey'],
aws_secret_access_key=temp_creds_req['sessionToken'],
aws_session_token='us-west-2') region_name
GDAL environment variables must be configured to access COGs in Earthdata Cloud. Geospatial data access Python packages like rasterio and rioxarray depend on GDAL, leveraging GDAL’s “Virtual File Systems” to read remote files. GDAL has a lot of environment variables that control it’s behavior. Changing these settings can mean the difference being able to access a file or not. They can also have an impact on the performance.
= rio.Env(AWSSession(session),
rio_env ='TRUE',
GDAL_DISABLE_READDIR_ON_OPEN=os.path.expanduser('~/cookies.txt'),
GDAL_HTTP_COOKIEFILE=os.path.expanduser('~/cookies.txt'))
GDAL_HTTP_COOKIEJAR__enter__() rio_env.
In this example we’re interested in the HLS L30 data collection from NASA’s LP DAAC in Earthdata Cloud. Below we specify the s3 URL to the data asset in Earthdata Cloud. This URL can be found via Earthdata Search or programmatically through the CMR and CMR-STAC APIs.
= 's3://lp-prod-protected/HLSL30.020/HLS.L30.T11SQA.2021333T181532.v2.0/HLS.L30.T11SQA.2021333T181532.v2.0.B04.tif' s3_url
Direct In-region Access
Read in the HLS s3 URL for the L30 red band (0.64 – 0.67 μm) into our workspace using rioxarray
, an extension of xarray
used to read geospatial data.
= rioxarray.open_rasterio(s3_url)
da da
The file is read into Python as an xarray
dataarray
with a band, x, and y dimension. In this example the band dimension is meaningless, so we’ll use the squeeze()
function to remove band as a dimension.
= da.squeeze('band', drop=True)
da_red da_red
Plot the dataarray
, representing the L30 red band, using hvplot
.
='x', y='y', cmap='gray', aspect='equal') da_red.hvplot.image(x
Exit the context manager.
__exit__() rio_env.
Resources
Direct S3 Data Access with rioxarray
Direct_S3_Access__rioxarray_clipping
Getting Started with Cloud-Native Harmonized Landsat Sentinel-2 (HLS) Data in R