Time Series Visualization

Exploring drought and vegetation response with Google Earth Engine and Altair in Google Colab

Justin Braaten
Geo for Good Summit, 2019

bit.ly/ts-vis








Introduction

What this is not:

  • an introduction to Earth Engine.
  • an introduction to Pandas.
  • an introduction to Altair.
  • an introduction to Colab.

What this is:

This is an applied example of a workflow that combines the strengths of four technologies to interactively explore environmental time series data and to prototype more extensive analyses.








Turning data into insights

...and ultimately into a narrative









Data reduction

  • Earth Engine - geospatial data access and processing
  • Pandas - dataframe structure
  • Altair - data visualization
  • - integrative Python environment








Drought and vegetation response

Sierra Nevada ecoregion


Source: Google | Earth Engine


Source: Flickr | Matt McGrath








Datasets








General workflow

  1. Filter the dataset
  2. Reduce region by a statistic
  3. Convert data format from list to dataframe
  4. Alter the dataframe
  5. Plot the dataframe








Python setup

Install the Earth Engine API

  1. Use pip to install the latest version of the Earth Engine API.
  2. Import the Earth Engine library.
  3. Authenticate access (registration verification and Google account access).
  4. Initialize the API.

In [2]:
!pip install earthengine-api
import ee

ee.Authenticate()
ee.Initialize()


Requirement already satisfied: earthengine-api in /usr/local/lib/python3.6/dist-packages (0.1.216)
Requirement already satisfied: google-api-python-client in /usr/local/lib/python3.6/dist-packages (from earthengine-api) (1.7.12)
Requirement already satisfied: httplib2<1dev,>=0.9.2 in /usr/local/lib/python3.6/dist-packages (from earthengine-api) (0.17.0)
Requirement already satisfied: google-auth>=1.4.1 in /usr/local/lib/python3.6/dist-packages (from earthengine-api) (1.7.2)
Requirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from earthengine-api) (0.16.0)
Requirement already satisfied: httplib2shim in /usr/local/lib/python3.6/dist-packages (from earthengine-api) (0.0.3)
Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from earthengine-api) (1.12.0)
Requirement already satisfied: google-auth-httplib2>=0.0.3 in /usr/local/lib/python3.6/dist-packages (from earthengine-api) (0.0.3)
Requirement already satisfied: google-cloud-storage in /usr/local/lib/python3.6/dist-packages (from earthengine-api) (1.18.1)
Requirement already satisfied: uritemplate<4dev,>=3.0.0 in /usr/local/lib/python3.6/dist-packages (from google-api-python-client->earthengine-api) (3.0.1)
Requirement already satisfied: setuptools>=40.3.0 in /usr/local/lib/python3.6/dist-packages (from google-auth>=1.4.1->earthengine-api) (46.0.0)
Requirement already satisfied: cachetools<3.2,>=2.0.0 in /usr/local/lib/python3.6/dist-packages (from google-auth>=1.4.1->earthengine-api) (3.1.1)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.6/dist-packages (from google-auth>=1.4.1->earthengine-api) (0.2.8)
Requirement already satisfied: rsa<4.1,>=3.1.4 in /usr/local/lib/python3.6/dist-packages (from google-auth>=1.4.1->earthengine-api) (4.0)
Requirement already satisfied: urllib3 in /usr/local/lib/python3.6/dist-packages (from httplib2shim->earthengine-api) (1.24.3)
Requirement already satisfied: certifi in /usr/local/lib/python3.6/dist-packages (from httplib2shim->earthengine-api) (2019.11.28)
Requirement already satisfied: google-resumable-media<0.5.0dev,>=0.3.1 in /usr/local/lib/python3.6/dist-packages (from google-cloud-storage->earthengine-api) (0.4.1)
Requirement already satisfied: google-cloud-core<2.0dev,>=1.0.0 in /usr/local/lib/python3.6/dist-packages (from google-cloud-storage->earthengine-api) (1.0.3)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /usr/local/lib/python3.6/dist-packages (from pyasn1-modules>=0.2.1->google-auth>=1.4.1->earthengine-api) (0.4.8)
Requirement already satisfied: google-api-core<2.0.0dev,>=1.14.0 in /usr/local/lib/python3.6/dist-packages (from google-cloud-core<2.0dev,>=1.0.0->google-cloud-storage->earthengine-api) (1.16.0)
Requirement already satisfied: protobuf>=3.4.0 in /usr/local/lib/python3.6/dist-packages (from google-api-core<2.0.0dev,>=1.14.0->google-cloud-core<2.0dev,>=1.0.0->google-cloud-storage->earthengine-api) (3.10.0)
Requirement already satisfied: requests<3.0.0dev,>=2.18.0 in /usr/local/lib/python3.6/dist-packages (from google-api-core<2.0.0dev,>=1.14.0->google-cloud-core<2.0dev,>=1.0.0->google-cloud-storage->earthengine-api) (2.21.0)
Requirement already satisfied: pytz in /usr/local/lib/python3.6/dist-packages (from google-api-core<2.0.0dev,>=1.14.0->google-cloud-core<2.0dev,>=1.0.0->google-cloud-storage->earthengine-api) (2018.9)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.6.0 in /usr/local/lib/python3.6/dist-packages (from google-api-core<2.0.0dev,>=1.14.0->google-cloud-core<2.0dev,>=1.0.0->google-cloud-storage->earthengine-api) (1.51.0)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests<3.0.0dev,>=2.18.0->google-api-core<2.0.0dev,>=1.14.0->google-cloud-core<2.0dev,>=1.0.0->google-cloud-storage->earthengine-api) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests<3.0.0dev,>=2.18.0->google-api-core<2.0.0dev,>=1.14.0->google-cloud-core<2.0dev,>=1.0.0->google-cloud-storage->earthengine-api) (2.8)
To authorize access needed by Earth Engine, open the following URL in a web browser and follow the instructions. If the web browser does not start automatically, please manually browse the URL below.

    https://accounts.google.com/o/oauth2/auth?client_id=517222506229-vsmmajv00ul0bs7p89v5m89qs8eb9359.apps.googleusercontent.com&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fearthengine+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdevstorage.full_control&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&response_type=code&code_challenge=dnH0jU-6Xx4Mxi9C63uo0o5SGkVkfK3k58ef35CPDzU&code_challenge_method=S256

The authorization workflow will generate a code, which you should paste in the box below. 
Enter verification code: 4/xwE8gHhwvN2QyWPSxqjuRDiOFW3xnPj0cPZXKssK3US8CVKOihLXmhM

Successfully saved authorization token.

Import other libraries


In [15]:
import numpy as np
import pandas as pd
import altair as alt

!pip install geopandas
import geopandas

!pip install eeconvert
import eeconvert


Requirement already satisfied: geopandas in /usr/local/lib/python3.6/dist-packages (0.7.0)
Requirement already satisfied: shapely in /usr/local/lib/python3.6/dist-packages (from geopandas) (1.7.0)
Requirement already satisfied: fiona in /usr/local/lib/python3.6/dist-packages (from geopandas) (1.8.13.post1)
Requirement already satisfied: pyproj>=2.2.0 in /usr/local/lib/python3.6/dist-packages (from geopandas) (2.6.0)
Requirement already satisfied: pandas>=0.23.0 in /usr/local/lib/python3.6/dist-packages (from geopandas) (0.25.3)
Requirement already satisfied: munch in /usr/local/lib/python3.6/dist-packages (from fiona->geopandas) (2.5.0)
Requirement already satisfied: click-plugins>=1.0 in /usr/local/lib/python3.6/dist-packages (from fiona->geopandas) (1.1.1)
Requirement already satisfied: six>=1.7 in /usr/local/lib/python3.6/dist-packages (from fiona->geopandas) (1.12.0)
Requirement already satisfied: click<8,>=4.0 in /usr/local/lib/python3.6/dist-packages (from fiona->geopandas) (7.1.1)
Requirement already satisfied: cligj>=0.5 in /usr/local/lib/python3.6/dist-packages (from fiona->geopandas) (0.5.0)
Requirement already satisfied: attrs>=17 in /usr/local/lib/python3.6/dist-packages (from fiona->geopandas) (19.3.0)
Requirement already satisfied: python-dateutil>=2.6.1 in /usr/local/lib/python3.6/dist-packages (from pandas>=0.23.0->geopandas) (2.8.1)
Requirement already satisfied: numpy>=1.13.3 in /usr/local/lib/python3.6/dist-packages (from pandas>=0.23.0->geopandas) (1.18.2)
Requirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.6/dist-packages (from pandas>=0.23.0->geopandas) (2018.9)
Collecting eeconvert
  Downloading https://files.pythonhosted.org/packages/14/af/103e633869dbe86d3b1de0b1fff86a8a5bdf05848ada25f5e82dbc1c73fd/eeconvert-0.1.22-py2.py3-none-any.whl
Requirement already satisfied: boto3 in /usr/local/lib/python3.6/dist-packages (from eeconvert) (1.12.26)
Requirement already satisfied: folium in /usr/local/lib/python3.6/dist-packages (from eeconvert) (0.8.3)
Requirement already satisfied: sqlalchemy in /usr/local/lib/python3.6/dist-packages (from eeconvert) (1.3.15)
Requirement already satisfied: shapely in /usr/local/lib/python3.6/dist-packages (from eeconvert) (1.7.0)
Requirement already satisfied: earthengine-api in /usr/local/lib/python3.6/dist-packages (from eeconvert) (0.1.216)
Collecting geojson
  Downloading https://files.pythonhosted.org/packages/e4/8d/9e28e9af95739e6d2d2f8d4bef0b3432da40b7c3588fbad4298c1be09e48/geojson-2.5.0-py2.py3-none-any.whl
Requirement already satisfied: botocore in /usr/local/lib/python3.6/dist-packages (from eeconvert) (1.15.26)
Requirement already satisfied: pandas in /usr/local/lib/python3.6/dist-packages (from eeconvert) (0.25.3)
Requirement already satisfied: branca in /usr/local/lib/python3.6/dist-packages (from eeconvert) (0.4.0)
Requirement already satisfied: geopandas in /usr/local/lib/python3.6/dist-packages (from eeconvert) (0.7.0)
Requirement already satisfied: s3transfer<0.4.0,>=0.3.0 in /usr/local/lib/python3.6/dist-packages (from boto3->eeconvert) (0.3.3)
Requirement already satisfied: jmespath<1.0.0,>=0.7.1 in /usr/local/lib/python3.6/dist-packages (from boto3->eeconvert) (0.9.5)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.6/dist-packages (from folium->eeconvert) (2.11.1)
Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from folium->eeconvert) (1.12.0)
Requirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from folium->eeconvert) (2.21.0)
Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from folium->eeconvert) (1.18.2)
Requirement already satisfied: google-auth-httplib2>=0.0.3 in /usr/local/lib/python3.6/dist-packages (from earthengine-api->eeconvert) (0.0.3)
Requirement already satisfied: httplib2<1dev,>=0.9.2 in /usr/local/lib/python3.6/dist-packages (from earthengine-api->eeconvert) (0.17.0)
Requirement already satisfied: google-auth>=1.4.1 in /usr/local/lib/python3.6/dist-packages (from earthengine-api->eeconvert) (1.7.2)
Requirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from earthengine-api->eeconvert) (0.16.0)
Requirement already satisfied: google-cloud-storage in /usr/local/lib/python3.6/dist-packages (from earthengine-api->eeconvert) (1.18.1)
Requirement already satisfied: httplib2shim in /usr/local/lib/python3.6/dist-packages (from earthengine-api->eeconvert) (0.0.3)
Requirement already satisfied: google-api-python-client in /usr/local/lib/python3.6/dist-packages (from earthengine-api->eeconvert) (1.7.12)
Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in /usr/local/lib/python3.6/dist-packages (from botocore->eeconvert) (2.8.1)
Requirement already satisfied: docutils<0.16,>=0.10 in /usr/local/lib/python3.6/dist-packages (from botocore->eeconvert) (0.15.2)
Requirement already satisfied: urllib3<1.26,>=1.20; python_version != "3.4" in /usr/local/lib/python3.6/dist-packages (from botocore->eeconvert) (1.24.3)
Requirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.6/dist-packages (from pandas->eeconvert) (2018.9)
Requirement already satisfied: pyproj>=2.2.0 in /usr/local/lib/python3.6/dist-packages (from geopandas->eeconvert) (2.6.0)
Requirement already satisfied: fiona in /usr/local/lib/python3.6/dist-packages (from geopandas->eeconvert) (1.8.13.post1)
Requirement already satisfied: MarkupSafe>=0.23 in /usr/local/lib/python3.6/dist-packages (from jinja2->folium->eeconvert) (1.1.1)
Requirement already satisfied: idna<2.9,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->folium->eeconvert) (2.8)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->folium->eeconvert) (3.0.4)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->folium->eeconvert) (2019.11.28)
Requirement already satisfied: setuptools>=40.3.0 in /usr/local/lib/python3.6/dist-packages (from google-auth>=1.4.1->earthengine-api->eeconvert) (46.0.0)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.6/dist-packages (from google-auth>=1.4.1->earthengine-api->eeconvert) (0.2.8)
Requirement already satisfied: rsa<4.1,>=3.1.4 in /usr/local/lib/python3.6/dist-packages (from google-auth>=1.4.1->earthengine-api->eeconvert) (4.0)
Requirement already satisfied: cachetools<3.2,>=2.0.0 in /usr/local/lib/python3.6/dist-packages (from google-auth>=1.4.1->earthengine-api->eeconvert) (3.1.1)
Requirement already satisfied: google-resumable-media<0.5.0dev,>=0.3.1 in /usr/local/lib/python3.6/dist-packages (from google-cloud-storage->earthengine-api->eeconvert) (0.4.1)
Requirement already satisfied: google-cloud-core<2.0dev,>=1.0.0 in /usr/local/lib/python3.6/dist-packages (from google-cloud-storage->earthengine-api->eeconvert) (1.0.3)
Requirement already satisfied: uritemplate<4dev,>=3.0.0 in /usr/local/lib/python3.6/dist-packages (from google-api-python-client->earthengine-api->eeconvert) (3.0.1)
Requirement already satisfied: click-plugins>=1.0 in /usr/local/lib/python3.6/dist-packages (from fiona->geopandas->eeconvert) (1.1.1)
Requirement already satisfied: click<8,>=4.0 in /usr/local/lib/python3.6/dist-packages (from fiona->geopandas->eeconvert) (7.1.1)
Requirement already satisfied: attrs>=17 in /usr/local/lib/python3.6/dist-packages (from fiona->geopandas->eeconvert) (19.3.0)
Requirement already satisfied: cligj>=0.5 in /usr/local/lib/python3.6/dist-packages (from fiona->geopandas->eeconvert) (0.5.0)
Requirement already satisfied: munch in /usr/local/lib/python3.6/dist-packages (from fiona->geopandas->eeconvert) (2.5.0)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /usr/local/lib/python3.6/dist-packages (from pyasn1-modules>=0.2.1->google-auth>=1.4.1->earthengine-api->eeconvert) (0.4.8)
Requirement already satisfied: google-api-core<2.0.0dev,>=1.14.0 in /usr/local/lib/python3.6/dist-packages (from google-cloud-core<2.0dev,>=1.0.0->google-cloud-storage->earthengine-api->eeconvert) (1.16.0)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.6.0 in /usr/local/lib/python3.6/dist-packages (from google-api-core<2.0.0dev,>=1.14.0->google-cloud-core<2.0dev,>=1.0.0->google-cloud-storage->earthengine-api->eeconvert) (1.51.0)
Requirement already satisfied: protobuf>=3.4.0 in /usr/local/lib/python3.6/dist-packages (from google-api-core<2.0.0dev,>=1.14.0->google-cloud-core<2.0dev,>=1.0.0->google-cloud-storage->earthengine-api->eeconvert) (3.10.0)
Installing collected packages: geojson, eeconvert
Successfully installed eeconvert-0.1.22 geojson-2.5.0

Define region reduction function

Reduction arguments

Define a global dictionary that provides arguments to the Earth Engine reduceRegion image method.


In [0]:
reReArgs = {
  'reducer': ee.Reducer.mean(),
  'geometry': ee.Geometry.Point([0, 0]),
  'scale': 1000,
  'crs': 'EPSG:5070',
  'bestEffort': True,
  'maxPixels': 1e14,
  'tileScale': 4}

Reduction function

This function reduces the pixels intersecting a region to a statistic. It operates on a single ee.Image and returns an ee.Feature (after the reduction, the pixel values and geometry matter not, so just return an aspatial feature). The plan is to map this over an ee.ImageCollection.

  1. Derive a series of date properties that will be used in charting.
  2. Apply the reduceRegion method to the image using arguments provided by the global reReArgs dictionary.
  3. Return an ee.Feature without geometry that contains the region reduction statistic result (an ee.dictionary), all image properties, and the derived date properties.

In [0]:
def regionReduce(img):
  eeDate = img.date()
  year = eeDate.get('year')
  month = eeDate.getRelative('month', 'year')
  doy = eeDate.getRelative('day', 'year')
  date = eeDate.format('YYYY-MM-dd')
  
  stat = img.reduceRegion(
    reducer = reReArgs['reducer'],
    geometry = reReArgs['geometry'],
    scale = reReArgs['scale'],
    crs = reReArgs['crs'],
    bestEffort = reReArgs['bestEffort'],
    maxPixels = reReArgs['maxPixels'],
    tileScale = reReArgs['tileScale'])

  return(ee.Feature(None, stat) \
    .copyProperties(img, img.propertyNames())
    .set({
      'DOY': doy,
      'Month': month,
      'Year': year,
      'Date': date}))

Reduction to list

The result of the above region reduction function is an ee.FeatureCollection, but a list is easier to work with. The following takes a collection and a list of properties to aggregate into a list of lists.

  1. Apply the above defined regionReduce function to all images in the passed collection.
  2. Remove any features that are null for the requested properties (ensure equal length property lists in the following step)
  3. Aggregate requested properties into a list of lists
  4. Extract the list of lists from the resulting dictionary and then send the data to the Colab VM client with getInfo()

In [0]:
def getReReList(col, props):
  dict = col.map(regionReduce) \
    .filter(ee.Filter.notNull(props)) \
    .reduceColumns(
      reducer = ee.Reducer.toList().repeat(len(props)),
      selectors = props)

  return(ee.List(dict.get('list')).getInfo())

Reduce drought severity by region

Load PDSI and AOI

  1. Load gridded Palmer Drought Severity Index (PDSI) data as an ee.ImageCollection
  2. Load the EPA Level-3 ecoregion boundaries as an ee.FeatureCollection and filter it to include only the Sierra Nevada region, which defines the area of interest (AOI).

In [0]:
def gdfToFc(gdf):
    """converts a geodataframe  to a featurecollection   
    Args:
        gdf (geoPandas.GeoDataFrame) : the input geodataframe       
    Returns:
        fc (ee.FeatureCollection) : feature collection (server side)      
    """
    gdfCopy = gdf.copy()
    gdfCopy["eeFeature"] = gdfCopy.apply(eeconvert.shapelyToEEFeature,1)
    featureList = gdfCopy["eeFeature"].tolist()
    fc =  ee.FeatureCollection(featureList)
    return fc

In [0]:
pdsi = ee.ImageCollection('IDAHO_EPSCOR/PDSI')


geojson_url = 'https://gist.githubusercontent.com/jirikadlec2/c503a718d16dfb2f65dbf4e5a6627d32/raw/a817c5629c10df73d83ebf4e30e46fc3f0f5a204/bhutan_test_sites.geojson'
aoi_geodataframe = geopandas.read_file(geojson_url)
aoi_geodataframe.head()

bhutan_sites = gdfToFc(aoi_geodataframe)
#sn = ee.FeatureCollection(geojson_url)

sn = bhutan_sites.filter(ee.Filter.eq('description', 'site3'))

#sn = ee.FeatureCollection('EPA/Ecoregions/2013/L3') \
#  .filter(ee.Filter.eq('na_l3name', 'Sierra Nevada'))

Reduce PDSI by AOI

  1. Set the reduce region arguments.
  2. Reduce PDSI for the AOI and return the results as a list. (return all properties in the provided list)

In [0]:
# Reset global arguments.
reReArgs['scale'] = 5000
reReArgs['reducer'] = ee.Reducer.mean()
reReArgs['geometry'] = sn
reReArgs['crs'] = 'EPSG:3310'

# Get a list containing a series of lists, one for each property given. 
snPdsiList = getReReList(pdsi, ['Year', 'Month', 'DOY', 'Date', 'pdsi'])

Format the table

Check the format

Print the list dimensions. These data are in memory on the Colab VM.


In [22]:
# Print the dimensions of the returned list.
print('n variables:', len(snPdsiList))
print('n observations:', len(snPdsiList[0]))


n variables: 5
n observations: 0

Convert list of lists to dataframe

  1. Convert the list to a Pandas dataframe.
  2. Print the shape of the new dataframe. shape returns (n rows, n columns)

In [23]:
snPdsiDf = pd.DataFrame(snPdsiList)
print(snPdsiDf.shape)


(5, 0)

Transpose the dataframe

  1. Transpose the dataframe.
  2. Print the shape again.

In [24]:
snPdsiDf = snPdsiDf.transpose()
print(snPdsiDf.shape)


(0, 5)

Preview the first 5 rows of the dataframe.


In [25]:
snPdsiDf.head(5)


Out[25]:
0 1 2 3 4

It's missing column names - add some and preview again.


In [26]:
snPdsiDf.columns = ['Year', 'Month', 'DOY', 'Date', 'PDSI']
snPdsiDf.head(5)


Out[26]:
Year Month DOY Date PDSI

Now check the data type of each column.


In [27]:
snPdsiDf.dtypes


Out[27]:
Year     float64
Month    float64
DOY      float64
Date     float64
PDSI     float64
dtype: object

object is a string - set the data types and preview again.


In [28]:
snPdsiDf = snPdsiDf.astype({'Year': int, 'Month': int, 'DOY': int, 'Date': str, 'PDSI': float})
snPdsiDf.dtypes


Out[28]:
Year       int64
Month      int64
DOY        int64
Date      object
PDSI     float64
dtype: object

Get the month correct.


In [29]:
snPdsiDf['Month'] = snPdsiDf['Month'] + 1
snPdsiDf.head(5)


Out[29]:
Year Month DOY Date PDSI

Define a data structure translator

The above data structure transformation will be used again multiple times; make it a function.


In [0]:
cols = {'Year': int, 'Month': int, 'DOY': int, 'Date': str, 'PDSI': float}

def eeList2Df(list, cols):
  df = pd.DataFrame(list).transpose()
  df.columns = [k for k in cols.keys()]
  return(df.astype(cols))

Calendar heatmap


In [31]:
alt.Chart(snPdsiDf).mark_rect().encode(
  x = 'Year:O',
  y = 'Month:O',
  color = alt.Color('PDSI:Q', scale=alt.Scale(scheme="redblue", domain=(-5, 5))),
  tooltip = [
    alt.Tooltip('Year:O', title='Year'),
    alt.Tooltip('mean(Month):O', title='Month'),
    alt.Tooltip('PDSI:Q', title='PDSI')
  ]).properties(width=700, height=300)


Out[31]:

Bar chart


In [32]:
alt.Chart(snPdsiDf).mark_bar(size=1).encode(
  x = 'Date:T',
  y = 'PDSI:Q',
  color = alt.Color('PDSI:Q', scale=alt.Scale(scheme="redblue", domain=(-5, 5))),
  tooltip = [
    alt.Tooltip('Date:T', title='Date'),
    alt.Tooltip('PDSI:Q', title='PDSI')
  ]
).properties(width=700, height=300)


Out[32]:

Reduce NDVI by region

A measure of photosynthetic capacity from MODIS.

Get data and reduce it

  1. Load MODIS NDVI data as an Earth Engine ee.ImageCollection.
  2. Set the region reduce scale argument to 1000 (meters).
  3. Get the data from Earth Engine to the Colab VM as a list of lists.

In [0]:
def calculate_ndvi(image):
  ndvi = image.normalizedDifference(['B8', 'B4']).rename('NDVI')
  return image.addBands(ndvi)

In [0]:
ndvi = ee.ImageCollection('MODIS/006/MOD13A2').filterDate('2017-01-01', '2020-03-25').select('NDVI')

reReArgs['scale'] = 1000
snNdviList = getReReList(ndvi, ['Year', 'DOY', 'NDVI'])

Convert the list to a tidy dataframe and preview it.


In [47]:
cols = {'Year': int, 'DOY': int, 'NDVI': float}
snNdviDf = eeList2Df(snNdviList, cols)
snNdviDf.head(5)


Out[47]:
Year DOY NDVI
0 2017 0 5196.0
1 2017 16 4657.0
2 2017 32 4515.0
3 2017 48 5131.0
4 2017 64 4973.0

Remove the NDVI scaling and preview again.


In [48]:
snNdviDf['NDVI'] = snNdviDf['NDVI']/10000
snNdviDf.head(5)


Out[48]:
Year DOY NDVI
0 2017 0 0.5196
1 2017 16 0.4657
2 2017 32 0.4515
3 2017 48 0.5131
4 2017 64 0.4973

DOY line chart


In [49]:
alt.Chart(snNdviDf).mark_line().encode(
  alt.X('DOY:O'),
  alt.Y('NDVI:Q', scale=alt.Scale(domain=(0.1, 0.88))),
  alt.Color('Year:O', scale=alt.Scale(scheme="magma")),
  tooltip=[
    alt.Tooltip('Year:O', title='Year'),  
    alt.Tooltip('DOY:O', title='DOY'),
    alt.Tooltip('NDVI:Q', title='NDVI')
  ]
).interactive().properties(width=700, height=300)


Out[49]:

Reduce the plot to median and interquartile range.


In [50]:
line = alt.Chart(snNdviDf).mark_line().encode(
  x='DOY:O',
  y='median(NDVI):Q',
).interactive()

band = alt.Chart(snNdviDf).mark_errorband(extent='iqr').encode(
  x='DOY:O',
  y=alt.Y('NDVI:Q', scale=alt.Scale(domain=(0.1, 0.7))),
).interactive().properties(width=700, height=300)

band + line


Out[50]:

Relationship between PDSI and NDVI

A scatterplot is a good way to visualize the relationship between two variables. The x variable will be PDSI and the y variable will be NDVI. To create this plot both variables must exist in the same dataframe. Each row will be an observation in time and two columns for corresponding PDSI and NDVI observations. Right now PDSI and NDVI are in two different dataframes, the first step is to merge them.

  1. Subset the data in the above plot where summer NDVI declines and then increases. Use the cursor tooltip to identify start and end day-of-year for this period.
  2. Reduce the values within the year by the minimum observation.

In [0]:


In [0]:
ndvi = ee.ImageCollection('COPERNICUS/S2').filterDate('2017-01-01', '2020-03-25').select('NDVI')

reReArgs['scale'] = 1000
snNdviList = getReReList(ndvi, ['Year', 'DOY', 'NDVI'])

In [0]:
ndviDoy = [224, 272] 

snNdviDfSub = snNdviDf[
  (snNdviDf['DOY'] >= ndviDoy[0]) & (snNdviDf['DOY'] <= ndviDoy[1])
]

snNdviDfSub = snNdviDfSub.groupby('Year').agg('min')
  1. Subset the PDSI data to include all days prior to the end of the above defined NDVI DOY range.
  2. Reduce the values within a year by mean of observations.

In [0]:
pdsiDoy = [1, 272]

snPdsiDfSub = snPdsiDf[
  (snPdsiDf['DOY'] >= pdsiDoy[0]) & (snPdsiDf['DOY'] <= pdsiDoy[1])
]

snPdsiDfSub = snPdsiDfSub.groupby('Year').agg('mean')
  1. Perform a join on 'Year' to combine the two dataframes.

In [0]:
ndviPdsiCombo = pd.merge(snNdviDfSub, snPdsiDfSub, how='left', on='Year') \
  .drop(columns=['DOY_x', 'DOY_y', 'Month']) \
  .reset_index()

ndviPdsiCombo.head(5)


Out[0]:
Year NDVI PDSI
0 2000 0.503490 -0.895635
1 2001 0.493462 -1.544887
2 2002 0.495906 -2.062868
3 2003 0.515995 -0.067230
4 2004 0.494445 -1.930910
  1. Add a line of best fit between PDSI and NDVI. Including a line of best fit can be a helpful visual aid. Here, a 1D polynomial is fit throught xy point cloud defined by corresponding NDVI and PDSI observations. The resulting fit is added to the dataframe as a new column Fit.

In [0]:
ndviPdsiCombo['Fit'] = np.poly1d(np.polyfit(ndviPdsiCombo['PDSI'], ndviPdsiCombo['NDVI'], 1))(ndviPdsiCombo['PDSI'])
ndviPdsiCombo.head(5)


Out[0]:
Year NDVI PDSI Fit
0 2000 0.503490 -0.895635 0.503880
1 2001 0.493462 -1.544887 0.500244
2 2002 0.495906 -2.062868 0.497344
3 2003 0.515995 -0.067230 0.508520
4 2004 0.494445 -1.930910 0.498083

The dataframe is ready for plotting. Since this plot includes points and a line of best fit, two charts need to be created, one for each, and the combined.


In [0]:
# Point chart.
points = alt.Chart(ndviPdsiCombo).mark_circle(size=60).encode(
  x=alt.X('PDSI:Q', scale=alt.Scale(domain=(-5, 5))),
  y=alt.Y('NDVI:Q', scale=alt.Scale(domain=(0.4, 0.6))),
  color=alt.Color('Year:O', scale=alt.Scale(scheme='magma')),
  tooltip=['Year', 'PDSI', 'NDVI']
).interactive()

# Line chart.
fit = alt.Chart(ndviPdsiCombo).mark_line().encode(
  x=alt.X('PDSI:Q', scale=alt.Scale(domain=(-5, 5))),
  y=alt.Y('Fit:Q', scale=alt.Scale(domain=(0.4, 0.6))),
  color=alt.value('#808080')
).properties(width=700, height=300)

# Combine charts.
fit + points


Out[0]:

As you can see there seems to be some degree of positive correlation between PDSI and NDVI. Some of the greatest outliers are 2016, 2017, 2018 - the three years following recovery from the long drought.

Patch-level vegetation change

At a course regional scale there appears to be a relationship between drought and NDVI. This section will look more closely at effects of drought on vegetation at a patch level. Here, a Landsat time series collection is created for the period 1984-2019 to provide greater temporal context for change.

Find a point of interest

Zoom to a location and click the map to list the latitude and longitude.


In [184]:
sn = bhutan_sites.filter(ee.Filter.eq('description', 'site3'))


# Import the Folium library.
import folium

# Set visualization parameters.
visParams = {'min':0, 'max':7000}

# Create a folium map object.
site_centroid = sn.geometry().centroid().getInfo()
site_lon = site_centroid['coordinates'][0]
site_lat = site_centroid['coordinates'][1]
fMap = folium.Map(
  location=[site_lat, site_lon],
  tiles='https://mt.google.com/vt/lyrs=s&x={x}&y={y}&z={z}',
  attr='Map Data: Google',
  zoom_start=17,
  height=500,
  width=700
)

# Add AOI to map.
folium.GeoJson(
  sn.geometry().getInfo(),
  name='geojson',
  style_function=lambda x: {'fillColor': '#00000000', 'color': '#FFFFFF'},
).add_to(fMap)

# Add a lat lon popup.
folium.LatLngPopup().add_to(fMap)

# Display the map.
display(fMap)


Prepare a Landsat surface reflectance collection

Edit date window inputs based on NDVI curve plotted previously and set latitude and longitude variables from the map above.


In [0]:
# Start and end dates.
startDay = 1
endDay = 365

Prepare a Landsat surface reflectance collection 2017-present.


In [0]:
# Make lat and lon an `ee.Geometry.Point` - AOI.
point = sn.geometry().centroid()
polygon = sn.geometry()
#point = ee.Geometry.Point([Longitude, Latitude])

# Define function to get and rename bands of interest from OLI.
def renameOLI(img):
  return(img.select(
    ee.List(['B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'pixel_qa']),
		ee.List(['Blue', 'Green', 'Red', 'NIR', 'SWIR1', 'SWIR2', 'pixel_qa'])))

# Define function to get and rename bands of interest from ETM+.
def renameETM(img):
  return(img.select(
		ee.List(['B1', 'B2', 'B3', 'B4', 'B5', 'B7', 'pixel_qa']),
		ee.List(['Blue', 'Green', 'Red', 'NIR', 'SWIR1', 'SWIR2', 'pixel_qa'])))
  
# Define function to get and rename bands of interest from Sentinnel2
def renameS2(img):
  return(img.select(
		ee.List(['B2', 'B3', 'B4', 'B8A', 'B11', 'B12', 'SCL']),
		ee.List(['Blue', 'Green', 'Red', 'NIR', 'SWIR1', 'SWIR2', 'pixel_qa'])))

# Define function to mask out clouds and cloud shadows.
def fmask(img):
  cloudShadowBitMask = 1 << 3
  cloudsBitMask = 1 << 5
  qa = img.select('pixel_qa')
  mask = qa.bitwiseAnd(cloudShadowBitMask).eq(0)\
    .And(qa.bitwiseAnd(cloudsBitMask).eq(0))
  return(img.updateMask(mask))

# Define function to mask out clouds and cloud shadows.
def sclMask(img):
  qa = img.select('pixel_qa')
  mask = qa.eq(4)\
    .Or(qa.eq(5))\
    .Or(qa.eq(6))\
    .Or(qa.eq(7))\
    .Or(qa.eq(11))
  return(img.updateMask(mask))

# Define function to add year.
def setYear(img):
  year = ee.Image(img).date().get('year')
  return(img.set('Year', year))

def setMonth(img):
  month = ee.Image(img).date().get('month')
  return img.set('Month', month)

# Define function to calculate NBR.
def calcNBR(img):
  return(img.normalizedDifference(ee.List(['NIR', 'SWIR2'])).rename('NBR'))

# Define function to calculate NDVI.
def calcNDVI(img):
  return(img.normalizedDifference(ee.List(['NIR', 'Red'])).rename('NDVI'))

def calcMSAVI2(img):
  nir = img.select('NIR')
  red = img.select('Red')

  return img.expression(
  '(2 * NIR + 1 - sqrt(pow((2 * NIR + 1), 2) - 8 * (NIR - RED)) ) / 2', 
  {
    'NIR': img.select('NIR'), 
    'RED': img.select('Red')
  }
  ).rename('MSAVI2')

  #return ((2 * nir + 1 - math.sqrt( (2 * nir + 1)**2 - 8 * (nir - red) )) / 2).rename('MSAVI2')

# Define function to prepare OLI images.
def prepOLI(img):
  orig = img
  img = renameOLI(img)
  img = fmask(img)
  #img = calcNBR(img)
  #img = calcNDVI(img)
  img = calcMSAVI2(img)
  img = img.copyProperties(orig, orig.propertyNames())
  return(setYear(img))

# Define function to prepare ETM+ images.
def prepETM(img):
  orig = img
  img = renameETM(img)
  img = fmask(img)
  #img = calcNBR(img)
  #img = calcNDVI(img)
  img = calcMSAVI2(img)
  img = img.copyProperties(orig, orig.propertyNames())
  return(setYear(img))

# Define function to prepare S2 images.
def prepS2(img):
  orig = img
  img = renameS2(img)
  img = sclMask(img)
  #img = calcNBR(img)
  #img = calcNDVI(img)
  img = calcMSAVI2(img)
  img = img.copyProperties(orig, orig.propertyNames())
  return(setYear(img))

# get data
tmCol = ee.ImageCollection("LANDSAT/LT05/C01/T1_SR")
etmCol = ee.ImageCollection("LANDSAT/LE07/C01/T1_SR")
oliCol = ee.ImageCollection("LANDSAT/LC08/C01/T1_SR")
s2Col = ee.ImageCollection("COPERNICUS/S2_SR")

# Filter collections and prepare them for merging.
oliCol = oliCol.filterBounds(point).filter(ee.Filter.calendarRange(startDay, endDay, 'day_of_year')).map(prepOLI)
etmCol = etmCol.filterBounds(point).filter(ee.Filter.calendarRange(startDay, endDay, 'day_of_year')).map(prepETM)
tmCol = tmCol.filterBounds(point).filter(ee.Filter.calendarRange(startDay, endDay, 'day_of_year')).map(prepETM)
s2Col = s2Col.filterBounds(point).filter(ee.Filter.calendarRange(startDay, endDay, 'day_of_year')).map(prepS2)


# Merge the collections.
landsatCol = oliCol \
  .merge(etmCol) \
  .merge(tmCol) \
  .merge(s2Col) \
  .filterDate('2017-01-01', '2020-03-30')

# Get distinct year collection.
distinctYearMonthCol = landsatCol.distinct('YearMonth')

# Define a filter that identifies which images from the complete collection
# match the DOY from the distinct DOY collection.
joinFilter = ee.Filter.equals(leftField='YearMonth', rightField='YearMonth')

# Define a join.
join = ee.Join.saveAll('year_month_matches')

# Apply the join and convert the resulting FeatureCollection to an
# ImageCollection.
joinCol = ee.ImageCollection(join.apply(distinctYearMonthCol, landsatCol, joinFilter))

# Apply mean reduction among matching year collections.
def reduceByJoin(img):
  yearCol = ee.ImageCollection.fromImages(ee.Image(img).get('year_month_matches'));
  return(yearCol.reduce(ee.Reducer.mean()) \
    .rename('MSAVI2') \
    .set('system:time_start', ee.Image(img).date().update(day=1).millis()))

#landsatCol = joinCol.map(reduceByJoin)

Reduce the region.


In [0]:
# reset global args
reReArgs['reducer'] = ee.Reducer.mean()
reReArgs['scale'] = 30
reReArgs['geometry'] = polygon

# Get a list containing a series of lists, one for each property given. 
pointNDVIList = getReReList(landsatCol, ['Date', 'MSAVI2'])

#pointNbrList = getReReList(landsatCol, ['Date', 'NBR'])

Transform and tidy the table.


In [182]:
cols = {'Date': str, 'MSAVI2': float}
pointNDVIDf = eeList2Df(pointNDVIList, cols)
pointNDVIDf.tail(5)


Out[182]:
Date MSAVI2
224 2020-03-10 0.580526
225 2020-03-15 0.564045
226 2020-03-15 0.573339
227 2020-03-20 0.288790
228 2020-03-20 0.434820

Display intra-annual reduction a line plot.


In [183]:
alt.Chart(pointNDVIDf).mark_line().encode(
  x='Date:T',
  y='MSAVI2:Q',
  tooltip=[
    alt.Tooltip('Date:T', title='Date'),
    alt.Tooltip('NDVI:Q', title='MSAVI2') 
  ]
).interactive().properties(width=700, height=300)


Out[183]: