These are some more advanced worked examples demonstrating features, generally for members of the Dept. of Geosciences. Each one should be stand-alone, ie does not require any of the other things that are done.

Timeseries Example: plotting winter-average SAT, with a decadal average over the top


In [1]:
# modules we'll need...
import iris
import iris.analysis.cartography # <-- for grid-cell weighting
import iris.coord_categorisation # <-- for creating new season coords
import iris.quickplot # <-- wrapper for easy plotting of Iris cubes
import matplotlib.dates # <-- tools for generating lists of dates easily
import matplotlib.pyplot # <-- generaly plotting tools

# load..
sat = iris.load_cube('~/data/HadCRUT*.nc', \
                     'near_surface_temperature_anomaly')

# Average in space..
def area_avg(cube):
    """
    Simple function to calculate the area-average.
    Makes lots of assumptions!
    """
    # --> add bounds
    cube.coord('latitude').guess_bounds()
    cube.coord('longitude').guess_bounds()
    # --> calculate weight of each cell
    cell_size = iris.analysis.cartography.area_weights(cube)
    # --> perform average
    return cube.collapsed(['latitude','longitude'], \
                          iris.analysis.MEAN, weights=cell_size)

sat_ts = area_avg(sat)
    
# Calculate the winter average..
# --> add 'season' coordinates
iris.coord_categorisation.add_season(sat_ts,'time')
iris.coord_categorisation.add_season_year(sat_ts,'time')
# --> group data by season and season-year, and average
sat_ts_seasons = sat_ts.aggregated_by(['season','season_year'],\
                                      iris.analysis.MEAN)
# --> get just the winters
sat_ts_djf = sat_ts_seasons.extract(iris.Constraint(season='djf'))

# Calculate a decadally-smoothed form of the data
#     At the moment rolling_window has a bug, so we need to remove
#     any dependent string coordinate (e.g. season) first
# --> remove string coordinate
sat_ts_djf.remove_coord('season')
# --> apply a rolling-window, averaging, in chunks of 10
smooth = sat_ts_djf.rolling_window('time', iris.analysis.MEAN, 10)

# Make our plot..
# --> make a larger canvas
matplotlib.pyplot.figure(figsize=(8,8))
# --> plot the raw data in grey
iris.quickplot.plot(sat_ts_djf ,'grey')
# --> add a straight-line at 0
matplotlib.pyplot.axhline(0, linestyle='--', color='grey')
# --> add out smoothed data, in red, and thicker
iris.quickplot.plot(smooth, 'r', linewidth=2.0)

# We don't really like the position of those x-axis ticks, so let's
# change those up.
# --> get a handle to the x-axis..
xaxis = matplotlib.pyplot.gca().xaxis
# --> use matplotlib's generator to make date ticks every 20 years
xaxis.set_major_locator(matplotlib.dates.YearLocator(20))

# Display..
matplotlib.pyplot.show()


/opt/anaconda/envs/SciTools/lib/python2.7/site-packages/Iris-1.7.0_dev-py2.7-linux-x86_64.egg/iris/analysis/cartography.py:316: UserWarning: Using DEFAULT_SPHERICAL_EARTH_RADIUS.
  warnings.warn("Using DEFAULT_SPHERICAL_EARTH_RADIUS.")
/opt/anaconda/envs/SciTools/lib/python2.7/site-packages/Iris-1.7.0_dev-py2.7-linux-x86_64.egg/iris/cube.py:3101: UserWarning: The bounds of coordinate u'time' were ignored in the rolling window operation.
  'the rolling window operation.' % coord_.name())

How to compute pentad (5-day) averages

Here we..

  • load the monthly data again
    • Not the best, but I happen to have it to hand.
  • average it in lat/lon to create a timseries
  • add a new coordinate ('pentad')
    • this will represent which pentad-of-the-year the given data falls into
  • create a 'pentad climatology' for 1980-2010for 1850--1800
  • plot both on line-graphs to show the change

In [10]:
# modules we'll need
import iris
import iris.coord_categorisation

# load data..
sat = iris.load_cube('~/data/HadCRUT*.nc',\
                    'near_surface_temperature_anomaly')

# get a timeseries by averaging in space..
# (for ease I've just re-used the function from the first example)
sat_ts = area_avg(sat)

# Add a 'pentad' coordinate.
# --> start by adding the day-of-year..
iris.coord_categorisation.add_day_of_year(sat_ts,'time')
# --> define a helpful translation function..
def pentad_of_year(coord,point):
    return floor(point/5)
# --> use that function to create a new 'pentad' coordinate.
iris.coord_categorisation.add_categorised_coord( \
                     sat_ts, "pentad", sat_ts.coord('day_of_year'), \
                    pentad_of_year,units='1')

# show that worked..
print "Day of year:",sat_ts.coord('day_of_year').points
print "Pentad nos:",sat_ts.coord('pentad').points
print '--'*50

# Calculate a climatology for 1850--1880, and 1980--2010..
# --> add a 'year' label, to make it easier to select data
iris.coord_categorisation.add_year(sat_ts,'time')
sat_ts_start = sat_ts.extract(iris.Constraint(year=lambda yr:1850<yr<1880))
sat_ts_end = sat_ts.extract(iris.Constraint(year=lambda yr:1980<yr<2010))
# --> aggregate for each...
sat_ts_start = sat_ts_start.aggregated_by('pentad',iris.analysis.MEAN)
sat_ts_end = sat_ts_end.aggregated_by('pentad',iris.analysis.MEAN)

# Plot!

# --> create an axis in the first position of a 1 row * 2 column figure
#matplotlib.pyplot.subplot2grid((1,2),(0,0))
# --> plot the climatologies..
iris.quickplot.plot(sat_ts_start,'b',label='1850--1880')
iris.quickplot.plot(sat_ts_end,'r',label='1980--2010')

# Display..
matplotlib.pyplot.show()

#23456789012345678901234567890123456789012345678901234567890123456789012


Day of year: [ 16  46  75 ..., 167 197 228]
Pentad nos: [  3.   9.  15. ...,  33.  39.  45.]
----------------------------------------------------------------------------------------------------

Note that the 'x' axis is wrong, it hasn't plotted the coordinate, instead it has plotted the value against based on it's position in the list. Here's how we specify what to plot against...


In [11]:
iris.quickplot.plot(sat_ts_start.coord('pentad'),sat_ts_start,'b',\
                    label='1850--1880')
iris.quickplot.plot(sat_ts_end.coord('pentad'),sat_ts_end,'r',\
                    label='1980--2010')
matplotlib.pyplot.legend(loc='center right')
matplotlib.pyplot.show()


Well that's weird. There must be something up with the coord dimension...


In [4]:
print sat_ts_start
print '--'*50
print sat_ts_start.coord('day_of_year').points


near_surface_temperature_anomaly / (K) (-- : 14)
     Auxiliary coordinates:
          day_of_year                      x
          pentad                           x
          time                             x
          year                             x
     Scalar coordinates:
          latitude: 0.0 degrees, bound=(-90.0, 90.0) degrees
          longitude: 0.0 degrees, bound=(-180.0, 180.0) degrees
     Attributes:
          Conventions: CF-1.0
          comment: 
          ensemble_member_index: 0
          ensemble_members: 100
          history: Updated at 30/09/2013 09:38:38
          institution: Met Office Hadley Centre / Climatic Research Unit, University of East ...
          reference: Morice, C. P., J. J. Kennedy, N. A. Rayner, and P. D. Jones (2012), Quantifying...
          reference_period: [1961 1990]
          source: CRUTEM.4.2.0.0, HadSST.3.1.0.0
          title: HadCRUT4 near-surface temperature ensemble data - ensemble median
          version: HadCRUT.4.2.0.0
     Cell methods:
          mean: latitude, longitude
          mean: pentad
----------------------------------------------------------------------------------------------------
[  16.   46.   75.  106.  136.  167.  197.  228.  259.  289.  320.  350.
  260.  290.]

Hmm, what's going on with the underlying data. Reextract..


In [7]:
tmp = sat_ts.extract(iris.Constraint(year=lambda yr:1850<yr<1855))
for i in range(4):
    print tmp.coord('pentad').points[i*12:(i+1)*12]
print '--'*50
import numpy
print numpy.unique(tmp.coord('pentad').points)
print len(numpy.unique(tmp.coord('pentad').points))


[  3.   9.  15.  21.  27.  33.  39.  45.  51.  57.  64.  70.]
[  3.   9.  15.  21.  27.  33.  39.  45.  52.  58.  64.  70.]
[  3.   9.  15.  21.  27.  33.  39.  45.  51.  57.  64.  70.]
[  3.   9.  15.  21.  27.  33.  39.  45.  51.  57.  64.  70.]
----------------------------------------------------------------------------------------------------
[  3.   9.  15.  21.  27.  33.  39.  45.  51.  52.  57.  58.  64.  70.]
14

That seems largely fine. There are a few years where the . Maybe we should try removing the other dimensions before aggregating??


In [9]:
# remove all other dimensions..
tmp2 = tmp.copy()
tmp2.remove_coord('time')
tmp2.remove_coord('day_of_year')
tmp2.remove_coord('year')
# check it worked..
print tmp2
# do the aggregation..
print tmp2.aggregated_by('pentad',iris.analysis.MEAN).coord('pentad').points


near_surface_temperature_anomaly / (K) (-- : 48)
     Auxiliary coordinates:
          pentad                           x
     Scalar coordinates:
          latitude: 0.0 degrees, bound=(-90.0, 90.0) degrees
          longitude: 0.0 degrees, bound=(-180.0, 180.0) degrees
     Attributes:
          Conventions: CF-1.0
          comment: 
          ensemble_member_index: 0
          ensemble_members: 100
          history: Updated at 30/09/2013 09:38:38
          institution: Met Office Hadley Centre / Climatic Research Unit, University of East ...
          reference: Morice, C. P., J. J. Kennedy, N. A. Rayner, and P. D. Jones (2012), Quantifying...
          reference_period: [1961 1990]
          source: CRUTEM.4.2.0.0, HadSST.3.1.0.0
          title: HadCRUT4 near-surface temperature ensemble data - ensemble median
          version: HadCRUT.4.2.0.0
     Cell methods:
          mean: latitude, longitude
[  3.   9.  15.  21.  27.  33.  39.  45.  51.  57.  64.  70.  52.  58.]

In [8]:
print sat_ts.aggregated_by('pentad',iris.analysis.MEAN).coord('pentad')


AuxCoord(array([  3.,   9.,  15.,  21.,  27.,  33.,  39.,  45.,  51.,  57.,  64.,
        70.,  52.,  58.]), standard_name=None, units=Unit('5 days'), long_name=u'pentad', attributes={'start_month': 1, 'end_month': 8, 'start_year': 1850, 'end_year': 2013})

Examples to build next...

  • Control of colours / the colourbar
    • position / orientation
    • with a specific number of segments
    • pointy ends
  • Use of an 'aggregator'
    • e.g. for nos of months above a certain temperature in each decade
  • A map of the trend in temperature

ASIDE: This is how to get 'safe' division (ie always treat numbers as floats) which is the standard in Python 3, available now..


In [2]:
from __future__ import division
2/3


Out[2]:
0.6666666666666666