First let's set things up for business.
In [1]:
import pmagpy.ipmag as ipmag
import pmagpy.pmag as pmag
import pandas as pd
import numpy as np
import matplotlib.pylab as plt
%matplotlib inline
We are supposed to use ipmag.igrf() to calculate the inclination values in Sicily at a latitude of 38$^{\circ}$ and longitude of 14$^{\circ}$E from 1600 to 1945.
In [3]:
help(ipmag.igrf)
In [11]:
# make a list of desired dates
dates=range(1600,1910,10) # list of dates 10 year increments
mod = 'cals10k_2' # choose the most recent model
lat,lon,alt=38,14,0 # latitude, longitude and alitude for sicily
Vecs=[] # list for Dec,Inc,Int outputs
for date in dates: # step through the dates
Vecs.append(ipmag.igrf([date,alt,lat,lon],mod=mod)) # append to list
vector_df = pd.DataFrame(Vecs) # make it into a Pandas dataframe
vector_df.columns=['dec','inc','int']
vector_df['vadms']=pmag.b_vdm(vector_df.int.values*1e-9, lat) # calculate the VADMs
vector_df['dec_adj']=vector_df['dec']
vector_df.loc[vector_df.dec>180,['dec_adj']]=vector_df.dec-360 # adjust declinations to be -180 => 180
fig=plt.figure(1,figsize=(7,9)) # set up the figure
fig.add_subplot(411) # make 4 rows of plots, this is the first
plt.plot(dates,vector_df.dec_adj) # plot the adjusted declinations
plt.ylabel('Declination ($^{\circ}$)')
plt.title('Geomagnetic field evaluated at Lat: '+str(lat)+' / Lon: '+str(lon))
fig.add_subplot(412) # this is the second
plt.plot(dates,vector_df.inc) # plot the inclinations
plt.ylabel('Inclination ($^{\circ}$)')
fig.add_subplot(413)
plt.plot(dates,vector_df.int*1e-3) # plot the intensites (in uT instead of nT)
plt.ylabel('Intensity ($\mu$T)')
fig.add_subplot(414) # plot the VADMs
plt.plot(dates,vector_df.vadms*1e-21) # plot as ZAm^2
plt.ylabel('VADM (ZAm$^2$)')
plt.xlabel('Dates (CE)');
We need to read in the data from geomagia_sel.txt. Take a peak and see that it is a comma delimited file with a header in the second line. We read the data in with pandas.
In [12]:
geomagia=pd.read_csv('Chapter_14/geomagia_sel.txt',header=1)
geomagia.head()
Out[12]:
In [13]:
geomagia_incs=geomagia[geomagia['Inc[deg.]']>-90]
In [14]:
geomagia_incs['Inc[deg.]']
Out[14]:
Let's replot the GUFM data and decorate the plot with the geomagia data.
In [16]:
plt.plot(dates,vector_df.inc)
plt.plot(geomagia_incs['Age[yr.AD]'],geomagia_incs['Inc[deg.]'],'ro');
In [18]:
Image(filename="geomagia_screenshot.png")
Out[18]:
I clicked on the 'Perform Query' button and scrolled down to the References part:
In [44]:
Image(filename="geomagia_refs.png")
Out[44]:
Here is the original paper to see whether the samples were demagnetized properly, etc. So I did that. here is the pdf: http://ac.els-cdn.com/0305440387900082/1-s2.0-0305440387900082-main.pdf?_tid=6f840d8c-c90e-11e4-a1c8-00000aacb360&acdnat=1426202636_67453570f83c4126d6c152fbb69fd35d
The method used to determine the direction was AF demagnetization in 5mT steps to 50 mT (not very high) and averaged the data from each specimen with Fisher statistics, apparently after looking at orthogonal plots to determine the stable portion of each demag experiment. (This is of course not the way we do things nowadays. One should use the principal component technique by Kirschvink (1980).) After getting a direction from all the samples, they averaged data by flow, also using Fisher statistics.
Any way, you get the idea. This procedure would be done for each reference to really assess the data quality.
Now let's do the equal area plot using the well worn ipmag functions (from Chapter 2).
In [19]:
ipmag.plot_net(1) # make an equal angle net
ipmag.plot_di(geomagia_incs['Dec[deg.]'].values,geomagia_incs['Inc[deg.]'].values,color='blue') # put on the dots
So.. The data fall into two clumps. Is this because of secular variation? or is there tilting? Or temporal aliasing? Hmmm.
To make this problem easier, I saved the PINT.xls as a text file. We can read it in with read_csv and find out what the columns are:
In [4]:
pint=pd.read_excel('Chapter_14/PINT.xls')
pint.columns
Out[4]:
Before we do anything else, let's clean out the records with no VDM/VADM data:
In [5]:
pint=pint.dropna(subset=['VDM/VADM'])
pint['VDM/VADM'].head()
Out[5]:
To filter by age, we use the 'AGE' column, and by method, the 'IntM' column. First let's do age:
In [6]:
pint_last_10=pint[pint.AGE <10]
pint_last_10.AGE.head()
Out[6]:
Now let's fish out the 'T+' and 'S' methods separately, starting with T+ (which means Thellier-Thellier plus pTRM checks).
In [7]:
T_plus=pint_last_10[pint_last_10.IntM=='T+']
T_plus.IntM.unique()
Out[7]:
OK that did it. And now for the Shaw method data:
In [8]:
Shaw=pint_last_10[pint_last_10.IntM=='S']
Shaw.IntM.unique()
Out[8]:
Now we can plot the two data sets versus age:
In [9]:
plt.plot(T_plus['AGE'],T_plus['VDM/VADM'],'bs',label='T+')
plt.plot(Shaw['AGE'],Shaw['VDM/VADM'],'ro',label='Shaw')
plt.legend();
The Shaw data seem to be more scattered with higher values for the same ages....
In [10]:
T_plus_trans=T_plus[T_plus.P=='T'] # pull out the transitional data
T_plus_norm=T_plus[T_plus.P=='N'] # the normal data
T_plus_rev=T_plus[T_plus.P=='R'] # the reverse data
In [11]:
plt.plot(T_plus_trans['AGE'],T_plus_trans['VDM/VADM'],'g^',label='Transitional')
plt.plot(T_plus_norm['AGE'],T_plus_norm['VDM/VADM'],'ro',label='Normal')
plt.plot(T_plus_rev['AGE'],T_plus_rev['VDM/VADM'],'bs',label='Reverse')
plt.xlabel('Age (Ma)')
plt.ylabel('V[A]DM (10$^{22}$ Am$^2$)')
plt.legend();
Very interesting. The 'transitional' data are maybe lower in general, but there are a lot of high values. The highest values are all normal, but I'm not sure if there is a big difference.
I went to the website for the paper Lawrence et al. (2009) here: http://earthref.org/MagIC/doi/10.1029/2008GC002072 and clicked on the icon under the heading 'Download'. This downloaded a text file which I can unpack with ipmag.download_magic.
In [12]:
help(ipmag.download_magic)
In [13]:
ipmag.download_magic('magic_contribution_13436.txt',input_dir_path='Chapter_14/Problem_3',\
dir_path='Chapter_14/Problem_3')
Out[13]:
In [14]:
site_means=pd.read_csv('Chapter_14/Problem_3/sites.txt','\t',header=1)
site_means.head()
Out[14]:
To plot them, I first fish out the directions that are not blank (some of the records are just intensity values).
In [15]:
site_means=site_means.dropna(subset = ['dir_dec','dir_inc'])
Now plot the net with ipmag.plot_net and the directions with ipmag.plot_di. But first I set up a figure object.
In [16]:
plt.figure(num=1)
ipmag.plot_net(1)
ipmag.plot_di(site_means['dir_dec'].values,site_means['dir_inc'].values,color='blue')
Now for the VGPs from the same records (vgp_lat and vgp_lon are the keys). First we have to drop the records without VGPs.
In [26]:
site_means=site_means.dropna(subset=['vgp_lat','vgp_lon'])
And then take the antipodes of the reverse VGPs using pmag.flip().
In [27]:
help(pmag.flip)
In [33]:
# need to make a nested array of values
vgp_block=site_means[['vgp_lon','vgp_lat']].values
norm_vgps,rev_vgps=pmag.flip(vgp_block)
In [34]:
help(ipmag.plot_vgp)
Let's have fun with ipmag.plot_vgp( ). There are two ways to do maps in Python, the old way, matplotlib's Basemap module, and the new cartopy way. We are shifting to cartopy, but there is also the basemap way if you have it installed and prefer it.
ipmag.plot_vgp( ) help page will walk us through this.
In [35]:
help(ipmag.plot_vgp)
In [36]:
has_basemap, Basemap = pmag.import_basemap()
has_cartopy, Cartopy = pmag.import_cartopy()
In [38]:
if not has_cartopy:
print('You will need to correctly install cartopy in order to continue this cell')
else:
map_axis =ipmag.make_orthographic_map(central_latitude=60,figsize=(6,6),land_color='bisque')
ipmag.plot_vgp(map_axis, di_block=norm_vgps, color='red',\
markersize=30, legend='no')
ipmag.plot_vgp(map_axis, di_block=rev_vgps, color='blue',\
markersize=30, legend='no')
In [ ]:
if not has_basemap:
print('You will need to correctly install basemap in order to continue this cell')
else:
m=Basemap(projection='ortho',lat_0=70,lon_0=230,resolution='c')
plt.figure(num=1,figsize=(6,6))
m.drawcoastlines()
m.fillcontinents(color='bisque')
m.drawmeridians(np.arange(0,360,30))
m.drawparallels(np.arange(-60,90,30))
ipmag.plot_vgp_basemap(m,di_block=norm_vgps, color='r',label='normal VGPs')
ipmag.plot_vgp_basemap(m,di_block=rev_flipped, color='b',label='reverse VGPs')
In [69]:
Antarctic_combo=np.array(pmag.flip(vgp_block,combine=True)).transpose()
In [70]:
def Scalc(vgps):
N=len(vgps[1])
colats=90.-vgps[1]
Sp2=np.sum(colats**2)/(N-1.)
return '%7.1f'%(np.sqrt(Sp2))
In [71]:
print (Scalc(Antarctic_combo))
In [52]:
def Scalc_w_cutoff(vgps,c):
N=len(vgps[1])
colats=90.-vgps[1]
Good=[]
for i in range(N):
if colats[i]<c:Good.append(colats[i])
sel_colats=np.array(Good)
Sp2=np.sum(sel_colats**2)/(N-1.)
return '%7.1f'%(np.sqrt(Sp2))
In [72]:
for c in range(90,20,-5):
print (c,Scalc_w_cutoff(Antarctic_combo,c))
Well the values of Sp with different cutoff values just look arbitrary to me. How am I supposed to know what the cutoff should be? Seems like that question puts the cart before the horse.
Ok. now we repeat this with the data in Chapter_14/hawaii.txt
In [73]:
hw_site_means=pd.read_csv('Chapter_14/Problem_3/hawaii.txt','\t',header=0)
hw_site_means.head()
Out[73]:
It looks like the columns we want are model_vgp_lon and model_vgp_lat. Here's another way to flip the reverse VGPs
In [74]:
vgp_block=hw_site_means[['model_vgp_lon','model_vgp_lat']].values
hawaii_combo=np.array(pmag.flip(vgp_block,combine=True)).transpose()
print (Scalc(hawaii_combo))
Wow. so there is a BIG latitudinal dependence. Now let's calculate Model G for the latitudes at Hawaii and Antarctica.
In [75]:
def ModelG(lat):
a,b=0.26,11.9
S2=(a*lat)**2+b**2
return np.sqrt(S2)
In [76]:
antarctica_lat=np.average(site_means['lat'])
hawaii_lat=np.average(hw_site_means['model_lat'])
print (antarctica_lat,ModelG(antarctica_lat))
print (hawaii_lat,ModelG(hawaii_lat))
So the values we get are much larger than Model G. But what about using a cutoff of, say 45$^{\circ}$?
In [77]:
print (Scalc_w_cutoff(Antarctic_combo,45))
print (Scalc_w_cutoff(hawaii_combo,45))
So, Model G works well for data with a 45$^{\circ}$ cutoff. And now just to show off:
In [78]:
lats=range(100)
Ss=[]
for lat in lats:
Ss.append(ModelG(lat))
plt.plot(lats,Ss);
First we we will generate 100 directions from the function ipmag.tk03() at a latitude of 20$^{\circ}$.
In [79]:
help(ipmag.tk03)
In [88]:
hawaii_tk03=np.array(ipmag.tk03(lat=20)) # get a nested array of vectors
In [89]:
decs=hawaii_tk03.transpose()[0]
incs=hawaii_tk03.transpose()[1]
In [90]:
ipmag.plot_net(1)
ipmag.plot_di(dec=decs,inc=incs,color='red',edge='black')
Let's find the IGRF value at lat=20, lon=-156.
In [84]:
d,i,f=ipmag.igrf([2010,0,20,-156])
print (d,i)
Now let's find the GAD inclination at this latitude with pmag.pinc
In [96]:
GAD_inc=pmag.pinc(20)
print (GAD_inc)
and rotate the hawaii.tk03 data to this expected GAD direction using pmag.dodirot_V
In [91]:
help(pmag.dodirot_V)
In [94]:
tk03_DIs=np.column_stack((decs,incs))
In [97]:
tk03_rot=pmag.dodirot_V(tk03_DIs,0,GAD_inc)
And plot them.
In [99]:
ipmag.plot_net(1)
ipmag.plot_di(di_block=tk03_rot,color='blue',edge='black')
Now we do the same for the Hawaiian data we extracted from hawaii.txt. But first we have to save them to a file.
In [105]:
hw_DIs=hw_site_means[['dec','inc']].values
hw_rot=pmag.dodirot_V(hw_DIs,0,GAD_inc)
In [106]:
ipmag.plot_net(1)
ipmag.plot_di(di_block=hw_rot,color='red',edge='black')
The data from Hawaii are dual polarity and apparantly have a different mean direction from the GAD field. They also look stretched in a weird way and I'm beginning to think that some part of Hawaii is tilted.... Maybe a project for someone in here.
And now let's calculate the eigenparameters for the Hawaii data set.
In [107]:
help(pmag.doprinc)
In [108]:
pmag.doprinc(hw_DIs)
Out[108]:
And bingo! The average inclination is about 28$^{\circ}$, not 36! This is what you would get if there is a non-zero g$^0_2$ term.