SEG Machine Learning (Well Log Facies Prediction) Contest

Entry by Justin Gosses of team Pet_Stromatolite

This is an "open science" contest designed to introduce people to machine learning with well logs and brainstorm different methods through collaboration with others, so this notebook is based heavily on the introductary notebook with my own modifications.

more information at https://github.com/seg/2016-ml-contest

and even more information at http://library.seg.org/doi/abs/10.1190/tle35100906.1

This data is from the Council Grove gas reservoir in Southwest Kansas. The Panoma Council Grove Field is predominantly a carbonate gas reservoir encompassing 2700 square miles in Southwestern Kansas. This dataset is from nine wells (with 4149 examples), consisting of a set of seven predictor variables and a rock facies (class) for each example vector and validation (test) data (830 examples from two wells) having the same seven predictor variables in the feature vector. Facies are based on examination of cores from nine wells taken vertically at half-foot intervals. Predictor variables include five from wireline log measurements and two geologic constraining variables that are derived from geologic knowledge. These are essentially continuous variables sampled at a half-foot sample rate.

The seven predictor variables are:

The nine discrete facies (classes of rocks) are:

  1. Nonmarine sandstone
  2. Nonmarine coarse siltstone
  3. Nonmarine fine siltstone
  4. Marine siltstone and shale
  5. Mudstone (limestone)
  6. Wackestone (limestone)
  7. Dolomite
  8. Packstone-grainstone (limestone)
  9. Phylloid-algal bafflestone (limestone)

These facies aren't discrete, and gradually blend into one another. Some have neighboring facies that are rather close. Mislabeling within these neighboring facies can be expected to occur. The following table lists the facies, their abbreviated labels and their approximate neighbors.

Facies Label Adjacent Facies
1 SS 2
2 CSiS 1,3
3 FSiS 2
4 SiSh 5
5 MS 4,6
6 WS 5,7
7 D 6,8
8 PS 6,7,9
9 BS 7,8

Let's clean up this dataset. The 'Well Name' and 'Formation' columns can be turned into a categorical data type.

=================================================================================================================

Notes:

Early Ideas for feature engineering

  • take out any points in individual wells where not all the logs are present
  • test whether error increases around the depths where PE is absent?
  • test whether using formation, depth, or depth&formation as variables impacts prediction
  • examine well logs & facies logs (including prediction wells) to see if there aren't trends that might be dealt with by increasing the population of certain wells over others in the training set?
  • explore effect size of using/not using marine or non-marine flags
  • explore making 'likely to predict wrong' flags based on first-pass results with thin facies surrounded by thicker facies such that you might expand a 'blended' response due to the measured response of the tool being thicker than predicted facies
  • explore doing the same above but before prediction using range of thickness in predicted facies flags vs. range of thickness in known facies flags
  • explore using multiple prediction loops, in order words, predict errors not just facies.
  • Explore error distribution: adjacent vs. non-adjacent facies, by thickness, marine vs. non-marine, by formation, and possible human judgement patterns that influence interpreted facies.

In [251]:
## import modules

from __future__ import division
%matplotlib inline
import matplotlib as mpl
import matplotlib.pyplot as plt
mpl.rcParams['figure.figsize']=(20.0,10.0)
inline_rc = dict(mpl.rcParams)

import pandas as pd
import numpy as np
import seaborn as sns

from sklearn import preprocessing
from sklearn.model_selection import LeavePGroupsOut
from sklearn.metrics import f1_score
from sklearn.multiclass import OneVsOneClassifier
from sklearn.ensemble import RandomForestClassifier, RandomForestRegressor

from scipy.signal import medfilt

from pandas.tools.plotting import scatter_matrix

import matplotlib.colors as colors

import xgboost as xgb
from sklearn.metrics import confusion_matrix, f1_score, accuracy_score
from classification_utilities import display_cm, display_adj_cm
from sklearn.model_selection import GridSearchCV


from sklearn.model_selection import validation_curve
from sklearn.datasets import load_svmlight_files

from xgboost.sklearn import XGBClassifier
from scipy.sparse import vstack

seed = 123
np.random.seed(seed)

In [291]:
# Parameters
feature_names = ['GR', 'ILD_log10', 'DeltaPHI', 'PHIND', 'PE', 'NM_M', 'RELPOS']
featurePlusFacies_names = ['Facies', 'GR', 'ILD_log10', 'DeltaPHI', 'PHIND', 'PE', 'NM_M', 'RELPOS']
# featureIncForm_names = ['Formation', 'GR', 'ILD_log10', 'DeltaPHI', 'PHIND', 'PE', 'NM_M', 'RELPOS']
featureIncForm_names = ['Formation', 'GR', 'ILD_log10', 'DeltaPHI', 'PHIND', 'PE', 'RELPOS']
facies_names = ['SS', 'CSiS', 'FSiS', 'SiSh', 'MS', 'WS', 'D', 'PS', 'BS']
facies_colors = ['#F4D03F', '#F5B041','#DC7633','#6E2C00', '#1B4F72','#2E86C1', '#AED6F1', '#A569BD', '#196F3D']
formation_names = ['A1 LM', 'A1 SH', 'B1 LM', 'B1 SH', 'B2 LM', 'B2 SH', 'B3 LM','B3 SH', 'B4 LM', 'B4 SH', 'B5 LM', 'B5 SH', 'C LM', 'C SH']
## formation_colors = ['#F4D03F', '#F5B041','#DC7633','#6E2C00', '#1B4F72','#2E86C1', '#AED6F1', '#A569BD', '#196F3D','#FAA03F', '#FAA041','#DCAA33','#6AAC00', '#1BAA72']
formation_colors = ['#000000','#00FF00','#0000FF','#FF0000','#01FFFE','#FFA6FE','#FFDB66','#006401','#010067','#95003A','#007DB5','#FF00F6','#FFEEE8','#774D00']

In [292]:
### setting up options in pandas
from pandas import set_option
set_option("display.max_rows", 100)
pd.options.mode.chained_assignment = None

In [293]:
### taking a look at the training dataset 
filename = 'training_data.csv'
training_data = pd.read_csv(filename)
training_data


Out[293]:
Facies Formation Well Name Depth GR ILD_log10 DeltaPHI PHIND PE NM_M RELPOS
0 3 A1 SH SHRIMPLIN 2793.0 77.450 0.664 9.900 11.915 4.600 1 1.000
1 3 A1 SH SHRIMPLIN 2793.5 78.260 0.661 14.200 12.565 4.100 1 0.979
2 3 A1 SH SHRIMPLIN 2794.0 79.050 0.658 14.800 13.050 3.600 1 0.957
3 3 A1 SH SHRIMPLIN 2794.5 86.100 0.655 13.900 13.115 3.500 1 0.936
4 3 A1 SH SHRIMPLIN 2795.0 74.580 0.647 13.500 13.300 3.400 1 0.915
... ... ... ... ... ... ... ... ... ... ... ...
3227 5 C LM CHURCHMAN BIBLE 3120.5 46.719 0.947 1.828 7.254 3.617 2 0.685
3228 5 C LM CHURCHMAN BIBLE 3121.0 44.563 0.953 2.241 8.013 3.344 2 0.677
3229 5 C LM CHURCHMAN BIBLE 3121.5 49.719 0.964 2.925 8.013 3.190 2 0.669
3230 5 C LM CHURCHMAN BIBLE 3122.0 51.469 0.965 3.083 7.708 3.152 2 0.661
3231 5 C LM CHURCHMAN BIBLE 3122.5 50.031 0.970 2.609 6.668 3.295 2 0.653

3232 rows × 11 columns


In [294]:
# Load data from file
data = pd.read_csv('facies_vectors.csv')

In [302]:
data


Out[302]:
Facies Formation Well Name Depth GR ILD_log10 DeltaPHI PHIND PE NM_M RELPOS
0 3 A1 SH SHRIMPLIN 2793.0 77.450 0.664 9.900 11.915 4.600 1 1.000
1 3 A1 SH SHRIMPLIN 2793.5 78.260 0.661 14.200 12.565 4.100 1 0.979
2 3 A1 SH SHRIMPLIN 2794.0 79.050 0.658 14.800 13.050 3.600 1 0.957
3 3 A1 SH SHRIMPLIN 2794.5 86.100 0.655 13.900 13.115 3.500 1 0.936
4 3 A1 SH SHRIMPLIN 2795.0 74.580 0.647 13.500 13.300 3.400 1 0.915
... ... ... ... ... ... ... ... ... ... ... ...
4144 5 C LM CHURCHMAN BIBLE 3120.5 46.719 0.947 1.828 7.254 3.617 2 0.685
4145 5 C LM CHURCHMAN BIBLE 3121.0 44.563 0.953 2.241 8.013 3.344 2 0.677
4146 5 C LM CHURCHMAN BIBLE 3121.5 49.719 0.964 2.925 8.013 3.190 2 0.669
4147 5 C LM CHURCHMAN BIBLE 3122.0 51.469 0.965 3.083 7.708 3.152 2 0.661
4148 5 C LM CHURCHMAN BIBLE 3122.5 50.031 0.970 2.609 6.668 3.295 2 0.653

4149 rows × 11 columns

facies_vectors.csv and training_data.csv appear to be the same?


Creating mappings to translate formation text to integers so that all datatypes are the same. This makes later processing easier.

code below gives each formation individual integers starting with A1 LM and going to C SH


In [296]:
mappingFormationIntegers = {'A1 LM':31.0, 'A1 SH':2.0, 'B1 LM':33.0, 'B1 SH':4.0, 'B2 LM':35.0, 'B2 SH':6.0, 'B3 LM':37.0,
       'B3 SH':8.0, 'B4 LM':39.0, 'B4 SH':10.0, 'B5 LM':41.0, 'B5 SH':12.0, 'C LM':43.0, 'C SH':14.0}

This mapping merely separate carbonate dominate formations from siliclastic domiante formations using a 1 or 2 scheme. Looking below at the bar charts for each formamtion SH vs. LM has a strong effect on distribution. My thinking is maybe the direction or type of mistakes might correlate.


In [297]:
mappingFormationLMvSh = {'A1 LM':1, 'A1 SH':2, 'B1 LM':1, 'B1 SH':2, 'B2 LM':1, 'B2 SH':2, 'B3 LM':1,
       'B3 SH':2, 'B4 LM':1, 'B4 SH':2, 'B5 LM':1, 'B5 SH':2, 'C LM':1, 'C SH':2}

replacing formation string with integers 1-14


In [306]:
dataFormIsInt = data
dataFormIsInt.describe()


Out[306]:
Facies Depth GR ILD_log10 DeltaPHI PHIND PE NM_M RELPOS
count 4149.000000 4149.000000 4149.000000 4149.000000 4149.000000 4149.000000 3232.000000 4149.000000 4149.000000
mean 4.503254 2906.867438 64.933985 0.659566 4.402484 13.201066 3.725014 1.518438 0.521852
std 2.474324 133.300164 30.302530 0.252703 5.274947 7.132846 0.896152 0.499720 0.286644
min 1.000000 2573.500000 10.149000 -0.025949 -21.832000 0.550000 0.200000 1.000000 0.000000
25% 2.000000 2821.500000 44.730000 0.498000 1.600000 8.500000 3.100000 1.000000 0.277000
50% 4.000000 2932.500000 64.990000 0.639000 4.300000 12.020000 3.551500 2.000000 0.528000
75% 6.000000 3007.000000 79.438000 0.822000 7.500000 16.050000 4.300000 2.000000 0.769000
max 9.000000 3138.000000 361.150000 1.800000 19.312000 84.400000 8.094000 2.000000 1.000000

In [305]:
data.describe()


Out[305]:
Facies Depth GR ILD_log10 DeltaPHI PHIND PE NM_M RELPOS
count 4149.000000 4149.000000 4149.000000 4149.000000 4149.000000 4149.000000 3232.000000 4149.000000 4149.000000
mean 4.503254 2906.867438 64.933985 0.659566 4.402484 13.201066 3.725014 1.518438 0.521852
std 2.474324 133.300164 30.302530 0.252703 5.274947 7.132846 0.896152 0.499720 0.286644
min 1.000000 2573.500000 10.149000 -0.025949 -21.832000 0.550000 0.200000 1.000000 0.000000
25% 2.000000 2821.500000 44.730000 0.498000 1.600000 8.500000 3.100000 1.000000 0.277000
50% 4.000000 2932.500000 64.990000 0.639000 4.300000 12.020000 3.551500 2.000000 0.528000
75% 6.000000 3007.000000 79.438000 0.822000 7.500000 16.050000 4.300000 2.000000 0.769000
max 9.000000 3138.000000 361.150000 1.800000 19.312000 84.400000 8.094000 2.000000 1.000000

In [317]:
data[feature_names]


Out[317]:
GR ILD_log10 DeltaPHI PHIND PE NM_M RELPOS
0 77.450 0.664 9.900 11.915 4.600 1 1.000
1 78.260 0.661 14.200 12.565 4.100 1 0.979
2 79.050 0.658 14.800 13.050 3.600 1 0.957
3 86.100 0.655 13.900 13.115 3.500 1 0.936
4 74.580 0.647 13.500 13.300 3.400 1 0.915
... ... ... ... ... ... ... ...
4144 46.719 0.947 1.828 7.254 3.617 2 0.685
4145 44.563 0.953 2.241 8.013 3.344 2 0.677
4146 49.719 0.964 2.925 8.013 3.190 2 0.669
4147 51.469 0.965 3.083 7.708 3.152 2 0.661
4148 50.031 0.970 2.609 6.668 3.295 2 0.653

4149 rows × 7 columns


In [ ]:


In [307]:
dataFormIsInt = dataFormIsInt.replace({'Formation': mappingFormationIntegers})
#converting integers to float64
pd.to_numeric(dataFormIsInt['Formation'])  
dataFormIsInt['Formation'].dtype


Out[307]:
dtype('float64')

In [308]:
data.describe()


Out[308]:
Facies Depth GR ILD_log10 DeltaPHI PHIND PE NM_M RELPOS
count 4149.000000 4149.000000 4149.000000 4149.000000 4149.000000 4149.000000 3232.000000 4149.000000 4149.000000
mean 4.503254 2906.867438 64.933985 0.659566 4.402484 13.201066 3.725014 1.518438 0.521852
std 2.474324 133.300164 30.302530 0.252703 5.274947 7.132846 0.896152 0.499720 0.286644
min 1.000000 2573.500000 10.149000 -0.025949 -21.832000 0.550000 0.200000 1.000000 0.000000
25% 2.000000 2821.500000 44.730000 0.498000 1.600000 8.500000 3.100000 1.000000 0.277000
50% 4.000000 2932.500000 64.990000 0.639000 4.300000 12.020000 3.551500 2.000000 0.528000
75% 6.000000 3007.000000 79.438000 0.822000 7.500000 16.050000 4.300000 2.000000 0.769000
max 9.000000 3138.000000 361.150000 1.800000 19.312000 84.400000 8.094000 2.000000 1.000000

In [309]:
dataFormIsInt.describe()


Out[309]:
Facies Formation Depth GR ILD_log10 DeltaPHI PHIND PE NM_M RELPOS
count 4149.000000 4149.000000 4149.000000 4149.000000 4149.000000 4149.000000 4149.000000 3232.000000 4149.000000 4149.000000
mean 4.503254 23.074235 2906.867438 64.933985 0.659566 4.402484 13.201066 3.725014 1.518438 0.521852
std 2.474324 15.677093 133.300164 30.302530 0.252703 5.274947 7.132846 0.896152 0.499720 0.286644
min 1.000000 2.000000 2573.500000 10.149000 -0.025949 -21.832000 0.550000 0.200000 1.000000 0.000000
25% 2.000000 8.000000 2821.500000 44.730000 0.498000 1.600000 8.500000 3.100000 1.000000 0.277000
50% 4.000000 31.000000 2932.500000 64.990000 0.639000 4.300000 12.020000 3.551500 2.000000 0.528000
75% 6.000000 39.000000 3007.000000 79.438000 0.822000 7.500000 16.050000 4.300000 2.000000 0.769000
max 9.000000 43.000000 3138.000000 361.150000 1.800000 19.312000 84.400000 8.094000 2.000000 1.000000

In [310]:
del dataFormIsInt['NM_M']
dataFormIsInt.describe()


Out[310]:
Facies Formation Depth GR ILD_log10 DeltaPHI PHIND PE RELPOS
count 4149.000000 4149.000000 4149.000000 4149.000000 4149.000000 4149.000000 4149.000000 3232.000000 4149.000000
mean 4.503254 23.074235 2906.867438 64.933985 0.659566 4.402484 13.201066 3.725014 0.521852
std 2.474324 15.677093 133.300164 30.302530 0.252703 5.274947 7.132846 0.896152 0.286644
min 1.000000 2.000000 2573.500000 10.149000 -0.025949 -21.832000 0.550000 0.200000 0.000000
25% 2.000000 8.000000 2821.500000 44.730000 0.498000 1.600000 8.500000 3.100000 0.277000
50% 4.000000 31.000000 2932.500000 64.990000 0.639000 4.300000 12.020000 3.551500 0.528000
75% 6.000000 39.000000 3007.000000 79.438000 0.822000 7.500000 16.050000 4.300000 0.769000
max 9.000000 43.000000 3138.000000 361.150000 1.800000 19.312000 84.400000 8.094000 1.000000

In [342]:
set_option("display.max_rows", 10000)
dataFormIsInt[featureIncForm_names].describe()


Out[342]:
Formation GR ILD_log10 DeltaPHI PHIND PE RELPOS
count 4149.000000 4149.000000 4149.000000 4149.000000 4149.000000 3232.000000 4149.000000
mean 23.074235 64.933985 0.659566 4.402484 13.201066 3.725014 0.521852
std 15.677093 30.302530 0.252703 5.274947 7.132846 0.896152 0.286644
min 2.000000 10.149000 -0.025949 -21.832000 0.550000 0.200000 0.000000
25% 8.000000 44.730000 0.498000 1.600000 8.500000 3.100000 0.277000
50% 31.000000 64.990000 0.639000 4.300000 12.020000 3.551500 0.528000
75% 39.000000 79.438000 0.822000 7.500000 16.050000 4.300000 0.769000
max 43.000000 361.150000 1.800000 19.312000 84.400000 8.094000 1.000000

getting list of formations


In [343]:
formation = dataFormIsInt['Formation'].values
#np.unique(formation)
dataFormIsInt['Formation'].dtypes


Out[343]:
dtype('float64')

In [344]:
dataFormIsInt['Formation']


Out[344]:
0        2.0
1        2.0
2        2.0
3        2.0
4        2.0
5        2.0
6        2.0
7        2.0
8        2.0
9        2.0
10       2.0
11       2.0
12       2.0
13       2.0
14       2.0
15       2.0
16       2.0
17       2.0
18       2.0
19       2.0
20       2.0
21       2.0
22       2.0
23       2.0
24       2.0
25       2.0
26       2.0
27       2.0
28       2.0
29       2.0
30       2.0
31       2.0
32       2.0
33       2.0
34       2.0
35       2.0
36       2.0
37       2.0
38       2.0
39       2.0
40       2.0
41       2.0
42       2.0
43      31.0
44      31.0
45      31.0
46      31.0
47      31.0
48      31.0
49      31.0
50      31.0
51      31.0
52      31.0
53      31.0
54      31.0
55      31.0
56      31.0
57      31.0
58      31.0
59      31.0
60      31.0
61      31.0
62      31.0
63      31.0
64      31.0
65      31.0
66      31.0
67      31.0
68      31.0
69      31.0
70      31.0
71      31.0
72      31.0
73      31.0
74      31.0
75      31.0
76      31.0
77      31.0
78      31.0
79      31.0
80      31.0
81      31.0
82      31.0
83      31.0
84      31.0
85      31.0
86      31.0
87      31.0
88      31.0
89      31.0
90      31.0
91      31.0
92      31.0
93      31.0
94       4.0
95       4.0
96       4.0
97       4.0
98       4.0
99       4.0
100      4.0
101      4.0
102      4.0
103      4.0
104      4.0
105      4.0
106      4.0
107      4.0
108      4.0
109      4.0
110      4.0
111      4.0
112      4.0
113      4.0
114      4.0
115      4.0
116      4.0
117      4.0
118      4.0
119      4.0
120      4.0
121      4.0
122      4.0
123      4.0
124      4.0
125      4.0
126      4.0
127      4.0
128      4.0
129      4.0
130      4.0
131      4.0
132     33.0
133     33.0
134     33.0
135     33.0
136     33.0
137     33.0
138     33.0
139     33.0
140     33.0
141     33.0
142     33.0
143     33.0
144     33.0
145     33.0
146     33.0
147     33.0
148     33.0
149     33.0
150      6.0
151      6.0
152      6.0
153      6.0
154      6.0
155      6.0
156      6.0
157      6.0
158      6.0
159      6.0
160      6.0
161      6.0
162      6.0
163      6.0
164      6.0
165      6.0
166      6.0
167      6.0
168      6.0
169      6.0
170      6.0
171      6.0
172      6.0
173      6.0
174      6.0
175      6.0
176      6.0
177      6.0
178     35.0
179     35.0
180     35.0
181     35.0
182     35.0
183     35.0
184     35.0
185     35.0
186     35.0
187     35.0
188     35.0
189     35.0
190     35.0
191     35.0
192     35.0
193     35.0
194      8.0
195      8.0
196      8.0
197      8.0
198      8.0
199      8.0
200      8.0
201      8.0
202      8.0
203      8.0
204      8.0
205      8.0
206      8.0
207      8.0
208      8.0
209      8.0
210      8.0
211      8.0
212      8.0
213      8.0
214      8.0
215      8.0
216      8.0
217      8.0
218      8.0
219      8.0
220      8.0
221      8.0
222      8.0
223      8.0
224     37.0
225     37.0
226     37.0
227     37.0
228     37.0
229     37.0
230     37.0
231     37.0
232     37.0
233     37.0
234     37.0
235     37.0
236     10.0
237     10.0
238     10.0
239     10.0
240     10.0
241     10.0
242     10.0
243     10.0
244     10.0
245     10.0
246     10.0
247     10.0
248     10.0
249     10.0
250     10.0
251     10.0
252     10.0
253     10.0
254     10.0
255     10.0
256     10.0
257     10.0
258     10.0
259     10.0
260     10.0
261     10.0
262     10.0
263     10.0
264     10.0
265     39.0
266     39.0
267     39.0
268     39.0
269     39.0
270     39.0
271     39.0
272     39.0
273     39.0
274     12.0
275     12.0
276     12.0
277     12.0
278     12.0
279     12.0
280     12.0
281     12.0
282     12.0
283     12.0
284     12.0
285     12.0
286     12.0
287     12.0
288     12.0
289     12.0
290     41.0
291     41.0
292     41.0
293     41.0
294     41.0
295     41.0
296     41.0
297     41.0
298     41.0
299     41.0
300     41.0
301     41.0
302     41.0
303     41.0
304     41.0
305     41.0
306     41.0
307     41.0
308     41.0
309     41.0
310     41.0
311     14.0
312     14.0
313     14.0
314     14.0
315     14.0
316     14.0
317     14.0
318     14.0
319     14.0
320     14.0
321     14.0
322     14.0
323     14.0
324     14.0
325     14.0
326     14.0
327     14.0
328     14.0
329     14.0
330     14.0
331     14.0
332     14.0
333     14.0
334     14.0
335     14.0
336     14.0
337     14.0
338     14.0
339     14.0
340     14.0
341     14.0
342     14.0
343     14.0
344     14.0
345     14.0
346     14.0
347     14.0
348     14.0
349     14.0
350     14.0
351     14.0
352     14.0
353     14.0
354     14.0
355     14.0
356     14.0
357     14.0
358     14.0
359     14.0
360     14.0
361     14.0
362     14.0
363     14.0
364     14.0
365     14.0
366     14.0
367     14.0
368     43.0
369     43.0
370     43.0
371     43.0
372     43.0
373     43.0
374     43.0
375     43.0
376     43.0
377     43.0
378     43.0
379     43.0
380     43.0
381     43.0
382     43.0
383     43.0
384     43.0
385     43.0
386     43.0
387     43.0
388     43.0
389     43.0
390     43.0
391     43.0
392     43.0
393     43.0
394     43.0
395     43.0
396     43.0
397     43.0
398     43.0
399     43.0
400     43.0
401     43.0
402     43.0
403     43.0
404     43.0
405     43.0
406     43.0
407     43.0
408     43.0
409     43.0
410     43.0
411     43.0
412     43.0
413     43.0
414     43.0
415     43.0
416     43.0
417     43.0
418     43.0
419     43.0
420     43.0
421     43.0
422     43.0
423     43.0
424     43.0
425     43.0
426     43.0
427     43.0
428     43.0
429     43.0
430     43.0
431     43.0
432     43.0
433     43.0
434     43.0
435     43.0
436     43.0
437     43.0
438     43.0
439     43.0
440     43.0
441     43.0
442     43.0
443     43.0
444     43.0
445     43.0
446     43.0
447     43.0
448     43.0
449     43.0
450     43.0
451     43.0
452     43.0
453     43.0
454     43.0
455     43.0
456     43.0
457     43.0
458     43.0
459     43.0
460     43.0
461     43.0
462     43.0
463     43.0
464     43.0
465     43.0
466     43.0
467     43.0
468     43.0
469     43.0
470     43.0
471      2.0
472      2.0
473      2.0
474      2.0
475      2.0
476      2.0
477      2.0
478      2.0
479      2.0
480      2.0
481      2.0
482      2.0
483      2.0
484      2.0
485      2.0
486      2.0
487      2.0
488      2.0
489      2.0
490      2.0
491      2.0
492      2.0
493      2.0
494      2.0
495      2.0
496      2.0
497      2.0
498      2.0
499      2.0
500      2.0
501      2.0
502      2.0
503      2.0
504      2.0
505      2.0
506      2.0
507      2.0
508      2.0
509     31.0
510     31.0
511     31.0
512     31.0
513     31.0
514     31.0
515     31.0
516     31.0
517     31.0
518     31.0
519     31.0
520     31.0
521     31.0
522     31.0
523     31.0
524     31.0
525     31.0
526     31.0
527     31.0
528     31.0
529     31.0
530     31.0
531     31.0
532     31.0
533     31.0
534     31.0
535     31.0
536     31.0
537     31.0
538     31.0
539     31.0
540     31.0
541     31.0
542     31.0
543     31.0
544     31.0
545     31.0
546     31.0
547     31.0
548     31.0
549     31.0
550     31.0
551     31.0
552     31.0
553     31.0
554     31.0
555     31.0
556     31.0
557     31.0
558     31.0
559     31.0
560     31.0
561     31.0
562     31.0
563     31.0
564     31.0
565     31.0
566     31.0
567     31.0
568     31.0
569     31.0
570     31.0
571     31.0
572     31.0
573      4.0
574      4.0
575      4.0
576      4.0
577      4.0
578      4.0
579      4.0
580      4.0
581      4.0
582      4.0
583      4.0
584      4.0
585      4.0
586      4.0
587      4.0
588      4.0
589      4.0
590      4.0
591      4.0
592      4.0
593      4.0
594      4.0
595      4.0
596      4.0
597      4.0
598      4.0
599      4.0
600      4.0
601      4.0
602      4.0
603      4.0
604      4.0
605      4.0
606     33.0
607     33.0
608     33.0
609     33.0
610     33.0
611     33.0
612     33.0
613     33.0
614     33.0
615     33.0
616     33.0
617     33.0
618     33.0
619     33.0
620     33.0
621     33.0
622     33.0
623     33.0
624     33.0
625     33.0
626     33.0
627      6.0
628      6.0
629      6.0
630      6.0
631      6.0
632      6.0
633      6.0
634      6.0
635      6.0
636      6.0
637      6.0
638      6.0
639      6.0
640      6.0
641      6.0
642      6.0
643      6.0
644      6.0
645      6.0
646      6.0
647      6.0
648      6.0
649      6.0
650     35.0
651     35.0
652     35.0
653     35.0
654     35.0
655     35.0
656     35.0
657     35.0
658     35.0
659     35.0
660     35.0
661     35.0
662     35.0
663     35.0
664     35.0
665     35.0
666     35.0
667     35.0
668     35.0
669     35.0
670      8.0
671      8.0
672      8.0
673      8.0
674      8.0
675      8.0
676      8.0
677      8.0
678      8.0
679      8.0
680      8.0
681      8.0
682      8.0
683      8.0
684      8.0
685      8.0
686      8.0
687      8.0
688      8.0
689      8.0
690      8.0
691      8.0
692      8.0
693      8.0
694      8.0
695      8.0
696      8.0
697      8.0
698     37.0
699     37.0
700     37.0
701     10.0
702     10.0
703     10.0
704     10.0
705     10.0
706     10.0
707     10.0
708     10.0
709     10.0
710     10.0
711     10.0
712     10.0
713     10.0
714     10.0
715     10.0
716     10.0
717     10.0
718     10.0
719     10.0
720     10.0
721     10.0
722     10.0
723     10.0
724     10.0
725     39.0
726     39.0
727     39.0
728     39.0
729     39.0
730     39.0
731     12.0
732     12.0
733     12.0
734     12.0
735     12.0
736     12.0
737     12.0
738     12.0
739     41.0
740     41.0
741     41.0
742     41.0
743     41.0
744     41.0
745     41.0
746     41.0
747     41.0
748     41.0
749     41.0
750     41.0
751     41.0
752     41.0
753     41.0
754     41.0
755     41.0
756     41.0
757     41.0
758     41.0
759     41.0
760     41.0
761     41.0
762     41.0
763     41.0
764     41.0
765     41.0
766     41.0
767     41.0
768     41.0
769     14.0
770     14.0
771     14.0
772     14.0
773     14.0
774     14.0
775     14.0
776     14.0
777     14.0
778     14.0
779     14.0
780     14.0
781     14.0
782     14.0
783     14.0
784     14.0
785     14.0
786     14.0
787     14.0
788     14.0
789     14.0
790     14.0
791     14.0
792     14.0
793     14.0
794     14.0
795     14.0
796     14.0
797     14.0
798     14.0
799     14.0
800     14.0
801     14.0
802     14.0
803     14.0
804     14.0
805     14.0
806     14.0
807     14.0
808     14.0
809     14.0
810     14.0
811     14.0
812     14.0
813     14.0
814     14.0
815     14.0
816     14.0
817     14.0
818     14.0
819     14.0
820     14.0
821     43.0
822     43.0
823     43.0
824     43.0
825     43.0
826     43.0
827     43.0
828     43.0
829     43.0
830     43.0
831     43.0
832     43.0
833     43.0
834     43.0
835     43.0
836     43.0
837     43.0
838     43.0
839     43.0
840     43.0
841     43.0
842     43.0
843     43.0
844     43.0
845     43.0
846     43.0
847     43.0
848     43.0
849     43.0
850     43.0
851     43.0
852     43.0
853     43.0
854     43.0
855     43.0
856     43.0
857     43.0
858     43.0
859     43.0
860     43.0
861     43.0
862     43.0
863     43.0
864     43.0
865     43.0
866     43.0
867     43.0
868     43.0
869     43.0
870     43.0
871     43.0
872     43.0
873     43.0
874     43.0
875     43.0
876     43.0
877     43.0
878     43.0
879     43.0
880     43.0
881     43.0
882     43.0
883     43.0
884     43.0
885     43.0
886     43.0
887     43.0
888     43.0
889     43.0
890     43.0
891     43.0
892     43.0
893     43.0
894     43.0
895     43.0
896     43.0
897     43.0
898     43.0
899     43.0
900     43.0
901     43.0
902     43.0
903     43.0
904     43.0
905     43.0
906     43.0
907     43.0
908     43.0
909     43.0
910     43.0
911     43.0
912     43.0
913     43.0
914     43.0
915     43.0
916     43.0
917     43.0
918     43.0
919     43.0
920     43.0
921     43.0
922     43.0
923     43.0
924     43.0
925     43.0
926     43.0
927     43.0
928     43.0
929     43.0
930     43.0
931     43.0
932     43.0
933     43.0
934     43.0
935     43.0
936     43.0
937      2.0
938      2.0
939      2.0
940      2.0
941      2.0
942      2.0
943      2.0
944      2.0
945      2.0
946      2.0
947      2.0
948      2.0
949      2.0
950      2.0
951      2.0
952      2.0
953      2.0
954      2.0
955      2.0
956      2.0
957      2.0
958      2.0
959      2.0
960      2.0
961      2.0
962      2.0
963      2.0
964      2.0
965      2.0
966      2.0
967      2.0
968      2.0
969      2.0
970      2.0
971      2.0
972      2.0
973      2.0
974      2.0
975      2.0
976      2.0
977      2.0
978      2.0
979      2.0
980      2.0
981      2.0
982      2.0
983      2.0
984      2.0
985      2.0
986      2.0
987      2.0
988      2.0
989      2.0
990      2.0
991      2.0
992      2.0
993      2.0
994      2.0
995      2.0
996      2.0
997      2.0
998      2.0
999      2.0
1000    31.0
1001    31.0
1002    31.0
1003    31.0
1004    31.0
1005    31.0
1006    31.0
1007    31.0
1008    31.0
1009    31.0
1010    31.0
1011    31.0
1012    31.0
1013    31.0
1014    31.0
1015    31.0
1016    31.0
1017    31.0
1018    31.0
1019    31.0
1020    31.0
1021    31.0
1022    31.0
1023    31.0
1024    31.0
1025    31.0
1026    31.0
1027    31.0
1028    31.0
1029    31.0
1030    31.0
1031    31.0
1032    31.0
1033    31.0
1034    31.0
1035    31.0
1036    31.0
1037    31.0
1038    31.0
1039    31.0
1040     4.0
1041     4.0
1042     4.0
1043     4.0
1044     4.0
1045     4.0
1046     4.0
1047     4.0
1048     4.0
1049     4.0
1050     4.0
1051     4.0
1052     4.0
1053     4.0
1054     4.0
1055     4.0
1056     4.0
1057     4.0
1058     4.0
1059     4.0
1060     4.0
1061     4.0
1062     4.0
1063     4.0
1064     4.0
1065     4.0
1066     4.0
1067     4.0
1068     4.0
1069     4.0
1070     4.0
1071     4.0
1072     4.0
1073     4.0
1074     4.0
1075     4.0
1076     4.0
1077     4.0
1078     4.0
1079     4.0
1080     4.0
1081     4.0
1082     4.0
1083     4.0
1084     4.0
1085     4.0
1086     4.0
1087     4.0
1088     4.0
1089    33.0
1090    33.0
1091    33.0
1092    33.0
1093    33.0
1094    33.0
1095    33.0
1096    33.0
1097    33.0
1098    33.0
1099     6.0
1100     6.0
1101     6.0
1102     6.0
1103     6.0
1104     6.0
1105     6.0
1106     6.0
1107     6.0
1108     6.0
1109     6.0
1110     6.0
1111     6.0
1112     6.0
1113     6.0
1114     6.0
1115     6.0
1116     6.0
1117     6.0
1118     6.0
1119     6.0
1120     6.0
1121     6.0
1122     6.0
1123     6.0
1124     6.0
1125     6.0
1126     6.0
1127     6.0
1128     6.0
1129    35.0
1130    35.0
1131    35.0
1132    35.0
1133    35.0
1134    35.0
1135    35.0
1136     8.0
1137     8.0
1138     8.0
1139     8.0
1140     8.0
1141     8.0
1142     8.0
1143     8.0
1144     8.0
1145     8.0
1146     8.0
1147     8.0
1148     8.0
1149     8.0
1150     8.0
1151     8.0
1152     8.0
1153     8.0
1154     8.0
1155     8.0
1156     8.0
1157     8.0
1158     8.0
1159     8.0
1160     8.0
1161     8.0
1162     8.0
1163     8.0
1164     8.0
1165     8.0
1166     8.0
1167     8.0
1168     8.0
1169     8.0
1170     8.0
1171     8.0
1172     8.0
1173     8.0
1174     8.0
1175     8.0
1176     8.0
1177     8.0
1178     8.0
1179    10.0
1180    10.0
1181    10.0
1182    10.0
1183    10.0
1184    10.0
1185    10.0
1186    10.0
1187    10.0
1188    10.0
1189    10.0
1190    10.0
1191    10.0
1192    10.0
1193    10.0
1194    10.0
1195    10.0
1196    10.0
1197    10.0
1198    10.0
1199    10.0
1200    10.0
1201    10.0
1202    10.0
1203    10.0
1204    10.0
1205    10.0
1206    10.0
1207    10.0
1208    10.0
1209    10.0
1210    10.0
1211    10.0
1212    10.0
1213    10.0
1214    10.0
1215    10.0
1216    10.0
1217    10.0
1218    10.0
1219    10.0
1220    10.0
1221    10.0
1222    10.0
1223    10.0
1224    10.0
1225    10.0
1226    10.0
1227    39.0
1228    39.0
1229    39.0
1230    39.0
1231    39.0
1232    39.0
1233    12.0
1234    12.0
1235    12.0
1236    12.0
1237    12.0
1238    12.0
1239    12.0
1240    12.0
1241    12.0
1242    12.0
1243    41.0
1244    41.0
1245    41.0
1246    41.0
1247    41.0
1248    41.0
1249    41.0
1250    41.0
1251    41.0
1252    41.0
1253    41.0
1254    41.0
1255    41.0
1256    41.0
1257    41.0
1258    41.0
1259    41.0
1260    41.0
1261    41.0
1262    41.0
1263    41.0
1264    41.0
1265    41.0
1266    14.0
1267    14.0
1268    14.0
1269    14.0
1270    14.0
1271    14.0
1272    14.0
1273    14.0
1274    14.0
1275    14.0
1276    14.0
1277    14.0
1278    14.0
1279    14.0
1280    14.0
1281    14.0
1282    14.0
1283    14.0
1284    14.0
1285    14.0
1286    14.0
1287    14.0
1288    14.0
1289    14.0
1290    14.0
1291    14.0
1292    14.0
1293    14.0
1294    14.0
1295    14.0
1296    14.0
1297    14.0
1298    14.0
1299    14.0
1300    14.0
1301    14.0
1302    14.0
1303    14.0
1304    14.0
1305    14.0
1306    14.0
1307    14.0
1308    14.0
1309    14.0
1310    14.0
1311    14.0
1312    14.0
1313    14.0
1314    14.0
1315    14.0
1316    14.0
1317    14.0
1318    43.0
1319    43.0
1320    43.0
1321    43.0
1322    43.0
1323    43.0
1324    43.0
1325    43.0
1326    43.0
1327    43.0
1328    43.0
1329    43.0
1330    43.0
1331    43.0
1332    43.0
1333    43.0
1334    43.0
1335    43.0
1336    43.0
1337    43.0
1338    43.0
1339    43.0
1340    43.0
1341    43.0
1342    43.0
1343    43.0
1344    43.0
1345    43.0
1346    43.0
1347    43.0
1348    43.0
1349    43.0
1350    43.0
1351    43.0
1352    43.0
1353    43.0
1354    43.0
1355    43.0
1356    43.0
1357    43.0
1358    43.0
1359    43.0
1360    43.0
1361    43.0
1362    43.0
1363    43.0
1364    43.0
1365    43.0
1366    43.0
1367    43.0
1368    43.0
1369    43.0
1370    43.0
1371    43.0
1372    43.0
1373    43.0
1374    43.0
1375    43.0
1376    43.0
1377    43.0
1378    43.0
1379    43.0
1380    43.0
1381    43.0
1382    43.0
1383    43.0
1384    43.0
1385    43.0
1386     2.0
1387     2.0
1388     2.0
1389     2.0
1390     2.0
1391     2.0
1392     2.0
1393     2.0
1394     2.0
1395     2.0
1396     2.0
1397     2.0
1398     2.0
1399     2.0
1400     2.0
1401     2.0
1402     2.0
1403     2.0
1404     2.0
1405     2.0
1406     2.0
1407     2.0
1408     2.0
1409     2.0
1410     2.0
1411     2.0
1412     2.0
1413     2.0
1414     2.0
1415     2.0
1416     2.0
1417     2.0
1418     2.0
1419     2.0
1420     2.0
1421     2.0
1422     2.0
1423     2.0
1424     2.0
1425     2.0
1426     2.0
1427     2.0
1428     2.0
1429     2.0
1430     2.0
1431     2.0
1432     2.0
1433     2.0
1434     2.0
1435     2.0
1436     2.0
1437     2.0
1438     2.0
1439    31.0
1440    31.0
1441    31.0
1442    31.0
1443    31.0
1444    31.0
1445    31.0
1446    31.0
1447    31.0
1448    31.0
1449    31.0
1450    31.0
1451    31.0
1452    31.0
1453    31.0
1454    31.0
1455    31.0
1456    31.0
1457    31.0
1458    31.0
1459    31.0
1460    31.0
1461    31.0
1462    31.0
1463    31.0
1464    31.0
1465    31.0
1466    31.0
1467    31.0
1468    31.0
1469    31.0
1470    31.0
1471    31.0
1472    31.0
1473    31.0
1474    31.0
1475    31.0
1476    31.0
1477    31.0
1478    31.0
1479    31.0
1480    31.0
1481    31.0
1482    31.0
1483    31.0
1484    31.0
1485    31.0
1486    31.0
1487    31.0
1488    31.0
1489    31.0
1490    31.0
1491    31.0
1492    31.0
1493    31.0
1494    31.0
1495    31.0
1496    31.0
1497    31.0
1498    31.0
1499    31.0
1500    31.0
1501    31.0
1502    31.0
1503    31.0
1504    31.0
1505    31.0
1506     4.0
1507     4.0
1508     4.0
1509     4.0
1510     4.0
1511     4.0
1512     4.0
1513     4.0
1514     4.0
1515     4.0
1516     4.0
1517     4.0
1518     4.0
1519     4.0
1520     4.0
1521     4.0
1522     4.0
1523     4.0
1524     4.0
1525     4.0
1526     4.0
1527     4.0
1528     4.0
1529     4.0
1530     4.0
1531     4.0
1532     4.0
1533     4.0
1534     4.0
1535     4.0
1536     4.0
1537     4.0
1538     4.0
1539     4.0
1540     4.0
1541     4.0
1542     4.0
1543     4.0
1544     4.0
1545    33.0
1546    33.0
1547    33.0
1548    33.0
1549    33.0
1550    33.0
1551    33.0
1552    33.0
1553    33.0
1554    33.0
1555    33.0
1556    33.0
1557    33.0
1558    33.0
1559    33.0
1560    33.0
1561    33.0
1562    33.0
1563    33.0
1564    33.0
1565    33.0
1566    33.0
1567    33.0
1568     6.0
1569     6.0
1570     6.0
1571     6.0
1572     6.0
1573     6.0
1574     6.0
1575     6.0
1576     6.0
1577     6.0
1578     6.0
1579     6.0
1580     6.0
1581     6.0
1582     6.0
1583     6.0
1584     6.0
1585     6.0
1586     6.0
1587     6.0
1588     6.0
1589    35.0
1590    35.0
1591    35.0
1592    35.0
1593    35.0
1594    35.0
1595    35.0
1596    35.0
1597    35.0
1598    35.0
1599    35.0
1600    35.0
1601    35.0
1602    35.0
1603    35.0
1604    35.0
1605    35.0
1606    35.0
1607    35.0
1608    35.0
1609     8.0
1610     8.0
1611     8.0
1612     8.0
1613     8.0
1614     8.0
1615     8.0
1616     8.0
1617     8.0
1618     8.0
1619     8.0
1620     8.0
1621     8.0
1622     8.0
1623     8.0
1624     8.0
1625     8.0
1626     8.0
1627     8.0
1628     8.0
1629     8.0
1630     8.0
1631     8.0
1632     8.0
1633     8.0
1634     8.0
1635     8.0
1636     8.0
1637     8.0
1638     8.0
1639     8.0
1640     8.0
1641    37.0
1642    37.0
1643    37.0
1644    37.0
1645    37.0
1646    37.0
1647    37.0
1648    37.0
1649    37.0
1650    10.0
1651    10.0
1652    10.0
1653    10.0
1654    10.0
1655    10.0
1656    10.0
1657    10.0
1658    10.0
1659    10.0
1660    10.0
1661    10.0
1662    10.0
1663    10.0
1664    10.0
1665    10.0
1666    10.0
1667    10.0
1668    10.0
1669    10.0
1670    10.0
1671    10.0
1672    10.0
1673    10.0
1674    39.0
1675    39.0
1676    39.0
1677    39.0
1678    39.0
1679    39.0
1680    12.0
1681    12.0
1682    12.0
1683    12.0
1684    12.0
1685    12.0
1686    12.0
1687    12.0
1688    12.0
1689    12.0
1690    12.0
1691    12.0
1692    41.0
1693    41.0
1694    41.0
1695    41.0
1696    41.0
1697    41.0
1698    41.0
1699    41.0
1700    41.0
1701    41.0
1702    41.0
1703    41.0
1704    41.0
1705    41.0
1706    41.0
1707    41.0
1708    41.0
1709    41.0
1710    41.0
1711    41.0
1712    41.0
1713    41.0
1714    41.0
1715    41.0
1716    41.0
1717    41.0
1718    41.0
1719    41.0
1720    41.0
1721    41.0
1722    41.0
1723    41.0
1724    41.0
1725    41.0
1726    41.0
1727    41.0
1728    14.0
1729    14.0
1730    14.0
1731    14.0
1732    14.0
1733    14.0
1734    14.0
1735    14.0
1736    14.0
1737    14.0
1738    14.0
1739    14.0
1740    14.0
1741    14.0
1742    14.0
1743    14.0
1744    14.0
1745    14.0
1746    14.0
1747    14.0
1748    14.0
1749    14.0
1750    14.0
1751    14.0
1752    14.0
1753    14.0
1754    14.0
1755    14.0
1756    14.0
1757    14.0
1758    14.0
1759    14.0
1760    14.0
1761    14.0
1762    14.0
1763    14.0
1764    14.0
1765    14.0
1766    14.0
1767    14.0
1768    14.0
1769    14.0
1770    14.0
1771    14.0
1772    14.0
1773    14.0
1774    14.0
1775    14.0
1776    14.0
1777    14.0
1778    14.0
1779    14.0
1780    14.0
1781    14.0
1782    14.0
1783    43.0
1784    43.0
1785    43.0
1786    43.0
1787    43.0
1788    43.0
1789    43.0
1790    43.0
1791    43.0
1792    43.0
1793    43.0
1794    43.0
1795    43.0
1796    43.0
1797    43.0
1798    43.0
1799    43.0
1800    43.0
1801    43.0
1802    43.0
1803    43.0
1804    43.0
1805    43.0
1806    43.0
1807    43.0
1808    43.0
1809    43.0
1810    43.0
1811    43.0
1812    43.0
1813    43.0
1814    43.0
1815    43.0
1816    43.0
1817    43.0
1818    43.0
1819    43.0
1820    43.0
1821    43.0
1822    43.0
1823    43.0
1824    43.0
1825    43.0
1826    43.0
1827    43.0
1828    43.0
1829    43.0
1830    43.0
1831    43.0
1832    43.0
1833    43.0
1834    43.0
1835    43.0
1836    43.0
1837    43.0
1838    43.0
1839    43.0
1840    43.0
1841    43.0
1842    43.0
1843    43.0
1844    43.0
1845    43.0
1846    43.0
1847     2.0
1848     2.0
1849     2.0
1850     2.0
1851     2.0
1852     2.0
1853     2.0
1854     2.0
1855     2.0
1856     2.0
1857     2.0
1858     2.0
1859     2.0
1860     2.0
1861     2.0
1862     2.0
1863     2.0
1864     2.0
1865     2.0
1866     2.0
1867     2.0
1868     2.0
1869     2.0
1870     2.0
1871     2.0
1872     2.0
1873     2.0
1874     2.0
1875     2.0
1876     2.0
1877     2.0
1878     2.0
1879     2.0
1880     2.0
1881     2.0
1882     2.0
1883     2.0
1884     2.0
1885     2.0
1886     2.0
1887     2.0
1888     2.0
1889     2.0
1890     2.0
1891     2.0
1892    31.0
1893    31.0
1894    31.0
1895    31.0
1896    31.0
1897    31.0
1898    31.0
1899    31.0
1900    31.0
1901    31.0
1902    31.0
1903    31.0
1904    31.0
1905    31.0
1906    31.0
1907    31.0
1908    31.0
1909    31.0
1910    31.0
1911    31.0
1912    31.0
1913    31.0
1914    31.0
1915    31.0
1916    31.0
1917    31.0
1918    31.0
1919    31.0
1920    31.0
1921    31.0
1922    31.0
1923    31.0
1924    31.0
1925    31.0
1926    31.0
1927    31.0
1928    31.0
1929    31.0
1930    31.0
1931    31.0
1932    31.0
1933    31.0
1934    31.0
1935    31.0
1936    31.0
1937    31.0
1938    31.0
1939    31.0
1940    31.0
1941    31.0
1942    31.0
1943    31.0
1944    31.0
1945    31.0
1946    31.0
1947    31.0
1948    31.0
1949    31.0
1950    31.0
1951    31.0
1952    31.0
1953    31.0
1954    31.0
1955    31.0
1956    31.0
1957    31.0
1958    31.0
1959    31.0
1960    31.0
1961    31.0
1962    31.0
1963    31.0
1964    31.0
1965    31.0
1966    31.0
1967    31.0
1968    31.0
1969    31.0
1970    31.0
1971    31.0
1972    31.0
1973    31.0
1974    31.0
1975    31.0
1976     4.0
1977     4.0
1978     4.0
1979     4.0
1980     4.0
1981     4.0
1982     4.0
1983     4.0
1984     4.0
1985     4.0
1986     4.0
1987     4.0
1988     4.0
1989     4.0
1990     4.0
1991     4.0
1992     4.0
1993     4.0
1994     4.0
1995     4.0
1996     4.0
1997     4.0
1998     4.0
1999     4.0
2000     4.0
2001    33.0
2002    33.0
2003    33.0
2004    33.0
2005    33.0
2006    33.0
2007    33.0
2008    33.0
2009    33.0
2010    33.0
2011    33.0
2012    33.0
2013    33.0
2014    33.0
2015    33.0
2016    33.0
2017    33.0
2018    33.0
2019    33.0
2020    33.0
2021    33.0
2022    33.0
2023    33.0
2024    33.0
2025    33.0
2026    33.0
2027    33.0
2028    33.0
2029    33.0
2030    33.0
2031    33.0
2032    33.0
2033    33.0
2034    33.0
2035    33.0
2036    33.0
2037    33.0
2038     6.0
2039     6.0
2040     6.0
2041     6.0
2042     6.0
2043     6.0
2044     6.0
2045     6.0
2046     6.0
2047     6.0
2048     6.0
2049     6.0
2050     6.0
2051     6.0
2052    35.0
2053    35.0
2054    35.0
2055    35.0
2056    35.0
2057    35.0
2058    35.0
2059    35.0
2060    35.0
2061    35.0
2062    35.0
2063    35.0
2064    35.0
2065    35.0
2066    35.0
2067    35.0
2068    35.0
2069    35.0
2070    35.0
2071    35.0
2072    35.0
2073    35.0
2074    35.0
2075     8.0
2076     8.0
2077     8.0
2078     8.0
2079     8.0
2080     8.0
2081     8.0
2082     8.0
2083     8.0
2084     8.0
2085     8.0
2086     8.0
2087     8.0
2088     8.0
2089     8.0
2090     8.0
2091     8.0
2092     8.0
2093     8.0
2094     8.0
2095     8.0
2096     8.0
2097    37.0
2098    37.0
2099    37.0
2100    37.0
2101    37.0
2102    37.0
2103    37.0
2104    37.0
2105    37.0
2106    37.0
2107    37.0
2108    37.0
2109    37.0
2110    37.0
2111    10.0
2112    10.0
2113    10.0
2114    10.0
2115    10.0
2116    10.0
2117    10.0
2118    10.0
2119    10.0
2120    10.0
2121    10.0
2122    10.0
2123    10.0
2124    10.0
2125    10.0
2126    10.0
2127    10.0
2128    10.0
2129    39.0
2130    39.0
2131    39.0
2132    39.0
2133    39.0
2134    39.0
2135    39.0
2136    39.0
2137    39.0
2138    39.0
2139    39.0
2140    12.0
2141    12.0
2142    12.0
2143    12.0
2144    12.0
2145    12.0
2146    12.0
2147    12.0
2148    12.0
2149    41.0
2150    41.0
2151    41.0
2152    41.0
2153    41.0
2154    41.0
2155    41.0
2156    41.0
2157    41.0
2158    41.0
2159    41.0
2160    41.0
2161    41.0
2162    41.0
2163    41.0
2164    41.0
2165    41.0
2166    41.0
2167    41.0
2168    41.0
2169    41.0
2170    41.0
2171    41.0
2172    41.0
2173    41.0
2174    41.0
2175    41.0
2176    41.0
2177    41.0
2178    41.0
2179    41.0
2180    41.0
2181    41.0
2182    41.0
2183    41.0
2184    41.0
2185    41.0
2186    41.0
2187    41.0
2188    41.0
2189    41.0
2190    41.0
2191    41.0
2192    41.0
2193    41.0
2194    41.0
2195    14.0
2196    14.0
2197    14.0
2198    14.0
2199    14.0
2200    14.0
2201    14.0
2202    14.0
2203    14.0
2204    14.0
2205    14.0
2206    14.0
2207    14.0
2208    14.0
2209    14.0
2210    14.0
2211    14.0
2212    14.0
2213    14.0
2214    14.0
2215    14.0
2216    14.0
2217    14.0
2218    14.0
2219    14.0
2220    14.0
2221    14.0
2222    14.0
2223    14.0
2224    14.0
2225    14.0
2226    14.0
2227    14.0
2228    14.0
2229    14.0
2230    43.0
2231    43.0
2232    43.0
2233    43.0
2234    43.0
2235    43.0
2236    43.0
2237    43.0
2238    43.0
2239    43.0
2240    43.0
2241    43.0
2242    43.0
2243    43.0
2244    43.0
2245    43.0
2246    43.0
2247    43.0
2248    43.0
2249    43.0
2250    43.0
2251    43.0
2252    43.0
2253    43.0
2254    43.0
2255    43.0
2256    43.0
2257    43.0
2258    43.0
2259    43.0
2260    43.0
2261    43.0
2262    43.0
2263    43.0
2264    43.0
2265    43.0
2266    43.0
2267    43.0
2268    43.0
2269    43.0
2270    43.0
2271    43.0
2272    43.0
2273    43.0
2274    43.0
2275    43.0
2276    43.0
2277    43.0
2278    43.0
2279    43.0
2280    43.0
2281    43.0
2282    43.0
2283    43.0
2284    43.0
2285    43.0
2286     2.0
2287     2.0
2288     2.0
2289     2.0
2290     2.0
2291     2.0
2292     2.0
2293     2.0
2294     2.0
2295     2.0
2296     2.0
2297     2.0
2298     2.0
2299     2.0
2300     2.0
2301     2.0
2302     2.0
2303     2.0
2304     2.0
2305     2.0
2306     2.0
2307     2.0
2308     2.0
2309     2.0
2310     2.0
2311     2.0
2312     2.0
2313     2.0
2314     2.0
2315     2.0
2316     2.0
2317     2.0
2318     2.0
2319     2.0
2320     2.0
2321     2.0
2322     2.0
2323     2.0
2324     2.0
2325     2.0
2326     2.0
2327     2.0
2328     2.0
2329     2.0
2330     2.0
2331     2.0
2332     2.0
2333     2.0
2334     2.0
2335     2.0
2336     2.0
2337     2.0
2338     2.0
2339     2.0
2340     2.0
2341     2.0
2342     2.0
2343     2.0
2344     2.0
2345     2.0
2346     2.0
2347     2.0
2348     2.0
2349     2.0
2350     2.0
2351     2.0
2352     2.0
2353     2.0
2354     2.0
2355     2.0
2356     2.0
2357    31.0
2358    31.0
2359    31.0
2360    31.0
2361    31.0
2362    31.0
2363    31.0
2364    31.0
2365    31.0
2366    31.0
2367    31.0
2368    31.0
2369    31.0
2370    31.0
2371    31.0
2372    31.0
2373    31.0
2374    31.0
2375    31.0
2376    31.0
2377    31.0
2378    31.0
2379    31.0
2380    31.0
2381    31.0
2382    31.0
2383    31.0
2384    31.0
2385    31.0
2386    31.0
2387    31.0
2388    31.0
2389    31.0
2390    31.0
2391    31.0
2392    31.0
2393    31.0
2394    31.0
2395    31.0
2396    31.0
2397    31.0
2398    31.0
2399    31.0
2400    31.0
2401     4.0
2402     4.0
2403     4.0
2404     4.0
2405     4.0
2406     4.0
2407     4.0
2408     4.0
2409     4.0
2410     4.0
2411     4.0
2412     4.0
2413     4.0
2414     4.0
2415     4.0
2416     4.0
2417     4.0
2418     4.0
2419     4.0
2420     4.0
2421     4.0
2422     4.0
2423     4.0
2424     4.0
2425     4.0
2426     4.0
2427     4.0
2428     4.0
2429     4.0
2430     4.0
2431     4.0
2432     4.0
2433     4.0
2434     4.0
2435     4.0
2436     4.0
2437     4.0
2438     4.0
2439     4.0
2440     4.0
2441     4.0
2442     4.0
2443     4.0
2444     4.0
2445     4.0
2446     4.0
2447     4.0
2448     4.0
2449     4.0
2450     4.0
2451     4.0
2452     4.0
2453     4.0
2454     4.0
2455     4.0
2456     4.0
2457     4.0
2458     4.0
2459     4.0
2460     4.0
2461     4.0
2462     4.0
2463     4.0
2464     4.0
2465     4.0
2466    33.0
2467    33.0
2468    33.0
2469    33.0
2470    33.0
2471    33.0
2472    33.0
2473    33.0
2474    33.0
2475    33.0
2476    33.0
2477     6.0
2478     6.0
2479     6.0
2480     6.0
2481     6.0
2482     6.0
2483     6.0
2484     6.0
2485     6.0
2486     6.0
2487     6.0
2488     6.0
2489     6.0
2490     6.0
2491     6.0
2492     6.0
2493     6.0
2494     6.0
2495     6.0
2496     6.0
2497     6.0
2498     6.0
2499     6.0
2500     6.0
2501     6.0
2502     6.0
2503     6.0
2504     6.0
2505     6.0
2506     6.0
2507     6.0
2508     6.0
2509     6.0
2510     6.0
2511     6.0
2512     6.0
2513     6.0
2514     6.0
2515     6.0
2516     6.0
2517     6.0
2518     6.0
2519     6.0
2520     6.0
2521     6.0
2522     6.0
2523     6.0
2524     6.0
2525     6.0
2526     8.0
2527     8.0
2528     8.0
2529     8.0
2530     8.0
2531     8.0
2532     8.0
2533     8.0
2534     8.0
2535     8.0
2536     8.0
2537     8.0
2538     8.0
2539     8.0
2540     8.0
2541     8.0
2542     8.0
2543     8.0
2544     8.0
2545     8.0
2546     8.0
2547     8.0
2548     8.0
2549     8.0
2550     8.0
2551     8.0
2552     8.0
2553     8.0
2554     8.0
2555     8.0
2556     8.0
2557     8.0
2558     8.0
2559     8.0
2560     8.0
2561     8.0
2562     8.0
2563     8.0
2564     8.0
2565     8.0
2566     8.0
2567     8.0
2568     8.0
2569     8.0
2570     8.0
2571     8.0
2572     8.0
2573     8.0
2574     8.0
2575     8.0
2576    10.0
2577    10.0
2578    10.0
2579    10.0
2580    10.0
2581    10.0
2582    10.0
2583    10.0
2584    10.0
2585    10.0
2586    10.0
2587    10.0
2588    10.0
2589    10.0
2590    10.0
2591    10.0
2592    10.0
2593    10.0
2594    10.0
2595    10.0
2596    10.0
2597    10.0
2598    10.0
2599    10.0
2600    10.0
2601    10.0
2602    10.0
2603    10.0
2604    10.0
2605    10.0
2606    10.0
2607    10.0
2608    10.0
2609    10.0
2610    10.0
2611    10.0
2612    10.0
2613    10.0
2614    10.0
2615    10.0
2616    10.0
2617    10.0
2618    10.0
2619    10.0
2620    10.0
2621    10.0
2622    10.0
2623    10.0
2624    10.0
2625    10.0
2626    10.0
2627    10.0
2628    10.0
2629    10.0
2630    10.0
2631    10.0
2632    10.0
2633    10.0
2634    10.0
2635    10.0
2636    39.0
2637    39.0
2638    39.0
2639    39.0
2640    39.0
2641    39.0
2642    39.0
2643    39.0
2644    39.0
2645    12.0
2646    12.0
2647    12.0
2648    12.0
2649    12.0
2650    12.0
2651    12.0
2652    12.0
2653    12.0
2654    12.0
2655    12.0
2656    12.0
2657    12.0
2658    12.0
2659    12.0
2660    12.0
2661    12.0
2662    12.0
2663    12.0
2664    12.0
2665    12.0
2666    41.0
2667    41.0
2668    41.0
2669    41.0
2670    41.0
2671    41.0
2672    41.0
2673    41.0
2674    41.0
2675    41.0
2676    41.0
2677    41.0
2678    41.0
2679    41.0
2680    41.0
2681    41.0
2682    41.0
2683    41.0
2684    41.0
2685    41.0
2686    41.0
2687    41.0
2688    41.0
2689    14.0
2690    14.0
2691    14.0
2692    14.0
2693    14.0
2694    14.0
2695    14.0
2696    14.0
2697    14.0
2698    14.0
2699    14.0
2700    14.0
2701    14.0
2702    14.0
2703    14.0
2704    14.0
2705    14.0
2706    14.0
2707    14.0
2708    14.0
2709    14.0
2710    14.0
2711    14.0
2712    14.0
2713    14.0
2714    14.0
2715    14.0
2716    14.0
2717    14.0
2718    14.0
2719    14.0
2720    14.0
2721    14.0
2722    14.0
2723    14.0
2724    43.0
2725    43.0
2726    43.0
2727    43.0
2728    43.0
2729    43.0
2730    43.0
2731    43.0
2732    43.0
2733    43.0
2734    43.0
2735    43.0
2736    43.0
2737    43.0
2738    43.0
2739    43.0
2740    43.0
2741    43.0
2742    43.0
2743    43.0
2744    43.0
2745    43.0
2746    43.0
2747    43.0
2748    43.0
2749    43.0
2750    43.0
2751    43.0
2752    43.0
2753    43.0
2754    43.0
2755    43.0
2756    43.0
2757    43.0
2758    43.0
2759    43.0
2760    43.0
2761    43.0
2762    43.0
2763    43.0
2764    43.0
2765    43.0
2766    43.0
2767    43.0
2768    43.0
2769    43.0
2770    43.0
2771    43.0
2772    43.0
2773    43.0
2774    43.0
2775    43.0
2776    43.0
2777    43.0
2778    43.0
2779    43.0
2780    43.0
2781    43.0
2782    43.0
2783    43.0
2784    43.0
2785    43.0
2786    43.0
2787     2.0
2788     2.0
2789     2.0
2790     2.0
2791     2.0
2792     2.0
2793     2.0
2794     2.0
2795     2.0
2796     2.0
2797     2.0
2798     2.0
2799     2.0
2800     2.0
2801     2.0
2802     2.0
2803     2.0
2804     2.0
2805     2.0
2806     2.0
2807     2.0
2808     2.0
2809     2.0
2810     2.0
2811     2.0
2812     2.0
2813     2.0
2814     2.0
2815     2.0
2816     2.0
2817     2.0
2818     2.0
2819     2.0
2820     2.0
2821     2.0
2822     2.0
2823     2.0
2824     2.0
2825     2.0
2826     2.0
2827     2.0
2828     2.0
2829     2.0
2830     2.0
2831    31.0
2832    31.0
2833    31.0
2834    31.0
2835    31.0
2836    31.0
2837    31.0
2838    31.0
2839    31.0
2840    31.0
2841    31.0
2842    31.0
2843    31.0
2844    31.0
2845    31.0
2846    31.0
2847    31.0
2848    31.0
2849    31.0
2850    31.0
2851    31.0
2852    31.0
2853    31.0
2854    31.0
2855    31.0
2856    31.0
2857    31.0
2858    31.0
2859    31.0
2860    31.0
2861    31.0
2862    31.0
2863    31.0
2864    31.0
2865    31.0
2866    31.0
2867    31.0
2868    31.0
2869    31.0
2870    31.0
2871    31.0
2872    31.0
2873    31.0
2874    31.0
2875    31.0
2876    31.0
2877    31.0
2878    31.0
2879    31.0
2880    31.0
2881    31.0
2882    31.0
2883    31.0
2884    31.0
2885    31.0
2886    31.0
2887    31.0
2888    31.0
2889    31.0
2890    31.0
2891    31.0
2892     4.0
2893     4.0
2894     4.0
2895     4.0
2896     4.0
2897     4.0
2898     4.0
2899     4.0
2900     4.0
2901     4.0
2902     4.0
2903     4.0
2904     4.0
2905     4.0
2906     4.0
2907     4.0
2908     4.0
2909     4.0
2910     4.0
2911     4.0
2912     4.0
2913     4.0
2914     4.0
2915     4.0
2916     4.0
2917     4.0
2918     4.0
2919     4.0
2920     4.0
2921     4.0
2922     4.0
2923     4.0
2924    33.0
2925    33.0
2926    33.0
2927    33.0
2928    33.0
2929    33.0
2930    33.0
2931    33.0
2932    33.0
2933    33.0
2934    33.0
2935    33.0
2936    33.0
2937    33.0
2938    33.0
2939    33.0
2940    33.0
2941    33.0
2942    33.0
2943    33.0
2944     6.0
2945     6.0
2946     6.0
2947     6.0
2948     6.0
2949     6.0
2950     6.0
2951     6.0
2952     6.0
2953     6.0
2954     6.0
2955     6.0
2956     6.0
2957     6.0
2958     6.0
2959     6.0
2960     6.0
2961     6.0
2962     6.0
2963     6.0
2964     6.0
2965     6.0
2966     6.0
2967     6.0
2968    35.0
2969    35.0
2970    35.0
2971    35.0
2972    35.0
2973    35.0
2974    35.0
2975    35.0
2976    35.0
2977    35.0
2978    35.0
2979    35.0
2980    35.0
2981    35.0
2982    35.0
2983    35.0
2984    35.0
2985    35.0
2986    35.0
2987    35.0
2988    35.0
2989     8.0
2990     8.0
2991     8.0
2992     8.0
2993     8.0
2994     8.0
2995     8.0
2996     8.0
2997     8.0
2998     8.0
2999     8.0
3000     8.0
3001     8.0
3002     8.0
3003     8.0
3004     8.0
3005     8.0
3006     8.0
3007     8.0
3008     8.0
3009     8.0
3010     8.0
3011     8.0
3012     8.0
3013     8.0
3014     8.0
3015     8.0
3016     8.0
3017    37.0
3018    37.0
3019    37.0
3020    37.0
3021    37.0
3022    37.0
3023    37.0
3024    37.0
3025    37.0
3026    37.0
3027    37.0
3028    37.0
3029    10.0
3030    10.0
3031    10.0
3032    10.0
3033    10.0
3034    10.0
3035    10.0
3036    10.0
3037    10.0
3038    10.0
3039    10.0
3040    10.0
3041    10.0
3042    10.0
3043    10.0
3044    10.0
3045    10.0
3046    10.0
3047    10.0
3048    10.0
3049    10.0
3050    10.0
3051    39.0
3052    39.0
3053    39.0
3054    39.0
3055    39.0
3056    39.0
3057    39.0
3058    39.0
3059    39.0
3060    39.0
3061    39.0
3062    39.0
3063    39.0
3064    12.0
3065    12.0
3066    12.0
3067    12.0
3068    12.0
3069    12.0
3070    41.0
3071    41.0
3072    41.0
3073    41.0
3074    41.0
3075    41.0
3076    41.0
3077    41.0
3078    41.0
3079    41.0
3080    41.0
3081    41.0
3082    41.0
3083    41.0
3084    41.0
3085    41.0
3086    41.0
3087    41.0
3088    41.0
3089    41.0
3090    41.0
3091    41.0
3092    41.0
3093    41.0
3094    41.0
3095    41.0
3096    41.0
3097    41.0
3098    41.0
3099    41.0
3100    41.0
3101    41.0
3102    41.0
3103    41.0
3104    41.0
3105    41.0
3106    41.0
3107    14.0
3108    14.0
3109    14.0
3110    14.0
3111    14.0
3112    14.0
3113    14.0
3114    14.0
3115    14.0
3116    14.0
3117    14.0
3118    14.0
3119    14.0
3120    14.0
3121    14.0
3122    14.0
3123    14.0
3124    14.0
3125    14.0
3126    14.0
3127    14.0
3128    14.0
3129    14.0
3130    14.0
3131    14.0
3132    14.0
3133    14.0
3134    14.0
3135    14.0
3136    14.0
3137    14.0
3138    14.0
3139    14.0
3140    14.0
3141    14.0
3142    14.0
3143    43.0
3144    43.0
3145    43.0
3146    43.0
3147    43.0
3148    43.0
3149    43.0
3150    43.0
3151    43.0
3152    43.0
3153    43.0
3154    43.0
3155    43.0
3156    43.0
3157    43.0
3158    43.0
3159    43.0
3160    43.0
3161    43.0
3162    43.0
3163    43.0
3164    43.0
3165    43.0
3166    43.0
3167    43.0
3168    43.0
3169    43.0
3170    43.0
3171    43.0
3172    43.0
3173    43.0
3174    43.0
3175    43.0
3176    43.0
3177    43.0
3178    43.0
3179    43.0
3180    43.0
3181    43.0
3182    43.0
3183    43.0
3184    43.0
3185    43.0
3186    43.0
3187    43.0
3188    43.0
3189    43.0
3190    43.0
3191    43.0
3192    43.0
3193    43.0
3194    43.0
3195    43.0
3196    43.0
3197    43.0
3198    43.0
3199    43.0
3200    43.0
3201    43.0
3202    43.0
3203    43.0
3204    43.0
3205    43.0
3206    43.0
3207    43.0
3208    43.0
3209    43.0
3210    43.0
3211    43.0
3212    43.0
3213    43.0
3214     2.0
3215     2.0
3216     2.0
3217    31.0
3218    31.0
3219    31.0
3220    31.0
3221    35.0
3222    35.0
3223    35.0
3224    35.0
3225    43.0
3226    43.0
3227    43.0
3228    43.0
3229    43.0
3230    43.0
3231    43.0
3232    35.0
3233    35.0
3234    35.0
3235    35.0
3236    35.0
3237    35.0
3238    35.0
3239    35.0
3240    35.0
3241    35.0
3242    41.0
3243    41.0
3244    41.0
3245    41.0
3246    41.0
3247    41.0
3248    41.0
3249    41.0
3250    41.0
3251    41.0
3252    41.0
3253    41.0
3254    41.0
3255    41.0
3256    41.0
3257    41.0
3258    41.0
3259    41.0
3260    41.0
3261    41.0
3262    41.0
3263    41.0
3264    41.0
3265    41.0
3266    41.0
3267    41.0
3268    41.0
3269    41.0
3270    41.0
3271    41.0
3272    41.0
3273    41.0
3274    41.0
3275    41.0
3276    41.0
3277    41.0
3278    41.0
3279    41.0
3280    43.0
3281    43.0
3282     2.0
3283     2.0
3284     2.0
3285     2.0
3286     2.0
3287     2.0
3288     2.0
3289     2.0
3290     2.0
3291     2.0
3292     2.0
3293     2.0
3294     2.0
3295     2.0
3296     2.0
3297     2.0
3298     2.0
3299     2.0
3300     2.0
3301     2.0
3302     2.0
3303     2.0
3304     2.0
3305     2.0
3306     2.0
3307     2.0
3308     2.0
3309     2.0
3310     2.0
3311     2.0
3312     2.0
3313     2.0
3314     2.0
3315     2.0
3316     2.0
3317     2.0
3318     2.0
3319     2.0
3320     2.0
3321     2.0
3322     2.0
3323     2.0
3324     2.0
3325    31.0
3326    31.0
3327    31.0
3328    31.0
3329    31.0
3330    31.0
3331    31.0
3332    31.0
3333    31.0
3334    31.0
3335    31.0
3336    31.0
3337    31.0
3338    31.0
3339    31.0
3340    31.0
3341    31.0
3342    31.0
3343    31.0
3344    31.0
3345    31.0
3346    31.0
3347    31.0
3348    31.0
3349    31.0
3350    31.0
3351    31.0
3352    31.0
3353    31.0
3354    31.0
3355    31.0
3356    31.0
3357    31.0
3358    31.0
3359    31.0
3360    31.0
3361    31.0
3362    31.0
3363    31.0
3364    31.0
3365    31.0
3366    31.0
3367    31.0
3368    31.0
3369    31.0
3370    31.0
3371    31.0
3372    31.0
3373    31.0
3374    31.0
3375    31.0
3376    31.0
3377    31.0
3378    31.0
3379    31.0
3380    31.0
3381    31.0
3382    31.0
3383    31.0
3384    31.0
3385    31.0
3386    31.0
3387    31.0
3388    31.0
3389    31.0
3390    31.0
3391    31.0
3392    31.0
3393    31.0
3394    31.0
3395    31.0
3396    31.0
3397    31.0
3398    31.0
3399    31.0
3400    31.0
3401    31.0
3402    31.0
3403    31.0
3404    31.0
3405    31.0
3406    31.0
3407     4.0
3408     4.0
3409     4.0
3410     4.0
3411     4.0
3412     4.0
3413     4.0
3414     4.0
3415     4.0
3416     4.0
3417     4.0
3418     4.0
3419     4.0
3420     4.0
3421     4.0
3422     4.0
3423     4.0
3424     4.0
3425     4.0
3426     4.0
3427     4.0
3428     4.0
3429     4.0
3430     4.0
3431     4.0
3432     4.0
3433     4.0
3434     4.0
3435     4.0
3436     4.0
3437     4.0
3438     4.0
3439    33.0
3440    33.0
3441    33.0
3442    33.0
3443    33.0
3444    33.0
3445    33.0
3446    33.0
3447    33.0
3448    33.0
3449    33.0
3450    33.0
3451    33.0
3452    33.0
3453    33.0
3454    33.0
3455    33.0
3456    33.0
3457    33.0
3458    33.0
3459    33.0
3460    33.0
3461    33.0
3462    33.0
3463    33.0
3464    33.0
3465    33.0
3466    33.0
3467    33.0
3468    33.0
3469    33.0
3470    33.0
3471    33.0
3472    33.0
3473    33.0
3474    33.0
3475    33.0
3476    33.0
3477    33.0
3478     6.0
3479     6.0
3480     6.0
3481     6.0
3482     6.0
3483     6.0
3484     6.0
3485     6.0
3486     6.0
3487     6.0
3488    35.0
3489    35.0
3490    35.0
3491    35.0
3492    35.0
3493    35.0
3494    35.0
3495    35.0
3496    35.0
3497    35.0
3498    35.0
3499    35.0
3500    35.0
3501    35.0
3502    35.0
3503    35.0
3504    35.0
3505    35.0
3506    35.0
3507    35.0
3508    35.0
3509    35.0
3510    35.0
3511    35.0
3512    35.0
3513    35.0
3514     8.0
3515     8.0
3516     8.0
3517     8.0
3518     8.0
3519     8.0
3520     8.0
3521     8.0
3522     8.0
3523     8.0
3524     8.0
3525     8.0
3526     8.0
3527     8.0
3528     8.0
3529     8.0
3530     8.0
3531     8.0
3532     8.0
3533     8.0
3534     8.0
3535     8.0
3536    37.0
3537    37.0
3538    37.0
3539    37.0
3540    37.0
3541    37.0
3542    37.0
3543    37.0
3544    37.0
3545    37.0
3546    10.0
3547    10.0
3548    10.0
3549    10.0
3550    10.0
3551    10.0
3552    10.0
3553    10.0
3554    10.0
3555    10.0
3556    10.0
3557    10.0
3558    10.0
3559    10.0
3560    10.0
3561    10.0
3562    10.0
3563    10.0
3564    10.0
3565    10.0
3566    39.0
3567    39.0
3568    39.0
3569    39.0
3570    39.0
3571    39.0
3572    39.0
3573    39.0
3574    39.0
3575    39.0
3576    39.0
3577    39.0
3578    39.0
3579    12.0
3580    12.0
3581    12.0
3582    12.0
3583    12.0
3584    12.0
3585    12.0
3586    12.0
3587    12.0
3588    12.0
3589    12.0
3590    12.0
3591    12.0
3592    12.0
3593    41.0
3594    41.0
3595    41.0
3596    41.0
3597    41.0
3598    41.0
3599    41.0
3600    41.0
3601    41.0
3602    41.0
3603    41.0
3604    41.0
3605    41.0
3606    41.0
3607    41.0
3608    41.0
3609    41.0
3610    41.0
3611    41.0
3612    41.0
3613    41.0
3614    41.0
3615    41.0
3616    41.0
3617    41.0
3618    41.0
3619    41.0
3620    41.0
3621    41.0
3622    41.0
3623    41.0
3624    41.0
3625    41.0
3626    41.0
3627    41.0
3628    41.0
3629    41.0
3630    41.0
3631    41.0
3632    41.0
3633    41.0
3634    41.0
3635    41.0
3636    14.0
3637    14.0
3638    14.0
3639    14.0
3640    14.0
3641    14.0
3642    14.0
3643    14.0
3644    14.0
3645    14.0
3646    14.0
3647    14.0
3648    14.0
3649    14.0
3650    14.0
3651    14.0
3652    14.0
3653    14.0
3654    14.0
3655    14.0
3656    14.0
3657    14.0
3658    14.0
3659    14.0
3660    14.0
3661    14.0
3662    14.0
3663    14.0
3664    14.0
3665    14.0
3666    14.0
3667    14.0
3668    14.0
3669    14.0
3670    14.0
3671    14.0
3672    14.0
3673    14.0
3674    14.0
3675    14.0
3676    14.0
3677    43.0
3678    43.0
3679    43.0
3680    43.0
3681    43.0
3682    43.0
3683    43.0
3684    43.0
3685    43.0
3686    43.0
3687    43.0
3688    43.0
3689    43.0
3690    43.0
3691    43.0
3692    43.0
3693    43.0
3694    43.0
3695    43.0
3696    43.0
3697    43.0
3698    43.0
3699    43.0
3700    43.0
3701    43.0
3702    43.0
3703    43.0
3704    43.0
3705    43.0
3706    43.0
3707    43.0
3708    43.0
3709    43.0
3710    43.0
3711    43.0
3712    43.0
3713    43.0
3714    43.0
3715    43.0
3716    43.0
3717    43.0
3718    43.0
3719    43.0
3720    43.0
3721    43.0
3722    43.0
3723    43.0
3724    43.0
3725    43.0
3726    43.0
3727    43.0
3728    43.0
3729    43.0
3730    43.0
3731    43.0
3732    43.0
3733    43.0
3734    43.0
3735    43.0
3736    43.0
3737    43.0
3738    43.0
3739    43.0
3740    43.0
3741    43.0
3742    43.0
3743    43.0
3744    43.0
3745     2.0
3746     2.0
3747     2.0
3748     2.0
3749     2.0
3750     2.0
3751     2.0
3752     2.0
3753     2.0
3754     2.0
3755     2.0
3756     2.0
3757     2.0
3758     2.0
3759     2.0
3760     2.0
3761     2.0
3762     2.0
3763     2.0
3764    31.0
3765    31.0
3766    31.0
3767    31.0
3768    31.0
3769    31.0
3770    31.0
3771    31.0
3772    31.0
3773    31.0
3774    31.0
3775    31.0
3776    31.0
3777    31.0
3778    31.0
3779    31.0
3780    31.0
3781    31.0
3782    31.0
3783    31.0
3784    31.0
3785    31.0
3786    31.0
3787    31.0
3788    31.0
3789    31.0
3790    31.0
3791    31.0
3792    31.0
3793    31.0
3794    31.0
3795    31.0
3796    31.0
3797    31.0
3798    31.0
3799    31.0
3800    31.0
3801    31.0
3802    31.0
3803    31.0
3804    31.0
3805    31.0
3806    31.0
3807    31.0
3808    31.0
3809    31.0
3810    31.0
3811    31.0
3812    31.0
3813    31.0
3814    31.0
3815    31.0
3816    31.0
3817    31.0
3818    31.0
3819    31.0
3820    31.0
3821    31.0
3822    31.0
3823    31.0
3824    31.0
3825    31.0
3826    31.0
3827    31.0
3828    31.0
3829    31.0
3830    31.0
3831    31.0
3832    31.0
3833    31.0
3834    31.0
3835    31.0
3836    31.0
3837    31.0
3838    31.0
3839    31.0
3840    31.0
3841    31.0
3842    31.0
3843    31.0
3844    31.0
3845    31.0
3846    31.0
3847    31.0
3848    31.0
3849    31.0
3850    31.0
3851    31.0
3852    31.0
3853    31.0
3854    31.0
3855    31.0
3856    31.0
3857    31.0
3858     4.0
3859     4.0
3860     4.0
3861     4.0
3862     4.0
3863     4.0
3864     4.0
3865     4.0
3866     4.0
3867     4.0
3868     4.0
3869     4.0
3870     4.0
3871     4.0
3872     4.0
3873     4.0
3874     4.0
3875     4.0
3876     4.0
3877     4.0
3878     4.0
3879     4.0
3880    33.0
3881    33.0
3882    33.0
3883    33.0
3884    33.0
3885    33.0
3886    33.0
3887    33.0
3888    33.0
3889    33.0
3890    33.0
3891    33.0
3892    33.0
3893    33.0
3894    33.0
3895    33.0
3896    33.0
3897    33.0
3898    33.0
3899    33.0
3900    33.0
3901    33.0
3902    33.0
3903    33.0
3904    33.0
3905    33.0
3906    33.0
3907    33.0
3908    33.0
3909    33.0
3910    33.0
3911     6.0
3912     6.0
3913     6.0
3914     6.0
3915     6.0
3916     6.0
3917     6.0
3918     6.0
3919     6.0
3920     6.0
3921     6.0
3922     6.0
3923     6.0
3924    35.0
3925    35.0
3926    35.0
3927    35.0
3928    35.0
3929    35.0
3930    35.0
3931    35.0
3932    35.0
3933    35.0
3934    35.0
3935    35.0
3936    35.0
3937    35.0
3938    35.0
3939    35.0
3940    35.0
3941    35.0
3942    35.0
3943    35.0
3944    35.0
3945    35.0
3946    35.0
3947    35.0
3948    35.0
3949    35.0
3950    35.0
3951    35.0
3952    35.0
3953     8.0
3954     8.0
3955     8.0
3956     8.0
3957     8.0
3958     8.0
3959     8.0
3960     8.0
3961     8.0
3962     8.0
3963     8.0
3964     8.0
3965     8.0
3966     8.0
3967     8.0
3968    37.0
3969    37.0
3970    37.0
3971    37.0
3972    37.0
3973    37.0
3974    37.0
3975    37.0
3976    37.0
3977    37.0
3978    37.0
3979    37.0
3980    37.0
3981    37.0
3982    10.0
3983    10.0
3984    10.0
3985    10.0
3986    10.0
3987    10.0
3988    10.0
3989    10.0
3990    10.0
3991    10.0
3992    10.0
3993    10.0
3994    10.0
3995    10.0
3996    39.0
3997    39.0
3998    39.0
3999    39.0
4000    39.0
4001    39.0
4002    39.0
4003    39.0
4004    39.0
4005    39.0
4006    39.0
4007    39.0
4008    39.0
4009    39.0
4010    39.0
4011    41.0
4012    41.0
4013    41.0
4014    41.0
4015    41.0
4016    41.0
4017    41.0
4018    41.0
4019    41.0
4020    41.0
4021    41.0
4022    41.0
4023    41.0
4024    41.0
4025    41.0
4026    41.0
4027    41.0
4028    41.0
4029    41.0
4030    41.0
4031    41.0
4032    41.0
4033    41.0
4034    41.0
4035    41.0
4036    41.0
4037    41.0
4038    41.0
4039    41.0
4040    41.0
4041    41.0
4042    41.0
4043    41.0
4044    41.0
4045    41.0
4046    41.0
4047    41.0
4048    41.0
4049    41.0
4050    41.0
4051    41.0
4052    41.0
4053    41.0
4054    41.0
4055    41.0
4056    41.0
4057    41.0
4058    41.0
4059    41.0
4060    41.0
4061    41.0
4062    41.0
4063    41.0
4064    41.0
4065    41.0
4066    41.0
4067    41.0
4068    41.0
4069    41.0
4070    41.0
4071    41.0
4072    41.0
4073    41.0
4074    41.0
4075    41.0
4076    14.0
4077    14.0
4078    14.0
4079    14.0
4080    14.0
4081    14.0
4082    14.0
4083    14.0
4084    14.0
4085    14.0
4086    14.0
4087    14.0
4088    14.0
4089    14.0
4090    14.0
4091    14.0
4092    14.0
4093    14.0
4094    14.0
4095    14.0
4096    14.0
4097    14.0
4098    14.0
4099    14.0
4100    14.0
4101    14.0
4102    14.0
4103    14.0
4104    14.0
4105    43.0
4106    43.0
4107    43.0
4108    43.0
4109    43.0
4110    43.0
4111    43.0
4112    43.0
4113    43.0
4114    43.0
4115    43.0
4116    43.0
4117    43.0
4118    43.0
4119    43.0
4120    43.0
4121    43.0
4122    43.0
4123    43.0
4124    43.0
4125    43.0
4126    43.0
4127    43.0
4128    43.0
4129    43.0
4130    43.0
4131    43.0
4132    43.0
4133    43.0
4134    43.0
4135    43.0
4136    43.0
4137    43.0
4138    43.0
4139    43.0
4140    43.0
4141    43.0
4142    43.0
4143    43.0
4144    43.0
4145    43.0
4146    43.0
4147    43.0
4148    43.0
Name: Formation, dtype: float64

Creating X and Y


In [345]:
X = dataFormIsInt[featureIncForm_names].values
# dataFormIsInt
#  used to be X = data[feature_names].values (sp?)
# XplusFacies = data[featurePlusFacies_names].values
y = dataFormIsInt['Facies'].values

In [ ]:

getting well, depth, facies which are list of all the individual instances of each column (well, depth, facies)


In [346]:
well = dataFormIsInt['Well Name'].values
depth = dataFormIsInt['Depth'].values
facies = dataFormIsInt['Facies'].values

In [ ]:

Plotting descriptive visual statistics


In [347]:
# Define function for plotting feature statistics
def plot_feature_stats(X, y, featureIncForm_names, facies_colors, facies_names):
    
    # Remove NaN
    nan_idx = np.any(np.isnan(X), axis=1)
    X = X[np.logical_not(nan_idx), :]
    y = y[np.logical_not(nan_idx)]
    
    # Merge features and labels into a single DataFrame
    features = pd.DataFrame(X, columns=featureIncForm_names)
    labels = pd.DataFrame(y, columns=['Facies'])
    for f_idx, facies in enumerate(facies_names):
        labels[labels[:] == f_idx] = facies
    data = pd.concat((labels, features), axis=1)

    # Plot features statistics
    facies_color_map = {}
    for ind, label in enumerate(facies_names):
        facies_color_map[label] = facies_colors[ind]

    sns.pairplot(data, hue='Facies', palette=facies_color_map, hue_order=list(reversed(facies_names)))

In [348]:
# Feature distribution
plot_feature_stats(X, y, featureIncForm_names, facies_colors, facies_names)
mpl.rcParams.update(inline_rc)


The same as above but with hue as formation instead of facies (note: legend still says 'facies')

using formation_names & formation_colors


In [349]:
# Feature distribution
plot_feature_stats(X, y, featureIncForm_names, formation_colors, formation_names)
mpl.rcParams.update(inline_rc)


Bar Charts of Number of datapoints of each Facies by well


In [350]:
# Facies per well
for w_idx, w in enumerate(np.unique(well)):
    ax = plt.subplot(3, 4, w_idx+1)
    hist = np.histogram(y[well == w], bins=np.arange(len(facies_names)+1)+.5)
    plt.bar(np.arange(len(hist[0])), hist[0], color=facies_colors, align='center')
    ax.set_xticks(np.arange(len(hist[0])))
    ax.set_xticklabels(facies_names)
    ax.set_title(w)


Bar Charts of Number of datapoints of each Facies by well


In [351]:
# Facies per formation
for w_idx, w in enumerate(np.unique(formation)):
    ax = plt.subplot(4, 4, w_idx+1)
    hist = np.histogram(y[formation == w], bins=np.arange(len(facies_names)+1)+.5)
    plt.bar(np.arange(len(hist[0])), hist[0], color=facies_colors, align='center')
    ax.set_xticks(np.arange(len(hist[0])))
    ax.set_xticklabels(facies_names)
    ax.set_title(w)


Checks if every feature is in each well


In [352]:
# Features per well
for w_idx, w in enumerate(np.unique(well)):
    ax = plt.subplot(3, 4, w_idx+1)
    hist = np.logical_not(np.any(np.isnan(X[well == w, :]), axis=0))
    plt.bar(np.arange(len(hist)), hist, color=facies_colors, align='center')
    ax.set_xticks(np.arange(len(hist)))
    ax.set_xticklabels(feature_names)
    ax.set_yticks([0, 1])
    ax.set_yticklabels(['miss', 'hit'])
    ax.set_title(w)


PE is missing in several of the wells: Alexander D, Kinzey A, and Recruit F9


In [353]:
reg = RandomForestRegressor(max_features='sqrt', n_estimators=50)
### was DataImpAll = data[feature_names].copy() now is below
DataImpAll = dataFormIsInt[featureIncForm_names].copy()
DataImp = DataImpAll.dropna(axis = 0, inplace=False)
Ximp=DataImp.loc[:, DataImp.columns != 'PE']
Yimp=DataImp.loc[:, 'PE']
reg.fit(Ximp, Yimp)
X[np.array(DataImpAll.PE.isnull()),4] = reg.predict(DataImpAll.loc[DataImpAll.PE.isnull(),:].drop('PE',axis=1,inplace=False))

Now I'll run Paolo Bestagini's routine to include a small window of values to acount for the spatial component in the log analysis, as well as the gradient information with respect to depth. This will be our prepared training dataset.


In [354]:
# Feature windows concatenation function
def augment_features_window(X, N_neig):
    
    # Parameters
    N_row = X.shape[0]
    N_feat = X.shape[1]

    # Zero padding
    X = np.vstack((np.zeros((N_neig, N_feat)), X, (np.zeros((N_neig, N_feat)))))

    # Loop over windows
    X_aug = np.zeros((N_row, N_feat*(2*N_neig+1)))
    for r in np.arange(N_row)+N_neig:
        this_row = []
        for c in np.arange(-N_neig,N_neig+1):
            this_row = np.hstack((this_row, X[r+c]))
        X_aug[r-N_neig] = this_row

    return X_aug

In [355]:
# Feature gradient computation function
def augment_features_gradient(X, depth):
    
    # Compute features gradient
    d_diff = np.diff(depth).reshape((-1, 1))
    d_diff[d_diff==0] = 0.001
    X_diff = np.diff(X, axis=0)
    X_grad = X_diff / d_diff
        
    # Compensate for last missing value
    X_grad = np.concatenate((X_grad, np.zeros((1, X_grad.shape[1]))))
    
    return X_grad

In [356]:
# Feature augmentation function
def augment_features(X, well, depth, N_neig=1):
    
    # Augment features
    X_aug = np.zeros((X.shape[0], X.shape[1]*(N_neig*2+2)))
    for w in np.unique(well):
        w_idx = np.where(well == w)[0]
        X_aug_win = augment_features_window(X[w_idx, :], N_neig)
        X_aug_grad = augment_features_gradient(X[w_idx, :], depth[w_idx])
        X_aug[w_idx, :] = np.concatenate((X_aug_win, X_aug_grad), axis=1)
    
    # Find padded rows
    padded_rows = np.unique(np.where(X_aug[:, 0:7] == np.zeros((1, 7)))[0])
    
    return X_aug, padded_rows

In [357]:
X_aug, padded_rows = augment_features(X, well, depth)

In [360]:
# Initialize model selection methods
lpgo = LeavePGroupsOut(2)

# Generate splits
split_list = []
# changed data to dataFormIsInt in line below
for train, val in lpgo.split(X, y, groups=dataFormIsInt['Well Name']):
    hist_tr = np.histogram(y[train], bins=np.arange(len(facies_names)+1)+.5)
    hist_val = np.histogram(y[val], bins=np.arange(len(facies_names)+1)+.5)
    if np.all(hist_tr[0] != 0) & np.all(hist_val[0] != 0):
        split_list.append({'train':train, 'val':val})
            
# Print splits
# for s, split in enumerate(split_list):
    print('Split %d' % s)
    print('    training:   %s' % (dataFormIsInt['Well Name'][split['train']].unique()))
    print('    validation: %s' % (dataFormIsInt['Well Name'][split['val']].unique()))


Split 0
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-360-bd6245b81b70> in <module>()
     14 # for s, split in enumerate(split_list):
     15     print('Split %d' % s)
---> 16     print('    training:   %s' % (dataFormIsInt['Well Name'][split['train']].unique()))
     17     print('    validation: %s' % (dataFormIsInt['Well Name'][split['val']].unique()))

/Users/justingosses/anaconda/lib/python3.5/site-packages/numpy/core/numeric.py in array_str(a, max_line_width, precision, suppress_small)
   1867 
   1868     """
-> 1869     return array2string(a, max_line_width, precision, suppress_small, ' ', "", str)
   1870 
   1871 def set_string_function(f, repr=True):

/Users/justingosses/anaconda/lib/python3.5/site-packages/numpy/core/arrayprint.py in array2string(a, max_line_width, precision, suppress_small, separator, prefix, style, formatter)
    445     else:
    446         lst = _array2string(a, max_line_width, precision, suppress_small,
--> 447                             separator, prefix, formatter=formatter)
    448     return lst
    449 

/Users/justingosses/anaconda/lib/python3.5/site-packages/numpy/core/arrayprint.py in _array2string(a, max_line_width, precision, suppress_small, separator, prefix, formatter)
    250         formatter = _formatter
    251 
--> 252     if a.size > _summaryThreshold:
    253         summary_insert = "..., "
    254         data = _leading_trailing(a)

TypeError: unorderable types: int() > str()

In [286]:
# Parameters search grid (uncomment parameters for full grid search... may take a lot of time)
md_grid = [3]
mcw_grid = [1]
gamma_grid = [0]
# changed gamma_grid to 0 from 0.3
ss_grid = [0.7]
# changed ss_grid to 0.7 from 1
csb_grid = [0.8]
alpha_grid =[0.2]
lr_grid = [0.05]
ne_grid = [200]
param_grid = []
for N in md_grid:
    for M in mcw_grid:
        for S in gamma_grid:
            for L in ss_grid:
                for K in csb_grid:
                    for P in alpha_grid:
                        for R in lr_grid:
                            for E in ne_grid:
                                param_grid.append({'maxdepth':N, 
                                                   'minchildweight':M, 
                                                   'gamma':S, 
                                                   'subsample':L,
                                                   'colsamplebytree':K,
                                                   'alpha':P,
                                                   'learningrate':R,
                                                   'n_estimators':E})

In [287]:
# Train and test a classifier
def train_and_test(X_tr, y_tr, X_v,well_v):
    
    #Feature normalization
    scaler = preprocessing.RobustScaler(quantile_range=(25.0, 75.0)).fit(X_tr)
    X_tr = scaler.transform(X_tr)
    X_v = scaler.transform(X_v)


    clf.fit(X_tr, y_tr)
    
    # Test classifier
    y_v_hat = clf.predict(X_v)
    
    # Clean isolated facies for each well
    for w in np.unique(well_v):
        y_v_hat[well_v==w] = medfilt(y_v_hat[well_v==w], kernel_size=5)
    
    return y_v_hat

In [288]:
# For each set of parameters
score_param = []
for param in param_grid:
    
    clf = OneVsOneClassifier(XGBClassifier(
            learning_rate = param['learningrate'],
            n_estimators=param['n_estimators'],
            max_depth=param['maxdepth'],
            min_child_weight=param['minchildweight'],
            gamma = param['gamma'],
            subsample=param['subsample'],
            colsample_bytree=param['colsamplebytree'],
            reg_alpha = param['alpha'],
            nthread = 1, 
            # changed nthread to 1 instead of 4 due to running on mac without NVIDA
            seed = seed,
        ) , n_jobs=-1)
    # For each data split
    score_split = []
    for split in split_list:
    
        # Remove padded rows
        split_train_no_pad = np.setdiff1d(split['train'], padded_rows)
        
        # Select training and validation data from current split
        X_tr = X_aug[split_train_no_pad, :]
        X_v = X_aug[split['val'], :]
        y_tr = y[split_train_no_pad]
        y_v = y[split['val']]
        
        # Select well labels for validation data
        well_v = well[split['val']]

        # Train and test
        y_v_hat = train_and_test(X_tr, y_tr, X_v, well_v)
        
        # Score
        score = f1_score(y_v, y_v_hat, average='micro')
        score_split.append(score)
        
    # Average score for this param
    score_param.append(np.mean(score_split))
    print('F1 score = %.3f %s' % (score_param[-1], param))
          
# Best set of parameters
best_idx = np.argmax(score_param)
param_best = param_grid[best_idx]
score_best = score_param[best_idx]
print('\nBest F1 score = %.3f %s' % (score_best, param_best))


---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-288-fca8bb6387b6> in <module>()
     33 
     34         # Train and test
---> 35         y_v_hat = train_and_test(X_tr, y_tr, X_v, well_v)
     36 
     37         # Score

<ipython-input-287-9d3d3d421298> in train_and_test(X_tr, y_tr, X_v, well_v)
      3 
      4     #Feature normalization
----> 5     scaler = preprocessing.RobustScaler(quantile_range=(25.0, 75.0)).fit(X_tr)
      6     X_tr = scaler.transform(X_tr)
      7     X_v = scaler.transform(X_v)

/Users/justingosses/anaconda/lib/python3.5/site-packages/sklearn/preprocessing/data.py in fit(self, X, y)
   1004         if sparse.issparse(X):
   1005             raise TypeError("RobustScaler cannot be fitted on sparse inputs")
-> 1006         X = self._check_array(X, self.copy)
   1007         if X.ndim == 1:
   1008             warnings.warn(DEPRECATION_MSG_1D, DeprecationWarning)

/Users/justingosses/anaconda/lib/python3.5/site-packages/sklearn/preprocessing/data.py in _check_array(self, X, copy)
    981         """Makes sure centering is not enabled for sparse matrices."""
    982         X = check_array(X, accept_sparse=('csr', 'csc'), copy=self.copy,
--> 983                         ensure_2d=False, estimator=self, dtype=FLOAT_DTYPES)
    984 
    985         if X.ndim == 1:

/Users/justingosses/anaconda/lib/python3.5/site-packages/sklearn/utils/validation.py in check_array(array, accept_sparse, dtype, order, copy, force_all_finite, ensure_2d, allow_nd, ensure_min_samples, ensure_min_features, warn_on_dtype, estimator)
    405                              % (array.ndim, estimator_name))
    406         if force_all_finite:
--> 407             _assert_all_finite(array)
    408 
    409     shape_repr = _shape_repr(array.shape)

/Users/justingosses/anaconda/lib/python3.5/site-packages/sklearn/utils/validation.py in _assert_all_finite(X)
     56             and not np.isfinite(X).all()):
     57         raise ValueError("Input contains NaN, infinity"
---> 58                          " or a value too large for %r." % X.dtype)
     59 
     60 

ValueError: Input contains NaN, infinity or a value too large for dtype('float64').

In [289]:
for param in param_grid:
    
    clf = OneVsOneClassifier(XGBClassifier(
            learning_rate = param['learningrate'],
            n_estimators=param['n_estimators'],
            max_depth=param['maxdepth'],
            min_child_weight=param['minchildweight'],
            gamma = param['gamma'],
            subsample=param['subsample'],
            colsample_bytree=param['colsamplebytree'],
            reg_alpha = param['alpha'],
            nthread = 1,
#             changed nthread to not be listed, it was 4
            seed = seed,
        ) , n_jobs=-1)

In [290]:
clf


Out[290]:
OneVsOneClassifier(estimator=XGBClassifier(base_score=0.5, colsample_bylevel=1, colsample_bytree=0.8,
       gamma=0, learning_rate=0.05, max_delta_step=0, max_depth=3,
       min_child_weight=1, missing=None, n_estimators=200, nthread=1,
       objective='binary:logistic', reg_alpha=0.2, reg_lambda=1,
       scale_pos_weight=1, seed=123, silent=True, subsample=0.7),
          n_jobs=-1)

Loading test data, running model on it, and exporting predictions


In [198]:
# Load data from file
test_data = pd.read_csv('validation_data_nofacies.csv')

In [199]:
# Prepare training data
X_tr = X
y_tr = y

# Augment features
X_tr, padded_rows = augment_features(X_tr, well, depth)


# Removed padded rows
X_tr = np.delete(X_tr, padded_rows, axis=0)
y_tr = np.delete(y_tr, padded_rows, axis=0)

In [200]:
# Prepare test data 2
well_ts = test_data['Well Name'].values
depth_ts = test_data['Depth'].values
X_ts = test_data[feature_names].values

# Augment features
X_ts, padded_rows = augment_features(X_ts, well_ts, depth_ts)

In [201]:
X_ts.shape


Out[201]:
(830, 28)

In [202]:
X_tr.shape


Out[202]:
(4131, 28)

In [203]:
y_tr.shape


Out[203]:
(4131,)

In [204]:
well_ts.shape


Out[204]:
(830,)

In [205]:
# Predict test labels
y_ts_hat = train_and_test(X_tr, y_tr, X_ts, well_ts)

In [206]:
# Save predicted labels
test_data['Facies'] = y_ts_hat
test_data.to_csv('Prediction4a.csv')

In [207]:
# print test data
test_data


Out[207]:
Formation Well Name Depth GR ILD_log10 DeltaPHI PHIND PE NM_M RELPOS Facies
0 A1 SH STUART 2808.0 66.276 0.630 3.300 10.650 3.591 1 1.000 3
1 A1 SH STUART 2808.5 77.252 0.585 6.500 11.950 3.341 1 0.978 3
2 A1 SH STUART 2809.0 82.899 0.566 9.400 13.600 3.064 1 0.956 3
3 A1 SH STUART 2809.5 80.671 0.593 9.500 13.250 2.977 1 0.933 3
4 A1 SH STUART 2810.0 75.971 0.638 8.700 12.350 3.020 1 0.911 3
... ... ... ... ... ... ... ... ... ... ... ...
825 C SH CRAWFORD 3158.5 86.078 0.554 5.040 16.150 3.161 1 0.639 2
826 C SH CRAWFORD 3159.0 88.855 0.539 5.560 16.750 3.118 1 0.611 3
827 C SH CRAWFORD 3159.5 90.490 0.530 6.360 16.780 3.168 1 0.583 3
828 C SH CRAWFORD 3160.0 90.975 0.522 7.035 16.995 3.154 1 0.556 3
829 C SH CRAWFORD 3160.5 90.108 0.513 7.505 17.595 3.125 1 0.528 2

830 rows × 11 columns

Display test data in log form


In [208]:
def make_facies_log_plot(logs, facies_colors):
    #make sure logs are sorted by depth
    logs = logs.sort_values(by='Depth')
    cmap_facies = colors.ListedColormap(
            facies_colors[0:len(facies_colors)], 'indexed')
    
    ztop=logs.Depth.min(); zbot=logs.Depth.max()
    
    cluster=np.repeat(np.expand_dims(logs['Facies'].values,1), 100, 1)
    
    f, ax = plt.subplots(nrows=1, ncols=6, figsize=(8, 12))
    ax[0].plot(logs.GR, logs.Depth, '-g')
    ax[1].plot(logs.ILD_log10, logs.Depth, '-')
    ax[2].plot(logs.DeltaPHI, logs.Depth, '-', color='0.5')
    ax[3].plot(logs.PHIND, logs.Depth, '-', color='r')
    ax[4].plot(logs.PE, logs.Depth, '-', color='black')
    im=ax[5].imshow(cluster, interpolation='none', aspect='auto',
                    cmap=cmap_facies,vmin=1,vmax=9)
    
    divider = make_axes_locatable(ax[5])
    cax = divider.append_axes("right", size="20%", pad=0.05)
    cbar=plt.colorbar(im, cax=cax)
    cbar.set_label((17*' ').join([' SS ', 'CSiS', 'FSiS', 
                                'SiSh', ' MS ', ' WS ', ' D  ', 
                                ' PS ', ' BS ']))
    cbar.set_ticks(range(0,1)); cbar.set_ticklabels('')
    
    for i in range(len(ax)-1):
        ax[i].set_ylim(ztop,zbot)
        ax[i].invert_yaxis()
        ax[i].grid()
        ax[i].locator_params(axis='x', nbins=3)
    
    ax[0].set_xlabel("GR")
    ax[0].set_xlim(logs.GR.min(),logs.GR.max())
    ax[1].set_xlabel("ILD_log10")
    ax[1].set_xlim(logs.ILD_log10.min(),logs.ILD_log10.max())
    ax[2].set_xlabel("DeltaPHI")
    ax[2].set_xlim(logs.DeltaPHI.min(),logs.DeltaPHI.max())
    ax[3].set_xlabel("PHIND")
    ax[3].set_xlim(logs.PHIND.min(),logs.PHIND.max())
    ax[4].set_xlabel("PE")
    ax[4].set_xlim(logs.PE.min(),logs.PE.max())
    ax[5].set_xlabel('Facies')
    
    ax[1].set_yticklabels([]); ax[2].set_yticklabels([]); ax[3].set_yticklabels([])
    ax[4].set_yticklabels([]); ax[5].set_yticklabels([])
    ax[5].set_xticklabels([])
    f.suptitle('Well: %s'%logs.iloc[0]['Well Name'], fontsize=14,y=0.94)

In [209]:
from mpl_toolkits.axes_grid1 import make_axes_locatable

In [210]:
# Plot predicted labels
make_facies_log_plot(
    test_data[test_data['Well Name'] == 'STUART'],
    facies_colors=facies_colors)

make_facies_log_plot(
    test_data[test_data['Well Name'] == 'CRAWFORD'],
    facies_colors=facies_colors)
mpl.rcParams.update(inline_rc)



In [ ]:


In [ ]:


In [ ]: