This network demonstrates how to use LIME with recurrent neural networks.

This focuses on keras-style "stateless" recurrent neural networks. These networks expect input with a shape (n_samples, n_timesteps, n_features) as opposed to the more normal (n_samples, n_features) input that most other machine learning algorithms expect.

To explain the neural network models, we use a variant on the TabularExplainer that takes care of reshaping data appropriately.


In [1]:
# Imports
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd

from keras.models import Sequential
from keras.layers import LSTM, Dropout, Dense
from keras.optimizers import Adam
from keras.utils import to_categorical

from sklearn.preprocessing import MinMaxScaler
from sklearn.metrics import classification_report

from lime import lime_tabular

%matplotlib inline


Using Theano backend.

Data

We will use the $CO_2$ dataset, which measures the concentration of $CO_2$ above Mauna Loa every week since about 1960. The classification task will be deciding if the concentration is rising - this is a problem that needs recurrency to solve (since the answer comes from the derivative), and is less trivial than it sounds because there is noise in the data.

The data is included in the data subdirectory here, where I've added a column for the detrended data that ends up being useful for the network, as we shall see shortly.


In [2]:
df = pd.read_csv('data/co2_data.csv', index_col=0, parse_dates=True)

In [3]:
fig, (left, right) = plt.subplots(nrows=1, ncols=2, figsize=(13, 5))
df[['co2']].plot(ax=left)
df[['co2_detrended']].plot(ax=right)


Out[3]:
<matplotlib.axes._subplots.AxesSubplot at 0x114a3d5c0>

Reshaping the dataset to be appropriate for the model


In [4]:
def reshape_data(seq, n_timesteps):
    N = len(seq) - n_timesteps - 1
    nf = seq.shape[1]
    if N <= 0:
        raise ValueError('I need more data!')
    new_seq = np.zeros((N, n_timesteps, nf))
    for i in range(N):
        new_seq[i, :, :] = seq[i:i+n_timesteps]
    return new_seq

In [5]:
N_TIMESTEPS = 12  # Use 1 year of lookback
data_columns = ['co2', 'co2_detrended']
target_columns = ['rising']

scaler = MinMaxScaler(feature_range=(-1, 1))
X_original = scaler.fit_transform(df[data_columns].values)
X = reshape_data(X_original, n_timesteps=N_TIMESTEPS)
y = to_categorical((df[target_columns].values[N_TIMESTEPS:-1]).astype(int))

# Train on the first 2000, and test on the last 276 samples
X_train = X[:2000]
y_train = y[:2000]
X_test = X[2000:]
y_test = y[2000:]
print(X.shape, y.shape)


(2270, 12, 2) (2270, 2)

Define the model


In [6]:
model = Sequential()
model.add(LSTM(32, input_shape=(N_TIMESTEPS, len(data_columns))))
model.add(Dropout(0.2))
model.add(Dense(2, activation='softmax'))

optimizer = Adam(lr=1e-4)
model.compile(loss='binary_crossentropy', optimizer=optimizer)

In [7]:
model.fit(X_train, y_train, batch_size=100, epochs=500,
          validation_data=(X_test, y_test),
          verbose=2)


Train on 2000 samples, validate on 270 samples
Epoch 1/500
0s - loss: 0.7026 - val_loss: 0.6739
Epoch 2/500
0s - loss: 0.6883 - val_loss: 0.6651
Epoch 3/500
0s - loss: 0.6748 - val_loss: 0.6558
Epoch 4/500
0s - loss: 0.6636 - val_loss: 0.6457
Epoch 5/500
0s - loss: 0.6485 - val_loss: 0.6352
Epoch 6/500
0s - loss: 0.6335 - val_loss: 0.6236
Epoch 7/500
0s - loss: 0.6172 - val_loss: 0.6108
Epoch 8/500
0s - loss: 0.6013 - val_loss: 0.5964
Epoch 9/500
0s - loss: 0.5822 - val_loss: 0.5806
Epoch 10/500
0s - loss: 0.5621 - val_loss: 0.5633
Epoch 11/500
0s - loss: 0.5402 - val_loss: 0.5441
Epoch 12/500
0s - loss: 0.5149 - val_loss: 0.5234
Epoch 13/500
0s - loss: 0.4914 - val_loss: 0.5013
Epoch 14/500
0s - loss: 0.4668 - val_loss: 0.4788
Epoch 15/500
0s - loss: 0.4444 - val_loss: 0.4574
Epoch 16/500
0s - loss: 0.4219 - val_loss: 0.4365
Epoch 17/500
0s - loss: 0.4034 - val_loss: 0.4176
Epoch 18/500
0s - loss: 0.3851 - val_loss: 0.4012
Epoch 19/500
0s - loss: 0.3705 - val_loss: 0.3861
Epoch 20/500
0s - loss: 0.3584 - val_loss: 0.3719
Epoch 21/500
0s - loss: 0.3448 - val_loss: 0.3596
Epoch 22/500
0s - loss: 0.3343 - val_loss: 0.3485
Epoch 23/500
0s - loss: 0.3242 - val_loss: 0.3385
Epoch 24/500
0s - loss: 0.3151 - val_loss: 0.3292
Epoch 25/500
0s - loss: 0.3059 - val_loss: 0.3208
Epoch 26/500
0s - loss: 0.2989 - val_loss: 0.3128
Epoch 27/500
0s - loss: 0.2929 - val_loss: 0.3053
Epoch 28/500
0s - loss: 0.2831 - val_loss: 0.2987
Epoch 29/500
0s - loss: 0.2792 - val_loss: 0.2929
Epoch 30/500
0s - loss: 0.2749 - val_loss: 0.2872
Epoch 31/500
0s - loss: 0.2661 - val_loss: 0.2828
Epoch 32/500
0s - loss: 0.2667 - val_loss: 0.2783
Epoch 33/500
0s - loss: 0.2594 - val_loss: 0.2738
Epoch 34/500
0s - loss: 0.2566 - val_loss: 0.2703
Epoch 35/500
0s - loss: 0.2540 - val_loss: 0.2670
Epoch 36/500
0s - loss: 0.2509 - val_loss: 0.2641
Epoch 37/500
0s - loss: 0.2480 - val_loss: 0.2611
Epoch 38/500
0s - loss: 0.2466 - val_loss: 0.2593
Epoch 39/500
0s - loss: 0.2454 - val_loss: 0.2572
Epoch 40/500
0s - loss: 0.2433 - val_loss: 0.2554
Epoch 41/500
0s - loss: 0.2411 - val_loss: 0.2536
Epoch 42/500
0s - loss: 0.2401 - val_loss: 0.2521
Epoch 43/500
0s - loss: 0.2371 - val_loss: 0.2505
Epoch 44/500
0s - loss: 0.2378 - val_loss: 0.2494
Epoch 45/500
0s - loss: 0.2354 - val_loss: 0.2489
Epoch 46/500
0s - loss: 0.2306 - val_loss: 0.2474
Epoch 47/500
0s - loss: 0.2355 - val_loss: 0.2460
Epoch 48/500
0s - loss: 0.2331 - val_loss: 0.2452
Epoch 49/500
0s - loss: 0.2273 - val_loss: 0.2448
Epoch 50/500
0s - loss: 0.2295 - val_loss: 0.2445
Epoch 51/500
0s - loss: 0.2261 - val_loss: 0.2439
Epoch 52/500
0s - loss: 0.2251 - val_loss: 0.2430
Epoch 53/500
0s - loss: 0.2273 - val_loss: 0.2432
Epoch 54/500
0s - loss: 0.2237 - val_loss: 0.2425
Epoch 55/500
0s - loss: 0.2199 - val_loss: 0.2418
Epoch 56/500
0s - loss: 0.2238 - val_loss: 0.2415
Epoch 57/500
0s - loss: 0.2207 - val_loss: 0.2416
Epoch 58/500
0s - loss: 0.2202 - val_loss: 0.2411
Epoch 59/500
0s - loss: 0.2186 - val_loss: 0.2416
Epoch 60/500
0s - loss: 0.2208 - val_loss: 0.2413
Epoch 61/500
0s - loss: 0.2184 - val_loss: 0.2408
Epoch 62/500
0s - loss: 0.2170 - val_loss: 0.2417
Epoch 63/500
0s - loss: 0.2174 - val_loss: 0.2409
Epoch 64/500
0s - loss: 0.2153 - val_loss: 0.2412
Epoch 65/500
0s - loss: 0.2126 - val_loss: 0.2398
Epoch 66/500
0s - loss: 0.2124 - val_loss: 0.2397
Epoch 67/500
0s - loss: 0.2176 - val_loss: 0.2402
Epoch 68/500
0s - loss: 0.2135 - val_loss: 0.2404
Epoch 69/500
0s - loss: 0.2110 - val_loss: 0.2407
Epoch 70/500
0s - loss: 0.2102 - val_loss: 0.2402
Epoch 71/500
0s - loss: 0.2107 - val_loss: 0.2410
Epoch 72/500
0s - loss: 0.2098 - val_loss: 0.2414
Epoch 73/500
0s - loss: 0.2105 - val_loss: 0.2426
Epoch 74/500
0s - loss: 0.2066 - val_loss: 0.2411
Epoch 75/500
0s - loss: 0.2070 - val_loss: 0.2413
Epoch 76/500
0s - loss: 0.2057 - val_loss: 0.2408
Epoch 77/500
0s - loss: 0.2092 - val_loss: 0.2417
Epoch 78/500
0s - loss: 0.2069 - val_loss: 0.2416
Epoch 79/500
0s - loss: 0.2054 - val_loss: 0.2417
Epoch 80/500
0s - loss: 0.2037 - val_loss: 0.2445
Epoch 81/500
0s - loss: 0.2049 - val_loss: 0.2434
Epoch 82/500
0s - loss: 0.2053 - val_loss: 0.2425
Epoch 83/500
0s - loss: 0.2037 - val_loss: 0.2434
Epoch 84/500
0s - loss: 0.2059 - val_loss: 0.2444
Epoch 85/500
0s - loss: 0.2040 - val_loss: 0.2437
Epoch 86/500
0s - loss: 0.2040 - val_loss: 0.2428
Epoch 87/500
0s - loss: 0.2026 - val_loss: 0.2417
Epoch 88/500
0s - loss: 0.2028 - val_loss: 0.2438
Epoch 89/500
0s - loss: 0.2036 - val_loss: 0.2427
Epoch 90/500
0s - loss: 0.1975 - val_loss: 0.2441
Epoch 91/500
0s - loss: 0.2016 - val_loss: 0.2438
Epoch 92/500
0s - loss: 0.2012 - val_loss: 0.2424
Epoch 93/500
0s - loss: 0.2024 - val_loss: 0.2439
Epoch 94/500
0s - loss: 0.2000 - val_loss: 0.2458
Epoch 95/500
0s - loss: 0.1994 - val_loss: 0.2450
Epoch 96/500
0s - loss: 0.1988 - val_loss: 0.2438
Epoch 97/500
0s - loss: 0.1998 - val_loss: 0.2461
Epoch 98/500
0s - loss: 0.1978 - val_loss: 0.2449
Epoch 99/500
0s - loss: 0.2007 - val_loss: 0.2446
Epoch 100/500
0s - loss: 0.1967 - val_loss: 0.2453
Epoch 101/500
0s - loss: 0.1957 - val_loss: 0.2444
Epoch 102/500
0s - loss: 0.1984 - val_loss: 0.2445
Epoch 103/500
0s - loss: 0.2028 - val_loss: 0.2444
Epoch 104/500
0s - loss: 0.1985 - val_loss: 0.2477
Epoch 105/500
0s - loss: 0.2001 - val_loss: 0.2440
Epoch 106/500
0s - loss: 0.1988 - val_loss: 0.2451
Epoch 107/500
0s - loss: 0.2002 - val_loss: 0.2472
Epoch 108/500
0s - loss: 0.1984 - val_loss: 0.2471
Epoch 109/500
0s - loss: 0.1949 - val_loss: 0.2460
Epoch 110/500
0s - loss: 0.1985 - val_loss: 0.2456
Epoch 111/500
0s - loss: 0.1959 - val_loss: 0.2459
Epoch 112/500
0s - loss: 0.1968 - val_loss: 0.2445
Epoch 113/500
0s - loss: 0.1987 - val_loss: 0.2453
Epoch 114/500
0s - loss: 0.1967 - val_loss: 0.2440
Epoch 115/500
0s - loss: 0.1954 - val_loss: 0.2423
Epoch 116/500
0s - loss: 0.1959 - val_loss: 0.2436
Epoch 117/500
0s - loss: 0.1998 - val_loss: 0.2446
Epoch 118/500
0s - loss: 0.1999 - val_loss: 0.2441
Epoch 119/500
0s - loss: 0.1958 - val_loss: 0.2438
Epoch 120/500
0s - loss: 0.1967 - val_loss: 0.2429
Epoch 121/500
0s - loss: 0.1971 - val_loss: 0.2443
Epoch 122/500
0s - loss: 0.1962 - val_loss: 0.2428
Epoch 123/500
0s - loss: 0.1977 - val_loss: 0.2451
Epoch 124/500
0s - loss: 0.1983 - val_loss: 0.2430
Epoch 125/500
0s - loss: 0.1952 - val_loss: 0.2425
Epoch 126/500
0s - loss: 0.1932 - val_loss: 0.2429
Epoch 127/500
0s - loss: 0.1957 - val_loss: 0.2417
Epoch 128/500
0s - loss: 0.1946 - val_loss: 0.2429
Epoch 129/500
0s - loss: 0.1942 - val_loss: 0.2426
Epoch 130/500
0s - loss: 0.1917 - val_loss: 0.2420
Epoch 131/500
0s - loss: 0.1933 - val_loss: 0.2411
Epoch 132/500
0s - loss: 0.1962 - val_loss: 0.2402
Epoch 133/500
0s - loss: 0.1952 - val_loss: 0.2439
Epoch 134/500
0s - loss: 0.1935 - val_loss: 0.2412
Epoch 135/500
0s - loss: 0.1991 - val_loss: 0.2411
Epoch 136/500
0s - loss: 0.1925 - val_loss: 0.2399
Epoch 137/500
0s - loss: 0.1949 - val_loss: 0.2417
Epoch 138/500
0s - loss: 0.1930 - val_loss: 0.2406
Epoch 139/500
0s - loss: 0.1934 - val_loss: 0.2411
Epoch 140/500
0s - loss: 0.1918 - val_loss: 0.2408
Epoch 141/500
0s - loss: 0.1924 - val_loss: 0.2427
Epoch 142/500
0s - loss: 0.1931 - val_loss: 0.2409
Epoch 143/500
0s - loss: 0.1966 - val_loss: 0.2400
Epoch 144/500
0s - loss: 0.1933 - val_loss: 0.2408
Epoch 145/500
0s - loss: 0.1966 - val_loss: 0.2391
Epoch 146/500
0s - loss: 0.1932 - val_loss: 0.2411
Epoch 147/500
0s - loss: 0.1949 - val_loss: 0.2394
Epoch 148/500
0s - loss: 0.1961 - val_loss: 0.2401
Epoch 149/500
0s - loss: 0.1918 - val_loss: 0.2414
Epoch 150/500
0s - loss: 0.1935 - val_loss: 0.2400
Epoch 151/500
0s - loss: 0.1920 - val_loss: 0.2409
Epoch 152/500
0s - loss: 0.1957 - val_loss: 0.2382
Epoch 153/500
0s - loss: 0.1938 - val_loss: 0.2380
Epoch 154/500
0s - loss: 0.1932 - val_loss: 0.2382
Epoch 155/500
0s - loss: 0.1907 - val_loss: 0.2390
Epoch 156/500
0s - loss: 0.1908 - val_loss: 0.2373
Epoch 157/500
0s - loss: 0.1930 - val_loss: 0.2362
Epoch 158/500
0s - loss: 0.1944 - val_loss: 0.2373
Epoch 159/500
0s - loss: 0.1914 - val_loss: 0.2371
Epoch 160/500
0s - loss: 0.1923 - val_loss: 0.2376
Epoch 161/500
0s - loss: 0.1955 - val_loss: 0.2385
Epoch 162/500
0s - loss: 0.1921 - val_loss: 0.2370
Epoch 163/500
0s - loss: 0.1958 - val_loss: 0.2387
Epoch 164/500
0s - loss: 0.1930 - val_loss: 0.2375
Epoch 165/500
0s - loss: 0.1913 - val_loss: 0.2376
Epoch 166/500
0s - loss: 0.1909 - val_loss: 0.2375
Epoch 167/500
0s - loss: 0.1910 - val_loss: 0.2369
Epoch 168/500
0s - loss: 0.1926 - val_loss: 0.2359
Epoch 169/500
0s - loss: 0.1907 - val_loss: 0.2354
Epoch 170/500
0s - loss: 0.1938 - val_loss: 0.2365
Epoch 171/500
0s - loss: 0.1920 - val_loss: 0.2354
Epoch 172/500
0s - loss: 0.1903 - val_loss: 0.2349
Epoch 173/500
0s - loss: 0.1919 - val_loss: 0.2361
Epoch 174/500
0s - loss: 0.1950 - val_loss: 0.2379
Epoch 175/500
0s - loss: 0.1884 - val_loss: 0.2352
Epoch 176/500
0s - loss: 0.1918 - val_loss: 0.2357
Epoch 177/500
0s - loss: 0.1911 - val_loss: 0.2372
Epoch 178/500
0s - loss: 0.1901 - val_loss: 0.2364
Epoch 179/500
0s - loss: 0.1914 - val_loss: 0.2365
Epoch 180/500
0s - loss: 0.1912 - val_loss: 0.2357
Epoch 181/500
0s - loss: 0.1918 - val_loss: 0.2352
Epoch 182/500
0s - loss: 0.1895 - val_loss: 0.2359
Epoch 183/500
0s - loss: 0.1896 - val_loss: 0.2344
Epoch 184/500
0s - loss: 0.1922 - val_loss: 0.2334
Epoch 185/500
0s - loss: 0.1935 - val_loss: 0.2355
Epoch 186/500
0s - loss: 0.1910 - val_loss: 0.2338
Epoch 187/500
0s - loss: 0.1917 - val_loss: 0.2343
Epoch 188/500
0s - loss: 0.1930 - val_loss: 0.2343
Epoch 189/500
0s - loss: 0.1922 - val_loss: 0.2324
Epoch 190/500
0s - loss: 0.1906 - val_loss: 0.2344
Epoch 191/500
0s - loss: 0.1904 - val_loss: 0.2323
Epoch 192/500
0s - loss: 0.1906 - val_loss: 0.2335
Epoch 193/500
0s - loss: 0.1919 - val_loss: 0.2331
Epoch 194/500
0s - loss: 0.1892 - val_loss: 0.2325
Epoch 195/500
0s - loss: 0.1881 - val_loss: 0.2335
Epoch 196/500
0s - loss: 0.1912 - val_loss: 0.2330
Epoch 197/500
0s - loss: 0.1937 - val_loss: 0.2330
Epoch 198/500
0s - loss: 0.1908 - val_loss: 0.2322
Epoch 199/500
0s - loss: 0.1927 - val_loss: 0.2324
Epoch 200/500
0s - loss: 0.1918 - val_loss: 0.2333
Epoch 201/500
0s - loss: 0.1895 - val_loss: 0.2332
Epoch 202/500
0s - loss: 0.1895 - val_loss: 0.2328
Epoch 203/500
0s - loss: 0.1920 - val_loss: 0.2328
Epoch 204/500
0s - loss: 0.1940 - val_loss: 0.2329
Epoch 205/500
0s - loss: 0.1882 - val_loss: 0.2323
Epoch 206/500
0s - loss: 0.1915 - val_loss: 0.2325
Epoch 207/500
0s - loss: 0.1920 - val_loss: 0.2320
Epoch 208/500
0s - loss: 0.1868 - val_loss: 0.2308
Epoch 209/500
0s - loss: 0.1903 - val_loss: 0.2322
Epoch 210/500
0s - loss: 0.1900 - val_loss: 0.2301
Epoch 211/500
0s - loss: 0.1892 - val_loss: 0.2319
Epoch 212/500
0s - loss: 0.1908 - val_loss: 0.2313
Epoch 213/500
0s - loss: 0.1916 - val_loss: 0.2320
Epoch 214/500
0s - loss: 0.1910 - val_loss: 0.2302
Epoch 215/500
0s - loss: 0.1896 - val_loss: 0.2307
Epoch 216/500
0s - loss: 0.1908 - val_loss: 0.2300
Epoch 217/500
0s - loss: 0.1894 - val_loss: 0.2288
Epoch 218/500
0s - loss: 0.1911 - val_loss: 0.2303
Epoch 219/500
0s - loss: 0.1908 - val_loss: 0.2304
Epoch 220/500
0s - loss: 0.1902 - val_loss: 0.2298
Epoch 221/500
0s - loss: 0.1911 - val_loss: 0.2319
Epoch 222/500
0s - loss: 0.1907 - val_loss: 0.2298
Epoch 223/500
0s - loss: 0.1889 - val_loss: 0.2327
Epoch 224/500
0s - loss: 0.1893 - val_loss: 0.2312
Epoch 225/500
0s - loss: 0.1869 - val_loss: 0.2305
Epoch 226/500
0s - loss: 0.1888 - val_loss: 0.2298
Epoch 227/500
0s - loss: 0.1895 - val_loss: 0.2323
Epoch 228/500
0s - loss: 0.1908 - val_loss: 0.2308
Epoch 229/500
0s - loss: 0.1927 - val_loss: 0.2293
Epoch 230/500
0s - loss: 0.1898 - val_loss: 0.2300
Epoch 231/500
0s - loss: 0.1912 - val_loss: 0.2291
Epoch 232/500
0s - loss: 0.1889 - val_loss: 0.2303
Epoch 233/500
0s - loss: 0.1903 - val_loss: 0.2293
Epoch 234/500
0s - loss: 0.1892 - val_loss: 0.2295
Epoch 235/500
0s - loss: 0.1881 - val_loss: 0.2286
Epoch 236/500
0s - loss: 0.1906 - val_loss: 0.2282
Epoch 237/500
0s - loss: 0.1901 - val_loss: 0.2298
Epoch 238/500
0s - loss: 0.1913 - val_loss: 0.2301
Epoch 239/500
0s - loss: 0.1878 - val_loss: 0.2308
Epoch 240/500
0s - loss: 0.1882 - val_loss: 0.2288
Epoch 241/500
0s - loss: 0.1904 - val_loss: 0.2297
Epoch 242/500
0s - loss: 0.1913 - val_loss: 0.2284
Epoch 243/500
0s - loss: 0.1909 - val_loss: 0.2278
Epoch 244/500
0s - loss: 0.1887 - val_loss: 0.2281
Epoch 245/500
0s - loss: 0.1905 - val_loss: 0.2291
Epoch 246/500
0s - loss: 0.1887 - val_loss: 0.2306
Epoch 247/500
0s - loss: 0.1878 - val_loss: 0.2318
Epoch 248/500
0s - loss: 0.1896 - val_loss: 0.2288
Epoch 249/500
0s - loss: 0.1918 - val_loss: 0.2290
Epoch 250/500
0s - loss: 0.1912 - val_loss: 0.2288
Epoch 251/500
0s - loss: 0.1903 - val_loss: 0.2281
Epoch 252/500
0s - loss: 0.1865 - val_loss: 0.2286
Epoch 253/500
0s - loss: 0.1898 - val_loss: 0.2295
Epoch 254/500
0s - loss: 0.1900 - val_loss: 0.2283
Epoch 255/500
0s - loss: 0.1867 - val_loss: 0.2289
Epoch 256/500
0s - loss: 0.1901 - val_loss: 0.2283
Epoch 257/500
0s - loss: 0.1880 - val_loss: 0.2304
Epoch 258/500
0s - loss: 0.1884 - val_loss: 0.2296
Epoch 259/500
0s - loss: 0.1909 - val_loss: 0.2284
Epoch 260/500
0s - loss: 0.1898 - val_loss: 0.2289
Epoch 261/500
0s - loss: 0.1887 - val_loss: 0.2296
Epoch 262/500
0s - loss: 0.1904 - val_loss: 0.2283
Epoch 263/500
0s - loss: 0.1916 - val_loss: 0.2280
Epoch 264/500
0s - loss: 0.1893 - val_loss: 0.2298
Epoch 265/500
0s - loss: 0.1874 - val_loss: 0.2290
Epoch 266/500
0s - loss: 0.1892 - val_loss: 0.2276
Epoch 267/500
0s - loss: 0.1873 - val_loss: 0.2291
Epoch 268/500
0s - loss: 0.1848 - val_loss: 0.2286
Epoch 269/500
0s - loss: 0.1894 - val_loss: 0.2283
Epoch 270/500
0s - loss: 0.1925 - val_loss: 0.2276
Epoch 271/500
0s - loss: 0.1880 - val_loss: 0.2283
Epoch 272/500
0s - loss: 0.1898 - val_loss: 0.2277
Epoch 273/500
0s - loss: 0.1898 - val_loss: 0.2284
Epoch 274/500
0s - loss: 0.1894 - val_loss: 0.2306
Epoch 275/500
0s - loss: 0.1890 - val_loss: 0.2268
Epoch 276/500
0s - loss: 0.1876 - val_loss: 0.2283
Epoch 277/500
0s - loss: 0.1886 - val_loss: 0.2283
Epoch 278/500
0s - loss: 0.1876 - val_loss: 0.2285
Epoch 279/500
0s - loss: 0.1892 - val_loss: 0.2274
Epoch 280/500
0s - loss: 0.1856 - val_loss: 0.2287
Epoch 281/500
0s - loss: 0.1892 - val_loss: 0.2293
Epoch 282/500
0s - loss: 0.1892 - val_loss: 0.2286
Epoch 283/500
0s - loss: 0.1873 - val_loss: 0.2297
Epoch 284/500
0s - loss: 0.1855 - val_loss: 0.2285
Epoch 285/500
0s - loss: 0.1906 - val_loss: 0.2274
Epoch 286/500
0s - loss: 0.1889 - val_loss: 0.2275
Epoch 287/500
0s - loss: 0.1887 - val_loss: 0.2281
Epoch 288/500
0s - loss: 0.1879 - val_loss: 0.2278
Epoch 289/500
0s - loss: 0.1858 - val_loss: 0.2270
Epoch 290/500
0s - loss: 0.1884 - val_loss: 0.2299
Epoch 291/500
0s - loss: 0.1865 - val_loss: 0.2281
Epoch 292/500
0s - loss: 0.1873 - val_loss: 0.2266
Epoch 293/500
0s - loss: 0.1900 - val_loss: 0.2282
Epoch 294/500
0s - loss: 0.1886 - val_loss: 0.2289
Epoch 295/500
0s - loss: 0.1907 - val_loss: 0.2291
Epoch 296/500
0s - loss: 0.1898 - val_loss: 0.2275
Epoch 297/500
0s - loss: 0.1865 - val_loss: 0.2270
Epoch 298/500
0s - loss: 0.1882 - val_loss: 0.2274
Epoch 299/500
0s - loss: 0.1848 - val_loss: 0.2280
Epoch 300/500
0s - loss: 0.1872 - val_loss: 0.2281
Epoch 301/500
0s - loss: 0.1901 - val_loss: 0.2283
Epoch 302/500
0s - loss: 0.1892 - val_loss: 0.2271
Epoch 303/500
0s - loss: 0.1870 - val_loss: 0.2289
Epoch 304/500
0s - loss: 0.1890 - val_loss: 0.2272
Epoch 305/500
0s - loss: 0.1892 - val_loss: 0.2287
Epoch 306/500
0s - loss: 0.1849 - val_loss: 0.2281
Epoch 307/500
0s - loss: 0.1906 - val_loss: 0.2290
Epoch 308/500
0s - loss: 0.1913 - val_loss: 0.2269
Epoch 309/500
0s - loss: 0.1896 - val_loss: 0.2297
Epoch 310/500
0s - loss: 0.1871 - val_loss: 0.2281
Epoch 311/500
0s - loss: 0.1862 - val_loss: 0.2268
Epoch 312/500
0s - loss: 0.1859 - val_loss: 0.2279
Epoch 313/500
0s - loss: 0.1870 - val_loss: 0.2272
Epoch 314/500
0s - loss: 0.1881 - val_loss: 0.2272
Epoch 315/500
0s - loss: 0.1891 - val_loss: 0.2276
Epoch 316/500
0s - loss: 0.1887 - val_loss: 0.2265
Epoch 317/500
0s - loss: 0.1887 - val_loss: 0.2288
Epoch 318/500
0s - loss: 0.1890 - val_loss: 0.2279
Epoch 319/500
0s - loss: 0.1876 - val_loss: 0.2293
Epoch 320/500
0s - loss: 0.1859 - val_loss: 0.2272
Epoch 321/500
0s - loss: 0.1916 - val_loss: 0.2277
Epoch 322/500
0s - loss: 0.1867 - val_loss: 0.2260
Epoch 323/500
0s - loss: 0.1906 - val_loss: 0.2275
Epoch 324/500
0s - loss: 0.1858 - val_loss: 0.2264
Epoch 325/500
0s - loss: 0.1864 - val_loss: 0.2270
Epoch 326/500
0s - loss: 0.1877 - val_loss: 0.2270
Epoch 327/500
0s - loss: 0.1891 - val_loss: 0.2284
Epoch 328/500
0s - loss: 0.1869 - val_loss: 0.2271
Epoch 329/500
0s - loss: 0.1905 - val_loss: 0.2282
Epoch 330/500
0s - loss: 0.1860 - val_loss: 0.2268
Epoch 331/500
0s - loss: 0.1865 - val_loss: 0.2276
Epoch 332/500
0s - loss: 0.1881 - val_loss: 0.2297
Epoch 333/500
0s - loss: 0.1877 - val_loss: 0.2258
Epoch 334/500
0s - loss: 0.1878 - val_loss: 0.2274
Epoch 335/500
0s - loss: 0.1892 - val_loss: 0.2262
Epoch 336/500
0s - loss: 0.1865 - val_loss: 0.2278
Epoch 337/500
0s - loss: 0.1877 - val_loss: 0.2278
Epoch 338/500
0s - loss: 0.1859 - val_loss: 0.2285
Epoch 339/500
0s - loss: 0.1884 - val_loss: 0.2273
Epoch 340/500
0s - loss: 0.1899 - val_loss: 0.2281
Epoch 341/500
0s - loss: 0.1890 - val_loss: 0.2274
Epoch 342/500
0s - loss: 0.1911 - val_loss: 0.2277
Epoch 343/500
0s - loss: 0.1862 - val_loss: 0.2272
Epoch 344/500
0s - loss: 0.1885 - val_loss: 0.2264
Epoch 345/500
0s - loss: 0.1879 - val_loss: 0.2262
Epoch 346/500
0s - loss: 0.1863 - val_loss: 0.2266
Epoch 347/500
0s - loss: 0.1851 - val_loss: 0.2273
Epoch 348/500
0s - loss: 0.1916 - val_loss: 0.2251
Epoch 349/500
0s - loss: 0.1882 - val_loss: 0.2271
Epoch 350/500
0s - loss: 0.1864 - val_loss: 0.2273
Epoch 351/500
0s - loss: 0.1923 - val_loss: 0.2278
Epoch 352/500
0s - loss: 0.1864 - val_loss: 0.2280
Epoch 353/500
0s - loss: 0.1845 - val_loss: 0.2281
Epoch 354/500
0s - loss: 0.1868 - val_loss: 0.2260
Epoch 355/500
0s - loss: 0.1902 - val_loss: 0.2275
Epoch 356/500
0s - loss: 0.1871 - val_loss: 0.2288
Epoch 357/500
0s - loss: 0.1881 - val_loss: 0.2265
Epoch 358/500
0s - loss: 0.1865 - val_loss: 0.2275
Epoch 359/500
0s - loss: 0.1865 - val_loss: 0.2276
Epoch 360/500
0s - loss: 0.1884 - val_loss: 0.2267
Epoch 361/500
0s - loss: 0.1875 - val_loss: 0.2294
Epoch 362/500
0s - loss: 0.1883 - val_loss: 0.2275
Epoch 363/500
0s - loss: 0.1852 - val_loss: 0.2267
Epoch 364/500
0s - loss: 0.1874 - val_loss: 0.2275
Epoch 365/500
0s - loss: 0.1870 - val_loss: 0.2262
Epoch 366/500
0s - loss: 0.1880 - val_loss: 0.2286
Epoch 367/500
0s - loss: 0.1853 - val_loss: 0.2263
Epoch 368/500
0s - loss: 0.1871 - val_loss: 0.2267
Epoch 369/500
0s - loss: 0.1885 - val_loss: 0.2281
Epoch 370/500
0s - loss: 0.1836 - val_loss: 0.2265
Epoch 371/500
0s - loss: 0.1871 - val_loss: 0.2264
Epoch 372/500
0s - loss: 0.1847 - val_loss: 0.2267
Epoch 373/500
0s - loss: 0.1881 - val_loss: 0.2291
Epoch 374/500
0s - loss: 0.1896 - val_loss: 0.2265
Epoch 375/500
0s - loss: 0.1881 - val_loss: 0.2266
Epoch 376/500
0s - loss: 0.1888 - val_loss: 0.2280
Epoch 377/500
0s - loss: 0.1888 - val_loss: 0.2272
Epoch 378/500
0s - loss: 0.1907 - val_loss: 0.2255
Epoch 379/500
0s - loss: 0.1870 - val_loss: 0.2268
Epoch 380/500
0s - loss: 0.1883 - val_loss: 0.2257
Epoch 381/500
0s - loss: 0.1887 - val_loss: 0.2274
Epoch 382/500
0s - loss: 0.1895 - val_loss: 0.2293
Epoch 383/500
0s - loss: 0.1843 - val_loss: 0.2266
Epoch 384/500
0s - loss: 0.1865 - val_loss: 0.2272
Epoch 385/500
0s - loss: 0.1867 - val_loss: 0.2269
Epoch 386/500
0s - loss: 0.1866 - val_loss: 0.2267
Epoch 387/500
0s - loss: 0.1859 - val_loss: 0.2270
Epoch 388/500
0s - loss: 0.1860 - val_loss: 0.2268
Epoch 389/500
0s - loss: 0.1883 - val_loss: 0.2259
Epoch 390/500
0s - loss: 0.1838 - val_loss: 0.2258
Epoch 391/500
0s - loss: 0.1832 - val_loss: 0.2272
Epoch 392/500
0s - loss: 0.1865 - val_loss: 0.2278
Epoch 393/500
0s - loss: 0.1867 - val_loss: 0.2261
Epoch 394/500
0s - loss: 0.1850 - val_loss: 0.2272
Epoch 395/500
0s - loss: 0.1835 - val_loss: 0.2279
Epoch 396/500
0s - loss: 0.1850 - val_loss: 0.2267
Epoch 397/500
0s - loss: 0.1853 - val_loss: 0.2265
Epoch 398/500
0s - loss: 0.1845 - val_loss: 0.2282
Epoch 399/500
0s - loss: 0.1871 - val_loss: 0.2282
Epoch 400/500
0s - loss: 0.1878 - val_loss: 0.2267
Epoch 401/500
0s - loss: 0.1845 - val_loss: 0.2265
Epoch 402/500
0s - loss: 0.1876 - val_loss: 0.2266
Epoch 403/500
0s - loss: 0.1842 - val_loss: 0.2269
Epoch 404/500
0s - loss: 0.1858 - val_loss: 0.2275
Epoch 405/500
0s - loss: 0.1832 - val_loss: 0.2266
Epoch 406/500
0s - loss: 0.1879 - val_loss: 0.2274
Epoch 407/500
0s - loss: 0.1877 - val_loss: 0.2270
Epoch 408/500
0s - loss: 0.1847 - val_loss: 0.2288
Epoch 409/500
0s - loss: 0.1822 - val_loss: 0.2273
Epoch 410/500
0s - loss: 0.1825 - val_loss: 0.2274
Epoch 411/500
0s - loss: 0.1846 - val_loss: 0.2277
Epoch 412/500
0s - loss: 0.1870 - val_loss: 0.2268
Epoch 413/500
0s - loss: 0.1899 - val_loss: 0.2283
Epoch 414/500
0s - loss: 0.1861 - val_loss: 0.2271
Epoch 415/500
0s - loss: 0.1845 - val_loss: 0.2284
Epoch 416/500
0s - loss: 0.1874 - val_loss: 0.2272
Epoch 417/500
0s - loss: 0.1863 - val_loss: 0.2277
Epoch 418/500
0s - loss: 0.1868 - val_loss: 0.2273
Epoch 419/500
0s - loss: 0.1862 - val_loss: 0.2287
Epoch 420/500
0s - loss: 0.1848 - val_loss: 0.2285
Epoch 421/500
0s - loss: 0.1879 - val_loss: 0.2291
Epoch 422/500
0s - loss: 0.1857 - val_loss: 0.2280
Epoch 423/500
0s - loss: 0.1862 - val_loss: 0.2275
Epoch 424/500
0s - loss: 0.1874 - val_loss: 0.2301
Epoch 425/500
0s - loss: 0.1895 - val_loss: 0.2284
Epoch 426/500
0s - loss: 0.1829 - val_loss: 0.2272
Epoch 427/500
0s - loss: 0.1873 - val_loss: 0.2288
Epoch 428/500
0s - loss: 0.1867 - val_loss: 0.2280
Epoch 429/500
0s - loss: 0.1870 - val_loss: 0.2268
Epoch 430/500
0s - loss: 0.1855 - val_loss: 0.2274
Epoch 431/500
0s - loss: 0.1865 - val_loss: 0.2278
Epoch 432/500
0s - loss: 0.1850 - val_loss: 0.2279
Epoch 433/500
0s - loss: 0.1864 - val_loss: 0.2289
Epoch 434/500
0s - loss: 0.1846 - val_loss: 0.2277
Epoch 435/500
0s - loss: 0.1816 - val_loss: 0.2277
Epoch 436/500
0s - loss: 0.1846 - val_loss: 0.2276
Epoch 437/500
0s - loss: 0.1842 - val_loss: 0.2284
Epoch 438/500
0s - loss: 0.1875 - val_loss: 0.2265
Epoch 439/500
0s - loss: 0.1872 - val_loss: 0.2287
Epoch 440/500
0s - loss: 0.1837 - val_loss: 0.2267
Epoch 441/500
0s - loss: 0.1869 - val_loss: 0.2264
Epoch 442/500
0s - loss: 0.1884 - val_loss: 0.2282
Epoch 443/500
0s - loss: 0.1863 - val_loss: 0.2291
Epoch 444/500
0s - loss: 0.1854 - val_loss: 0.2277
Epoch 445/500
0s - loss: 0.1842 - val_loss: 0.2275
Epoch 446/500
0s - loss: 0.1853 - val_loss: 0.2287
Epoch 447/500
0s - loss: 0.1850 - val_loss: 0.2283
Epoch 448/500
0s - loss: 0.1847 - val_loss: 0.2285
Epoch 449/500
0s - loss: 0.1866 - val_loss: 0.2272
Epoch 450/500
0s - loss: 0.1834 - val_loss: 0.2270
Epoch 451/500
0s - loss: 0.1857 - val_loss: 0.2297
Epoch 452/500
0s - loss: 0.1843 - val_loss: 0.2272
Epoch 453/500
0s - loss: 0.1849 - val_loss: 0.2276
Epoch 454/500
0s - loss: 0.1845 - val_loss: 0.2270
Epoch 455/500
0s - loss: 0.1857 - val_loss: 0.2266
Epoch 456/500
0s - loss: 0.1856 - val_loss: 0.2292
Epoch 457/500
0s - loss: 0.1842 - val_loss: 0.2289
Epoch 458/500
0s - loss: 0.1845 - val_loss: 0.2272
Epoch 459/500
0s - loss: 0.1851 - val_loss: 0.2269
Epoch 460/500
0s - loss: 0.1857 - val_loss: 0.2284
Epoch 461/500
0s - loss: 0.1841 - val_loss: 0.2278
Epoch 462/500
0s - loss: 0.1807 - val_loss: 0.2285
Epoch 463/500
0s - loss: 0.1838 - val_loss: 0.2263
Epoch 464/500
0s - loss: 0.1862 - val_loss: 0.2274
Epoch 465/500
0s - loss: 0.1829 - val_loss: 0.2281
Epoch 466/500
0s - loss: 0.1867 - val_loss: 0.2278
Epoch 467/500
0s - loss: 0.1865 - val_loss: 0.2279
Epoch 468/500
0s - loss: 0.1857 - val_loss: 0.2295
Epoch 469/500
0s - loss: 0.1877 - val_loss: 0.2282
Epoch 470/500
0s - loss: 0.1862 - val_loss: 0.2280
Epoch 471/500
0s - loss: 0.1863 - val_loss: 0.2304
Epoch 472/500
0s - loss: 0.1848 - val_loss: 0.2285
Epoch 473/500
0s - loss: 0.1872 - val_loss: 0.2279
Epoch 474/500
0s - loss: 0.1849 - val_loss: 0.2309
Epoch 475/500
0s - loss: 0.1851 - val_loss: 0.2287
Epoch 476/500
0s - loss: 0.1850 - val_loss: 0.2293
Epoch 477/500
0s - loss: 0.1852 - val_loss: 0.2293
Epoch 478/500
0s - loss: 0.1851 - val_loss: 0.2283
Epoch 479/500
0s - loss: 0.1857 - val_loss: 0.2279
Epoch 480/500
0s - loss: 0.1856 - val_loss: 0.2278
Epoch 481/500
0s - loss: 0.1850 - val_loss: 0.2309
Epoch 482/500
0s - loss: 0.1870 - val_loss: 0.2279
Epoch 483/500
0s - loss: 0.1805 - val_loss: 0.2288
Epoch 484/500
0s - loss: 0.1851 - val_loss: 0.2273
Epoch 485/500
0s - loss: 0.1908 - val_loss: 0.2294
Epoch 486/500
0s - loss: 0.1812 - val_loss: 0.2276
Epoch 487/500
0s - loss: 0.1837 - val_loss: 0.2296
Epoch 488/500
0s - loss: 0.1836 - val_loss: 0.2288
Epoch 489/500
0s - loss: 0.1854 - val_loss: 0.2290
Epoch 490/500
0s - loss: 0.1875 - val_loss: 0.2272
Epoch 491/500
0s - loss: 0.1839 - val_loss: 0.2285
Epoch 492/500
0s - loss: 0.1849 - val_loss: 0.2308
Epoch 493/500
0s - loss: 0.1846 - val_loss: 0.2289
Epoch 494/500
0s - loss: 0.1867 - val_loss: 0.2291
Epoch 495/500
0s - loss: 0.1867 - val_loss: 0.2276
Epoch 496/500
0s - loss: 0.1864 - val_loss: 0.2297
Epoch 497/500
0s - loss: 0.1845 - val_loss: 0.2307
Epoch 498/500
0s - loss: 0.1855 - val_loss: 0.2289
Epoch 499/500
0s - loss: 0.1852 - val_loss: 0.2301
Epoch 500/500
0s - loss: 0.1830 - val_loss: 0.2298
Out[7]:
<keras.callbacks.History at 0x118efd4a8>

In [8]:
y_pred = np.argmax(model.predict(X_test), axis=1)
y_true = np.argmax(y_test, axis=1)
print(classification_report(y_true, y_pred))


             precision    recall  f1-score   support

          0       0.83      0.92      0.87       100
          1       0.95      0.89      0.92       170

avg / total       0.90      0.90      0.90       270


In [9]:
plt.plot(y_true, lw=3, alpha=0.3, label='Truth')
plt.plot(y_pred, '--', label='Predictions')
plt.legend(loc='best')


Out[9]:
<matplotlib.legend.Legend at 0x1192da2e8>

Explain the model with LIME:


In [10]:
explainer = lime_tabular.RecurrentTabularExplainer(X_train, training_labels=y_train, feature_names=data_columns,
                                                   discretize_continuous=True,
                                                   class_names=['Falling', 'Rising'],
                                                   discretizer='decile')

In [12]:
exp = explainer.explain_instance(X_test[50], model.predict, num_features=10, labels=(1,))
exp.show_in_notebook()


We can see that the most important features are the de-trended $CO_2$ concentration several timesteps in the past. In particular, we see that if that feature is low in the recent past, then the concentration is now probably rising.


In [ ]: