Loading data...
20000 train sequences
5000 test sequences
Pad sequences (samples x time)
X_train shape: (20000, 100)
X_test shape: (5000, 100)
Build model...
Train...
Train on 20000 samples, validate on 5000 samples
Epoch 1/3
20000/20000 [==============================] - 832s - loss: 0.4704 - acc: 0.7756 - val_loss: 0.4304 - val_acc: 0.8060
Epoch 2/3
20000/20000 [==============================] - 837s - loss: 0.2697 - acc: 0.8921 - val_loss: 0.3656 - val_acc: 0.8292
Epoch 3/3
20000/20000 [==============================] - 860s - loss: 0.1641 - acc: 0.9401 - val_loss: 0.4020 - val_acc: 0.8388
5000/5000 [==============================] - 27s
Test score: 0.401985421205
Test accuracy: 0.8388
Exporting model to YAML:
class_mode: binary
layers:
- W_constraint: null
W_regularizer: null
activity_regularizer: null
cache_enabled: true
init: uniform
input_dim: 20000
input_length: 100
input_shape: !!python/tuple [20000]
mask_zero: false
name: Embedding
output_dim: 128
- {activation: tanh, cache_enabled: true, forget_bias_init: one, go_backwards: false,
init: glorot_uniform, inner_activation: hard_sigmoid, inner_init: orthogonal, input_dim: 128,
input_length: null, name: LSTM, output_dim: 128, return_sequences: false, stateful: false}
- {cache_enabled: true, name: Dropout, p: 0.5}
- {W_constraint: null, W_regularizer: null, activation: linear, activity_regularizer: null,
b_constraint: null, b_regularizer: null, cache_enabled: true, init: glorot_uniform,
input_dim: null, name: Dense, output_dim: 1}
- {activation: sigmoid, cache_enabled: true, name: Activation}
loss: binary_crossentropy
name: Sequential
optimizer: {beta_1: 0.8999999761581421, beta_2: 0.9990000128746033, epsilon: 1.0e-08,
lr: 0.0010000000474974513, name: Adam}
/usr/local/lib/python2.7/dist-packages/theano/scan_module/scan_perform_ext.py:133: RuntimeWarning: numpy.ndarray size changed, may indicate binary incompatibility
from scan_perform.scan_perform import *