CNN 多通道情感分析

一个有三个通道,分别是word embedding,POS 标签 embedding, 词的情感极性强度embedding


In [1]:
import keras 
from  os.path import join
from keras.preprocessing import sequence
from keras.models import Sequential
from keras.layers import Dense, Dropout,Activation, Lambda,Input
from keras.layers import Embedding
from keras.layers import Convolution1D
from keras.datasets import imdb
from keras import backend as K
from keras.layers import Convolution1D, GlobalMaxPooling1D,Convolution2D,Merge,merge
from keras.utils import np_utils
from keras.models import Model
import nltk
from nltk.tag import pos_tag
import numpy as np
from keras.regularizers import l2
import theano


Using Theano backend.
Using gpu device 0: GeForce GT 630 (CNMeM is disabled, cuDNN not available)

POS当作一个通道。

Tag word 的方法: http://www.nltk.org/book/ch05.html


In [2]:
file_names = ['stsa.fine.test','stsa.fine.train','stsa.fine.dev']
file_path = '/home/bruce/data/sentiment/citai_process'
def read_file(fname=''):
    with open(join(file_path,fname)) as fr:
        lines = fr.readlines()
    lines = [line.strip().lower() for line in lines]
    lables = [int(line[0:1]) for line in lines]
    words = [line[2:].split() for line in lines]
    return words,lables       
train_X,train_y = read_file(fname='stsa.fine.train')
test_X,test_y = read_file(fname='stsa.fine.test')
dev_X,dev_y = read_file(fname='stsa.fine.dev')
print(len(train_X))
print(len(test_X))
print(len(dev_X))
print(train_X[0:2])
print(train_y[0:2])


8544
2210
1101
[['a', 'stirring', ',', 'funny', 'and', 'finally', 'transport', 're-imagining', 'of', 'beauty', 'and', 'the', 'beast', 'and', '1930s', 'horror', 'film'], ['apparently', 'reassemble', 'from', 'the', 'cutting-room', 'floor', 'of', 'any', 'give', 'daytime', 'soap', '.']]
[4, 1]

In [3]:
def tag_sentence(X=[]):
    tag_X=[]
    for line in X:
        word_tag = pos_tag(line,tagset='universal')
        tag = [i[1] for i in word_tag]
        tag_X.append(tag)
    return tag_X
train_tag_X = tag_sentence(X=train_X)
dev_tag_X = tag_sentence(X=dev_X)
test_tag_X = tag_sentence(X=test_X)
print(train_X[0])
print(train_tag_X[0])


['a', 'stirring', ',', 'funny', 'and', 'finally', 'transport', 're-imagining', 'of', 'beauty', 'and', 'the', 'beast', 'and', '1930s', 'horror', 'film']
['DET', 'NOUN', '.', 'ADJ', 'CONJ', 'ADV', 'VERB', 'NOUN', 'ADP', 'NOUN', 'CONJ', 'DET', 'NOUN', 'CONJ', 'NUM', 'NOUN', 'NOUN']

情感极性当作一个通道。

读取情感强度文件,构建字典


In [4]:
senti_file = '/home/bruce/data/sentiment/sentiment_diction/wordwithStrength.txt'
def construct_senti_dict(senti_file=''):
    with open(senti_file) as fr:
        lines = fr.readlines()
    lines = [line.strip().split() for line in lines]
    lines = [(i[0],float(i[1])) for i in lines]
    return dict(lines)
sentiment_dict=construct_senti_dict(senti_file)
print('sentiment number =',len(sentiment_dict))


sentiment number = 18540

构建情感极性强度通道


In [5]:
def sentiment_strength(X=[],sentiment_dict=sentiment_dict):
    sentiment_X = [[sentiment_dict[w] if w in sentiment_dict else 0 for w in line ]for line in X]
    sentiment_X = [[ str(int(val*10)) if val <=0 else  '+'+str(int(val*10)) for val in line] for line in sentiment_X]
    return sentiment_X
train_sentiment_X = sentiment_strength(X=train_X,sentiment_dict=sentiment_dict)
dev_sentiment_X = sentiment_strength(X=dev_X,sentiment_dict=sentiment_dict)
test_sentiment_X = sentiment_strength(X=test_X,sentiment_dict=sentiment_dict)

assert len(train_sentiment_X) == len(train_X) 
print(train_sentiment_X[0:5])
print(train_X[0:5])    
print(train_y[0:5])


[['0', '+4', '0', '0', '0', '0', '0', '0', '0', '+2', '0', '0', '-5', '0', '0', '-2', '0'], ['+5', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0'], ['0', '-5', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '+6', '-2', '0', '+2', '0', '0', '-3', '0', '0', '0', '-5', '0', '0', '0', '0', '0', '0', '0', '-2', '0', '0'], ['0', '0', '0', '0', '0', '0', '0', '0', '0', '0'], ['0', '0', '0', '+5', '-2', '0', '+2', '+3', '0', '0', '0', '0', '0', '0', '-3', '0', '+2', '0', '0', '0']]
[['a', 'stirring', ',', 'funny', 'and', 'finally', 'transport', 're-imagining', 'of', 'beauty', 'and', 'the', 'beast', 'and', '1930s', 'horror', 'film'], ['apparently', 'reassemble', 'from', 'the', 'cutting-room', 'floor', 'of', 'any', 'give', 'daytime', 'soap', '.'], ['they', 'presume', 'their', 'audience', 'wo', "n't", 'sit', 'still', 'for', 'a', 'sociology', 'lesson', ',', 'however', 'entertainingly', 'present', ',', 'so', 'they', 'trot', 'out', 'the', 'conventional', 'science-fiction', 'element', 'of', 'bug-eyed', 'monster', 'and', 'futuristic', 'woman', 'in', 'skimpy', 'clothes', '.'], ['the', 'entire', 'movie', 'be', 'fill', 'with', 'deja', 'vu', 'moment', '.'], ['this', 'be', 'a', 'visually', 'stunning', 'rumination', 'on', 'love', ',', 'memory', ',', 'history', 'and', 'the', 'war', 'between', 'art', 'and', 'commerce', '.']]
[4, 1, 1, 2, 3]

否定词。

数据预处理


In [6]:
def token_to_index(datas=[]):
    word_index={}
    count=1
    for data in datas:
        for list_ in data:
            for w in list_:
                if w not in word_index:
                    word_index[w] = count
                    count = count + 1
    print('leng of word_index =',len(word_index))
    for i in range(len(datas)):
        datas[i] = [[ word_index[w] for w in line ] for line in datas[i]] 
    return datas,word_index
X,word_index = token_to_index(datas=[train_X,dev_X,train_sentiment_X,train_tag_X,dev_sentiment_X,dev_tag_X])
train_X,dev_X,train_sentiment_X,train_tag_X,dev_sentiment_X,dev_tag_X = X

print('length of dict_index = ',len(word_index))


leng of word_index = 14525
length of dict_index =  14525

In [7]:
print(train_sentiment_X[0:2])
print(train_X[0:2])    
print(train_y[0:2])


[[14498, 14499, 14498, 14498, 14498, 14498, 14498, 14498, 14498, 14500, 14498, 14498, 14501, 14498, 14498, 14502, 14498], [14503, 14498, 14498, 14498, 14498, 14498, 14498, 14498, 14498, 14498, 14498, 14498]]
[[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 5, 11, 12, 5, 13, 14, 15], [16, 17, 18, 11, 19, 20, 9, 21, 22, 23, 24, 25]]
[4, 1]

Glove训练好的词向量

利用glove基于twitter训练公开的数据


In [8]:
embedding_dim = 100
we_file = '/home/bruce/data/glove/twitter/glove.twitter.27B.{0}d.txt'.format(embedding_dim)
def get_index_wordembedding(we_file='',word_index={}):
    index_wordembedding ={}
    zeros = np.zeros(embedding_dim)
    for line in open(we_file):
        elements = line.strip().split()
        if elements[0] in  word_index:
            index = word_index[elements[0]]
            wordembedding = [float(i) for i in elements[1:]]
            index_wordembedding[index] = wordembedding
    print('总word的数目= ',len(word_index))
    print('总word embedding 的数目 = ',len(index_wordembedding))
    
    for word,index in word_index.items():
        if index not in index_wordembedding:
            index_wordembedding[index] = zeros
    assert len(index_wordembedding) == len(word_index)
    return index_wordembedding
index_wordembedding = get_index_wordembedding(we_file=we_file,word_index=word_index)


总word的数目=  14525
总word embedding 的数目 =  11850

获取训练好的word embedding 数组,用来初始化 Embedding


In [9]:
def get_trained_embedding(index_wordembedding=None):
    index_we = sorted(index_wordembedding.items())
    print('index_we[0] =',index_we[0])
    trained_embedding = [t[1] for t in index_we]
    zeros = np.zeros(embedding_dim)
    trained_embedding = np.vstack((zeros,trained_embedding))
    return np.array(trained_embedding)

将一个batch大小的index数据,利用index_wordembedding进行embedding


In [10]:
def batch_indexData_embedding(X=None,index_wordembedding={}):
    zeros = np.zeros(embedding_dim)
    return [ [ index_wordembedding[w] if w in index_wordembedding else zeros  for w in line ] for line in X ]

构建模型

模型参数


In [11]:
max_len = 36
batch_size=50

max_features= 14526
#embedding_dims=50

nb_filter = 100
filter_length1 = 3
filter_length2 = 4
filter_length3 = 5
dense1_hindden = 150*2
nb_classes = 5

In [ ]:

错误记录

1.输入的变量和后面同名

CNN -Rand 模型


In [29]:
print('Build model...')
input_random = Input(shape=(max_len,), dtype='int32', name='main_input1')
embedding = Embedding(output_dim=embedding_dim, input_dim=max_features)(input_random)
# 卷积层
conv1 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length1,
                        border_mode = 'valid',
                        activation='relu'
                       )(embedding)
conv2 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length1,
                        border_mode = 'valid',
                        activation='relu'
                       )(embedding)

conv3 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length1,
                        border_mode = 'valid',
                        activation='relu'
                       )(embedding)
conv1 =GlobalMaxPooling1D(conv1)
conv2 =GlobalMaxPooling1D()(conv2)
conv3 =GlobalMaxPooling1D()(conv3)
merged_vector = merge([conv1,conv2,conv3], mode='concat')
# 全连接层
dense_layer = Dense(dense1_hindden)
dens1 = dense_layer(merged_vector)
print('dense_layer input_shape should == (300,)')
print(dense_layer.input_shape)
dens1 = Activation('relu')(dens1)

# softmax层
dens2 = Dense(nb_classes)(dens1)
output_random = Activation('softmax')(dens2)

model = Model(input=input_random,output=output_random)
print('finish build model')
model.compile(optimizer='adadelta',
              loss='categorical_crossentropy',
              metrics=['accuracy'])


Build model...
(None, 100)
dense_layer input_shape should == (300,)
(None, 300)
finish build model

CNN-static 模型


In [12]:
input_static = Input(shape=(max_len,embedding_dim), name='main_input2')
# 卷积层
conv1 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length1,
                        border_mode = 'valid',
                        activation='relu'
                       )(input_static)

conv2 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length1,
                        border_mode = 'valid',
                        activation='relu'
                       )(input_static)

conv3 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length1,
                        border_mode = 'valid',
                        activation='relu'
                       )(input_static)
conv1 =GlobalMaxPooling1D()(conv1)
conv2 =GlobalMaxPooling1D()(conv2)
conv3 =GlobalMaxPooling1D()(conv3)
merged_vector = merge([conv1,conv2,conv3], mode='concat')

# 全连接层
dens1 = Dense(dense1_hindden)(merged_vector)
dens1 = Activation('relu')(dens1)

# softmax层
dens2 = Dense(nb_classes)(dens1)
output_static = Activation('softmax')(dens2)

model = Model(input=input_static,output=output_static)
print('finish build model')
model.compile(optimizer='adadelta',
              loss='categorical_crossentropy',
              metrics=['accuracy'])


finish build model

CNN-non-static 模型


In [32]:
print('Build model...')
input_non_static = Input(shape=(max_len,), dtype='int32', name='main_input1')
#初始化Embedding层
trained_embedding = get_trained_embedding(index_wordembedding=index_wordembedding)

embedding_layer = Embedding(max_features,
                            embedding_dim,
                            weights=[trained_embedding]
                            )

embedding = embedding_layer(input_non_static)

conv1 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length1,
                        border_mode = 'valid',
                        activation='relu'
                       )(embedding)

conv2 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length1,
                        border_mode = 'valid',
                        activation='relu'
                       )(embedding)

conv3 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length1,
                        border_mode = 'valid',
                        activation='relu'
                       )(embedding)
dropout = Dropout(0.5)

conv1 =GlobalMaxPooling1D()(conv1)
conv2 =GlobalMaxPooling1D()(conv2)
conv3 =GlobalMaxPooling1D()(conv3)
#conv1 = dropout(conv1)
#conv2 = dropout(conv2)
#conv3 = dropout(conv3)

merged_vector = merge([conv1,conv2,conv3], mode='concat')
# 全连接层
dense_layer = Dense(dense1_hindden)
dens1 = dense_layer(merged_vector)
print('dense_layer input shpae = ',dense_layer.input_shape)
dens1 = Activation('relu')(dens1)
dens1 = dropout(dens1)

# softmax层
dens2 = Dense(nb_classes)(dens1)
output_non_static = Activation('softmax')(dens2)

model = Model(input=input_non_static,output=output_non_static)
print('finish build model')
model.compile(optimizer='adadelta',
              loss='categorical_crossentropy',
              metrics=['accuracy'])


Build model...
index_we[0] = (1, [0.86323, 0.031356, 0.10169, 0.26639, 0.19313, -0.076727, -0.22647, -0.69596, -0.63946, -0.8632, -0.29465, -0.31175, -4.4257, -0.16769, 0.23197, -0.0085179, -0.063032, -0.044064, -0.23138, 0.59465, -0.1334, -0.61637, -0.019008, -0.31235, -0.2403, -3.112, 0.22267, -0.046524, -0.046095, 1.1434, 0.60818, 0.34767, 0.36155, 0.35258, -0.16617, 0.82837, 0.35088, -0.23608, -0.25425, 0.55587, -1.4276, 0.06918, 0.015027, -0.45487, 0.63978, -0.16407, 0.14985, 0.94771, 0.23274, -0.51445, 0.70982, 0.60018, 0.047234, -0.39084, -0.14794, 0.68263, -0.12995, -0.22846, 0.43185, -0.10681, 0.06544, 0.34506, 0.089428, 0.19983, 1.1775, -0.33236, -0.60181, 0.38324, -0.090755, -0.15759, -0.23093, -0.88441, 0.07837, 0.19774, -0.10609, 0.28091, 0.14899, -0.224, 0.20039, -0.23564, 1.5186, 0.3518, -0.10327, -0.14035, 0.084164, 0.76701, -0.54544, 0.17372, -0.02784, 0.4905, 0.45353, 0.13881, 0.091135, 0.31961, -0.077948, 0.045671, -0.55133, -0.28853, -0.50833, -0.31382])
dense_layer input shpae =  (None, 300)
finish build model

CNN-multichannel 模型


In [ ]:
print('Build model...')
input1 = Input(shape=(max_len,), dtype='int32', name='main_input1')
input2 = Input(shape=(max_len,embedding_dim), name='main_input2')
#input3 = Input(shape=(max_len,), dtype='int32', name='main_input3')

embedding = Embedding(output_dim=embedding_dim, input_dim=max_features)
embedding1 = embedding(input1)
#embedding2 = embedding(input2)
#embedding3 = embedding(input3)
#---------------------------------------------------------------------------
#卷积方法一:每个通道,用不同的卷积核
'''
cov1_out1 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length,
                        border_mode = 'valid',
                        activation='relu'
                       )(embedding1)
cov1_out2 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length,
                        border_mode = 'valid',
                        activation='relu'
                       )(embedding2)
cov1_out3 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length,
                        border_mode = 'valid',
                        activation='relu'
                       )(embedding3)
'''
# 卷积方法二:每个通道用相同的卷积核
conv11 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length1,
                        border_mode = 'valid',
                        activation='relu',
                        W_regularizer=l2(3)
                       )
conv12 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length2,
                        border_mode = 'valid',
                        activation='relu',
                        W_regularizer=l2(3)
                       )
conv13 = Convolution1D(nb_filter = nb_filter,
                        filter_length = filter_length3,
                        border_mode = 'valid',
                        activation='relu',
                        W_regularizer=l2(3)
                       )
dropout = Dropout(0.5)
#第一个通道
cov1_out11  = conv11(embedding1)
cov1_out12  = conv12(embedding1)
cov1_out13  = conv13(embedding1)
cov1_out11 = dropout(cov1_out11)
cov1_out12 = dropout(cov1_out12)
cov1_out13 = dropout(cov1_out13)

#第二个通道
cov1_out14 = conv11(input2)
cov1_out15 = conv12(input2)
cov1_out16 = conv13(input2)
cov1_out14 = dropout(cov1_out14)
cov1_out15 = dropout(cov1_out15)
cov1_out16 = dropout(cov1_out16)
#cov1_out2 = conv(embedding2)
#cov1_out3 = conv(embedding3)

#------------------------------------------------------------------------------
maxpooling = GlobalMaxPooling1D()
conv11 = maxpooling(cov1_out11)
conv12 = maxpooling(cov1_out12)
conv13 = maxpooling(cov1_out13)
conv14 = maxpooling(cov1_out14)
conv15 = maxpooling(cov1_out15)
conv16 = maxpooling(cov1_out16)

merged_vector = merge([conv11,conv12,conv13,conv14,conv15,conv16], mode='concat')

#dropout = Dropout(0.5)
#merged_vector = dropout(merged_vector)

dens1 = Dense(dense1_hindden)(merged_vector)
dens1 = Activation('relu')(dens1)


dens2 = Dense(nb_classes)(dens1)
output = Activation('softmax')(dens2)
model = Model(input=[input1,input2],output=output)

print('finish build model')
model.compile(optimizer='adadelta',
              loss='categorical_crossentropy',
              metrics=['accuracy'])

模型图


In [33]:
from IPython.display import SVG
from keras.utils.visualize_util import model_to_dot
SVG(model_to_dot(model).create(prog='dot', format='svg'))


Out[33]:
G 140400143125080 main_input1 (InputLayer) 140400154600784 embedding_9 (Embedding) 140400143125080->140400154600784 140400143125416 convolution1d_25 (Convolution1D) 140400154600784->140400143125416 140400142248648 convolution1d_26 (Convolution1D) 140400154600784->140400142248648 140400143091584 convolution1d_27 (Convolution1D) 140400154600784->140400143091584 140400142790104 globalmaxpooling1d_25 (GlobalMaxPooling1D) 140400143125416->140400142790104 140400142790216 globalmaxpooling1d_26 (GlobalMaxPooling1D) 140400142248648->140400142790216 140400142787976 globalmaxpooling1d_27 (GlobalMaxPooling1D) 140400143091584->140400142787976 140400142788144 merge_9 (Merge) 140400142790104->140400142788144 140400142790216->140400142788144 140400142787976->140400142788144 140400142786688 dense_14 (Dense) 140400142788144->140400142786688 140400143248072 activation_13 (Activation) 140400142786688->140400143248072 140400143221424 dropout_7 (Dropout) 140400143248072->140400143221424 140400142819512 dense_15 (Dense) 140400143221424->140400142819512 140400142846328 activation_14 (Activation) 140400142819512->140400142846328

模型输入


In [14]:
print(type(train_y[0]))
train_y_model = np_utils.to_categorical(train_y, nb_classes)
dev_y_model = np_utils.to_categorical(dev_y, nb_classes)
train_X_model = sequence.pad_sequences(train_X, maxlen=max_len)
dev_X_model = sequence.pad_sequences(dev_X, maxlen=max_len)
train_sentiment_X_model = sequence.pad_sequences(train_sentiment_X,maxlen=max_len)
train_tag_X_model= sequence.pad_sequences(train_tag_X,maxlen=max_len)
dev_sentiment_X_model = sequence.pad_sequences(dev_sentiment_X,maxlen=max_len)
dev_tag_X_model = sequence.pad_sequences(dev_tag_X,maxlen=max_len)
#train_embedding_X_model = batch_indexData_embedding(X=train_X_model,index_wordembedding=index_wordembedding)
dev_embedding_X_model = batch_indexData_embedding(X=dev_X_model,index_wordembedding=index_wordembedding)
dev_embedding_X_model = np.array(dev_embedding_X_model)


<class 'int'>

测试数据


In [15]:
#转为index 
def to_index(word_index={},data=[]):
    return [[word_index[w] if w in word_index else 0  for w in sentence] for sentence in data]
test_index_X = to_index(word_index,test_X)
#删补
test_index_X = sequence.pad_sequences(test_index_X, maxlen=max_len)
#embedding
test_embedding_X = batch_indexData_embedding(X=test_index_X,index_wordembedding=index_wordembedding)
test_y = np_utils.to_categorical(test_y, nb_classes)

In [16]:
def my_generator4(X1=None,X2=None,X3=None,x4=None,y=None):
    i = 0
    max_i = int(len(X1)/batch_size)
    while True:
        i = i % max_i
        x1_batch = X1[i*batch_size:(i+1)*batch_size]
        x2_batch = X2[i*batch_size:(i+1)*batch_size]
        x3_batch = X3[i*batch_size:(i+1)*batch_size]
       
        y_batch = y[i*batch_size:(i+1)*batch_size]
        yield ([x1_batch,x2_batch,x3_batch],y_batch)
        i = i + 1
def my_generator3(X1=None,y=None):
    i = 0
    max_i = int(len(X1)/batch_size)
    while True:
        i = i % max_i
        x1_batch = X1[i*batch_size:(i+1)*batch_size]
        x2_batch = batch_indexData_embedding(X=x1_batch,index_wordembedding=index_wordembedding)
        x2_batch = np.array(x2_batch)
       
        y_batch = y[i*batch_size:(i+1)*batch_size]
        yield ([x1_batch,x2_batch],y_batch)
        i = i + 1
def my_generator1(X1=None,y=None):
    i = 0
    max_i = int(len(X1)/batch_size)
    while True:
        i = i % max_i
        x1_batch = X1[i*batch_size:(i+1)*batch_size]
        y_batch = y[i*batch_size:(i+1)*batch_size]
        yield (x1_batch,y_batch)
        i = i + 1
def my_generator2(X1=None,y=None):
    i = 0
    max_i = int(len(X1)/batch_size)
    while True:
        i = i % max_i
        x1_batch = X1[i*batch_size:(i+1)*batch_size]
        x1_batch = batch_indexData_embedding(X=x1_batch,index_wordembedding=index_wordembedding)
        x1_batch = np.array(x1_batch)
       
        y_batch = y[i*batch_size:(i+1)*batch_size]
        yield (x1_batch,y_batch)
        i = i + 1

训练模型

cnn random 模型


In [26]:
model.fit_generator(my_generator1(train_X_model,train_y_model),samples_per_epoch = 32*100,nb_epoch=100,verbose=1,validation_data=(dev_X_model,dev_y_model))


Epoch 1/100
3200/3200 [==============================] - 141s - loss: 1.5767 - acc: 0.2703 - val_loss: 1.5712 - val_acc: 0.2525
Epoch 2/100
3200/3200 [==============================] - 141s - loss: 1.5596 - acc: 0.2778 - val_loss: 1.5728 - val_acc: 0.2598
Epoch 3/100
3200/3200 [==============================] - 141s - loss: 1.5658 - acc: 0.2894 - val_loss: 1.5655 - val_acc: 0.3061
Epoch 4/100
3200/3200 [==============================] - 141s - loss: 1.5466 - acc: 0.2975 - val_loss: 1.5611 - val_acc: 0.3025
Epoch 5/100
3200/3200 [==============================] - 141s - loss: 1.5434 - acc: 0.3028 - val_loss: 1.5446 - val_acc: 0.3052
Epoch 6/100
3200/3200 [==============================] - 141s - loss: 1.5209 - acc: 0.3100 - val_loss: 1.5319 - val_acc: 0.3252
Epoch 7/100
3200/3200 [==============================] - 141s - loss: 1.5031 - acc: 0.3316 - val_loss: 1.5104 - val_acc: 0.3442
Epoch 8/100
3200/3200 [==============================] - 141s - loss: 1.4846 - acc: 0.3466 - val_loss: 1.4928 - val_acc: 0.3415
Epoch 9/100
3200/3200 [==============================] - 141s - loss: 1.4453 - acc: 0.3744 - val_loss: 1.4612 - val_acc: 0.3597
Epoch 10/100
3200/3200 [==============================] - 141s - loss: 1.4050 - acc: 0.3887 - val_loss: 1.4622 - val_acc: 0.3470
Epoch 11/100
3200/3200 [==============================] - 141s - loss: 1.3714 - acc: 0.4122 - val_loss: 1.4054 - val_acc: 0.3787
Epoch 12/100
3200/3200 [==============================] - 141s - loss: 1.3243 - acc: 0.4419 - val_loss: 1.3851 - val_acc: 0.3896
Epoch 13/100
3200/3200 [==============================] - 141s - loss: 1.2690 - acc: 0.4550 - val_loss: 1.3682 - val_acc: 0.3960
Epoch 14/100
3200/3200 [==============================] - 141s - loss: 1.2332 - acc: 0.4875 - val_loss: 1.3567 - val_acc: 0.4069
Epoch 15/100
3200/3200 [==============================] - 141s - loss: 1.1868 - acc: 0.4997 - val_loss: 1.3515 - val_acc: 0.3969
Epoch 16/100
3200/3200 [==============================] - 141s - loss: 1.1480 - acc: 0.5141 - val_loss: 1.3631 - val_acc: 0.4015
Epoch 17/100
3200/3200 [==============================] - 141s - loss: 1.0822 - acc: 0.5591 - val_loss: 1.3894 - val_acc: 0.3851
Epoch 18/100
3200/3200 [==============================] - 141s - loss: 1.0563 - acc: 0.5625 - val_loss: 1.3679 - val_acc: 0.4105
Epoch 19/100
3200/3200 [==============================] - 141s - loss: 1.0052 - acc: 0.6003 - val_loss: 1.3666 - val_acc: 0.4060
Epoch 20/100
3200/3200 [==============================] - 141s - loss: 0.9510 - acc: 0.6266 - val_loss: 1.3650 - val_acc: 0.4051
Epoch 21/100
3200/3200 [==============================] - 141s - loss: 0.9126 - acc: 0.6431 - val_loss: 1.3916 - val_acc: 0.3942
Epoch 22/100
3200/3200 [==============================] - 141s - loss: 0.8600 - acc: 0.6825 - val_loss: 1.3978 - val_acc: 0.4169
Epoch 23/100
3200/3200 [==============================] - 141s - loss: 0.8188 - acc: 0.6981 - val_loss: 1.4001 - val_acc: 0.4142
Epoch 24/100
3200/3200 [==============================] - 141s - loss: 0.7773 - acc: 0.7191 - val_loss: 1.4086 - val_acc: 0.4033
Epoch 25/100
3200/3200 [==============================] - 141s - loss: 0.7109 - acc: 0.7609 - val_loss: 1.4367 - val_acc: 0.3806
Epoch 26/100
3200/3200 [==============================] - 141s - loss: 0.6782 - acc: 0.7694 - val_loss: 1.4602 - val_acc: 0.4051
Epoch 27/100
3200/3200 [==============================] - 141s - loss: 0.6215 - acc: 0.8016 - val_loss: 1.4900 - val_acc: 0.3760
Epoch 28/100
3200/3200 [==============================] - 141s - loss: 0.5772 - acc: 0.8241 - val_loss: 1.5510 - val_acc: 0.3951
Epoch 29/100
3200/3200 [==============================] - 141s - loss: 0.5304 - acc: 0.8406 - val_loss: 1.5368 - val_acc: 0.3942
Epoch 30/100
2950/3200 [==========================>...] - ETA: 9s - loss: 0.4810 - acc: 0.8664 
---------------------------------------------------------------------------
KeyboardInterrupt                         Traceback (most recent call last)
<ipython-input-26-1deed1343cda> in <module>()
----> 1 model.fit_generator(my_generator1(train_X_model,train_y_model),samples_per_epoch = 32*100,nb_epoch=100,verbose=1,validation_data=(dev_X_model,dev_y_model))

/home/bruce/anaconda3/lib/python3.5/site-packages/keras/engine/training.py in fit_generator(self, generator, samples_per_epoch, nb_epoch, verbose, callbacks, validation_data, nb_val_samples, class_weight, max_q_size, nb_worker, pickle_safe)
   1441                     outs = self.train_on_batch(x, y,
   1442                                                sample_weight=sample_weight,
-> 1443                                                class_weight=class_weight)
   1444                 except:
   1445                     _stop.set()

/home/bruce/anaconda3/lib/python3.5/site-packages/keras/engine/training.py in train_on_batch(self, x, y, sample_weight, class_weight)
   1219             ins = x + y + sample_weights
   1220         self._make_train_function()
-> 1221         outputs = self.train_function(ins)
   1222         if len(outputs) == 1:
   1223             return outputs[0]

/home/bruce/anaconda3/lib/python3.5/site-packages/keras/backend/theano_backend.py in __call__(self, inputs)
    715     def __call__(self, inputs):
    716         assert type(inputs) in {list, tuple}
--> 717         return self.function(*inputs)
    718 
    719 

/home/bruce/anaconda3/lib/python3.5/site-packages/theano/compile/function_module.py in __call__(self, *args, **kwargs)
    857         t0_fn = time.time()
    858         try:
--> 859             outputs = self.fn()
    860         except Exception:
    861             if hasattr(self.fn, 'position_of_error'):

/home/bruce/anaconda3/lib/python3.5/site-packages/theano/gof/op.py in rval(p, i, o, n)
    909         if params is graph.NoParams:
    910             # default arguments are stored in the closure of `rval`
--> 911             def rval(p=p, i=node_input_storage, o=node_output_storage, n=node):
    912                 r = p(n, [x[0] for x in i], o)
    913                 for o in node.outputs:

KeyboardInterrupt: 

cnn random 结果

time max_len batch_size max_features embedding_dims nb_filter filter_length dense1_hindden val_acc
2016-11-25 9:52 36 50 14526 100 各100 3,4,5 300 0.4169

cnn static 模型


In [17]:
model.fit_generator(my_generator2(train_X_model,train_y_model),samples_per_epoch = 32*100,nb_epoch=100,verbose=1,validation_data=(test_embedding_X,test_y))


Epoch 1/100
3200/3200 [==============================] - 7s - loss: 1.5584 - acc: 0.2828 - val_loss: 1.5444 - val_acc: 0.2643
Epoch 2/100
3200/3200 [==============================] - 7s - loss: 1.4847 - acc: 0.3403 - val_loss: 1.4650 - val_acc: 0.3371
Epoch 3/100
3200/3200 [==============================] - 7s - loss: 1.4243 - acc: 0.3666 - val_loss: 1.3837 - val_acc: 0.3986
Epoch 4/100
3200/3200 [==============================] - 7s - loss: 1.3394 - acc: 0.4072 - val_loss: 1.3730 - val_acc: 0.4000
Epoch 5/100
3200/3200 [==============================] - 7s - loss: 1.3169 - acc: 0.4359 - val_loss: 1.3482 - val_acc: 0.3896
Epoch 6/100
3200/3200 [==============================] - 7s - loss: 1.2558 - acc: 0.4484 - val_loss: 1.3557 - val_acc: 0.3792
Epoch 7/100
3200/3200 [==============================] - 7s - loss: 1.2113 - acc: 0.4834 - val_loss: 1.3742 - val_acc: 0.3471
Epoch 8/100
3200/3200 [==============================] - 7s - loss: 1.2069 - acc: 0.4903 - val_loss: 1.5117 - val_acc: 0.3394
Epoch 9/100
3200/3200 [==============================] - 7s - loss: 1.1149 - acc: 0.5384 - val_loss: 1.3993 - val_acc: 0.3787
Epoch 10/100
3200/3200 [==============================] - 7s - loss: 1.0970 - acc: 0.5488 - val_loss: 1.3774 - val_acc: 0.3900
Epoch 11/100
3200/3200 [==============================] - 7s - loss: 1.0631 - acc: 0.5684 - val_loss: 1.3594 - val_acc: 0.3787
Epoch 12/100
3200/3200 [==============================] - 7s - loss: 0.9805 - acc: 0.6175 - val_loss: 1.3565 - val_acc: 0.4253
Epoch 13/100
3200/3200 [==============================] - 7s - loss: 0.9682 - acc: 0.6194 - val_loss: 1.3978 - val_acc: 0.3665
Epoch 14/100
3200/3200 [==============================] - 7s - loss: 0.9007 - acc: 0.6450 - val_loss: 1.3462 - val_acc: 0.4249
Epoch 15/100
3200/3200 [==============================] - 7s - loss: 0.8420 - acc: 0.6863 - val_loss: 1.3573 - val_acc: 0.4244
Epoch 16/100
3200/3200 [==============================] - 7s - loss: 0.8240 - acc: 0.6922 - val_loss: 1.5582 - val_acc: 0.3471
Epoch 17/100
3200/3200 [==============================] - 7s - loss: 0.7362 - acc: 0.7537 - val_loss: 1.6107 - val_acc: 0.3579
Epoch 18/100
3200/3200 [==============================] - 7s - loss: 0.7029 - acc: 0.7566 - val_loss: 1.6109 - val_acc: 0.3529
Epoch 19/100
3200/3200 [==============================] - 7s - loss: 0.6702 - acc: 0.7709 - val_loss: 1.4396 - val_acc: 0.4195
Epoch 20/100
3200/3200 [==============================] - 7s - loss: 0.5816 - acc: 0.8253 - val_loss: 1.5092 - val_acc: 0.4045
Epoch 21/100
3200/3200 [==============================] - 7s - loss: 0.5794 - acc: 0.8078 - val_loss: 1.6137 - val_acc: 0.4014
Epoch 22/100
3200/3200 [==============================] - 7s - loss: 0.4893 - acc: 0.8612 - val_loss: 1.5950 - val_acc: 0.3751
Epoch 23/100
3200/3200 [==============================] - 7s - loss: 0.4498 - acc: 0.8812 - val_loss: 1.6020 - val_acc: 0.4231
Epoch 24/100
3200/3200 [==============================] - 7s - loss: 0.4459 - acc: 0.8769 - val_loss: 1.5865 - val_acc: 0.4172
Epoch 25/100
3200/3200 [==============================] - 7s - loss: 0.3555 - acc: 0.9181 - val_loss: 1.6858 - val_acc: 0.3869
Epoch 26/100
3200/3200 [==============================] - 7s - loss: 0.3580 - acc: 0.9050 - val_loss: 1.6441 - val_acc: 0.4032
Epoch 27/100
3200/3200 [==============================] - 7s - loss: 0.3220 - acc: 0.9225 - val_loss: 1.7096 - val_acc: 0.3842
Epoch 28/100
3200/3200 [==============================] - 7s - loss: 0.2680 - acc: 0.9494 - val_loss: 1.8914 - val_acc: 0.3656
Epoch 29/100
3200/3200 [==============================] - 7s - loss: 0.2819 - acc: 0.9331 - val_loss: 1.7921 - val_acc: 0.4050
Epoch 30/100
3200/3200 [==============================] - 7s - loss: 0.2098 - acc: 0.9706 - val_loss: 1.8323 - val_acc: 0.3869
Epoch 31/100
3200/3200 [==============================] - 7s - loss: 0.1901 - acc: 0.9741 - val_loss: 1.9775 - val_acc: 0.3751
Epoch 32/100
3200/3200 [==============================] - 7s - loss: 0.2023 - acc: 0.9628 - val_loss: 1.9483 - val_acc: 0.3733
Epoch 33/100
3200/3200 [==============================] - 7s - loss: 0.1446 - acc: 0.9872 - val_loss: 1.9446 - val_acc: 0.3878
Epoch 34/100
3200/3200 [==============================] - 7s - loss: 0.1630 - acc: 0.9709 - val_loss: 1.9490 - val_acc: 0.4027
Epoch 35/100
3200/3200 [==============================] - 7s - loss: 0.1191 - acc: 0.9906 - val_loss: 1.9762 - val_acc: 0.3851
Epoch 36/100
3200/3200 [==============================] - 7s - loss: 0.1015 - acc: 0.9925 - val_loss: 2.0109 - val_acc: 0.3995
Epoch 37/100
3200/3200 [==============================] - 7s - loss: 0.1595 - acc: 0.9669 - val_loss: 2.0891 - val_acc: 0.3896
Epoch 38/100
3200/3200 [==============================] - 7s - loss: 0.0829 - acc: 0.9944 - val_loss: 2.2068 - val_acc: 0.3796
Epoch 39/100
3200/3200 [==============================] - 7s - loss: 0.0943 - acc: 0.9838 - val_loss: 2.1990 - val_acc: 0.3982
Epoch 40/100
3200/3200 [==============================] - 7s - loss: 0.0690 - acc: 0.9953 - val_loss: 2.2403 - val_acc: 0.3760
Epoch 41/100
3200/3200 [==============================] - 7s - loss: 0.0705 - acc: 0.9922 - val_loss: 2.2084 - val_acc: 0.3819
Epoch 42/100
3200/3200 [==============================] - 7s - loss: 0.0502 - acc: 0.9969 - val_loss: 2.2662 - val_acc: 0.3751
Epoch 43/100
3200/3200 [==============================] - 7s - loss: 0.0976 - acc: 0.9809 - val_loss: 2.4123 - val_acc: 0.3683
Epoch 44/100
3200/3200 [==============================] - 7s - loss: 0.0428 - acc: 0.9975 - val_loss: 2.3838 - val_acc: 0.3792
Epoch 45/100
3200/3200 [==============================] - 7s - loss: 0.0403 - acc: 0.9984 - val_loss: 2.4797 - val_acc: 0.3751
Epoch 46/100
3200/3200 [==============================] - 7s - loss: 0.0328 - acc: 0.9988 - val_loss: 2.4417 - val_acc: 0.3851
Epoch 47/100
3200/3200 [==============================] - 7s - loss: 0.1214 - acc: 0.9656 - val_loss: 2.4314 - val_acc: 0.3937
Epoch 48/100
3200/3200 [==============================] - 7s - loss: 0.0284 - acc: 0.9994 - val_loss: 2.5106 - val_acc: 0.3715
Epoch 49/100
3200/3200 [==============================] - 7s - loss: 0.0294 - acc: 0.9975 - val_loss: 3.4529 - val_acc: 0.4104
Epoch 50/100
3200/3200 [==============================] - 7s - loss: 0.1825 - acc: 0.9653 - val_loss: 2.4843 - val_acc: 0.3765
Epoch 51/100
3200/3200 [==============================] - 7s - loss: 0.0213 - acc: 0.9997 - val_loss: 2.5460 - val_acc: 0.3742
Epoch 52/100
3200/3200 [==============================] - 7s - loss: 0.0236 - acc: 0.9988 - val_loss: 2.6172 - val_acc: 0.3833
Epoch 53/100
3200/3200 [==============================] - 7s - loss: 0.0893 - acc: 0.9787 - val_loss: 2.6454 - val_acc: 0.3738
Epoch 54/100
3200/3200 [==============================] - 7s - loss: 0.0186 - acc: 0.9994 - val_loss: 2.6241 - val_acc: 0.3851
Epoch 55/100
3200/3200 [==============================] - 7s - loss: 0.0150 - acc: 0.9991 - val_loss: 2.7681 - val_acc: 0.3937
Epoch 56/100
3200/3200 [==============================] - 7s - loss: 0.0655 - acc: 0.9800 - val_loss: 2.6864 - val_acc: 0.3824
Epoch 57/100
3200/3200 [==============================] - 7s - loss: 0.0149 - acc: 0.9994 - val_loss: 2.7132 - val_acc: 0.3774
Epoch 58/100
3200/3200 [==============================] - 7s - loss: 0.0254 - acc: 0.9931 - val_loss: 3.9512 - val_acc: 0.3706
Epoch 59/100
3200/3200 [==============================] - 7s - loss: 0.0295 - acc: 0.9953 - val_loss: 2.7956 - val_acc: 0.3851
Epoch 60/100
3200/3200 [==============================] - 7s - loss: 0.0095 - acc: 0.9997 - val_loss: 2.8772 - val_acc: 0.3787
Epoch 61/100
3200/3200 [==============================] - 7s - loss: 0.4310 - acc: 0.9150 - val_loss: 4.1280 - val_acc: 0.3240
Epoch 62/100
3200/3200 [==============================] - 7s - loss: 0.0692 - acc: 0.9903 - val_loss: 2.8018 - val_acc: 0.3914
Epoch 63/100
3200/3200 [==============================] - 7s - loss: 0.0144 - acc: 0.9991 - val_loss: 2.8189 - val_acc: 0.3946
Epoch 64/100
3200/3200 [==============================] - 7s - loss: 0.1646 - acc: 0.9572 - val_loss: 2.8573 - val_acc: 0.3919
Epoch 65/100
3200/3200 [==============================] - 7s - loss: 0.0177 - acc: 0.9994 - val_loss: 2.8017 - val_acc: 0.3891
Epoch 66/100
3200/3200 [==============================] - 7s - loss: 0.0104 - acc: 0.9994 - val_loss: 2.8127 - val_acc: 0.3805
Epoch 67/100
3200/3200 [==============================] - 7s - loss: 0.1102 - acc: 0.9744 - val_loss: 3.0060 - val_acc: 0.3814
Epoch 68/100
3200/3200 [==============================] - 7s - loss: 0.0127 - acc: 0.9997 - val_loss: 2.9351 - val_acc: 0.3851
Epoch 69/100
3200/3200 [==============================] - 7s - loss: 0.0080 - acc: 0.9994 - val_loss: 2.8897 - val_acc: 0.3842
Epoch 70/100
3200/3200 [==============================] - 7s - loss: 0.1338 - acc: 0.9694 - val_loss: 2.9361 - val_acc: 0.3792
Epoch 71/100
3200/3200 [==============================] - 7s - loss: 0.0121 - acc: 0.9991 - val_loss: 2.9441 - val_acc: 0.3914
Epoch 72/100
3200/3200 [==============================] - 7s - loss: 0.0057 - acc: 1.0000 - val_loss: 3.0731 - val_acc: 0.3742
Epoch 73/100
3200/3200 [==============================] - 7s - loss: 0.1114 - acc: 0.9709 - val_loss: 2.9618 - val_acc: 0.3860
Epoch 74/100
3200/3200 [==============================] - 7s - loss: 0.0096 - acc: 0.9991 - val_loss: 2.9474 - val_acc: 0.3783
Epoch 75/100
3200/3200 [==============================] - 7s - loss: 0.0049 - acc: 0.9997 - val_loss: 3.0044 - val_acc: 0.3896
Epoch 76/100
3200/3200 [==============================] - 7s - loss: 0.0087 - acc: 0.9997 - val_loss: 3.1179 - val_acc: 0.3787
Epoch 77/100
3200/3200 [==============================] - 7s - loss: 0.0633 - acc: 0.9847 - val_loss: 3.0595 - val_acc: 0.3805
Epoch 78/100
3200/3200 [==============================] - 7s - loss: 0.0060 - acc: 0.9994 - val_loss: 3.0769 - val_acc: 0.3719
Epoch 79/100
3200/3200 [==============================] - 7s - loss: 0.0069 - acc: 0.9991 - val_loss: 3.1592 - val_acc: 0.3941
Epoch 80/100
3200/3200 [==============================] - 7s - loss: 0.0036 - acc: 1.0000 - val_loss: 3.2115 - val_acc: 0.3923
Epoch 81/100
3200/3200 [==============================] - 7s - loss: 0.2330 - acc: 0.9537 - val_loss: 3.2622 - val_acc: 0.3751
Epoch 82/100
3200/3200 [==============================] - 7s - loss: 0.0081 - acc: 0.9988 - val_loss: 3.1765 - val_acc: 0.3756
Epoch 83/100
3200/3200 [==============================] - 7s - loss: 0.0040 - acc: 0.9997 - val_loss: 3.2127 - val_acc: 0.3810
Epoch 84/100
3200/3200 [==============================] - 7s - loss: 0.0050 - acc: 0.9997 - val_loss: 3.2501 - val_acc: 0.3697
Epoch 85/100
3200/3200 [==============================] - 7s - loss: 0.0058 - acc: 0.9991 - val_loss: 3.2639 - val_acc: 0.3819
Epoch 86/100
3200/3200 [==============================] - 7s - loss: 0.0074 - acc: 0.9975 - val_loss: 4.9147 - val_acc: 0.3498
Epoch 87/100
3200/3200 [==============================] - 7s - loss: 0.0766 - acc: 0.9831 - val_loss: 3.3420 - val_acc: 0.3846
Epoch 88/100
3200/3200 [==============================] - 7s - loss: 0.0023 - acc: 1.0000 - val_loss: 3.3558 - val_acc: 0.3851
Epoch 89/100
3200/3200 [==============================] - 7s - loss: 0.0033 - acc: 0.9994 - val_loss: 3.3681 - val_acc: 0.3796
Epoch 90/100
3200/3200 [==============================] - 7s - loss: 0.0053 - acc: 0.9991 - val_loss: 3.3069 - val_acc: 0.3828
Epoch 91/100
3200/3200 [==============================] - 7s - loss: 0.0029 - acc: 0.9997 - val_loss: 3.3691 - val_acc: 0.3778
Epoch 92/100
3200/3200 [==============================] - 7s - loss: 0.0022 - acc: 0.9997 - val_loss: 3.4003 - val_acc: 0.3769
Epoch 93/100
3200/3200 [==============================] - 7s - loss: 0.0042 - acc: 0.9991 - val_loss: 3.4217 - val_acc: 0.3905
Epoch 94/100
3200/3200 [==============================] - 7s - loss: 0.0027 - acc: 0.9997 - val_loss: 3.4271 - val_acc: 0.3878
Epoch 95/100
3200/3200 [==============================] - 7s - loss: 0.0046 - acc: 0.9988 - val_loss: 3.5735 - val_acc: 0.3878
Epoch 96/100
3200/3200 [==============================] - 7s - loss: 8.3503e-04 - acc: 1.0000 - val_loss: 3.5545 - val_acc: 0.3760
Epoch 97/100
3200/3200 [==============================] - 7s - loss: 0.0035 - acc: 0.9994 - val_loss: 3.5897 - val_acc: 0.3900
Epoch 98/100
3200/3200 [==============================] - 7s - loss: 0.0035 - acc: 0.9991 - val_loss: 3.5030 - val_acc: 0.3787
Epoch 99/100
3200/3200 [==============================] - 7s - loss: 0.0027 - acc: 0.9997 - val_loss: 3.5869 - val_acc: 0.3738
Epoch 100/100
3200/3200 [==============================] - 7s - loss: 0.0014 - acc: 0.9997 - val_loss: 3.5620 - val_acc: 0.3805
Out[17]:
<keras.callbacks.History at 0x7f3af9a4ac18>

cnn static 结果

time max_len batch_size max_features embedding_dims nb_filter filter_length dense1_hindden val_acc
2016-11-25 9:52 36 50 14526 100 各100 3,4,5 300 0.4253

In [ ]:

cnn non-static 模型


In [34]:
model.fit_generator(my_generator1(train_X_model,train_y_model),samples_per_epoch = 50*40,nb_epoch=100,verbose=1,validation_data=(test_index_X,test_y))


Epoch 1/100
2000/2000 [==============================] - 6s - loss: 1.6160 - acc: 0.2560 - val_loss: 1.5496 - val_acc: 0.3081
Epoch 2/100
2000/2000 [==============================] - 6s - loss: 1.5724 - acc: 0.2760 - val_loss: 1.5286 - val_acc: 0.3308
Epoch 3/100
2000/2000 [==============================] - 6s - loss: 1.5372 - acc: 0.3175 - val_loss: 1.5180 - val_acc: 0.3290
Epoch 4/100
2000/2000 [==============================] - 6s - loss: 1.5177 - acc: 0.3265 - val_loss: 1.4710 - val_acc: 0.3719
Epoch 5/100
2000/2000 [==============================] - 6s - loss: 1.4547 - acc: 0.3760 - val_loss: 1.4388 - val_acc: 0.3661
Epoch 6/100
2000/2000 [==============================] - 6s - loss: 1.4269 - acc: 0.3660 - val_loss: 1.4374 - val_acc: 0.3665
Epoch 7/100
2000/2000 [==============================] - 6s - loss: 1.4063 - acc: 0.3870 - val_loss: 1.3889 - val_acc: 0.3977
Epoch 8/100
2000/2000 [==============================] - 6s - loss: 1.4010 - acc: 0.3950 - val_loss: 1.3765 - val_acc: 0.3950
Epoch 9/100
2000/2000 [==============================] - 6s - loss: 1.3477 - acc: 0.4130 - val_loss: 1.3484 - val_acc: 0.4041
Epoch 10/100
2000/2000 [==============================] - 6s - loss: 1.3121 - acc: 0.4275 - val_loss: 1.3391 - val_acc: 0.4068
Epoch 11/100
2000/2000 [==============================] - 6s - loss: 1.3002 - acc: 0.4345 - val_loss: 1.3507 - val_acc: 0.3977
Epoch 12/100
2000/2000 [==============================] - 6s - loss: 1.3089 - acc: 0.4450 - val_loss: 1.4051 - val_acc: 0.3805
Epoch 13/100
2000/2000 [==============================] - 6s - loss: 1.2846 - acc: 0.4400 - val_loss: 1.3321 - val_acc: 0.4118
Epoch 14/100
2000/2000 [==============================] - 6s - loss: 1.2232 - acc: 0.4680 - val_loss: 1.3185 - val_acc: 0.4172
Epoch 15/100
2000/2000 [==============================] - 6s - loss: 1.2368 - acc: 0.4765 - val_loss: 1.3243 - val_acc: 0.4158
Epoch 16/100
2000/2000 [==============================] - 6s - loss: 1.2328 - acc: 0.4780 - val_loss: 1.3783 - val_acc: 0.3855
Epoch 17/100
2000/2000 [==============================] - 6s - loss: 1.2029 - acc: 0.4845 - val_loss: 1.3194 - val_acc: 0.3991
Epoch 18/100
2000/2000 [==============================] - 6s - loss: 1.1488 - acc: 0.5125 - val_loss: 1.3011 - val_acc: 0.4276
Epoch 19/100
2000/2000 [==============================] - 6s - loss: 1.1566 - acc: 0.5080 - val_loss: 1.2884 - val_acc: 0.4357
Epoch 20/100
2000/2000 [==============================] - 6s - loss: 1.1235 - acc: 0.5365 - val_loss: 1.3692 - val_acc: 0.3905
Epoch 21/100
2000/2000 [==============================] - 6s - loss: 1.1291 - acc: 0.5290 - val_loss: 1.3522 - val_acc: 0.3932
Epoch 22/100
2000/2000 [==============================] - 6s - loss: 1.0810 - acc: 0.5560 - val_loss: 1.3144 - val_acc: 0.4276
Epoch 23/100
2000/2000 [==============================] - 6s - loss: 1.0517 - acc: 0.5695 - val_loss: 1.3446 - val_acc: 0.3778
Epoch 24/100
2000/2000 [==============================] - 6s - loss: 1.0346 - acc: 0.5720 - val_loss: 1.3070 - val_acc: 0.4226
Epoch 25/100
2000/2000 [==============================] - 6s - loss: 1.0425 - acc: 0.5700 - val_loss: 1.3134 - val_acc: 0.4190
Epoch 26/100
2000/2000 [==============================] - 6s - loss: 0.9791 - acc: 0.6095 - val_loss: 1.3013 - val_acc: 0.4339
Epoch 27/100
2000/2000 [==============================] - 6s - loss: 0.9489 - acc: 0.6155 - val_loss: 1.3093 - val_acc: 0.4222
Epoch 28/100
2000/2000 [==============================] - 6s - loss: 0.9355 - acc: 0.6250 - val_loss: 1.3139 - val_acc: 0.4154
Epoch 29/100
2000/2000 [==============================] - 6s - loss: 0.9361 - acc: 0.6315 - val_loss: 1.3453 - val_acc: 0.4113
Epoch 30/100
2000/2000 [==============================] - 6s - loss: 0.8961 - acc: 0.6515 - val_loss: 1.3303 - val_acc: 0.4412
Epoch 31/100
2000/2000 [==============================] - 6s - loss: 0.8399 - acc: 0.6670 - val_loss: 1.3601 - val_acc: 0.4072
Epoch 32/100
2000/2000 [==============================] - 6s - loss: 0.8414 - acc: 0.6825 - val_loss: 1.3738 - val_acc: 0.3991
Epoch 33/100
2000/2000 [==============================] - 6s - loss: 0.8132 - acc: 0.6940 - val_loss: 1.4456 - val_acc: 0.3910
Epoch 34/100
2000/2000 [==============================] - 6s - loss: 0.7940 - acc: 0.6985 - val_loss: 1.3834 - val_acc: 0.3959
Epoch 35/100
2000/2000 [==============================] - 6s - loss: 0.7563 - acc: 0.7185 - val_loss: 1.3462 - val_acc: 0.4276
Epoch 36/100
2000/2000 [==============================] - 6s - loss: 0.7444 - acc: 0.7340 - val_loss: 1.3610 - val_acc: 0.4385
Epoch 37/100
2000/2000 [==============================] - 6s - loss: 0.7078 - acc: 0.7430 - val_loss: 1.6853 - val_acc: 0.3593
Epoch 38/100
2000/2000 [==============================] - 6s - loss: 0.6917 - acc: 0.7475 - val_loss: 1.4416 - val_acc: 0.4081
Epoch 39/100
2000/2000 [==============================] - 6s - loss: 0.6626 - acc: 0.7585 - val_loss: 1.4033 - val_acc: 0.3977
Epoch 40/100
2000/2000 [==============================] - 6s - loss: 0.6496 - acc: 0.7735 - val_loss: 1.4453 - val_acc: 0.3864
Epoch 41/100
2000/2000 [==============================] - 6s - loss: 0.5906 - acc: 0.7945 - val_loss: 1.4176 - val_acc: 0.4186
Epoch 42/100
2000/2000 [==============================] - 6s - loss: 0.6232 - acc: 0.7830 - val_loss: 1.4290 - val_acc: 0.4045
Epoch 43/100
2000/2000 [==============================] - 6s - loss: 0.5554 - acc: 0.8055 - val_loss: 1.4497 - val_acc: 0.4330
Epoch 44/100
2000/2000 [==============================] - 6s - loss: 0.5190 - acc: 0.8340 - val_loss: 1.5117 - val_acc: 0.4131
Epoch 45/100
2000/2000 [==============================] - 6s - loss: 0.5182 - acc: 0.8345 - val_loss: 1.5194 - val_acc: 0.3955
Epoch 46/100
2000/2000 [==============================] - 6s - loss: 0.5071 - acc: 0.8355 - val_loss: 1.5069 - val_acc: 0.4118
Epoch 47/100
2000/2000 [==============================] - 6s - loss: 0.4760 - acc: 0.8475 - val_loss: 1.4981 - val_acc: 0.4244
Epoch 48/100
2000/2000 [==============================] - 6s - loss: 0.4436 - acc: 0.8605 - val_loss: 1.5615 - val_acc: 0.4054
Epoch 49/100
2000/2000 [==============================] - 6s - loss: 0.4344 - acc: 0.8650 - val_loss: 1.5817 - val_acc: 0.3946
Epoch 50/100
2000/2000 [==============================] - 6s - loss: 0.3917 - acc: 0.8830 - val_loss: 1.6029 - val_acc: 0.3946
Epoch 51/100
2000/2000 [==============================] - 6s - loss: 0.4021 - acc: 0.8930 - val_loss: 1.6452 - val_acc: 0.3946
Epoch 52/100
2000/2000 [==============================] - 6s - loss: 0.3534 - acc: 0.9035 - val_loss: 1.5586 - val_acc: 0.4167
Epoch 53/100
2000/2000 [==============================] - 6s - loss: 0.3576 - acc: 0.9025 - val_loss: 1.5914 - val_acc: 0.4204
Epoch 54/100
2000/2000 [==============================] - 6s - loss: 0.3269 - acc: 0.9115 - val_loss: 1.7235 - val_acc: 0.3950
Epoch 55/100
2000/2000 [==============================] - 6s - loss: 0.3380 - acc: 0.9020 - val_loss: 1.6489 - val_acc: 0.4109
Epoch 56/100
2000/2000 [==============================] - 6s - loss: 0.2863 - acc: 0.9190 - val_loss: 1.6465 - val_acc: 0.4072
Epoch 57/100
2000/2000 [==============================] - 6s - loss: 0.2889 - acc: 0.9130 - val_loss: 1.6938 - val_acc: 0.3964
Epoch 58/100
2000/2000 [==============================] - 6s - loss: 0.2614 - acc: 0.9335 - val_loss: 1.7725 - val_acc: 0.3977
Epoch 59/100
2000/2000 [==============================] - 6s - loss: 0.2846 - acc: 0.9225 - val_loss: 1.7438 - val_acc: 0.3995
Epoch 60/100
2000/2000 [==============================] - 6s - loss: 0.2358 - acc: 0.9485 - val_loss: 1.7186 - val_acc: 0.4050
Epoch 61/100
2000/2000 [==============================] - 6s - loss: 0.2206 - acc: 0.9475 - val_loss: 2.1195 - val_acc: 0.3593
Epoch 62/100
2000/2000 [==============================] - 6s - loss: 0.2217 - acc: 0.9430 - val_loss: 1.7741 - val_acc: 0.4068
Epoch 63/100
2000/2000 [==============================] - 6s - loss: 0.2107 - acc: 0.9430 - val_loss: 1.8059 - val_acc: 0.4032
Epoch 64/100
2000/2000 [==============================] - 6s - loss: 0.1926 - acc: 0.9600 - val_loss: 1.8838 - val_acc: 0.4009
Epoch 65/100
2000/2000 [==============================] - 6s - loss: 0.1781 - acc: 0.9605 - val_loss: 1.9377 - val_acc: 0.3959
Epoch 66/100
2000/2000 [==============================] - 6s - loss: 0.1674 - acc: 0.9650 - val_loss: 1.9232 - val_acc: 0.3991
Epoch 67/100
2000/2000 [==============================] - 6s - loss: 0.1565 - acc: 0.9675 - val_loss: 2.1491 - val_acc: 0.3842
Epoch 68/100
2000/2000 [==============================] - 6s - loss: 0.1583 - acc: 0.9645 - val_loss: 2.1257 - val_acc: 0.3688
Epoch 69/100
2000/2000 [==============================] - 6s - loss: 0.1532 - acc: 0.9650 - val_loss: 2.1459 - val_acc: 0.4154
Epoch 70/100
2000/2000 [==============================] - 6s - loss: 0.1440 - acc: 0.9690 - val_loss: 1.9177 - val_acc: 0.4036
Epoch 71/100
2000/2000 [==============================] - 6s - loss: 0.1255 - acc: 0.9750 - val_loss: 2.0047 - val_acc: 0.4068
Epoch 72/100
2000/2000 [==============================] - 6s - loss: 0.1297 - acc: 0.9725 - val_loss: 2.0971 - val_acc: 0.3937
Epoch 73/100
2000/2000 [==============================] - 6s - loss: 0.1192 - acc: 0.9760 - val_loss: 2.0166 - val_acc: 0.3932
Epoch 74/100
2000/2000 [==============================] - 6s - loss: 0.1268 - acc: 0.9695 - val_loss: 2.0828 - val_acc: 0.3977
Epoch 75/100
2000/2000 [==============================] - 6s - loss: 0.1052 - acc: 0.9825 - val_loss: 2.1015 - val_acc: 0.4014
Epoch 76/100
2000/2000 [==============================] - 6s - loss: 0.1085 - acc: 0.9775 - val_loss: 2.2390 - val_acc: 0.3932
Epoch 77/100
2000/2000 [==============================] - 6s - loss: 0.0996 - acc: 0.9815 - val_loss: 2.0726 - val_acc: 0.3986
Epoch 78/100
2000/2000 [==============================] - 6s - loss: 0.0927 - acc: 0.9820 - val_loss: 2.1191 - val_acc: 0.4032
Epoch 79/100
2000/2000 [==============================] - 6s - loss: 0.0817 - acc: 0.9875 - val_loss: 2.1503 - val_acc: 0.4018
Epoch 80/100
2000/2000 [==============================] - 6s - loss: 0.0895 - acc: 0.9845 - val_loss: 2.1610 - val_acc: 0.4122
Epoch 81/100
2000/2000 [==============================] - 6s - loss: 0.0846 - acc: 0.9855 - val_loss: 2.3335 - val_acc: 0.3932
Epoch 82/100
2000/2000 [==============================] - 6s - loss: 0.0747 - acc: 0.9865 - val_loss: 2.2409 - val_acc: 0.3914
Epoch 83/100
2000/2000 [==============================] - 6s - loss: 0.0667 - acc: 0.9925 - val_loss: 2.2677 - val_acc: 0.4104
Epoch 84/100
2000/2000 [==============================] - 6s - loss: 0.0714 - acc: 0.9865 - val_loss: 2.4675 - val_acc: 0.3896
Epoch 85/100
2000/2000 [==============================] - 6s - loss: 0.0651 - acc: 0.9860 - val_loss: 2.3052 - val_acc: 0.3846
Epoch 86/100
2000/2000 [==============================] - 6s - loss: 0.0599 - acc: 0.9890 - val_loss: 2.2864 - val_acc: 0.4000
Epoch 87/100
2000/2000 [==============================] - 6s - loss: 0.0562 - acc: 0.9920 - val_loss: 2.2703 - val_acc: 0.4059
Epoch 88/100
2000/2000 [==============================] - 6s - loss: 0.0468 - acc: 0.9945 - val_loss: 2.3956 - val_acc: 0.4050
Epoch 89/100
2000/2000 [==============================] - 6s - loss: 0.0521 - acc: 0.9930 - val_loss: 2.3708 - val_acc: 0.3986
Epoch 90/100
2000/2000 [==============================] - 6s - loss: 0.0805 - acc: 0.9800 - val_loss: 2.4118 - val_acc: 0.3900
Epoch 91/100
2000/2000 [==============================] - 6s - loss: 0.0504 - acc: 0.9930 - val_loss: 2.3752 - val_acc: 0.4054
Epoch 92/100
2000/2000 [==============================] - 6s - loss: 0.0477 - acc: 0.9925 - val_loss: 2.3969 - val_acc: 0.3973
Epoch 93/100
2000/2000 [==============================] - 6s - loss: 0.0460 - acc: 0.9945 - val_loss: 2.4448 - val_acc: 0.3964
Epoch 94/100
2000/2000 [==============================] - 6s - loss: 0.0424 - acc: 0.9920 - val_loss: 2.4089 - val_acc: 0.4005
Epoch 95/100
2000/2000 [==============================] - 6s - loss: 0.0437 - acc: 0.9950 - val_loss: 2.5242 - val_acc: 0.3914
Epoch 96/100
2000/2000 [==============================] - 6s - loss: 0.0343 - acc: 0.9955 - val_loss: 2.4625 - val_acc: 0.3923
Epoch 97/100
2000/2000 [==============================] - 6s - loss: 0.0382 - acc: 0.9950 - val_loss: 2.5792 - val_acc: 0.3928
Epoch 98/100
2000/2000 [==============================] - 6s - loss: 0.0358 - acc: 0.9960 - val_loss: 2.5178 - val_acc: 0.3923
Epoch 99/100
2000/2000 [==============================] - 6s - loss: 0.0437 - acc: 0.9920 - val_loss: 2.5288 - val_acc: 0.4036
Epoch 100/100
2000/2000 [==============================] - 6s - loss: 0.0300 - acc: 0.9960 - val_loss: 2.6317 - val_acc: 0.4009
Out[34]:
<keras.callbacks.History at 0x7fb173847898>

cnn non-static 结果

time max_len batch_size max_features embedding_dims nb_filter filter_length dense1_hindden val_acc
2016-11-25 9:52 36 50 14526 100 各100 3,4,5 300 0.4204
2016-11-26 9:52 36 50 14526 100 各100 3,4,5 300 0.4471

In [ ]:


In [61]:
model.fit_generator(my_generator(train_X_model,train_sentiment_X_model,train_tag_X_model,train_y_model),samples_per_epoch = 32*100,nb_epoch=100,verbose=1,validation_data=([dev_X_model,dev_sentiment_X_model,dev_tag_X_model],dev_y_model))


Epoch 1/100
3200/3200 [==============================] - 334s - loss: 1.5742 - acc: 0.2650 - val_loss: 1.5627 - val_acc: 0.2598
Epoch 2/100
3200/3200 [==============================] - 334s - loss: 1.5498 - acc: 0.2972 - val_loss: 1.5499 - val_acc: 0.3124
Epoch 3/100
3200/3200 [==============================] - 334s - loss: 1.5352 - acc: 0.3137 - val_loss: 1.5339 - val_acc: 0.3152
Epoch 4/100
3200/3200 [==============================] - 334s - loss: 1.5132 - acc: 0.3184 - val_loss: 1.5171 - val_acc: 0.3233
Epoch 5/100
3200/3200 [==============================] - 334s - loss: 1.4971 - acc: 0.3325 - val_loss: 1.5003 - val_acc: 0.3261
Epoch 6/100
3200/3200 [==============================] - 334s - loss: 1.4634 - acc: 0.3466 - val_loss: 1.4700 - val_acc: 0.3397
Epoch 7/100
3200/3200 [==============================] - 333s - loss: 1.4384 - acc: 0.3625 - val_loss: 1.4501 - val_acc: 0.3433
Epoch 8/100
3200/3200 [==============================] - 334s - loss: 1.4194 - acc: 0.3750 - val_loss: 1.4297 - val_acc: 0.3724
Epoch 9/100
3200/3200 [==============================] - 334s - loss: 1.3717 - acc: 0.4003 - val_loss: 1.4161 - val_acc: 0.3824
Epoch 10/100
3200/3200 [==============================] - 334s - loss: 1.3493 - acc: 0.4156 - val_loss: 1.4090 - val_acc: 0.3778
Epoch 11/100
3200/3200 [==============================] - 334s - loss: 1.3084 - acc: 0.4322 - val_loss: 1.3852 - val_acc: 0.3933
Epoch 12/100
3200/3200 [==============================] - 334s - loss: 1.2695 - acc: 0.4575 - val_loss: 1.3687 - val_acc: 0.4078
Epoch 13/100
3200/3200 [==============================] - 333s - loss: 1.2212 - acc: 0.4756 - val_loss: 1.3693 - val_acc: 0.4033
Epoch 14/100
3200/3200 [==============================] - 334s - loss: 1.1907 - acc: 0.4997 - val_loss: 1.3653 - val_acc: 0.4051
Epoch 15/100
3200/3200 [==============================] - 334s - loss: 1.1544 - acc: 0.5112 - val_loss: 1.3740 - val_acc: 0.4024
Epoch 16/100
3200/3200 [==============================] - 334s - loss: 1.1122 - acc: 0.5319 - val_loss: 1.3699 - val_acc: 0.3933
Epoch 17/100
3200/3200 [==============================] - 334s - loss: 1.0589 - acc: 0.5634 - val_loss: 1.3651 - val_acc: 0.4096
Epoch 18/100
3200/3200 [==============================] - 355s - loss: 1.0406 - acc: 0.5616 - val_loss: 1.3860 - val_acc: 0.4096
Epoch 19/100
3200/3200 [==============================] - 362s - loss: 0.9869 - acc: 0.5944 - val_loss: 1.3892 - val_acc: 0.4078
Epoch 20/100
3200/3200 [==============================] - 361s - loss: 0.9381 - acc: 0.6294 - val_loss: 1.4016 - val_acc: 0.4078
Epoch 21/100
3200/3200 [==============================] - 362s - loss: 0.8978 - acc: 0.6384 - val_loss: 1.3973 - val_acc: 0.4142
Epoch 22/100
3200/3200 [==============================] - 361s - loss: 0.8590 - acc: 0.6684 - val_loss: 1.4242 - val_acc: 0.4114
Epoch 23/100
3200/3200 [==============================] - 362s - loss: 0.8235 - acc: 0.6912 - val_loss: 1.4325 - val_acc: 0.4096
Epoch 24/100
3200/3200 [==============================] - 355s - loss: 0.7772 - acc: 0.7069 - val_loss: 1.4488 - val_acc: 0.3942
Epoch 25/100
3200/3200 [==============================] - 334s - loss: 0.7088 - acc: 0.7497 - val_loss: 1.5132 - val_acc: 0.3924
Epoch 26/100
3200/3200 [==============================] - 334s - loss: 0.6923 - acc: 0.7562 - val_loss: 1.5027 - val_acc: 0.4078
Epoch 27/100
3200/3200 [==============================] - 334s - loss: 0.6366 - acc: 0.7819 - val_loss: 1.5491 - val_acc: 0.4005
Epoch 28/100
3200/3200 [==============================] - 334s - loss: 0.5884 - acc: 0.8041 - val_loss: 1.6189 - val_acc: 0.3869
Epoch 29/100
3200/3200 [==============================] - 334s - loss: 0.5473 - acc: 0.8178 - val_loss: 1.6239 - val_acc: 0.4015
Epoch 30/100
3200/3200 [==============================] - 334s - loss: 0.4964 - acc: 0.8531 - val_loss: 1.6480 - val_acc: 0.4105
Epoch 31/100
3200/3200 [==============================] - 334s - loss: 0.4754 - acc: 0.8559 - val_loss: 1.7564 - val_acc: 0.3833
Epoch 32/100
3200/3200 [==============================] - 334s - loss: 0.4300 - acc: 0.8666 - val_loss: 1.7143 - val_acc: 0.3906
Epoch 33/100
3200/3200 [==============================] - 334s - loss: 0.3721 - acc: 0.8991 - val_loss: 1.7923 - val_acc: 0.3887
Epoch 34/100
3200/3200 [==============================] - 334s - loss: 0.3556 - acc: 0.9056 - val_loss: 1.8931 - val_acc: 0.3887
Epoch 35/100
3200/3200 [==============================] - 334s - loss: 0.3123 - acc: 0.9237 - val_loss: 1.8345 - val_acc: 0.3833
Epoch 36/100
3200/3200 [==============================] - 334s - loss: 0.2782 - acc: 0.9325 - val_loss: 2.1587 - val_acc: 0.3815
Epoch 37/100
3200/3200 [==============================] - 334s - loss: 0.2434 - acc: 0.9428 - val_loss: 1.9437 - val_acc: 0.3660
Epoch 38/100
3200/3200 [==============================] - 334s - loss: 0.2108 - acc: 0.9531 - val_loss: 2.0303 - val_acc: 0.3942
Epoch 39/100
3200/3200 [==============================] - 334s - loss: 0.1989 - acc: 0.9594 - val_loss: 2.1072 - val_acc: 0.4015
Epoch 40/100
3200/3200 [==============================] - 334s - loss: 0.1743 - acc: 0.9641 - val_loss: 2.1421 - val_acc: 0.3987
Epoch 41/100
3200/3200 [==============================] - 334s - loss: 0.1458 - acc: 0.9750 - val_loss: 2.2109 - val_acc: 0.3842
Epoch 42/100
3200/3200 [==============================] - 334s - loss: 0.1319 - acc: 0.9788 - val_loss: 2.3470 - val_acc: 0.4005
Epoch 43/100
3200/3200 [==============================] - 334s - loss: 0.1117 - acc: 0.9813 - val_loss: 2.2882 - val_acc: 0.3824
Epoch 44/100
3200/3200 [==============================] - 334s - loss: 0.1006 - acc: 0.9825 - val_loss: 2.4145 - val_acc: 0.3878
Epoch 45/100
3200/3200 [==============================] - 334s - loss: 0.0807 - acc: 0.9906 - val_loss: 2.4151 - val_acc: 0.3724
Epoch 46/100
3200/3200 [==============================] - 334s - loss: 0.0696 - acc: 0.9878 - val_loss: 2.6104 - val_acc: 0.3878
Epoch 47/100
3200/3200 [==============================] - 334s - loss: 0.0651 - acc: 0.9903 - val_loss: 2.6191 - val_acc: 0.4033
Epoch 48/100
3200/3200 [==============================] - 334s - loss: 0.0549 - acc: 0.9916 - val_loss: 2.6325 - val_acc: 0.3951
Epoch 49/100
3200/3200 [==============================] - 334s - loss: 0.0436 - acc: 0.9950 - val_loss: 2.6991 - val_acc: 0.4033
Epoch 50/100
3200/3200 [==============================] - 334s - loss: 0.0360 - acc: 0.9963 - val_loss: 2.7265 - val_acc: 0.4024
Epoch 51/100
3200/3200 [==============================] - 334s - loss: 0.0320 - acc: 0.9959 - val_loss: 2.7209 - val_acc: 0.3915
Epoch 52/100
3200/3200 [==============================] - 334s - loss: 0.0276 - acc: 0.9988 - val_loss: 2.8044 - val_acc: 0.3969
Epoch 53/100
3200/3200 [==============================] - 334s - loss: 0.0228 - acc: 0.9978 - val_loss: 2.8656 - val_acc: 0.3833
Epoch 54/100
3200/3200 [==============================] - 334s - loss: 0.0197 - acc: 0.9988 - val_loss: 3.0113 - val_acc: 0.3978
Epoch 55/100
3200/3200 [==============================] - 334s - loss: 0.0178 - acc: 0.9988 - val_loss: 3.1037 - val_acc: 0.3942
Epoch 56/100
3200/3200 [==============================] - 334s - loss: 0.0155 - acc: 0.9991 - val_loss: 3.0073 - val_acc: 0.3978
Epoch 57/100
3200/3200 [==============================] - 334s - loss: 0.0133 - acc: 0.9991 - val_loss: 3.0668 - val_acc: 0.4042
Epoch 58/100
3200/3200 [==============================] - 334s - loss: 0.0090 - acc: 1.0000 - val_loss: 3.1204 - val_acc: 0.4069
Epoch 59/100
3200/3200 [==============================] - 334s - loss: 0.0098 - acc: 0.9991 - val_loss: 3.1919 - val_acc: 0.3978
Epoch 60/100
3200/3200 [==============================] - 334s - loss: 0.0094 - acc: 0.9994 - val_loss: 3.2241 - val_acc: 0.3969
Epoch 61/100
3200/3200 [==============================] - 334s - loss: 0.0067 - acc: 0.9997 - val_loss: 3.2815 - val_acc: 0.3951
Epoch 62/100
3200/3200 [==============================] - 334s - loss: 0.0067 - acc: 0.9994 - val_loss: 3.3798 - val_acc: 0.4033
Epoch 63/100
3200/3200 [==============================] - 334s - loss: 0.0052 - acc: 0.9997 - val_loss: 3.4693 - val_acc: 0.3860
Epoch 64/100
3200/3200 [==============================] - 334s - loss: 0.0049 - acc: 0.9994 - val_loss: 3.4607 - val_acc: 0.4069
Epoch 65/100
3200/3200 [==============================] - 334s - loss: 0.0054 - acc: 0.9994 - val_loss: 3.4302 - val_acc: 0.4060
Epoch 66/100
3200/3200 [==============================] - 334s - loss: 0.0026 - acc: 1.0000 - val_loss: 3.4694 - val_acc: 0.4033
Epoch 67/100
3200/3200 [==============================] - 334s - loss: 0.0028 - acc: 0.9997 - val_loss: 3.5276 - val_acc: 0.4005
Epoch 68/100
3200/3200 [==============================] - 334s - loss: 0.0043 - acc: 0.9997 - val_loss: 3.5841 - val_acc: 0.4060
Epoch 69/100
3200/3200 [==============================] - 334s - loss: 0.0022 - acc: 0.9997 - val_loss: 3.6341 - val_acc: 0.4042
Epoch 70/100
3200/3200 [==============================] - 334s - loss: 0.0041 - acc: 0.9997 - val_loss: 3.6940 - val_acc: 0.3951
Epoch 71/100
3200/3200 [==============================] - 334s - loss: 0.0015 - acc: 1.0000 - val_loss: 3.7466 - val_acc: 0.3860
Epoch 72/100
3200/3200 [==============================] - 334s - loss: 0.0022 - acc: 0.9994 - val_loss: 3.7167 - val_acc: 0.3996
Epoch 73/100
3200/3200 [==============================] - 334s - loss: 0.0038 - acc: 0.9997 - val_loss: 3.7340 - val_acc: 0.3969
Epoch 74/100
3200/3200 [==============================] - 334s - loss: 0.0011 - acc: 1.0000 - val_loss: 3.7821 - val_acc: 0.3960
Epoch 75/100
3200/3200 [==============================] - 334s - loss: 0.0017 - acc: 0.9997 - val_loss: 3.7606 - val_acc: 0.4015
Epoch 76/100
3200/3200 [==============================] - 334s - loss: 0.0027 - acc: 0.9997 - val_loss: 3.7919 - val_acc: 0.3996
Epoch 77/100
3200/3200 [==============================] - 334s - loss: 0.0011 - acc: 0.9997 - val_loss: 3.8756 - val_acc: 0.3987
Epoch 78/100
3200/3200 [==============================] - 334s - loss: 0.0034 - acc: 0.9997 - val_loss: 3.8536 - val_acc: 0.3987
Epoch 79/100
3200/3200 [==============================] - 334s - loss: 6.8420e-04 - acc: 1.0000 - val_loss: 3.9308 - val_acc: 0.3924
Epoch 80/100
3200/3200 [==============================] - 334s - loss: 9.4822e-04 - acc: 1.0000 - val_loss: 3.9247 - val_acc: 0.3978
Epoch 81/100
3200/3200 [==============================] - 334s - loss: 0.0031 - acc: 0.9997 - val_loss: 3.9384 - val_acc: 0.3987
Epoch 82/100
3200/3200 [==============================] - 334s - loss: 5.9501e-04 - acc: 1.0000 - val_loss: 3.9892 - val_acc: 0.4024
Epoch 83/100
3200/3200 [==============================] - 334s - loss: 7.8704e-04 - acc: 1.0000 - val_loss: 3.9495 - val_acc: 0.3969
Epoch 84/100
3200/3200 [==============================] - 334s - loss: 0.0029 - acc: 0.9997 - val_loss: 3.9762 - val_acc: 0.3996
Epoch 85/100
3200/3200 [==============================] - 334s - loss: 5.0231e-04 - acc: 1.0000 - val_loss: 4.0102 - val_acc: 0.3960
Epoch 86/100
3200/3200 [==============================] - 334s - loss: 0.0021 - acc: 0.9997 - val_loss: 4.0126 - val_acc: 0.4015
Epoch 87/100
3200/3200 [==============================] - 334s - loss: 4.8490e-04 - acc: 1.0000 - val_loss: 4.0832 - val_acc: 0.3969
Epoch 88/100
3200/3200 [==============================] - 334s - loss: 6.0297e-04 - acc: 1.0000 - val_loss: 4.0647 - val_acc: 0.3951
Epoch 89/100
3200/3200 [==============================] - 334s - loss: 0.0017 - acc: 0.9997 - val_loss: 4.1057 - val_acc: 0.3996
Epoch 90/100
3200/3200 [==============================] - 334s - loss: 3.9432e-04 - acc: 1.0000 - val_loss: 4.1044 - val_acc: 0.3969
Epoch 91/100
3200/3200 [==============================] - 334s - loss: 7.9876e-04 - acc: 0.9997 - val_loss: 4.1034 - val_acc: 0.3951
Epoch 92/100
3200/3200 [==============================] - 334s - loss: 0.0021 - acc: 0.9997 - val_loss: 4.1198 - val_acc: 0.3987
Epoch 93/100
3200/3200 [==============================] - 334s - loss: 0.0010 - acc: 0.9997 - val_loss: 4.1375 - val_acc: 0.4005
Epoch 94/100
3200/3200 [==============================] - 334s - loss: 0.0019 - acc: 0.9997 - val_loss: 4.1234 - val_acc: 0.3969
Epoch 95/100
3200/3200 [==============================] - 334s - loss: 3.2084e-04 - acc: 1.0000 - val_loss: 4.1963 - val_acc: 0.3924
Epoch 96/100
3200/3200 [==============================] - 334s - loss: 9.5798e-04 - acc: 0.9997 - val_loss: 4.1820 - val_acc: 0.3951
Epoch 97/100
3200/3200 [==============================] - 334s - loss: 0.0020 - acc: 0.9997 - val_loss: 4.2095 - val_acc: 0.3987
Epoch 98/100
3200/3200 [==============================] - 334s - loss: 2.7613e-04 - acc: 1.0000 - val_loss: 4.1888 - val_acc: 0.3960
Epoch 99/100
3200/3200 [==============================] - 334s - loss: 3.6846e-04 - acc: 1.0000 - val_loss: 4.2307 - val_acc: 0.3951
Epoch 100/100
3200/3200 [==============================] - 334s - loss: 0.0015 - acc: 0.9997 - val_loss: 4.1971 - val_acc: 0.3942
Out[61]:
<keras.callbacks.History at 0x7f5f318951d0>

实验结果

time max_len batch_size max_features embedding_dims nb_filter filter_length dense1_hindden val_acc
2016-11-23 14:20 36 32 14526 200 50 2 300 0.4015
2016-11-24 11:16 36 32 14526 200 150 2 300 0.4142

两通道实验结果,一个是用训练好的词向量初始化句子,另一个是用随机初始化的词向量初始化句子。

time max_len batch_size max_features embedding_dims nb_filter filter_length dense1_hindden val_acc
2016-11-25 9:52 36 32 14526 100 各100 3,4,5 300 0.4124

In [ ]: