In [1]:
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from sklearn import metrics
plt.style.use("ggplot")
%matplotlib inline
In [2]:
plt.rcParams["figure.figsize"] = (12, 6)
Download stock price data from NSE website. Here I download stock price data for TCS stock.
In [3]:
df = pd.read_csv("https://raw.githubusercontent.com/abulbasar/data/master/tcs-stock.csv")
df.head()
Out[3]:
Symbol
Series
Date
Prev Close
Open Price
High Price
Low Price
Last Price
Close Price
Average Price
Total Traded Quantity
Turnover
No. of Trades
Deliverable Qty
% Dly Qt to Traded Qty
0
TCS
EQ
31-May-2018
3514.10
1734.0
1759.05
1726.10
1748.85
1741.05
1742.44
5049371
8.798250e+09
207998
3447026
68.27
1
TCS
EQ
01-Jun-2018
1741.05
1754.0
1757.50
1716.30
1732.00
1732.45
1742.29
1603856
2.794386e+09
74272
924751
57.66
2
TCS
EQ
04-Jun-2018
1732.45
1745.0
1749.00
1712.60
1742.00
1744.25
1730.25
1681483
2.909394e+09
96686
928452
55.22
3
TCS
EQ
05-Jun-2018
1744.25
1744.0
1751.90
1711.15
1723.00
1721.60
1728.88
2449568
4.235001e+09
133067
1492228
60.92
4
TCS
EQ
06-Jun-2018
1721.60
1723.0
1734.90
1715.20
1725.00
1725.75
1725.85
2236512
3.859875e+09
73770
1573892
70.37
Check whether you have duplicate data on by date. It looks like 11/2/2019, 18/2/2019, 25/2/2019 etc. have duplicate data. Not sure why there are duplicates by date. It could be ammendment to the data.
In [4]:
df[df["Date"].duplicated(keep = False)]
Out[4]:
Symbol
Series
Date
Prev Close
Open Price
High Price
Low Price
Last Price
Close Price
Average Price
Total Traded Quantity
Turnover
No. of Trades
Deliverable Qty
% Dly Qt to Traded Qty
173
TCS
BL
11-Feb-2019
2604.20
2061.40
2061.40
2061.40
2061.40
2061.40
2061.40
376308
7.757213e+08
1
0
0.00
174
TCS
EQ
11-Feb-2019
2061.40
2060.00
2096.00
2048.90
2065.60
2065.90
2077.11
1778730
3.694622e+09
83603
901516
50.68
179
TCS
BL
18-Feb-2019
2061.40
2029.70
2029.70
2029.70
2029.70
2029.70
2029.70
2474600
5.022696e+09
1
0
0.00
180
TCS
EQ
18-Feb-2019
2029.70
2037.60
2041.95
1962.65
1969.60
1970.30
1988.87
2942184
5.851634e+09
100585
1668905
56.72
185
TCS
BL
25-Feb-2019
2029.70
1925.65
1925.65
1925.65
1925.65
1925.65
1925.65
1057509
2.036392e+09
2
0
0.00
186
TCS
EQ
25-Feb-2019
1925.65
1932.50
1990.00
1930.50
1987.00
1985.15
1961.61
2934880
5.757095e+09
140541
1341508
45.71
203
TCS
BL
22-Mar-2019
1925.65
2015.05
2015.05
2015.05
2015.05
2015.05
2015.05
264913
5.338129e+08
1
264913
100.00
204
TCS
EQ
22-Mar-2019
2015.05
2015.00
2016.00
1983.30
2010.00
2005.65
1998.96
3148149
6.293032e+09
155770
1787595
56.78
Drop the duplicate from the dataset.
In [5]:
df = df[~df["Date"].duplicated()]
Convert the dataset into time series data by undating the index to time series index.
In [6]:
df.index = pd.to_datetime(df["Date"])
df.head()
Out[6]:
Symbol
Series
Date
Prev Close
Open Price
High Price
Low Price
Last Price
Close Price
Average Price
Total Traded Quantity
Turnover
No. of Trades
Deliverable Qty
% Dly Qt to Traded Qty
Date
2018-05-31
TCS
EQ
31-May-2018
3514.10
1734.0
1759.05
1726.10
1748.85
1741.05
1742.44
5049371
8.798250e+09
207998
3447026
68.27
2018-06-01
TCS
EQ
01-Jun-2018
1741.05
1754.0
1757.50
1716.30
1732.00
1732.45
1742.29
1603856
2.794386e+09
74272
924751
57.66
2018-06-04
TCS
EQ
04-Jun-2018
1732.45
1745.0
1749.00
1712.60
1742.00
1744.25
1730.25
1681483
2.909394e+09
96686
928452
55.22
2018-06-05
TCS
EQ
05-Jun-2018
1744.25
1744.0
1751.90
1711.15
1723.00
1721.60
1728.88
2449568
4.235001e+09
133067
1492228
60.92
2018-06-06
TCS
EQ
06-Jun-2018
1721.60
1723.0
1734.90
1715.20
1725.00
1725.75
1725.85
2236512
3.859875e+09
73770
1573892
70.37
In [7]:
df.index
Out[7]:
DatetimeIndex(['2018-05-31', '2018-06-01', '2018-06-04', '2018-06-05',
'2018-06-06', '2018-06-07', '2018-06-08', '2018-06-11',
'2018-06-12', '2018-06-13',
...
'2019-05-17', '2019-05-20', '2019-05-21', '2019-05-22',
'2019-05-23', '2019-05-24', '2019-05-27', '2019-05-28',
'2019-05-29', '2019-05-30'],
dtype='datetime64[ns]', name='Date', length=246, freq=None)
In [8]:
y = np.log(df["Close Price"])
y.plot()
Out[8]:
<matplotlib.axes._subplots.AxesSubplot at 0x1a1f105ed0>
In [9]:
y.index
Out[9]:
DatetimeIndex(['2018-05-31', '2018-06-01', '2018-06-04', '2018-06-05',
'2018-06-06', '2018-06-07', '2018-06-08', '2018-06-11',
'2018-06-12', '2018-06-13',
...
'2019-05-17', '2019-05-20', '2019-05-21', '2019-05-22',
'2019-05-23', '2019-05-24', '2019-05-27', '2019-05-28',
'2019-05-29', '2019-05-30'],
dtype='datetime64[ns]', name='Date', length=246, freq=None)
Let's see if there are gaps in the data in terms of date range. Trading does not happen on weekends or national holidays. So there is no data on those days.
In [10]:
list(y.index[:10])
Out[10]:
[Timestamp('2018-05-31 00:00:00'),
Timestamp('2018-06-01 00:00:00'),
Timestamp('2018-06-04 00:00:00'),
Timestamp('2018-06-05 00:00:00'),
Timestamp('2018-06-06 00:00:00'),
Timestamp('2018-06-07 00:00:00'),
Timestamp('2018-06-08 00:00:00'),
Timestamp('2018-06-11 00:00:00'),
Timestamp('2018-06-12 00:00:00'),
Timestamp('2018-06-13 00:00:00')]
In [11]:
y.asfreq("D")[:15]
Out[11]:
Date
2018-05-31 7.462244
2018-06-01 7.457292
2018-06-02 NaN
2018-06-03 NaN
2018-06-04 7.464080
2018-06-05 7.451009
2018-06-06 7.453417
2018-06-07 7.459080
2018-06-08 7.465713
2018-06-09 NaN
2018-06-10 NaN
2018-06-11 7.467200
2018-06-12 7.484930
2018-06-13 7.508842
2018-06-14 7.488601
Freq: D, Name: Close Price, dtype: float64
Let's impute the missing data by forward fill using asfreq method.
In [12]:
y.asfreq("D", method="ffill")[:15]
Out[12]:
Date
2018-05-31 7.462244
2018-06-01 7.457292
2018-06-02 7.457292
2018-06-03 7.457292
2018-06-04 7.464080
2018-06-05 7.451009
2018-06-06 7.453417
2018-06-07 7.459080
2018-06-08 7.465713
2018-06-09 7.465713
2018-06-10 7.465713
2018-06-11 7.467200
2018-06-12 7.484930
2018-06-13 7.508842
2018-06-14 7.488601
Freq: D, Name: Close Price, dtype: float64
For legitimate gaps, we can also set frequency to business days ("B")
In [13]:
y = y.asfreq("B")
y.index
Out[13]:
DatetimeIndex(['2018-05-31', '2018-06-01', '2018-06-04', '2018-06-05',
'2018-06-06', '2018-06-07', '2018-06-08', '2018-06-11',
'2018-06-12', '2018-06-13',
...
'2019-05-17', '2019-05-20', '2019-05-21', '2019-05-22',
'2019-05-23', '2019-05-24', '2019-05-27', '2019-05-28',
'2019-05-29', '2019-05-30'],
dtype='datetime64[ns]', name='Date', length=261, freq='B')
Calculate quarterly high, low etc.
In [14]:
pd.DataFrame(dict(low=y.resample("Q").min(),
high=y.resample("Q").max(),
first = y.resample("Q").first(),
last = y.resample("Q").last(),
median = y.resample("Q").median(),
mean = y.resample("Q").mean()
))
Out[14]:
low
high
first
last
median
mean
Date
2018-06-30
7.451009
7.527256
7.462244
7.521724
7.503576
7.491539
2018-09-30
7.523670
7.695508
7.523670
7.688776
7.602676
7.607829
2018-12-31
7.495042
7.721149
7.721149
7.545945
7.567837
7.572861
2019-03-31
7.502876
7.641516
7.551082
7.601727
7.593789
7.585512
2019-06-30
7.608126
7.723275
7.616604
7.671501
7.654088
7.655684
Calculate rolling mean, exponentially weighted mean
In [15]:
period = 3
y_rolling = pd.DataFrame(dict(actual = y,
rolling_mean = y.rolling(period).mean(),
ewm = y.ewm(span = period).mean()
))
y_rolling.head(10)
Out[15]:
actual
rolling_mean
ewm
Date
2018-05-31
7.462244
NaN
7.462244
2018-06-01
7.457292
NaN
7.458942
2018-06-04
7.464080
7.461205
7.461878
2018-06-05
7.451009
7.457460
7.456081
2018-06-06
7.453417
7.456169
7.454706
2018-06-07
7.459080
7.454502
7.456928
2018-06-08
7.465713
7.459403
7.461355
2018-06-11
7.467200
7.463997
7.464289
2018-06-12
7.484930
7.472614
7.474630
2018-06-13
7.508842
7.486991
7.491753
In [16]:
rolling_dropped = y_rolling.dropna()
print("rmse over rolling mean: ", metrics.mean_squared_error(rolling_dropped.actual, rolling_dropped.rolling_mean) ** 0.5)
print("rmse over ewma: ", metrics.mean_squared_error(rolling_dropped.actual, rolling_dropped["ewm"]) ** 0.5)
rmse over rolling mean: 0.011226801116974697
rmse over ewma: 0.008422829060485497
By visual inspection, we see the time series is not stationary. Let we will see a more formal way of testing using Dickey Fullter Test.
In [17]:
y_rolling.plot()
Out[17]:
<matplotlib.axes._subplots.AxesSubplot at 0x108b92b50>
Formal of way of test for stationarity. Since the p-value of Augmented Dickey Fullter test is > 0.05, we can say that there is not sufficient proof that the time series is stationary. Loosely speaking, it is non-stationary. We need further transformation to make data stationary. Populary method for making data stationary will be to take "difference". Taking difference by lag one, we see the p-val for Dickey Fuller test is < 0.05. So we conclude, diff 1 has made the data stationary.
In [18]:
from statsmodels.tsa.stattools import adfuller
from numpy import log
result = adfuller(y.dropna())
print('ADF Statistic: %f' % result[0])
print('p-value: %f' % result[1])
ADF Statistic: -2.645513
p-value: 0.083951
In [19]:
diff = pd.DataFrame({"actual": y, "diff1": y.diff(3)}).head(10)
diff["diff2"] = diff.diff1.diff(1)
diff
Out[19]:
actual
diff1
diff2
Date
2018-05-31
7.462244
NaN
NaN
2018-06-01
7.457292
NaN
NaN
2018-06-04
7.464080
NaN
NaN
2018-06-05
7.451009
-0.011234
NaN
2018-06-06
7.453417
-0.003875
0.007359
2018-06-07
7.459080
-0.005000
-0.001125
2018-06-08
7.465713
0.014703
0.019703
2018-06-11
7.467200
0.013783
-0.000921
2018-06-12
7.484930
0.025851
0.012068
2018-06-13
7.508842
0.043129
0.017279
In [20]:
result = adfuller(y.diff(periods=1).dropna())
print('ADF Statistic: %f' % result[0])
print('p-value: %f' % result[1])
ADF Statistic: -16.354529
p-value: 0.000000
In [21]:
y.diff(periods=1).plot()
Out[21]:
<matplotlib.axes._subplots.AxesSubplot at 0x1a1f7fb390>
Use pandas's auto correlation plot. Here is a nice explanation of the autocorrelation plot
In [22]:
from pandas.plotting import autocorrelation_plot
autocorrelation_plot(y)
Out[22]:
<matplotlib.axes._subplots.AxesSubplot at 0x1c237ec510>
In [23]:
y = y.dropna()
In [24]:
from statsmodels.tsa.arima_model import ARIMA
# fit model
model = ARIMA(y, order=(5,1,0)) # p, d, q
model_fit = model.fit(disp=0)
#print(model_fit.summary())
# plot residual erros
residuals = pd.DataFrame(model_fit.resid)
residuals.plot()
residuals.plot(kind='kde')
print(residuals.describe())
/anaconda3/lib/python3.7/site-packages/statsmodels/tsa/base/tsa_model.py:219: ValueWarning: A date index has been provided, but it has no associated frequency information and so will be ignored when e.g. forecasting.
' ignored when e.g. forecasting.', ValueWarning)
/anaconda3/lib/python3.7/site-packages/statsmodels/tsa/base/tsa_model.py:219: ValueWarning: A date index has been provided, but it has no associated frequency information and so will be ignored when e.g. forecasting.
' ignored when e.g. forecasting.', ValueWarning)
0
count 245.000000
mean -0.000001
std 0.015648
min -0.064712
25% -0.007517
50% 0.000321
75% 0.007957
max 0.060259
In [25]:
model_fit.summary()
Out[25]:
ARIMA Model Results
Dep. Variable: D.Close Price No. Observations: 245
Model: ARIMA(5, 1, 0) Log Likelihood 671.412
Method: css-mle S.D. of innovations 0.016
Date: Thu, 24 Oct 2019 AIC -1328.824
Time: 15:49:50 BIC -1304.315
Sample: 1 HQIC -1318.954
coef std err z P>|z| [0.025 0.975]
const 0.0008 0.001 0.922 0.357 -0.001 0.003
ar.L1.D.Close Price -0.0124 0.064 -0.193 0.847 -0.138 0.113
ar.L2.D.Close Price -0.0835 0.064 -1.307 0.193 -0.209 0.042
ar.L3.D.Close Price -0.0153 0.064 -0.239 0.812 -0.141 0.110
ar.L4.D.Close Price -0.0336 0.064 -0.526 0.599 -0.158 0.091
ar.L5.D.Close Price 0.0545 0.064 0.857 0.393 -0.070 0.179
Roots
Real Imaginary Modulus Frequency
AR.1 -1.3265 -1.1035j 1.7255 -0.3896
AR.2 -1.3265 +1.1035j 1.7255 0.3896
AR.3 0.5898 -1.6129j 1.7173 -0.1942
AR.4 0.5898 +1.6129j 1.7173 0.1942
AR.5 2.0889 -0.0000j 2.0889 -0.0000
In [26]:
model = ARIMA(y.values, order=(5,1,0))
model_fit = model.fit(disp=0)
predictions, error, conf_interval = model_fit.forecast(150)
plt.plot(predictions)
plt.fill_between(range(len(predictions)), conf_interval[:,0], conf_interval[:,1], alpha = 0.3)
Out[26]:
<matplotlib.collections.PolyCollection at 0x1c23e328d0>
In [27]:
predictions[:10]
Out[27]:
array([7.67043094, 7.66948429, 7.67017584, 7.67145335, 7.67334407,
7.67409748, 7.67475654, 7.67557198, 7.67642212, 7.67733182])
In [28]:
error[:10]
Out[28]:
array([0.0156159 , 0.02194815, 0.02609877, 0.0295766 , 0.03251409,
0.03556232, 0.03839147, 0.04097907, 0.0434079 , 0.04569468])
In [29]:
conf_interval[:10]
Out[29]:
array([[7.63982434, 7.70103754],
[7.62646671, 7.71250187],
[7.6190232 , 7.72132849],
[7.61348428, 7.72942242],
[7.60961762, 7.73707051],
[7.60439662, 7.74379834],
[7.59951065, 7.75000244],
[7.59525447, 7.75588948],
[7.59134419, 7.76150005],
[7.5877719 , 7.76689174]])
In [30]:
model.endog
Out[30]:
array([-0.00495179, 0.00678807, -0.01307057, 0.00240765, 0.00566263,
0.0066329 , 0.00148707, 0.01773066, 0.02391171, -0.02024075,
0.02894678, -0.00602338, -0.00246339, -0.00112423, -0.00170246,
-0.00526295, 0.00520799, 0.01784181, 0.0032345 , -0.00826875,
0.0027368 , 0.00194642, 0.01218676, -0.00422438, 0.00755385,
0.01721521, -0.0157803 , -0.00420426, 0.05438435, -0.00432775,
0.00493366, 0.0040803 , 0.00648906, -0.00094943, -0.00946851,
0.00764207, 0.00452246, -0.00332114, -0.00879292, -0.00851457,
-0.01072046, 0.0004372 , -0.00226524, 0.01853654, -0.01431908,
0.01581051, -0.00197219, -0.00415938, 0.00337448, 0.00017729,
0.00945199, 0.00353096, 0.00169843, 0.00246757, 0.00211377,
-0.0013921 , 0.00382362, 0.00861149, 0.00387436, 0.00385941,
0.00888334, 0.00135227, 0.00500675, -0.00192271, -0.01234494,
0.02215997, -0.00940615, -0.00190153, 0.00219005, 0.00043263,
-0.01671956, -0.00112464, 0.0104397 , 0.00422752, -0.00164057,
0.00344691, 0.01207845, 0.04431635, -0.00696088, -0.01898169,
0.02156629, -0.00235561, 0.03237316, -0.04196705, -0.04639343,
0.01816508, -0.01200914, 0.00683562, -0.02331198, -0.03164137,
-0.03163221, 0.01613355, 0.00654433, -0.01690818, -0.00843184,
-0.00534564, -0.03168429, 0.0026272 , 0.00243145, -0.02951941,
0.03918651, 0.01264024, 0.02262062, -0.00123906, -0.01158695,
-0.01188143, 0.02189153, 0.00487762, -0.0171113 , 0.00295498,
0.0104422 , -0.02846402, -0.00415613, 0.00731493, 0.00836626,
-0.01280204, -0.03590928, 0.00030349, 0.01834061, 0.02363379,
0.04835551, -0.01183956, 0.00391978, 0.00716341, 0.01424929,
-0.00204102, -0.00702599, 0.00125379, -0.00977092, 0.0121738 ,
0.00836492, -0.01710298, 0.00359989, 0.00228411, -0.00323946,
-0.00980722, -0.00734229, -0.03026323, 0.01190272, -0.01539017,
0.01039989, -0.00678058, -0.00158349, 0.0051372 , 0.01071598,
-0.01221489, -0.01223273, 0.01115317, -0.00229464, -0.00349161,
0.00084757, -0.02465886, -0.01602967, 0.02964045, 0.00125737,
0.01283074, 0.00334656, 0.00422645, -0.003569 , -0.01392478,
0.01374073, 0.01007242, 0.01764843, 0.01409458, -0.0006811 ,
0.01636878, 0.00783872, 0.00995057, -0.00278402, 0.01468705,
0.00389639, -0.01037579, 0. , -0.00476539, 0.00670394,
-0.01012174, -0.00731417, 0. , -0.06351098, 0.00521005,
-0.00028729, 0.00596379, 0. , 0.05704878, 0.00947088,
-0.03694547, 0.00600678, -0.00366512, 0.00576775, 0.00682801,
0.00465809, -0.00391332, -0.00116705, -0.00595574, -0.0065699 ,
0.02609805, -0.00844261, 0. , -0.00383868, 0. ,
-0.015403 , -0.00080668, -0.00746735, 0.01633018, 0.00067467,
0.01487643, 0.02318303, 0. , -0.03166027, 0.01663916,
0.01090068, 0.00997065, -0.02480916, -0.01022239, -0.00247893,
0.04787968, 0.00871599, 0.0085244 , 0.0052882 , -0.00296537,
0.01309227, 0.00527585, 0.01964629, 0.00969133, -0.02008669,
-0.03837247, 0.01205185, -0.00273795, 0.00041814, 0.00910905,
-0.01706031, -0.00330633, -0.01724712, 0.00145663, 0.00635089,
-0.00632703, 0.02288159, -0.01608046, -0.01336057, -0.01339543,
-0.00294975, 0.00348513, 0.00900972, 0.01616757, 0.01821929])
In [31]:
(predictions + 1.96 * error)[:10]
Out[31]:
array([7.7010381 , 7.71250266, 7.72132943, 7.72942348, 7.73707169,
7.74379962, 7.75000382, 7.75589096, 7.76150161, 7.76689339])
In [32]:
(predictions - 1.96 * error)[:10]
Out[32]:
array([7.63982378, 7.62646592, 7.61902226, 7.61348321, 7.60961645,
7.60439534, 7.59950926, 7.59525299, 7.59134263, 7.58777026])
In [33]:
from sklearn.metrics import mean_squared_error
x = y.values
train_size = int(len(x) * 0.7)
train, test = x[0:train_size], x[train_size:]
history = [x for x in train]
predictions = []
for t in range(len(test)):
model = ARIMA(history, order=(5,1,0))
model_fit = model.fit(disp=0)
output = model_fit.forecast()
yhat = output[0][0]
predictions.append(yhat)
obs = test[t]
history.append(obs)
print('predicted=%f, actual=%f' % (yhat, obs))
error = mean_squared_error(test, predictions)
print('Test RMSE: %.3f' % np.sqrt(error))
plt.plot(test)
plt.plot(predictions, color='red')
predicted=7.642313, actual=7.631141
predicted=7.634232, actual=7.631141
predicted=7.632516, actual=7.626375
predicted=7.629072, actual=7.633079
predicted=7.634839, actual=7.622957
predicted=7.621720, actual=7.615643
predicted=7.617704, actual=7.615643
predicted=7.615724, actual=7.552132
predicted=7.554593, actual=7.557342
predicted=7.560087, actual=7.557055
predicted=7.551836, actual=7.563019
predicted=7.564309, actual=7.563019
predicted=7.552747, actual=7.620068
predicted=7.619829, actual=7.629539
predicted=7.627589, actual=7.592593
predicted=7.596197, actual=7.598600
predicted=7.602220, actual=7.594935
predicted=7.597300, actual=7.600702
predicted=7.603198, actual=7.607530
predicted=7.604694, actual=7.612189
predicted=7.613020, actual=7.608275
predicted=7.608681, actual=7.607108
predicted=7.608652, actual=7.601152
predicted=7.602319, actual=7.594583
predicted=7.596263, actual=7.620681
predicted=7.620880, actual=7.612238
predicted=7.610899, actual=7.612238
predicted=7.614528, actual=7.608399
predicted=7.607916, actual=7.608399
predicted=7.611421, actual=7.592996
predicted=7.593551, actual=7.592190
predicted=7.594197, actual=7.584722
predicted=7.584855, actual=7.601052
predicted=7.602042, actual=7.601727
predicted=7.600058, actual=7.616604
predicted=7.617376, actual=7.639787
predicted=7.638202, actual=7.639787
predicted=7.640333, actual=7.608126
predicted=7.610033, actual=7.624765
predicted=7.627836, actual=7.635666
predicted=7.634644, actual=7.645637
predicted=7.646219, actual=7.620828
predicted=7.620353, actual=7.610605
predicted=7.615017, actual=7.608126
predicted=7.609976, actual=7.656006
predicted=7.656009, actual=7.664722
predicted=7.659212, actual=7.673246
predicted=7.672306, actual=7.678535
predicted=7.678270, actual=7.675569
predicted=7.678589, actual=7.688661
predicted=7.690101, actual=7.693937
predicted=7.694241, actual=7.713584
predicted=7.714129, actual=7.723275
predicted=7.722398, actual=7.703188
predicted=7.704747, actual=7.664816
predicted=7.668089, actual=7.676868
predicted=7.682775, actual=7.674130
predicted=7.675242, actual=7.674548
predicted=7.675004, actual=7.683657
predicted=7.682369, actual=7.666597
predicted=7.667849, actual=7.663290
predicted=7.665677, actual=7.646043
predicted=7.647540, actual=7.647500
predicted=7.650776, actual=7.653851
predicted=7.653663, actual=7.647524
predicted=7.648021, actual=7.670405
predicted=7.670248, actual=7.654325
predicted=7.653655, actual=7.640964
predicted=7.643984, actual=7.627569
predicted=7.628996, actual=7.624619
predicted=7.628661, actual=7.628104
predicted=7.628498, actual=7.637114
predicted=7.637006, actual=7.653281
predicted=7.652325, actual=7.671501
Test RMSE: 0.018
Out[33]:
[<matplotlib.lines.Line2D at 0x1c23fc6190>]
In [34]:
result = pd.DataFrame(y)[train_size:]
result.columns = ["actual"]
result["prediction"] = predictions
result
Out[34]:
actual
prediction
Date
2019-02-08
7.631141
7.642313
2019-02-11
7.631141
7.634232
2019-02-12
7.626375
7.632516
2019-02-13
7.633079
7.629072
2019-02-14
7.622957
7.634839
...
...
...
2019-05-24
7.624619
7.628996
2019-05-27
7.628104
7.628661
2019-05-28
7.637114
7.628498
2019-05-29
7.653281
7.637006
2019-05-30
7.671501
7.652325
74 rows × 2 columns
In [35]:
from numpy import array
import keras
import tensorflow as tf
from keras.preprocessing.sequence import TimeseriesGenerator
Using TensorFlow backend.
In [36]:
y_b = np.log(df["Close Price"])
y_b = y_b.asfreq("B").dropna()/10.0
print("%s" % y_b[:10])
Date
2018-05-31 0.746224
2018-06-01 0.745729
2018-06-04 0.746408
2018-06-05 0.745101
2018-06-06 0.745342
2018-06-07 0.745908
2018-06-08 0.746571
2018-06-11 0.746720
2018-06-12 0.748493
2018-06-13 0.750884
Name: Close Price, dtype: float64
In [37]:
n_input = 5
train_size = int(0.7 * len(y_b))
y_b_train = y_b[:train_size]
y_b_test = y_b[train_size:]
generator_train = TimeseriesGenerator(y_b_train, y_b_train, length=n_input, batch_size=1, )
generator_test = TimeseriesGenerator(y_b_test, y_b_test, length=n_input, batch_size=1)
print('Samples: %d' % len(generator_train))
for i in range(3):
x, y = generator_train[i]
print('%s => %s' % (x, y))
Samples: 167
[[0.74622437 0.74572919 0.74640799 0.74510094 0.7453417 ]] => [0.74590796]
[[0.74572919 0.74640799 0.74510094 0.7453417 0.74590796]] => [0.74657125]
[[0.74640799 0.74510094 0.7453417 0.74590796 0.74657125]] => [0.74671996]
In [38]:
tf.set_random_seed(1)
np.random.seed(1)
model = keras.Sequential([
keras.layers.InputLayer(input_shape = (n_input,)),
keras.layers.Dense(1, activation = None, kernel_initializer = keras.initializers.he_uniform(seed = 1.0))
])
model.compile(loss = "mse", optimizer = keras.optimizers.SGD(lr = 0.1))
model.summary()
model.fit_generator(generator_train, steps_per_epoch=1, epochs=500, verbose=0)
y_b_test_pred = model.predict_generator(generator_test, use_multiprocessing = True)
layer = model.layers[0]
print(layer.get_weights())
WARNING:tensorflow:From /anaconda3/lib/python3.7/site-packages/tensorflow/python/framework/op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Colocations handled automatically by placer.
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_1 (Dense) (None, 1) 6
=================================================================
Total params: 6
Trainable params: 6
Non-trainable params: 0
_________________________________________________________________
WARNING:tensorflow:From /anaconda3/lib/python3.7/site-packages/tensorflow/python/ops/math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.cast instead.
[array([[-0.3597389 ],
[-0.43928966],
[-0.32653952],
[ 0.1131153 ],
[ 0.94963855]], dtype=float32), array([0.804969], dtype=float32)]
In [39]:
y_b_test_pred.shape, y_b_test.shape
Out[39]:
((69, 1), (74,))
In [40]:
metrics.mean_squared_error(y_b_test[n_input:] * 10.0, y_b_test_pred * 10.0) ** 0.5
Out[40]:
0.07251101321580536
In [43]:
y_b = y_b.values
y_b = y_b.reshape((len(y_b), 1))
train_size = int(0.7 * len(y_b))
y_b_train = y_b[:train_size]
y_b_test = y_b[train_size:]
n_input = 5
generator_train = TimeseriesGenerator(y_b_train, y_b_train, length=n_input, batch_size=1, )
generator_test = TimeseriesGenerator(y_b_test, y_b_test, length=n_input, batch_size=1)
print('Samples: %d' % len(generator_train))
for i in range(3):
x, y = generator_train[i]
print('%s => %s' % (x, y))
Samples: 167
[[[0.74622437]
[0.74572919]
[0.74640799]
[0.74510094]
[0.7453417 ]]] => [[0.74590796]]
[[[0.74572919]
[0.74640799]
[0.74510094]
[0.7453417 ]
[0.74590796]]] => [[0.74657125]]
[[[0.74640799]
[0.74510094]
[0.7453417 ]
[0.74590796]
[0.74657125]]] => [[0.74671996]]
In [44]:
model = keras.Sequential([
keras.layers.InputLayer(input_shape = (n_input,1)),
keras.layers.LSTM(units = 100),
keras.layers.Dense(units = 1, activation = None)
])
model.compile(loss = "mse", optimizer = "adam")
model.summary()
model.fit_generator(generator_train, steps_per_epoch=1, epochs=500, verbose=1)
y_b_test_pred = model.predict_generator(generator_test)
layer = model.layers[0]
#print(layer.get_weights())
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm_1 (LSTM) (None, 100) 40800
_________________________________________________________________
dense_2 (Dense) (None, 1) 101
=================================================================
Total params: 40,901
Trainable params: 40,901
Non-trainable params: 0
_________________________________________________________________
Epoch 1/500
1/1 [==============================] - 1s 585ms/step - loss: 0.5183
Epoch 2/500
1/1 [==============================] - 0s 5ms/step - loss: 0.4663
Epoch 3/500
1/1 [==============================] - 0s 5ms/step - loss: 0.4234
Epoch 4/500
1/1 [==============================] - 0s 5ms/step - loss: 0.3667
Epoch 5/500
1/1 [==============================] - 0s 5ms/step - loss: 0.3279
Epoch 6/500
1/1 [==============================] - 0s 5ms/step - loss: 0.2827
Epoch 7/500
1/1 [==============================] - 0s 6ms/step - loss: 0.2467
Epoch 8/500
1/1 [==============================] - 0s 5ms/step - loss: 0.2054
Epoch 9/500
1/1 [==============================] - 0s 5ms/step - loss: 0.1703
Epoch 10/500
1/1 [==============================] - 0s 5ms/step - loss: 0.1340
Epoch 11/500
1/1 [==============================] - 0s 5ms/step - loss: 0.1071
Epoch 12/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0742
Epoch 13/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0494
Epoch 14/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0316
Epoch 15/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0159
Epoch 16/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0045
Epoch 17/500
1/1 [==============================] - 0s 5ms/step - loss: 8.3205e-05
Epoch 18/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0015
Epoch 19/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0074
Epoch 20/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0160
Epoch 21/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0243
Epoch 22/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0304
Epoch 23/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0312
Epoch 24/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0293
Epoch 25/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0237
Epoch 26/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0185
Epoch 27/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0135
Epoch 28/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0071
Epoch 29/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0038
Epoch 30/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0016
Epoch 31/500
1/1 [==============================] - 0s 5ms/step - loss: 1.9945e-04
Epoch 32/500
1/1 [==============================] - 0s 6ms/step - loss: 5.8022e-05
Epoch 33/500
1/1 [==============================] - 0s 6ms/step - loss: 4.5785e-04
Epoch 34/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0014
Epoch 35/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0025
Epoch 36/500
1/1 [==============================] - 0s 5ms/step - loss: 0.0035
Epoch 37/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0043
Epoch 38/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0046
Epoch 39/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0057
Epoch 40/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0050
Epoch 41/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0044
Epoch 42/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0044
Epoch 43/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0035
Epoch 44/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0031
Epoch 45/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0019
Epoch 46/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0014
Epoch 47/500
1/1 [==============================] - 0s 7ms/step - loss: 3.9434e-04
Epoch 48/500
1/1 [==============================] - 0s 6ms/step - loss: 2.7921e-04
Epoch 49/500
1/1 [==============================] - 0s 6ms/step - loss: 8.9382e-05
Epoch 50/500
1/1 [==============================] - 0s 6ms/step - loss: 2.1536e-06
Epoch 51/500
1/1 [==============================] - 0s 6ms/step - loss: 1.0172e-04
Epoch 52/500
1/1 [==============================] - 0s 6ms/step - loss: 3.3048e-04
Epoch 53/500
1/1 [==============================] - 0s 6ms/step - loss: 4.8098e-04
Epoch 54/500
1/1 [==============================] - 0s 5ms/step - loss: 4.9414e-04
Epoch 55/500
1/1 [==============================] - 0s 6ms/step - loss: 8.9877e-04
Epoch 56/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0011
Epoch 57/500
1/1 [==============================] - 0s 6ms/step - loss: 0.0011
Epoch 58/500
1/1 [==============================] - 0s 6ms/step - loss: 9.4789e-04
Epoch 59/500
1/1 [==============================] - 0s 6ms/step - loss: 7.8432e-04
Epoch 60/500
1/1 [==============================] - 0s 6ms/step - loss: 3.9499e-04
Epoch 61/500
1/1 [==============================] - 0s 5ms/step - loss: 3.8717e-04
Epoch 62/500
1/1 [==============================] - 0s 6ms/step - loss: 1.4475e-04
Epoch 63/500
1/1 [==============================] - 0s 6ms/step - loss: 1.3797e-04
Epoch 64/500
1/1 [==============================] - 0s 5ms/step - loss: 8.4598e-05
Epoch 65/500
1/1 [==============================] - 0s 6ms/step - loss: 1.9787e-07
Epoch 66/500
1/1 [==============================] - 0s 6ms/step - loss: 1.0700e-04
Epoch 67/500
1/1 [==============================] - 0s 6ms/step - loss: 7.1718e-05
Epoch 68/500
1/1 [==============================] - 0s 6ms/step - loss: 6.3159e-05
Epoch 69/500
1/1 [==============================] - 0s 6ms/step - loss: 1.1441e-04
Epoch 70/500
1/1 [==============================] - 0s 7ms/step - loss: 5.8854e-05
Epoch 71/500
1/1 [==============================] - 0s 6ms/step - loss: 1.3546e-04
Epoch 72/500
1/1 [==============================] - 0s 7ms/step - loss: 2.5025e-04
Epoch 73/500
1/1 [==============================] - 0s 35ms/step - loss: 2.7873e-04
Epoch 74/500
1/1 [==============================] - 0s 9ms/step - loss: 1.3182e-04
Epoch 75/500
1/1 [==============================] - 0s 7ms/step - loss: 1.6983e-04
Epoch 76/500
1/1 [==============================] - 0s 7ms/step - loss: 9.5882e-05
Epoch 77/500
1/1 [==============================] - 0s 6ms/step - loss: 6.2263e-05
Epoch 78/500
1/1 [==============================] - 0s 6ms/step - loss: 9.6591e-06
Epoch 79/500
1/1 [==============================] - 0s 7ms/step - loss: 5.2586e-06
Epoch 80/500
1/1 [==============================] - 0s 6ms/step - loss: 1.2792e-05
Epoch 81/500
1/1 [==============================] - 0s 6ms/step - loss: 2.9179e-05
Epoch 82/500
1/1 [==============================] - 0s 6ms/step - loss: 1.8325e-07
Epoch 83/500
1/1 [==============================] - 0s 7ms/step - loss: 1.7991e-05
Epoch 84/500
1/1 [==============================] - 0s 6ms/step - loss: 2.4702e-05
Epoch 85/500
1/1 [==============================] - 0s 6ms/step - loss: 9.3653e-05
Epoch 86/500
1/1 [==============================] - 0s 6ms/step - loss: 2.4078e-05
Epoch 87/500
1/1 [==============================] - 0s 6ms/step - loss: 1.4975e-05
Epoch 88/500
1/1 [==============================] - 0s 5ms/step - loss: 5.1968e-05
Epoch 89/500
1/1 [==============================] - 0s 5ms/step - loss: 2.2489e-05
Epoch 90/500
1/1 [==============================] - 0s 5ms/step - loss: 2.6998e-06
Epoch 91/500
1/1 [==============================] - 0s 5ms/step - loss: 4.2896e-06
Epoch 92/500
1/1 [==============================] - 0s 4ms/step - loss: 1.2863e-05
Epoch 93/500
1/1 [==============================] - 0s 5ms/step - loss: 8.4561e-06
Epoch 94/500
1/1 [==============================] - 0s 5ms/step - loss: 1.1697e-07
Epoch 95/500
1/1 [==============================] - 0s 7ms/step - loss: 2.7082e-07
Epoch 96/500
1/1 [==============================] - 0s 5ms/step - loss: 1.1189e-06
Epoch 97/500
1/1 [==============================] - 0s 6ms/step - loss: 1.4168e-08
Epoch 98/500
1/1 [==============================] - 0s 5ms/step - loss: 2.2087e-05
Epoch 99/500
1/1 [==============================] - 0s 5ms/step - loss: 1.7892e-05
Epoch 100/500
1/1 [==============================] - 0s 5ms/step - loss: 2.1001e-05
Epoch 101/500
1/1 [==============================] - 0s 5ms/step - loss: 9.3358e-06
Epoch 102/500
1/1 [==============================] - 0s 4ms/step - loss: 3.4208e-08
Epoch 103/500
1/1 [==============================] - 0s 5ms/step - loss: 4.3895e-06
Epoch 104/500
1/1 [==============================] - 0s 5ms/step - loss: 2.4468e-05
Epoch 105/500
1/1 [==============================] - 0s 5ms/step - loss: 1.3958e-06
Epoch 106/500
1/1 [==============================] - 0s 6ms/step - loss: 3.5805e-06
Epoch 107/500
1/1 [==============================] - 0s 5ms/step - loss: 1.4157e-06
Epoch 108/500
1/1 [==============================] - 0s 5ms/step - loss: 1.3240e-06
Epoch 109/500
1/1 [==============================] - 0s 5ms/step - loss: 2.7774e-08
Epoch 110/500
1/1 [==============================] - 0s 5ms/step - loss: 2.8253e-06
Epoch 111/500
1/1 [==============================] - 0s 5ms/step - loss: 8.8197e-07
Epoch 112/500
1/1 [==============================] - 0s 5ms/step - loss: 8.0267e-07
Epoch 113/500
1/1 [==============================] - 0s 4ms/step - loss: 4.3496e-06
Epoch 114/500
1/1 [==============================] - 0s 5ms/step - loss: 8.3928e-07
Epoch 115/500
1/1 [==============================] - 0s 5ms/step - loss: 5.1058e-06
Epoch 116/500
1/1 [==============================] - 0s 5ms/step - loss: 7.2349e-06
Epoch 117/500
1/1 [==============================] - 0s 5ms/step - loss: 1.2595e-06
Epoch 118/500
1/1 [==============================] - 0s 5ms/step - loss: 1.0632e-05
Epoch 119/500
1/1 [==============================] - 0s 5ms/step - loss: 5.7485e-06
Epoch 120/500
1/1 [==============================] - 0s 5ms/step - loss: 1.2006e-05
Epoch 121/500
1/1 [==============================] - 0s 5ms/step - loss: 6.7072e-06
Epoch 122/500
1/1 [==============================] - 0s 5ms/step - loss: 2.0808e-07
Epoch 123/500
1/1 [==============================] - 0s 5ms/step - loss: 3.7697e-06
Epoch 124/500
1/1 [==============================] - 0s 6ms/step - loss: 2.5056e-06
Epoch 125/500
1/1 [==============================] - 0s 5ms/step - loss: 1.3970e-05
Epoch 126/500
1/1 [==============================] - 0s 5ms/step - loss: 2.6987e-06
Epoch 127/500
1/1 [==============================] - 0s 5ms/step - loss: 1.5189e-06
Epoch 128/500
1/1 [==============================] - 0s 4ms/step - loss: 2.4047e-05
Epoch 129/500
1/1 [==============================] - 0s 5ms/step - loss: 1.5359e-06
Epoch 130/500
1/1 [==============================] - 0s 5ms/step - loss: 2.7764e-05
Epoch 131/500
1/1 [==============================] - 0s 5ms/step - loss: 2.6821e-06
Epoch 132/500
1/1 [==============================] - 0s 5ms/step - loss: 2.7853e-06
Epoch 133/500
1/1 [==============================] - 0s 5ms/step - loss: 1.2098e-05
Epoch 134/500
1/1 [==============================] - 0s 4ms/step - loss: 6.0226e-06
Epoch 135/500
1/1 [==============================] - 0s 5ms/step - loss: 7.0289e-08
Epoch 136/500
1/1 [==============================] - 0s 6ms/step - loss: 5.9727e-06
Epoch 137/500
1/1 [==============================] - 0s 5ms/step - loss: 6.0501e-06
Epoch 138/500
1/1 [==============================] - 0s 5ms/step - loss: 4.1724e-06
Epoch 139/500
1/1 [==============================] - 0s 5ms/step - loss: 7.8356e-06
Epoch 140/500
1/1 [==============================] - 0s 5ms/step - loss: 2.1032e-06
Epoch 141/500
1/1 [==============================] - 0s 5ms/step - loss: 3.3718e-07
Epoch 142/500
1/1 [==============================] - 0s 5ms/step - loss: 2.3737e-05
Epoch 143/500
1/1 [==============================] - 0s 5ms/step - loss: 3.5166e-07
Epoch 144/500
1/1 [==============================] - 0s 5ms/step - loss: 1.1157e-07
Epoch 145/500
1/1 [==============================] - 0s 6ms/step - loss: 3.0227e-07
Epoch 146/500
1/1 [==============================] - 0s 5ms/step - loss: 4.4674e-05
Epoch 147/500
1/1 [==============================] - 0s 6ms/step - loss: 1.4277e-05
Epoch 148/500
1/1 [==============================] - 0s 5ms/step - loss: 4.6830e-07
Epoch 149/500
1/1 [==============================] - 0s 5ms/step - loss: 2.7712e-06
Epoch 150/500
1/1 [==============================] - 0s 6ms/step - loss: 2.5365e-06
Epoch 151/500
1/1 [==============================] - 0s 5ms/step - loss: 1.9762e-06
Epoch 152/500
1/1 [==============================] - 0s 6ms/step - loss: 1.1671e-06
Epoch 153/500
1/1 [==============================] - 0s 5ms/step - loss: 1.6728e-06
Epoch 154/500
1/1 [==============================] - 0s 5ms/step - loss: 6.0910e-06
Epoch 155/500
1/1 [==============================] - 0s 6ms/step - loss: 2.2714e-06
Epoch 156/500
1/1 [==============================] - 0s 5ms/step - loss: 2.0406e-06
Epoch 157/500
1/1 [==============================] - 0s 5ms/step - loss: 3.7690e-06
Epoch 158/500
1/1 [==============================] - 0s 5ms/step - loss: 1.4725e-07
Epoch 159/500
1/1 [==============================] - 0s 5ms/step - loss: 4.6608e-08
Epoch 160/500
1/1 [==============================] - 0s 5ms/step - loss: 1.5276e-05
Epoch 161/500
1/1 [==============================] - 0s 6ms/step - loss: 6.0686e-06
Epoch 162/500
1/1 [==============================] - 0s 5ms/step - loss: 5.6486e-06
Epoch 163/500
1/1 [==============================] - 0s 6ms/step - loss: 1.8253e-05
Epoch 164/500
1/1 [==============================] - 0s 20ms/step - loss: 1.8341e-07
Epoch 165/500
1/1 [==============================] - 0s 6ms/step - loss: 5.5538e-06
Epoch 166/500
1/1 [==============================] - 0s 6ms/step - loss: 4.3204e-05
Epoch 167/500
1/1 [==============================] - 0s 6ms/step - loss: 9.7840e-07
Epoch 168/500
1/1 [==============================] - 0s 26ms/step - loss: 4.7627e-06
Epoch 169/500
1/1 [==============================] - 0s 6ms/step - loss: 9.6076e-06
Epoch 170/500
1/1 [==============================] - 0s 6ms/step - loss: 1.3356e-05
Epoch 171/500
1/1 [==============================] - 0s 7ms/step - loss: 8.9010e-06
Epoch 172/500
1/1 [==============================] - 0s 5ms/step - loss: 8.3517e-06
Epoch 173/500
1/1 [==============================] - 0s 6ms/step - loss: 2.7169e-06
Epoch 174/500
1/1 [==============================] - 0s 5ms/step - loss: 4.2573e-06
Epoch 175/500
1/1 [==============================] - 0s 5ms/step - loss: 4.0903e-06
Epoch 176/500
1/1 [==============================] - 0s 6ms/step - loss: 1.4325e-06
Epoch 177/500
1/1 [==============================] - 0s 6ms/step - loss: 9.6981e-07
Epoch 178/500
1/1 [==============================] - 0s 5ms/step - loss: 2.6351e-05
Epoch 179/500
1/1 [==============================] - 0s 5ms/step - loss: 4.1359e-05
Epoch 180/500
1/1 [==============================] - 0s 6ms/step - loss: 2.8542e-05
Epoch 181/500
1/1 [==============================] - 0s 5ms/step - loss: 1.5397e-05
Epoch 182/500
1/1 [==============================] - 0s 5ms/step - loss: 1.3671e-05
Epoch 183/500
1/1 [==============================] - 0s 5ms/step - loss: 3.9773e-06
Epoch 184/500
1/1 [==============================] - 0s 5ms/step - loss: 2.3499e-05
Epoch 185/500
1/1 [==============================] - 0s 5ms/step - loss: 8.8427e-08
Epoch 186/500
1/1 [==============================] - 0s 5ms/step - loss: 7.2822e-07
Epoch 187/500
1/1 [==============================] - 0s 6ms/step - loss: 4.8112e-06
Epoch 188/500
1/1 [==============================] - 0s 5ms/step - loss: 9.4207e-07
Epoch 189/500
1/1 [==============================] - 0s 6ms/step - loss: 6.5638e-06
Epoch 190/500
1/1 [==============================] - 0s 6ms/step - loss: 1.0401e-06
Epoch 191/500
1/1 [==============================] - 0s 5ms/step - loss: 1.0438e-06
Epoch 192/500
1/1 [==============================] - 0s 6ms/step - loss: 1.2676e-05
Epoch 193/500
1/1 [==============================] - 0s 6ms/step - loss: 4.6819e-06
Epoch 194/500
1/1 [==============================] - 0s 6ms/step - loss: 5.6789e-06
Epoch 195/500
1/1 [==============================] - 0s 6ms/step - loss: 5.6072e-07
Epoch 196/500
1/1 [==============================] - 0s 5ms/step - loss: 1.2910e-06
Epoch 197/500
1/1 [==============================] - 0s 5ms/step - loss: 5.1185e-07
Epoch 198/500
1/1 [==============================] - 0s 5ms/step - loss: 2.5403e-07
Epoch 199/500
1/1 [==============================] - 0s 5ms/step - loss: 2.4581e-07
Epoch 200/500
1/1 [==============================] - 0s 5ms/step - loss: 1.1932e-06
Epoch 201/500
1/1 [==============================] - 0s 6ms/step - loss: 2.2282e-05
Epoch 202/500
1/1 [==============================] - 0s 6ms/step - loss: 4.6821e-09
Epoch 203/500
1/1 [==============================] - 0s 6ms/step - loss: 2.1968e-05
Epoch 204/500
1/1 [==============================] - 0s 6ms/step - loss: 4.8966e-07
Epoch 205/500
1/1 [==============================] - 0s 5ms/step - loss: 7.2969e-08
Epoch 206/500
1/1 [==============================] - 0s 6ms/step - loss: 1.0465e-05
Epoch 207/500
1/1 [==============================] - 0s 5ms/step - loss: 7.1856e-06
Epoch 208/500
1/1 [==============================] - 0s 5ms/step - loss: 1.4129e-05
Epoch 209/500
1/1 [==============================] - 0s 5ms/step - loss: 1.0635e-05
Epoch 210/500
1/1 [==============================] - 0s 6ms/step - loss: 5.8938e-09
Epoch 211/500
1/1 [==============================] - 0s 5ms/step - loss: 1.8488e-06
Epoch 212/500
1/1 [==============================] - 0s 4ms/step - loss: 1.0498e-08
Epoch 213/500
1/1 [==============================] - 0s 5ms/step - loss: 1.3996e-06
Epoch 214/500
1/1 [==============================] - 0s 5ms/step - loss: 6.5001e-06
Epoch 215/500
1/1 [==============================] - 0s 5ms/step - loss: 1.1777e-05
Epoch 216/500
1/1 [==============================] - 0s 6ms/step - loss: 8.1670e-06
Epoch 217/500
1/1 [==============================] - 0s 5ms/step - loss: 2.1399e-05
Epoch 218/500
1/1 [==============================] - 0s 5ms/step - loss: 2.6684e-05
Epoch 219/500
1/1 [==============================] - 0s 5ms/step - loss: 2.3186e-05
Epoch 220/500
1/1 [==============================] - 0s 5ms/step - loss: 2.1011e-06
Epoch 221/500
1/1 [==============================] - 0s 6ms/step - loss: 8.7858e-06
Epoch 222/500
1/1 [==============================] - 0s 5ms/step - loss: 1.4740e-05
Epoch 223/500
1/1 [==============================] - 0s 5ms/step - loss: 1.9015e-07
Epoch 224/500
1/1 [==============================] - 0s 6ms/step - loss: 2.8978e-05
Epoch 225/500
1/1 [==============================] - 0s 5ms/step - loss: 4.1515e-05
Epoch 226/500
1/1 [==============================] - 0s 5ms/step - loss: 3.3282e-05
Epoch 227/500
1/1 [==============================] - 0s 6ms/step - loss: 2.5124e-05
Epoch 228/500
1/1 [==============================] - 0s 5ms/step - loss: 1.4524e-05
Epoch 229/500
1/1 [==============================] - 0s 5ms/step - loss: 5.6384e-06
Epoch 230/500
1/1 [==============================] - 0s 5ms/step - loss: 5.1968e-06
Epoch 231/500
1/1 [==============================] - 0s 5ms/step - loss: 4.1277e-06
Epoch 232/500
1/1 [==============================] - 0s 5ms/step - loss: 5.2441e-05
Epoch 233/500
1/1 [==============================] - 0s 5ms/step - loss: 4.2758e-06
Epoch 234/500
1/1 [==============================] - 0s 5ms/step - loss: 4.1566e-06
Epoch 235/500
1/1 [==============================] - 0s 5ms/step - loss: 1.4436e-06
Epoch 236/500
1/1 [==============================] - 0s 5ms/step - loss: 1.8441e-06
Epoch 237/500
1/1 [==============================] - 0s 4ms/step - loss: 3.5229e-08
Epoch 238/500
1/1 [==============================] - 0s 5ms/step - loss: 1.6409e-06
Epoch 239/500
1/1 [==============================] - 0s 5ms/step - loss: 7.8261e-07
Epoch 240/500
1/1 [==============================] - 0s 5ms/step - loss: 1.9792e-06
Epoch 241/500
1/1 [==============================] - 0s 5ms/step - loss: 5.4231e-08
Epoch 242/500
1/1 [==============================] - 0s 5ms/step - loss: 9.2102e-06
Epoch 243/500
1/1 [==============================] - 0s 5ms/step - loss: 1.1329e-06
Epoch 244/500
1/1 [==============================] - 0s 5ms/step - loss: 1.2321e-07
Epoch 245/500
1/1 [==============================] - 0s 5ms/step - loss: 4.1816e-07
Epoch 246/500
1/1 [==============================] - 0s 5ms/step - loss: 5.2027e-06
Epoch 247/500
1/1 [==============================] - 0s 5ms/step - loss: 1.4128e-07
Epoch 248/500
1/1 [==============================] - 0s 6ms/step - loss: 5.0227e-10
Epoch 249/500
1/1 [==============================] - 0s 5ms/step - loss: 6.3211e-06
Epoch 250/500
1/1 [==============================] - 0s 5ms/step - loss: 1.8305e-06
Epoch 251/500
1/1 [==============================] - 0s 5ms/step - loss: 5.2764e-06
Epoch 252/500
1/1 [==============================] - 0s 5ms/step - loss: 1.2982e-07
Epoch 253/500
1/1 [==============================] - 0s 6ms/step - loss: 1.5052e-05
Epoch 254/500
1/1 [==============================] - 0s 5ms/step - loss: 6.7641e-06
Epoch 255/500
1/1 [==============================] - 0s 5ms/step - loss: 3.7836e-06
Epoch 256/500
1/1 [==============================] - 0s 6ms/step - loss: 1.1075e-05
Epoch 257/500
1/1 [==============================] - 0s 5ms/step - loss: 5.3343e-06
Epoch 258/500
1/1 [==============================] - 0s 5ms/step - loss: 4.4200e-07
Epoch 259/500
1/1 [==============================] - 0s 6ms/step - loss: 4.7805e-07
Epoch 260/500
1/1 [==============================] - 0s 6ms/step - loss: 2.2589e-05
Epoch 261/500
1/1 [==============================] - 0s 6ms/step - loss: 1.1215e-05
Epoch 262/500
1/1 [==============================] - 0s 5ms/step - loss: 1.2633e-05
Epoch 263/500
1/1 [==============================] - 0s 5ms/step - loss: 7.1677e-06
Epoch 264/500
1/1 [==============================] - 0s 6ms/step - loss: 4.1477e-05
Epoch 265/500
1/1 [==============================] - 0s 5ms/step - loss: 9.1177e-09
Epoch 266/500
1/1 [==============================] - 0s 6ms/step - loss: 1.1205e-07
Epoch 267/500
1/1 [==============================] - 0s 6ms/step - loss: 6.4356e-07
Epoch 268/500
1/1 [==============================] - 0s 6ms/step - loss: 5.0019e-06
Epoch 269/500
1/1 [==============================] - 0s 8ms/step - loss: 3.9607e-05
Epoch 270/500
1/1 [==============================] - 0s 19ms/step - loss: 7.1228e-06
Epoch 271/500
1/1 [==============================] - 0s 11ms/step - loss: 3.8820e-05
Epoch 272/500
1/1 [==============================] - 0s 6ms/step - loss: 1.0640e-05
Epoch 273/500
1/1 [==============================] - 0s 6ms/step - loss: 9.2365e-07
Epoch 274/500
1/1 [==============================] - 0s 7ms/step - loss: 5.2532e-05
Epoch 275/500
1/1 [==============================] - 0s 6ms/step - loss: 8.2394e-06
Epoch 276/500
1/1 [==============================] - 0s 6ms/step - loss: 2.5481e-07
Epoch 277/500
1/1 [==============================] - 0s 6ms/step - loss: 2.1460e-07
Epoch 278/500
1/1 [==============================] - 0s 7ms/step - loss: 4.9034e-06
Epoch 279/500
1/1 [==============================] - 0s 7ms/step - loss: 3.4162e-05
Epoch 280/500
1/1 [==============================] - 0s 6ms/step - loss: 2.3428e-05
Epoch 281/500
1/1 [==============================] - 0s 5ms/step - loss: 1.6165e-05
Epoch 282/500
1/1 [==============================] - 0s 5ms/step - loss: 9.2594e-07
Epoch 283/500
1/1 [==============================] - 0s 6ms/step - loss: 3.1163e-06
Epoch 284/500
1/1 [==============================] - 0s 6ms/step - loss: 8.4107e-06
Epoch 285/500
1/1 [==============================] - 0s 5ms/step - loss: 5.1244e-05
Epoch 286/500
1/1 [==============================] - 0s 5ms/step - loss: 1.0865e-06
Epoch 287/500
1/1 [==============================] - 0s 5ms/step - loss: 6.9996e-06
Epoch 288/500
1/1 [==============================] - 0s 4ms/step - loss: 6.3479e-07
Epoch 289/500
1/1 [==============================] - 0s 4ms/step - loss: 6.9097e-07
Epoch 290/500
1/1 [==============================] - 0s 5ms/step - loss: 2.4515e-05
Epoch 291/500
1/1 [==============================] - 0s 6ms/step - loss: 1.5395e-05
Epoch 292/500
1/1 [==============================] - 0s 5ms/step - loss: 5.4213e-07
Epoch 293/500
1/1 [==============================] - 0s 5ms/step - loss: 3.0809e-06
Epoch 294/500
1/1 [==============================] - 0s 4ms/step - loss: 1.5107e-05
Epoch 295/500
1/1 [==============================] - 0s 5ms/step - loss: 9.5034e-08
Epoch 296/500
1/1 [==============================] - 0s 5ms/step - loss: 5.0200e-08
Epoch 297/500
1/1 [==============================] - 0s 5ms/step - loss: 8.9601e-06
Epoch 298/500
1/1 [==============================] - 0s 5ms/step - loss: 6.4376e-06
Epoch 299/500
1/1 [==============================] - 0s 6ms/step - loss: 3.0543e-06
Epoch 300/500
1/1 [==============================] - 0s 5ms/step - loss: 1.0175e-06
Epoch 301/500
1/1 [==============================] - 0s 5ms/step - loss: 3.9183e-08
Epoch 302/500
1/1 [==============================] - 0s 6ms/step - loss: 2.7312e-07
Epoch 303/500
1/1 [==============================] - 0s 5ms/step - loss: 1.4172e-05
Epoch 304/500
1/1 [==============================] - 0s 5ms/step - loss: 5.7010e-05
Epoch 305/500
1/1 [==============================] - 0s 5ms/step - loss: 2.3062e-06
Epoch 306/500
1/1 [==============================] - 0s 5ms/step - loss: 1.2785e-08
Epoch 307/500
1/1 [==============================] - 0s 5ms/step - loss: 2.8279e-06
Epoch 308/500
1/1 [==============================] - 0s 5ms/step - loss: 9.2587e-08
Epoch 309/500
1/1 [==============================] - 0s 5ms/step - loss: 2.2170e-05
Epoch 310/500
1/1 [==============================] - 0s 6ms/step - loss: 1.1517e-06
Epoch 311/500
1/1 [==============================] - 0s 5ms/step - loss: 2.2577e-06
Epoch 312/500
1/1 [==============================] - 0s 5ms/step - loss: 1.3120e-07
Epoch 313/500
1/1 [==============================] - 0s 6ms/step - loss: 7.0904e-06
Epoch 314/500
1/1 [==============================] - 0s 4ms/step - loss: 2.8093e-06
Epoch 315/500
1/1 [==============================] - 0s 5ms/step - loss: 4.1427e-05
Epoch 316/500
1/1 [==============================] - 0s 5ms/step - loss: 9.9984e-08
Epoch 317/500
1/1 [==============================] - 0s 5ms/step - loss: 4.1232e-07
Epoch 318/500
1/1 [==============================] - 0s 5ms/step - loss: 3.6112e-07
Epoch 319/500
1/1 [==============================] - 0s 5ms/step - loss: 3.7108e-07
Epoch 320/500
1/1 [==============================] - 0s 5ms/step - loss: 8.2812e-08
Epoch 321/500
1/1 [==============================] - 0s 5ms/step - loss: 1.3499e-06
Epoch 322/500
1/1 [==============================] - 0s 5ms/step - loss: 9.9419e-10
Epoch 323/500
1/1 [==============================] - 0s 19ms/step - loss: 3.1035e-06
Epoch 324/500
1/1 [==============================] - 0s 7ms/step - loss: 1.3080e-06
Epoch 325/500
1/1 [==============================] - 0s 6ms/step - loss: 5.4676e-08
Epoch 326/500
1/1 [==============================] - 0s 6ms/step - loss: 1.3935e-06
Epoch 327/500
1/1 [==============================] - 0s 6ms/step - loss: 2.9127e-06
Epoch 328/500
1/1 [==============================] - 0s 6ms/step - loss: 2.9426e-07
Epoch 329/500
1/1 [==============================] - 0s 6ms/step - loss: 2.7450e-07
Epoch 330/500
1/1 [==============================] - 0s 6ms/step - loss: 3.1416e-06
Epoch 331/500
1/1 [==============================] - 0s 6ms/step - loss: 2.3524e-06
Epoch 332/500
1/1 [==============================] - 0s 6ms/step - loss: 2.0568e-06
Epoch 333/500
1/1 [==============================] - 0s 6ms/step - loss: 1.1600e-08
Epoch 334/500
1/1 [==============================] - 0s 5ms/step - loss: 9.7404e-07
Epoch 335/500
1/1 [==============================] - 0s 34ms/step - loss: 1.8045e-06
Epoch 336/500
1/1 [==============================] - 0s 5ms/step - loss: 2.5045e-06
Epoch 337/500
1/1 [==============================] - 0s 5ms/step - loss: 3.6080e-06
Epoch 338/500
1/1 [==============================] - 0s 5ms/step - loss: 1.0532e-05
Epoch 339/500
1/1 [==============================] - 0s 5ms/step - loss: 1.4643e-05
Epoch 340/500
1/1 [==============================] - 0s 5ms/step - loss: 1.9683e-06
Epoch 341/500
1/1 [==============================] - 0s 6ms/step - loss: 5.6929e-08
Epoch 342/500
1/1 [==============================] - 0s 5ms/step - loss: 1.4587e-05
Epoch 343/500
1/1 [==============================] - 0s 5ms/step - loss: 1.4993e-06
Epoch 344/500
1/1 [==============================] - 0s 5ms/step - loss: 1.0084e-05
Epoch 345/500
1/1 [==============================] - 0s 5ms/step - loss: 1.6322e-07
Epoch 346/500
1/1 [==============================] - 0s 5ms/step - loss: 3.4292e-06
Epoch 347/500
1/1 [==============================] - 0s 6ms/step - loss: 3.2163e-05
Epoch 348/500
1/1 [==============================] - 0s 6ms/step - loss: 8.8858e-07
Epoch 349/500
1/1 [==============================] - 0s 6ms/step - loss: 6.3920e-06
Epoch 350/500
1/1 [==============================] - 0s 6ms/step - loss: 2.3229e-08
Epoch 351/500
1/1 [==============================] - 0s 5ms/step - loss: 1.2958e-06
Epoch 352/500
1/1 [==============================] - 0s 6ms/step - loss: 2.8155e-06
Epoch 353/500
1/1 [==============================] - 0s 6ms/step - loss: 5.7401e-07
Epoch 354/500
1/1 [==============================] - 0s 5ms/step - loss: 9.1553e-07
Epoch 355/500
1/1 [==============================] - 0s 6ms/step - loss: 1.1768e-10
Epoch 356/500
1/1 [==============================] - 0s 6ms/step - loss: 3.8005e-06
Epoch 357/500
1/1 [==============================] - 0s 6ms/step - loss: 8.7297e-08
Epoch 358/500
1/1 [==============================] - 0s 5ms/step - loss: 1.3507e-05
Epoch 359/500
1/1 [==============================] - 0s 6ms/step - loss: 7.5241e-08
Epoch 360/500
1/1 [==============================] - 0s 5ms/step - loss: 1.6093e-06
Epoch 361/500
1/1 [==============================] - 0s 5ms/step - loss: 2.7995e-05
Epoch 362/500
1/1 [==============================] - 0s 6ms/step - loss: 6.7477e-06
Epoch 363/500
1/1 [==============================] - 0s 6ms/step - loss: 6.8070e-07
Epoch 364/500
1/1 [==============================] - 0s 5ms/step - loss: 2.4270e-06
Epoch 365/500
1/1 [==============================] - 0s 5ms/step - loss: 5.7471e-05
Epoch 366/500
1/1 [==============================] - 0s 5ms/step - loss: 7.0301e-07
Epoch 367/500
1/1 [==============================] - 0s 16ms/step - loss: 2.9008e-05
Epoch 368/500
1/1 [==============================] - 0s 10ms/step - loss: 2.4705e-07
Epoch 369/500
1/1 [==============================] - 0s 6ms/step - loss: 1.3411e-05
Epoch 370/500
1/1 [==============================] - 0s 6ms/step - loss: 2.6115e-06
Epoch 371/500
1/1 [==============================] - 0s 6ms/step - loss: 4.0838e-06
Epoch 372/500
1/1 [==============================] - 0s 6ms/step - loss: 1.9648e-06
Epoch 373/500
1/1 [==============================] - 0s 6ms/step - loss: 5.7843e-06
Epoch 374/500
1/1 [==============================] - 0s 6ms/step - loss: 1.3037e-05
Epoch 375/500
1/1 [==============================] - 0s 5ms/step - loss: 1.2165e-05
Epoch 376/500
1/1 [==============================] - 0s 6ms/step - loss: 1.1015e-05
Epoch 377/500
1/1 [==============================] - 0s 5ms/step - loss: 8.6910e-08
Epoch 378/500
1/1 [==============================] - 0s 6ms/step - loss: 3.0571e-05
Epoch 379/500
1/1 [==============================] - 0s 6ms/step - loss: 3.8940e-06
Epoch 380/500
1/1 [==============================] - 0s 6ms/step - loss: 3.5926e-07
Epoch 381/500
1/1 [==============================] - 0s 6ms/step - loss: 1.7917e-05
Epoch 382/500
1/1 [==============================] - 0s 7ms/step - loss: 9.4007e-06
Epoch 383/500
1/1 [==============================] - 0s 6ms/step - loss: 2.5384e-06
Epoch 384/500
1/1 [==============================] - 0s 7ms/step - loss: 1.0406e-05
Epoch 385/500
1/1 [==============================] - 0s 7ms/step - loss: 2.1297e-05
Epoch 386/500
1/1 [==============================] - 0s 6ms/step - loss: 1.6286e-05
Epoch 387/500
1/1 [==============================] - 0s 7ms/step - loss: 4.2580e-05
Epoch 388/500
1/1 [==============================] - 0s 6ms/step - loss: 8.0496e-08
Epoch 389/500
1/1 [==============================] - 0s 7ms/step - loss: 1.5183e-05
Epoch 390/500
1/1 [==============================] - 0s 6ms/step - loss: 9.7404e-06
Epoch 391/500
1/1 [==============================] - 0s 5ms/step - loss: 1.5061e-05
Epoch 392/500
1/1 [==============================] - 0s 6ms/step - loss: 2.4126e-05
Epoch 393/500
1/1 [==============================] - 0s 6ms/step - loss: 8.1613e-06
Epoch 394/500
1/1 [==============================] - 0s 6ms/step - loss: 2.3522e-05
Epoch 395/500
1/1 [==============================] - 0s 5ms/step - loss: 4.0806e-06
Epoch 396/500
1/1 [==============================] - 0s 6ms/step - loss: 1.0600e-05
Epoch 397/500
1/1 [==============================] - 0s 6ms/step - loss: 4.8524e-06
Epoch 398/500
1/1 [==============================] - 0s 6ms/step - loss: 1.0352e-07
Epoch 399/500
1/1 [==============================] - 0s 6ms/step - loss: 1.5149e-07
Epoch 400/500
1/1 [==============================] - 0s 6ms/step - loss: 6.7927e-06
Epoch 401/500
1/1 [==============================] - 0s 6ms/step - loss: 1.8015e-05
Epoch 402/500
1/1 [==============================] - 0s 6ms/step - loss: 6.8649e-06
Epoch 403/500
1/1 [==============================] - 0s 6ms/step - loss: 3.0971e-05
Epoch 404/500
1/1 [==============================] - 0s 6ms/step - loss: 2.9835e-06
Epoch 405/500
1/1 [==============================] - 0s 6ms/step - loss: 3.2352e-05
Epoch 406/500
1/1 [==============================] - 0s 7ms/step - loss: 5.7539e-06
Epoch 407/500
1/1 [==============================] - 0s 6ms/step - loss: 1.3134e-06
Epoch 408/500
1/1 [==============================] - 0s 6ms/step - loss: 1.4103e-05
Epoch 409/500
1/1 [==============================] - 0s 6ms/step - loss: 6.6993e-07
Epoch 410/500
1/1 [==============================] - 0s 6ms/step - loss: 6.8850e-07
Epoch 411/500
1/1 [==============================] - 0s 6ms/step - loss: 2.4094e-05
Epoch 412/500
1/1 [==============================] - 0s 6ms/step - loss: 7.6957e-06
Epoch 413/500
1/1 [==============================] - 0s 6ms/step - loss: 2.5113e-06
Epoch 414/500
1/1 [==============================] - 0s 6ms/step - loss: 8.2210e-06
Epoch 415/500
1/1 [==============================] - 0s 6ms/step - loss: 2.4802e-05
Epoch 416/500
1/1 [==============================] - 0s 31ms/step - loss: 4.4113e-07
Epoch 417/500
1/1 [==============================] - 0s 8ms/step - loss: 5.4760e-06
Epoch 418/500
1/1 [==============================] - 0s 7ms/step - loss: 4.8916e-06
Epoch 419/500
1/1 [==============================] - 0s 7ms/step - loss: 2.2245e-06
Epoch 420/500
1/1 [==============================] - 0s 7ms/step - loss: 2.1456e-06
Epoch 421/500
1/1 [==============================] - 0s 7ms/step - loss: 6.4977e-06
Epoch 422/500
1/1 [==============================] - 0s 6ms/step - loss: 1.6152e-05
Epoch 423/500
1/1 [==============================] - 0s 6ms/step - loss: 2.3546e-05
Epoch 424/500
1/1 [==============================] - 0s 6ms/step - loss: 8.9410e-07
Epoch 425/500
1/1 [==============================] - 0s 6ms/step - loss: 2.4488e-06
Epoch 426/500
1/1 [==============================] - 0s 6ms/step - loss: 6.6902e-06
Epoch 427/500
1/1 [==============================] - 0s 6ms/step - loss: 6.4914e-05
Epoch 428/500
1/1 [==============================] - 0s 6ms/step - loss: 5.2060e-08
Epoch 429/500
1/1 [==============================] - 0s 6ms/step - loss: 2.6883e-06
Epoch 430/500
1/1 [==============================] - 0s 5ms/step - loss: 1.0360e-09
Epoch 431/500
1/1 [==============================] - 0s 6ms/step - loss: 4.4820e-07
Epoch 432/500
1/1 [==============================] - 0s 5ms/step - loss: 3.3641e-06
Epoch 433/500
1/1 [==============================] - 0s 6ms/step - loss: 8.7844e-06
Epoch 434/500
1/1 [==============================] - 0s 6ms/step - loss: 7.4865e-06
Epoch 435/500
1/1 [==============================] - 0s 6ms/step - loss: 2.2340e-06
Epoch 436/500
1/1 [==============================] - 0s 6ms/step - loss: 8.6229e-06
Epoch 437/500
1/1 [==============================] - 0s 6ms/step - loss: 2.8293e-08
Epoch 438/500
1/1 [==============================] - 0s 6ms/step - loss: 9.3194e-06
Epoch 439/500
1/1 [==============================] - 0s 6ms/step - loss: 1.3669e-05
Epoch 440/500
1/1 [==============================] - 0s 7ms/step - loss: 2.0882e-06
Epoch 441/500
1/1 [==============================] - 0s 6ms/step - loss: 5.4562e-06
Epoch 442/500
1/1 [==============================] - 0s 6ms/step - loss: 1.2471e-06
Epoch 443/500
1/1 [==============================] - 0s 6ms/step - loss: 9.4015e-06
Epoch 444/500
1/1 [==============================] - 0s 6ms/step - loss: 3.6926e-07
Epoch 445/500
1/1 [==============================] - 0s 6ms/step - loss: 1.9620e-06
Epoch 446/500
1/1 [==============================] - 0s 6ms/step - loss: 1.0600e-06
Epoch 447/500
1/1 [==============================] - 0s 6ms/step - loss: 3.7123e-06
Epoch 448/500
1/1 [==============================] - 0s 6ms/step - loss: 1.8890e-06
Epoch 449/500
1/1 [==============================] - 0s 6ms/step - loss: 1.6656e-07
Epoch 450/500
1/1 [==============================] - 0s 7ms/step - loss: 2.5923e-05
Epoch 451/500
1/1 [==============================] - 0s 7ms/step - loss: 2.8344e-07
Epoch 452/500
1/1 [==============================] - 0s 6ms/step - loss: 8.7008e-06
Epoch 453/500
1/1 [==============================] - 0s 6ms/step - loss: 2.0124e-06
Epoch 454/500
1/1 [==============================] - 0s 6ms/step - loss: 1.2641e-05
Epoch 455/500
1/1 [==============================] - 0s 7ms/step - loss: 4.1562e-07
Epoch 456/500
1/1 [==============================] - 0s 6ms/step - loss: 4.0759e-07
Epoch 457/500
1/1 [==============================] - 0s 6ms/step - loss: 1.0755e-07
Epoch 458/500
1/1 [==============================] - 0s 6ms/step - loss: 2.0140e-05
Epoch 459/500
1/1 [==============================] - 0s 6ms/step - loss: 3.6060e-05
Epoch 460/500
1/1 [==============================] - 0s 6ms/step - loss: 1.2965e-05
Epoch 461/500
1/1 [==============================] - 0s 6ms/step - loss: 2.6633e-08
Epoch 462/500
1/1 [==============================] - 0s 6ms/step - loss: 2.3047e-08
Epoch 463/500
1/1 [==============================] - 0s 35ms/step - loss: 2.9354e-05
Epoch 464/500
1/1 [==============================] - 0s 7ms/step - loss: 6.9772e-06
Epoch 465/500
1/1 [==============================] - 0s 7ms/step - loss: 4.4993e-06
Epoch 466/500
1/1 [==============================] - 0s 7ms/step - loss: 2.6705e-09
Epoch 467/500
1/1 [==============================] - 0s 7ms/step - loss: 1.0548e-05
Epoch 468/500
1/1 [==============================] - 0s 6ms/step - loss: 1.4305e-06
Epoch 469/500
1/1 [==============================] - 0s 6ms/step - loss: 1.2167e-05
Epoch 470/500
1/1 [==============================] - 0s 7ms/step - loss: 2.9186e-06
Epoch 471/500
1/1 [==============================] - 0s 6ms/step - loss: 7.5564e-07
Epoch 472/500
1/1 [==============================] - 0s 6ms/step - loss: 2.4575e-05
Epoch 473/500
1/1 [==============================] - 0s 5ms/step - loss: 7.0291e-07
Epoch 474/500
1/1 [==============================] - 0s 6ms/step - loss: 1.8626e-05
Epoch 475/500
1/1 [==============================] - 0s 6ms/step - loss: 7.2234e-06
Epoch 476/500
1/1 [==============================] - 0s 6ms/step - loss: 1.2721e-05
Epoch 477/500
1/1 [==============================] - 0s 6ms/step - loss: 5.6278e-07
Epoch 478/500
1/1 [==============================] - 0s 6ms/step - loss: 2.9541e-06
Epoch 479/500
1/1 [==============================] - 0s 6ms/step - loss: 5.4962e-07
Epoch 480/500
1/1 [==============================] - 0s 6ms/step - loss: 5.1199e-06
Epoch 481/500
1/1 [==============================] - 0s 6ms/step - loss: 1.6929e-07
Epoch 482/500
1/1 [==============================] - 0s 6ms/step - loss: 4.5590e-07
Epoch 483/500
1/1 [==============================] - 0s 6ms/step - loss: 3.1773e-05
Epoch 484/500
1/1 [==============================] - 0s 7ms/step - loss: 5.4670e-06
Epoch 485/500
1/1 [==============================] - 0s 6ms/step - loss: 2.0676e-06
Epoch 486/500
1/1 [==============================] - 0s 6ms/step - loss: 3.1195e-06
Epoch 487/500
1/1 [==============================] - 0s 7ms/step - loss: 5.2242e-06
Epoch 488/500
1/1 [==============================] - 0s 6ms/step - loss: 4.4828e-07
Epoch 489/500
1/1 [==============================] - 0s 6ms/step - loss: 2.6785e-06
Epoch 490/500
1/1 [==============================] - 0s 6ms/step - loss: 2.5152e-07
Epoch 491/500
1/1 [==============================] - 0s 6ms/step - loss: 1.9737e-08
Epoch 492/500
1/1 [==============================] - 0s 6ms/step - loss: 2.7381e-07
Epoch 493/500
1/1 [==============================] - 0s 6ms/step - loss: 1.0329e-07
Epoch 494/500
1/1 [==============================] - 0s 6ms/step - loss: 1.9879e-05
Epoch 495/500
1/1 [==============================] - 0s 6ms/step - loss: 3.0817e-06
Epoch 496/500
1/1 [==============================] - 0s 6ms/step - loss: 1.2209e-05
Epoch 497/500
1/1 [==============================] - 0s 6ms/step - loss: 3.6378e-07
Epoch 498/500
1/1 [==============================] - 0s 6ms/step - loss: 2.3634e-05
Epoch 499/500
1/1 [==============================] - 0s 6ms/step - loss: 1.5354e-05
Epoch 500/500
1/1 [==============================] - 0s 5ms/step - loss: 1.4438e-07
In [45]:
y_b_test_pred.shape, y_b_test.shape
Out[45]:
((69, 1), (74, 1))
In [46]:
metrics.mean_squared_error(y_b_test[n_input:] * 10.0, y_b_test_pred * 10.0) ** 0.5
Out[46]:
0.02561094861964438
In [ ]:
In [ ]:
Content source: abulbasar/machine-learning
Similar notebooks: