Momentum

Still Relevant Today?

QuantCon Singapore 2016
Anthony Ng
anthony@gmail.com
11 Nov 2016

Disclaimer

  • None of the contents constitute an offer to sell, a solicitation to buy, or a recommendation or endorsement for any security or strategy, nor does it constitute an offer to provide investment advisory services
  • Past performance is no indicator of future performance
  • Provided for informational purposes only
  • All investments involve risk, including loss of principal

Momentum

Approach and Data

In this notebook, we aim to replicate the Time Series Momentum(TSMOM) Moskowitz, Ooi and Pedersen (MOP) paper. We examined the TSMOM performance by using the same futures contracts and time period (1984 - 2009). After which, we extended the testing period to include Oct 2016. As a form of benchmark, we also examined the buy-and-hold with and without volatility scaling strategy as advocated by Kim, Tse and Wald in their Time Series Momentum and Volatility Scaling (2016) paper. Data were sourced from Bloomberg.

  • 9 developed countries index futures
  • 24 commodities futures
  • 13 bond futures
  • 9 currencies futures
  • Jul 1959 - Oct 2016

Volatility Scaling

In MOP paper, under section 2.4, they discussed using ex ante volatility estimate to scale capital allocation to different futures contract based on each future contracts volatility. The basic idea is very similar to risk bedgeting or risk parity. The following is their explanation:

...Since volatility varies dramatically across our assets, we scale the returns by their volatilities in order to make meaningful comparisons across assets. We estimate each instrument's ex ante volatility $\sigma_t$ at each point in time using an extremely simple model: the exponential weighted lagged squared daily returns (i.e., similar to a simple univatiate GARCH model). Specifically, the ex ante annualized variance $\sigma^2_t$ for each instrument is calculated as follows: $$\sigma^2_t=261\sum^\infty_{i=0}(1-\delta)\delta^i(r_{t-1-i}-\bar{r}_t)^2$$

where the scalar 261 scales the variance to be annual, the weights $(1-\delta)\delta^i$ add up to one, and $\bar{r}_t$ is the exponentially weighted average return computed similarly. The parameter $\delta$ is chosen so that the center of mass of the weights is $\sum^{\infty}_{i=0}(1-\delta)\delta^ii=\delta/(1-\delta)=60$ days. The volatility model is the same for all assets at all times...


In [1]:
import numpy as np
import pandas as pd
import datetime
import pyfolio as pf
import matplotlib.pyplot as plt
import matplotlib
import seaborn as sns
import pytz

In [2]:
tolerance = 0.
look_back = 12

# Vol scaling
vol_flag = 1                  # Set flag to 1 for vol targeting
if vol_flag == 1:
    target_vol = 0.4
else:
    target_vol = 'no target vol'

1. Trading Strategy - TSMOM with Volatility Scaling (1984 - 2009)

As per MOP paper covering 1984 - 2009

1a. Individual Futures Performance


In [3]:
res = local_csv("futures.csv")
# res = local_csv("futures_incl_2016.csv") # Uncomment this line to include 2016
res['Date'] = pd.to_datetime(res['Date'], format='%Y-%m-%d')
res.set_index('Date', inplace=True)

In [4]:
std_index = res.resample('BM').last().index
mth_index = pd.DataFrame(index=std_index)
mth_index_vol = pd.DataFrame(index=std_index)
summary_stats = pd.DataFrame(index=['Asset', 'Start', 'Mean', 'Std', \
                                    'Skew', 'Kurt', 'Sharpe Ratio'])

In [5]:
for oo in res.columns:
    returns = res[oo]
    returns.dropna(inplace=True)
    
    first_date = returns.index[0].strftime("%Y-%m-%d")   # store this to show when data series starts
    
    ret_index = (1 + returns).cumprod()
    ret_index[0] = 1
    
    # equation (1) ex ante vol estimate
    day_vol = returns.ewm(ignore_na=False,
                          adjust=True,
                          com=60,   
                          min_periods=0).std(bias=False)
    vol = day_vol * np.sqrt(261) # annualise
    
    ret_index = pd.concat([ret_index, vol], axis=1)
    ret_index.columns = [oo, 'vol']
    
    # convert to monthly
    ret_m_index = ret_index.resample('BM').last().ffill()
    ret_m_index.ix[0][oo] = 1

    mth_index = pd.concat([mth_index, ret_m_index[oo]], axis=1)
    tmp = ret_m_index['vol']
    tmp.name = oo + "_Vol"
    mth_index_vol = pd.concat([mth_index_vol, tmp], axis=1)

    tmp_mean = ret_index[oo].pct_change().mean()*252
    tmp_std = ret_index[oo].pct_change().std()*np.sqrt(252)
    tmp_skew = ret_index[oo].pct_change().skew()
    tmp_kurt = ret_index[oo].pct_change().kurt()
    sr = tmp_mean / tmp_std

    dict = {'Asset': oo,
            'Start': first_date,
            'Mean': np.round(tmp_mean,4),
            'Std':  np.round(tmp_std,4),
            'Skew': np.round(tmp_skew,4),
            'Kurt': np.round(tmp_kurt,4),
            'Sharpe Ratio': np.round(sr,4),
            }
    summary_stats[oo] = pd.Series(dict)

In [6]:
summary_stats = summary_stats.transpose()
futures_list = local_csv("futures_list.csv")
all = summary_stats.reset_index().merge(futures_list)
all.sort_values(by=["ASSET_CLASS", "FUTURES"], inplace=True)
del all['Asset'], all['index']

Individual Futures Contracts Performance

These are annualized performance.


In [7]:
all.set_index(['ASSET_CLASS', 'FUTURES']).style.set_properties(**{'text-align': 'right'})


Out[7]:
Start Mean Std Skew Kurt Sharpe Ratio
ASSET_CLASS FUTURES
BOND AUSTRALIA 10-YEAR BOND 1987-09-22 -0.0244 0.0137 -0.3368 4.2927 -1.7838
BOND AUSTRALIA 3-YEAR BOND 1989-12-04 -0.0183 0.0137 -0.2034 3.8708 -1.3344
BOND CANADA 10-YEAR BOND 1989-09-19 0.0098 0.064 -0.2109 2.3455 0.1538
BOND EURO BOBL 1991-10-08 0.0016 0.0339 -0.2887 1.8962 0.0484
BOND EURO BUND 1990-11-27 0.0098 0.0529 -0.2032 1.7357 0.1861
BOND EURO BUXL 1998-10-06 0.0038 0.0997 -0.1955 2.0124 0.0382
BOND EURO SCHATZ 1997-03-11 -0.0134 0.0146 -0.3324 3.6816 -0.9156
BOND JAPAN 10-YEAR BOND 1985-10-22 0.0074 0.0542 -0.5901 8.9518 0.136
BOND LONG GILT 1984-01-03 -0.0127 0.0755 0.0872 3.7265 -0.1688
BOND US 10-YEAR NOTE 1984-01-03 0.0156 0.0689 0.076 3.2148 0.2269
BOND US 2-YEAR NOTE 1990-06-27 -0.0081 0.0185 -0.0891 4.0655 -0.4388
BOND US 5-YEAR NOTE 1988-05-24 0.0012 0.0431 -0.0979 2.7085 0.0284
BOND US LONG BOND 1984-01-03 0.0274 0.1034 -0.0237 1.9634 0.2646
COMMODITIES ALUMINIUM 1997-07-25 -0.0217 0.2131 -0.1772 2.471 -0.1016
COMMODITIES BRENT CRUDE 1988-06-27 0.1729 0.3557 -0.4146 11.5905 0.486
COMMODITIES COCOA 1984-01-03 -0.039 0.3035 0.1705 2.5969 -0.1285
COMMODITIES COFFEE 1984-01-03 -0.0273 0.3724 0.5869 9.6552 -0.0733
COMMODITIES COPPER 1997-07-23 0.1133 0.2807 0.0752 4.565 0.4035
COMMODITIES CORN 1984-01-03 -0.0908 0.2332 0.0768 2.9448 -0.3892
COMMODITIES COTTON 1984-01-03 -0.0294 0.2484 0.1078 2.5545 -0.1183
COMMODITIES GASOIL 1989-07-05 0.1447 0.3357 -0.3526 10.7393 0.431
COMMODITIES GOLD 1984-01-03 -0.0255 0.1615 0.1759 7.768 -0.1577
COMMODITIES HEATING OIL 1986-07-02 0.1641 0.3659 -0.3195 9.5853 0.4483
COMMODITIES LEAN HOGS 1986-04-03 -0.0217 0.2333 -0.0693 1.2697 -0.0929
COMMODITIES LIVE CATTLE 1984-01-03 0.0026 0.1446 -0.0978 1.3376 0.0178
COMMODITIES NATURAL GAS 1990-04-05 -0.0052 0.5312 0.5047 5.5002 -0.0097
COMMODITIES NICKEL 1997-07-25 0.1594 0.3935 0.1149 3.363 0.405
COMMODITIES PLATINUM 1984-01-30 0.0353 0.2315 -0.044 7.4779 0.1523
COMMODITIES RBOB GASOLINE 2005-10-05 0.0703 0.4422 -0.069 1.8821 0.1589
COMMODITIES SILVER 1984-01-03 -0.0217 0.279 -0.3456 6.2958 -0.0779
COMMODITIES SOY MEAL 1984-01-03 0.0533 0.2439 -0.031 2.5715 0.2186
COMMODITIES SOY OIL 1984-01-03 -0.0333 0.2397 0.2251 1.8114 -0.1388
COMMODITIES SOYBEANS 1984-01-03 -0.0084 0.2277 -0.1572 2.628 -0.0368
COMMODITIES SUGAR 1984-01-03 0.0181 0.3685 -0.0212 4.0372 0.0492
COMMODITIES WHEAT 1984-01-03 -0.0603 0.2573 0.1473 2.7786 -0.2345
COMMODITIES WTI CRUDE 1984-01-03 0.1346 0.3759 -0.3765 10.9173 0.358
COMMODITIES ZINC 1997-07-25 0.0092 0.3078 -0.086 3.3653 0.03
CURRENCIES AUSTRALIAN DOLLAR 1987-01-14 0.0146 0.1197 -0.6075 11.0146 0.1223
CURRENCIES CANADIAN DOLLAR 1986-04-07 -0.0082 0.0724 0.0903 8.6474 -0.1137
CURRENCIES EURO 1998-05-21 0.0021 0.1021 0.0266 1.3833 0.0201
CURRENCIES JAPANESE YEN 1986-05-28 -0.0301 0.1143 0.6129 6.7518 -0.2636
CURRENCIES NEW ZEALAND 1997-05-09 0.0167 0.1391 -0.2998 3.8182 0.12
CURRENCIES NORWAY 2002-05-20 0.0472 0.1346 -0.1743 2.9403 0.3508
CURRENCIES SWEDEN 2002-05-20 0.0367 0.1348 0.3729 5.9377 0.2724
CURRENCIES SWITZERLAND 1986-04-08 -0.0137 0.1186 0.1735 2.0546 -0.1157
CURRENCIES UK 1986-05-29 -0.0028 0.1015 -0.2066 3.498 -0.0278
EQUITY INDEXES AEX (NETHERLANDS) 1989-01-04 0.0085 0.25 -7.0447 238.363 0.034
EQUITY INDEXES DAX (GERMANY) 1990-11-27 0.0389 0.237 0.0231 6.0168 0.1642
EQUITY INDEXES FTSE 100 (UK) 1988-03-01 0.0093 0.1881 0.0019 5.7149 0.0497
EQUITY INDEXES FTSE/MIB (ITALY) 2004-03-24 0.001 0.2246 0.0756 8.9781 0.0044
EQUITY INDEXES IBEX 35 (SPAIN) 1992-07-02 0.0745 0.2373 -0.1767 4.3022 0.314
EQUITY INDEXES S&P 500 (US) 1997-09-11 -0.0035 0.2192 0.215 10.1467 -0.0159
EQUITY INDEXES SPI 200 (AUSTRALIA) 2000-05-04 0.0307 0.1674 -0.3216 6.2255 0.1834
EQUITY INDEXES TOPIX (JAPAN) 1990-05-21 -0.0553 0.2428 0.2691 9.5202 -0.2279

1b. Trading Strategy - TSMOM with Volatility Scaling (1984 - 2009)

Perform a Time Series Momentum strategy on each price series.

  • Long those with positive momentum
  • Short those with negative momentum

In [8]:
pnl = pd.DataFrame(index=std_index)
leverage = pd.DataFrame(index=std_index)
strategy_cumm_rtns = pd.DataFrame(index=std_index)

In [9]:
for oo in mth_index:
    df = pd.concat([mth_index[oo], mth_index_vol[oo+"_Vol"]], axis=1)
    df['returns'] = df[oo].pct_change(look_back)
    
    df['pnl'] = 0.
    df['leverage'] = 0.
    try:
        for k, v in enumerate(df['returns']):
            if k <= look_back:
                # skip the first 12 observations
                continue
            if df['returns'].iloc[k-1] < tolerance:
                # negative returns, sell and hold for 1 mth, then close position
                if vol_flag == 1:
                    df['pnl'].iloc[k] = (df[oo].iloc[k - 1] / df[oo].iloc[k] - 1) * \
                                        target_vol / df[oo+"_Vol"].iloc[k - 1]
                    df['leverage'].iloc[k] = target_vol / df[oo+"_Vol"].iloc[k - 1]
                else:
                    df['pnl'].iloc[k] = (df[oo].iloc[k - 1] / df[oo].iloc[k] - 1)                   
                    df['leverage'].iloc[k] = 1.
            elif df['returns'].iloc[k-1] > tolerance:
                # positive returns, buy and hold for 1 mth, then close position
                if vol_flag == 1:
                    df['pnl'].iloc[k] = (df[oo].iloc[k] / df[oo].iloc[k - 1] - 1) * \
                                        target_vol / df[oo+"_Vol"].iloc[k - 1]
                    df['leverage'].iloc[k] = target_vol / df[oo+"_Vol"].iloc[k - 1]
                else:
                    df['pnl'].iloc[k] = (df[oo].iloc[k] / df[oo].iloc[k - 1] - 1) 
                    df['leverage'].iloc[k] = 1.
    except: pass    
    # convert to cummulative index
    pnl = pd.concat([pnl, df['pnl']], axis=1)
    leverage = pd.concat([leverage, df['leverage']], axis=1)
    
    ret_index = (1 + df['pnl'][13:]).cumprod()
    ret_index[0] = 1
    strategy_cumm_rtns = pd.concat([strategy_cumm_rtns, ret_index], axis=1)

In [10]:
pnl.columns = res.columns
leverage.columns = leverage.columns
strategy_cumm_rtns.columns = res.columns
df = pnl
df['port_avg'] = df.mean(skipna = 1, axis=1)
Strategy = df['port_avg'].copy()
Strategy.name = "TSMOM with Vol"
dataport_index = (1 + df['port_avg']).cumprod()

In [11]:
print "Annualized Sharpe Ratio = ", pf.empyrical.sharpe_ratio(df['port_avg'], period='monthly')
print "Annualized Mean Returns = ", pf.empyrical.annual_return(df['port_avg'], period='monthly')
print "Annualized Standard Deviations = ", pf.empyrical.annual_volatility(df['port_avg'], period='monthly')


Annualized Sharpe Ratio =  1.55942862102
Annualized Mean Returns =  0.158205253154
Annualized Standard Deviations =  0.0977748556597

In [12]:
print "Max Drawdown = ", pf.empyrical.max_drawdown(df['port_avg'])
print "Calmar ratio = ", pf.empyrical.calmar_ratio(df['port_avg'], period='monthly')


Max Drawdown =  -0.140726202907
Calmar ratio =  1.12420608164

Other Strategies

The performance of Time Series Momentum with volatility scaling (MOP strategy) up to and including Sep 2016 (not shown) is:

  • Annualized Sharpe Ratio = 1.32240147363
  • Annualized Mean Returns = 0.144648073277
  • Annualized Standard Deviations = 0.107003671777
  • Max Drawdown = -0.140726203354
  • Calmar ratio = 1.02786879649

The performance of Time Series Momentum without volatility scaling is:

  • Annualized Sharpe Ratio = 1.18250924383
  • Annualized Mean Returns = 0.0593476792819
  • Annualized Standard Deviations = 0.0499137056686
  • Max Drawdown = -0.125187506104
  • Calmar ratio = 0.474070305648

In [13]:
eastern = pytz.timezone('US/Eastern')
df['port_avg'].index = df['port_avg'].index.tz_localize(pytz.utc).tz_convert(eastern)
pf.create_full_tear_sheet(df['port_avg'])


Entire data start date: 1984-01-30
Entire data end date: 2009-12-30


Backtest Months: 14
Performance statistics Backtest
annual_return 20.85
annual_volatility 0.45
sharpe_ratio 7.15
calmar_ratio 148.18
stability_of_timeseries 0.96
max_drawdown -0.14
omega_ratio 3.37
sortino_ratio 15.12
skew 0.14
kurtosis 0.77
tail_ratio 1.75
common_sense_ratio 38.30
information_ratio 0.44
alpha 4.32
beta -0.09
Worst Drawdown Periods net drawdown in % peak date valley date recovery date duration
0 14.07 2003-05-29 2003-08-28 2003-12-30 153
1 13.54 2009-02-26 2009-07-30 NaT NaN
2 9.17 2008-06-29 2008-09-29 2008-10-30 89
3 7.82 2004-03-30 2004-07-29 2004-10-28 153
4 7.33 2007-06-28 2007-08-30 2007-10-30 89

[-0.061 -0.061]
/usr/local/lib/python2.7/dist-packages/numpy/lib/function_base.py:3834: RuntimeWarning: Invalid value encountered in percentile
  RuntimeWarning)
Stress Events mean min max
Dotcom 2.44% -2.87% 5.45%
Lehmann -2.47% -3.74% -1.21%
9/11 9.52% 9.52% 9.52%
US Housing 5.28% 5.28% 5.28%
Aug07 -1.90% -1.90% -1.90%
Mar08 -1.04% -1.04% -1.04%
Sept08 -1.21% -1.21% -1.21%
2009Q1 2.17% 1.28% 3.07%
2009Q2 -4.35% -7.21% -1.88%
Low Volatility Bull Market 2.04% -5.54% 7.36%
GFC Crash 2.07% -4.99% 10.51%
Recovery -0.70% -7.21% 4.44%
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-13-e20fd174a628> in <module>()
      1 eastern = pytz.timezone('US/Eastern')
      2 df['port_avg'].index = df['port_avg'].index.tz_localize(pytz.utc).tz_convert(eastern)
----> 3 pf.create_full_tear_sheet(df['port_avg'])

/usr/local/lib/python2.7/dist-packages/pyfolio/tears.pyc in create_full_tear_sheet(returns, positions, transactions, market_data, benchmark_rets, gross_lev, slippage, live_start_date, sector_mappings, bayesian, round_trips, hide_positions, cone_std, bootstrap, unadjusted_returns, set_context)
    171     create_interesting_times_tear_sheet(returns,
    172                                         benchmark_rets=benchmark_rets,
--> 173                                         set_context=set_context)
    174 
    175     if positions is not None:

/usr/local/lib/python2.7/dist-packages/pyfolio/plotting.pyc in call_w_context(*args, **kwargs)
     50         if set_context:
     51             with context():
---> 52                 return func(*args, **kwargs)
     53         else:
     54             return func(*args, **kwargs)

/usr/local/lib/python2.7/dist-packages/pyfolio/tears.pyc in create_interesting_times_tear_sheet(returns, benchmark_rets, legend_loc, return_fig)
    695         empyrical.cum_returns(rets_period).plot(
    696             ax=ax, color='forestgreen', label='algo', alpha=0.7, lw=2)
--> 697         empyrical.cum_returns(bmark_interesting[name]).plot(
    698             ax=ax, color='gray', label='SPY', alpha=0.6)
    699         ax.legend(['algo',

KeyError: 'Dotcom'

Further Analysis


In [14]:
pf.plot_drawdown_underwater(df['port_avg']);



In [15]:
ax = (1 + df['port_avg']).cumprod().plot(logy=True);
ax.set_title("Cummulative Excess Return, " + \
            "\ntarget vol = " + str(target_vol) + ", look back = " + \
             str(look_back) + " months");



In [16]:
tmp = df['port_avg'].reset_index()
tmp['Date'] = pd.to_datetime(tmp['Date'], format='%Y-%m-%d')
tmp = tmp.set_index('Date')
tmp['month'] = tmp.index.month
tmp['year'] = tmp.index.year
tmp = np.round(tmp, 3)
res = tmp.pivot('year', 'month', 'port_avg')
res['total'] = np.sum(res, axis=1)

In [17]:
fig, ax = plt.subplots(figsize=(20,20));
sns.heatmap(res.fillna(0) * 100,
            annot=True,
            annot_kws={
                "size": 13},
            alpha=1.0,
            center=0.0,
            cbar=True,
            cmap=matplotlib.cm.PiYG,
            linewidths=.5,
            ax = ax); 
ax.set_ylabel('Year');
ax.set_xlabel('Month');
ax.set_title("Monthly Returns (%), " + \
            "\ntarget vol = " + str(target_vol) + ", look back = " + \
             str(look_back) + " months");
plt.show()


2. Buy-and-Hold Without Volatility Scaling Strategy (1984 - 2009)

Buy-and-hold with and without volatility scaling strategy as advocated by Kim, Tse and Wald in their Time Series Momentum and Volatility Scaling (2016) paper.

The performance with volatility scaling is not shown in this notebook. It can be easily accomodated by adding this two addition line:

  • vol_flag = 1
  • target_vol = 0.4

With volatility scaling, the result is:

  • Annualized Sharpe Ratio = -0.0882482424146
  • Annualized Mean Returns = -0.022460936792
  • Annualized Standard Deviations = 0.142189626101
  • Max Drawdown = -0.793868253586
  • Calmar ratio = -0.0282930280818

In [18]:
res = local_csv("futures.csv")
res['Date'] = pd.to_datetime(res['Date'], format='%Y-%m-%d')
res.set_index('Date', inplace=True)
std_index = res.resample('BM').last().index
mth_index = pd.DataFrame(index=std_index)
mth_index_vol = pd.DataFrame(index=std_index)
summary_stats = pd.DataFrame(index=['Asset', 'Start', 'Mean', 'Std', 'Skew', 'Kurt', 'Sharpe Ratio'])

In [19]:
for oo in res.columns:
    returns = res[oo]
    returns.dropna(inplace=True)
    
    first_date = returns.index[0].strftime("%Y-%m-%d")   # store this to show when data series starts
    
    ret_index = (1 + returns).cumprod()
    ret_index[0] = 1
    
    # equation (1) ex ante vol estimate
    day_vol = returns.ewm(ignore_na=False,
                          adjust=True,
                          com=60,   
                          min_periods=0).std(bias=False)
    vol = day_vol * np.sqrt(261) # annualise
    
    ret_index = pd.concat([ret_index, vol], axis=1)
    ret_index.columns = [oo, 'vol']
    
    # convert to monthly
    ret_m_index = ret_index.resample('BM').last().ffill()
    ret_m_index.ix[0][oo] = 1

    mth_index = pd.concat([mth_index, ret_m_index[oo]], axis=1)
    tmp = ret_m_index['vol']
    tmp.name = oo + "_Vol"
    mth_index_vol = pd.concat([mth_index_vol, tmp], axis=1)

    tmp_mean = ret_index[oo].pct_change().mean()*252
    tmp_std = ret_index[oo].pct_change().std()*np.sqrt(252)
    tmp_skew = ret_index[oo].pct_change().skew()
    tmp_kurt = ret_index[oo].pct_change().kurt()
    sr = tmp_mean / tmp_std

    dict = {'Asset': oo,
            'Start': first_date,
            'Mean': np.round(tmp_mean,4),
            'Std':  np.round(tmp_std,4),
            'Skew': np.round(tmp_skew,4),
            'Kurt': np.round(tmp_kurt,4),
            'Sharpe Ratio': np.round(sr,4),
            }
    summary_stats[oo] = pd.Series(dict)

In [20]:
summary_stats = summary_stats.transpose()
futures_list = local_csv("futures_list.csv")
all = summary_stats.reset_index().merge(futures_list)
all.sort_values(by=["ASSET_CLASS", "FUTURES"], inplace=True)
del all['Asset'], all['index']
pnl = pd.DataFrame(index=std_index)
leverage = pd.DataFrame(index=std_index)
strategy_cumm_rtns = pd.DataFrame(index=std_index)

In [21]:
vol_flag = 0  # change to flag to 1 to volatility scale the strategy
#target_vol = 0.4

In [22]:
for oo in mth_index:
    df = pd.concat([mth_index[oo], mth_index_vol[oo+"_Vol"]], axis=1)
    df['returns'] = df[oo].pct_change(look_back)
    
    df['pnl'] = 0.
    df['leverage'] = 0.
    try:
        for k, v in enumerate(df['returns']):
            if k <= look_back:
                # skip the first 12 observations
                continue
            if vol_flag == 1:
                df['pnl'].iloc[k] = (df[oo].iloc[k] / df[oo].iloc[k - 1] - 1) * \
                                    target_vol / df[oo+'_Vol'].iloc[k - 1]
                df['leverage'].iloc[k] = target_vol / df[oo+'_Vol'].iloc[k - 1]
            else:
                df['pnl'].iloc[k] = (df[oo].iloc[k] / df[oo].iloc[k - 1] - 1)                   
                df['leverage'].iloc[k] = 1.
    except:
        pass  
    # convert to cummulative index
    pnl = pd.concat([pnl, df['pnl']], axis=1)
    leverage = pd.concat([leverage, df['leverage']], axis=1)
    
    ret_index = (1 + df['pnl'][13:]).cumprod()
    ret_index[0] = 1
    strategy_cumm_rtns = pd.concat([strategy_cumm_rtns, ret_index], axis=1)

In [23]:
pnl.columns = res.columns
leverage.columns = leverage.columns
strategy_cumm_rtns.columns = res.columns
df = pnl
df['port_avg'] = df.mean(skipna = 1, axis=1)

temp = df['port_avg'].copy()
temp.name = "Buy_Hold No Vol"
temp.index = temp.index.tz_localize(pytz.utc).tz_convert(eastern)
Strategy.index = Strategy.index.tz_localize(pytz.utc).tz_convert(eastern)
Strategy = pd.concat([Strategy, temp], axis=1)

dataport_index = (1 + df['port_avg']).cumprod()

In [24]:
print "Annualized Sharpe Ratio = ", pf.empyrical.sharpe_ratio(df['port_avg'], period='monthly')
print "Annualized Mean Returns = ", pf.empyrical.annual_return(df['port_avg'], period='monthly')
print "Annualized Standard Deviations = ", pf.empyrical.annual_volatility(df['port_avg'], period='monthly')


Annualized Sharpe Ratio =  0.196406771186
Annualized Mean Returns =  0.0115582210137
Annualized Standard Deviations =  0.0718082068149

In [25]:
print "Max Drawdown = ", pf.empyrical.max_drawdown(df['port_avg'])
print "Calmar ratio = ", pf.empyrical.calmar_ratio(df['port_avg'], period='monthly')


Max Drawdown =  -0.320576124342
Calmar ratio =  0.0360545285069

In [26]:
eastern = pytz.timezone('US/Eastern')
df['port_avg'].index = df['port_avg'].index.tz_localize(pytz.utc).tz_convert(eastern)
pf.create_full_tear_sheet(df['port_avg'])


Entire data start date: 1984-01-30
Entire data end date: 2009-12-30


Backtest Months: 14
Performance statistics Backtest
annual_return 0.27
annual_volatility 0.33
sharpe_ratio 0.90
calmar_ratio 0.85
stability_of_timeseries 0.40
max_drawdown -0.32
omega_ratio 1.17
sortino_ratio 1.25
skew -0.96
kurtosis 6.51
tail_ratio 1.09
common_sense_ratio 1.39
information_ratio 0.16
alpha 1.04
beta -0.36
Worst Drawdown Periods net drawdown in % peak date valley date recovery date duration
0 32.06 2008-06-29 2009-02-26 NaT NaN
1 3.94 2006-04-27 2006-09-28 2006-11-29 154
2 3.93 2003-02-27 2003-03-30 2003-05-29 66
3 3.83 2008-02-28 2008-03-30 2008-06-29 87
4 2.63 2006-11-29 2007-01-30 2007-04-29 108

[-0.049 -0.049]
Stress Events mean min max
Dotcom 0.47% -1.82% 2.90%
Lehmann -5.52% -7.75% -3.30%
9/11 -5.01% -5.01% -5.01%
US Housing 2.76% 2.76% 2.76%
Aug07 -1.99% -1.99% -1.99%
Mar08 -3.83% -3.83% -3.83%
Sept08 -7.75% -7.75% -7.75%
2009Q1 -3.00% -3.23% -2.76%
2009Q2 4.62% 2.90% 7.41%
Low Volatility Bull Market 0.41% -2.65% 4.00%
GFC Crash -1.13% -13.77% 5.88%
Recovery 2.34% -0.88% 7.41%
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-26-e20fd174a628> in <module>()
      1 eastern = pytz.timezone('US/Eastern')
      2 df['port_avg'].index = df['port_avg'].index.tz_localize(pytz.utc).tz_convert(eastern)
----> 3 pf.create_full_tear_sheet(df['port_avg'])

/usr/local/lib/python2.7/dist-packages/pyfolio/tears.pyc in create_full_tear_sheet(returns, positions, transactions, market_data, benchmark_rets, gross_lev, slippage, live_start_date, sector_mappings, bayesian, round_trips, hide_positions, cone_std, bootstrap, unadjusted_returns, set_context)
    171     create_interesting_times_tear_sheet(returns,
    172                                         benchmark_rets=benchmark_rets,
--> 173                                         set_context=set_context)
    174 
    175     if positions is not None:

/usr/local/lib/python2.7/dist-packages/pyfolio/plotting.pyc in call_w_context(*args, **kwargs)
     50         if set_context:
     51             with context():
---> 52                 return func(*args, **kwargs)
     53         else:
     54             return func(*args, **kwargs)

/usr/local/lib/python2.7/dist-packages/pyfolio/tears.pyc in create_interesting_times_tear_sheet(returns, benchmark_rets, legend_loc, return_fig)
    695         empyrical.cum_returns(rets_period).plot(
    696             ax=ax, color='forestgreen', label='algo', alpha=0.7, lw=2)
--> 697         empyrical.cum_returns(bmark_interesting[name]).plot(
    698             ax=ax, color='gray', label='SPY', alpha=0.6)
    699         ax.legend(['algo',

KeyError: 'Dotcom'

Factor Analysis


In [27]:
import statsmodels.api as sm

TSMOM Volatility Scaled (1984 - 2009) [Same as MOP paper]


In [28]:
TSMOM = Strategy.reset_index()[["Date", "TSMOM with Vol"]]
TSMOM = TSMOM.set_index("Date").tz_convert(None)
TSMOM = TSMOM.reset_index()

In [29]:
df = local_csv("factors.csv")
df["Date"] = pd.to_datetime(df['Date'], format='%Y-%m-%d')
data = df.merge(TSMOM)

In [30]:
data = data[['Date', 'SMB', 'HML', 'Mom', 'bond_index', 'equity_index', \
             'commodity_index', 'TSMOM with Vol']].copy()
data.columns = ['Date', 'SMB', 'HML', 'MOM', 'BOND', 'EQUITY', 'COMMODITY', 'PORTFOLIO']
data = data.dropna()
data = data.set_index("Date")
data = data.reset_index()

In [31]:
X = data[['SMB', 'HML', 'MOM', 'BOND', 'EQUITY', 'COMMODITY']].copy()
X = sm.add_constant(X)
model = sm.OLS(data['PORTFOLIO'].astype(float), X).fit()
print(model.summary())


                            OLS Regression Results                            
==============================================================================
Dep. Variable:              PORTFOLIO   R-squared:                       0.040
Model:                            OLS   Adj. R-squared:                  0.022
Method:                 Least Squares   F-statistic:                     2.140
Date:                Tue, 22 Nov 2016   Prob (F-statistic):             0.0487
Time:                        10:50:11   Log-Likelihood:                 677.30
No. Observations:                 312   AIC:                            -1341.
Df Residuals:                     305   BIC:                            -1314.
Df Model:                           6                                         
Covariance Type:            nonrobust                                         
==============================================================================
                 coef    std err          t      P>|t|      [95.0% Conf. Int.]
------------------------------------------------------------------------------
const          0.0120      0.002      6.797      0.000         0.009     0.016
SMB            0.0016      0.002      0.714      0.476        -0.003     0.006
HML            0.0031      0.003      0.968      0.334        -0.003     0.009
MOM            0.0045      0.003      1.571      0.117        -0.001     0.010
BOND           0.0806      0.125      0.646      0.519        -0.165     0.326
EQUITY        -0.1030      0.037     -2.755      0.006        -0.177    -0.029
COMMODITY      0.0220      0.029      0.756      0.450        -0.035     0.079
==============================================================================
Omnibus:                        3.675   Durbin-Watson:                   1.698
Prob(Omnibus):                  0.159   Jarque-Bera (JB):                4.230
Skew:                           0.073   Prob(JB):                        0.121
Kurtosis:                       3.551   Cond. No.                         82.5
==============================================================================

Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.

Buy-and-Hold without Vol (1984 - 2009)


In [32]:
BH_no_V = Strategy.reset_index()[["Date", "Buy_Hold No Vol"]]
BH_no_V = BH_no_V.set_index("Date").tz_convert(None)
BH_no_V = BH_no_V.reset_index()

In [33]:
data = df.merge(BH_no_V)
data = data[['Date', 'SMB', 'HML', 'Mom', 'bond_index', 'equity_index', \
             'commodity_index', 'Buy_Hold No Vol']].copy()
data.columns = ['Date', 'SMB', 'HML', 'MOM', 'BOND', 'EQUITY', 'COMMODITY', 'PORTFOLIO']
data = data.dropna()
data = data.set_index("Date")
data = data.reset_index()

In [34]:
X = data[['SMB', 'HML', 'MOM', 'BOND', 'EQUITY', 'COMMODITY']].copy()
X = sm.add_constant(X)
model = sm.OLS(data['PORTFOLIO'].astype(float), X).fit()
print(model.summary())


                            OLS Regression Results                            
==============================================================================
Dep. Variable:              PORTFOLIO   R-squared:                       0.670
Model:                            OLS   Adj. R-squared:                  0.663
Method:                 Least Squares   F-statistic:                     103.1
Date:                Tue, 22 Nov 2016   Prob (F-statistic):           2.14e-70
Time:                        10:50:11   Log-Likelihood:                 940.05
No. Observations:                 312   AIC:                            -1866.
Df Residuals:                     305   BIC:                            -1840.
Df Model:                           6                                         
Covariance Type:            nonrobust                                         
==============================================================================
                 coef    std err          t      P>|t|      [95.0% Conf. Int.]
------------------------------------------------------------------------------
const         -0.0006      0.001     -0.763      0.446        -0.002     0.001
SMB        -4.451e-05      0.001     -0.046      0.963        -0.002     0.002
HML           -0.0008      0.001     -0.579      0.563        -0.004     0.002
MOM           -0.0003      0.001     -0.277      0.782        -0.003     0.002
BOND           0.1423      0.054      2.648      0.009         0.037     0.248
EQUITY         0.1713      0.016     10.630      0.000         0.140     0.203
COMMODITY      0.2440      0.013     19.437      0.000         0.219     0.269
==============================================================================
Omnibus:                        8.177   Durbin-Watson:                   1.862
Prob(Omnibus):                  0.017   Jarque-Bera (JB):               12.672
Skew:                           0.132   Prob(JB):                      0.00177
Kurtosis:                       3.951   Cond. No.                         82.5
==============================================================================

Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.

Summary

Buy and Hold without volatility scaling strategy showed that it is postively correlated to bond, equity and commodity. The p-value also showed that these are significant.


Comparing TSMOM with S&P 500


In [35]:
SPX = df[["Date", "spx"]].copy()
SPX["Date"] = pd.to_datetime(SPX['Date'], format='%Y-%m-%d')

In [36]:
TSMOM = Strategy.reset_index()[["Date", "TSMOM with Vol"]].dropna()
TSMOM = TSMOM.set_index("Date").tz_convert(None)
TSMOM = TSMOM.reset_index()
#TSMOM["Date"] = pd.to_datetime(TSMOM['Date'], format='%Y-%m-%d')

In [38]:
comb = TSMOM.merge(SPX)
sns.regplot(x="spx", y="TSMOM with Vol", data=comb, order=2);



In [39]:
X = comb['spx'].copy()
X = sm.add_constant(X)
model = sm.OLS(comb['TSMOM with Vol'].astype(float), X).fit()
print(model.summary())


                            OLS Regression Results                            
==============================================================================
Dep. Variable:         TSMOM with Vol   R-squared:                       0.030
Model:                            OLS   Adj. R-squared:                  0.027
Method:                 Least Squares   F-statistic:                     9.535
Date:                Tue, 22 Nov 2016   Prob (F-statistic):            0.00220
Time:                        10:51:01   Log-Likelihood:                 675.59
No. Observations:                 312   AIC:                            -1347.
Df Residuals:                     310   BIC:                            -1340.
Df Model:                           1                                         
Covariance Type:            nonrobust                                         
==============================================================================
                 coef    std err          t      P>|t|      [95.0% Conf. Int.]
------------------------------------------------------------------------------
const          0.0132      0.002      8.319      0.000         0.010     0.016
spx           -0.1103      0.036     -3.088      0.002        -0.181    -0.040
==============================================================================
Omnibus:                        3.401   Durbin-Watson:                   1.718
Prob(Omnibus):                  0.183   Jarque-Bera (JB):                3.959
Skew:                           0.024   Prob(JB):                        0.138
Kurtosis:                       3.550   Cond. No.                         22.7
==============================================================================

Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.

Note the positive alpha and significant coefficient relating to S&P 500.


In [40]:
X = comb['spx'].copy()
X = X ** 2
Y = comb['TSMOM with Vol']
Y = Y ** 2
model = sm.OLS(Y, X).fit()
print(model.summary())


                            OLS Regression Results                            
==============================================================================
Dep. Variable:         TSMOM with Vol   R-squared:                       0.165
Model:                            OLS   Adj. R-squared:                  0.163
Method:                 Least Squares   F-statistic:                     61.66
Date:                Tue, 22 Nov 2016   Prob (F-statistic):           6.67e-14
Time:                        10:51:02   Log-Likelihood:                 1553.1
No. Observations:                 312   AIC:                            -3104.
Df Residuals:                     311   BIC:                            -3101.
Df Model:                           1                                         
Covariance Type:            nonrobust                                         
==============================================================================
                 coef    std err          t      P>|t|      [95.0% Conf. Int.]
------------------------------------------------------------------------------
spx            0.1667      0.021      7.852      0.000         0.125     0.208
==============================================================================
Omnibus:                      151.806   Durbin-Watson:                   1.512
Prob(Omnibus):                  0.000   Jarque-Bera (JB):             1359.979
Skew:                           1.784   Prob(JB):                    4.83e-296
Kurtosis:                      12.586   Cond. No.                         1.00
==============================================================================

Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.

Straddle like payoff

In MOP paper, they states the returns to TSMOM are largest during the biggest up and down market movements. In addition, the coefficient on the market return squared is significantly positive, indicating that TSMOM delivers its highest profits during the most extreme market movements. The result above also concur with their findings.

The take away from this is that TSMOM has payoffs similar to an option straddle on the market.

Conclusion

The benchmarks used for the factor analysis are:

  • EQUITY - MSCI World equity index
  • BOND - Barclay's Aggregate Bond Index
  • COMMODITY - S&P GSCI Index
  • SMB - Size
  • HML - Value vs growth
  • MOM - Cross sectional Momentum

The SMB, HML, MOM data are from Kenneth French's data library. Rest of the data are from Bloomberg after adjusting for risk free rate.

The performance of TSMOM has been quite impressive with the Sharpe ratio at 1.56 for 1984-2009 and 1.32 if we cover the 1984-2016 period. Without volatility scaling, the Sharpe ratio drop down to 1.18. Compare this to buy and hold strategy with and without volatility scaling which generated Sharpe Ratio of 0.196 and -0.088 respectively. Against other metrics such as maximum draw down, the TSMOM with and without volatility scaling also outperformed.

When one take a look at the factor analyis, TSMOM with volatility scaling is negatively correlated to the MSCI World Index and uncorrelated to the rest of the factors. The alpha is 1.2% per month. However, one must take this with a grain of salt as the adjusted R-squared is only 0.02.

This presentation is for informational purposes only and does not constitute an offer to sell, a solicitation to buy, or a recommendation for any security; nor does it constitute an offer to provide investment advisory or other services by Quantopian, Inc. ("Quantopian"). Nothing contained herein constitutes investment advice or offers any opinion with respect to the suitability of any security, and any views expressed herein should not be taken as advice to buy, sell, or hold any security or as an endorsement of any security or company. In preparing the information contained herein, Quantopian, Inc. has not taken into account the investment needs, objectives, and financial circumstances of any particular investor. Any views expressed and data illustrated herein were prepared based upon information, believed to be reliable, available to Quantopian, Inc. at the time of publication. Quantopian makes no guarantees as to their accuracy or completeness. All information is subject to change and may quickly become unreliable for various reasons, including changes in market conditions or economic circumstances.