Parameter estimation in GLMMs

The lme4 package for R provides the glmer function to define and fit generalized linear mixed models (GLMMs). The MixedModels package for Julia provides a similar function called glmm. Because I am more familiar with glmm than with glmer these days, I will explain the algorithms using glmm for illustration.

This is a Jupyter notebook of generalized linear mixed models (GLMM's) fit to data from a perception study data using the MixedModels package for Julia.

The notebook can be run by downloading version 0.6.0 or later of Julia and installing the IJulia and MixedModels packages. The Julia function to install a package is Pkg.add(), e.g.

Pkg.add("MixedModels")

When IJulia is installed you can start a notebook server in your browser by starting Julia in the usual way and, from within Julia, running

using IJulia
notebook()

Navigate to the location of this notebook file and select it to start the notebook.

An alternative is to log in to https://juliabox.com using one of the login options. From the File tab, upload the data and this notebook. Returning to the Jupyter tab and selecting the notebook should start it.

Once the notebook is running, Ctrl-Enter runs a cell or Shift-Enter runs a cell and moves to the next cell. Check under Help -> Keyboard Shortcuts for other key sequences.

Making packages available

The Julia equivalent of R's library() function is the using directive.


In [1]:
using DataFrames, MixedModels, RData


WARNING: Method definition ==(Base.Nullable{S}, Base.Nullable{T}) in module Base at nullable.jl:238 overwritten in module NullableArrays at /home/bates/.julia/v0.6/NullableArrays/src/operators.jl:128.

Consider the VerbAgg data from the lme4 package. This is one of the data sets available in the test/dat.rda file for the MixedModels package


In [2]:
const dat = convert(Dict{Symbol,DataFrame},
    load(Pkg.dir("MixedModels", "test", "dat.rda")))


Out[2]:
Dict{Symbol,DataFrames.DataFrame} with 61 entries:
  :bs10          => 1104×6 DataFrames.DataFrame…
  :Genetics      => 60×5 DataFrames.DataFrame…
  :Contraception => 1934×6 DataFrames.DataFrame…
  :Mmmec         => 354×6 DataFrames.DataFrame…
  :kb07          => 1790×10 DataFrames.DataFrame…
  :Rail          => 18×2 DataFrames.DataFrame…
  :KKL           => 53765×24 DataFrames.DataFrame…
  :Bond          => 21×3 DataFrames.DataFrame…
  :VerbAgg       => 7584×9 DataFrames.DataFrame…
  :ergoStool     => 36×3 DataFrames.DataFrame…
  :s3bbx         => 2449×6 DataFrames.DataFrame…
  :cake          => 270×5 DataFrames.DataFrame…
  :Cultivation   => 24×4 DataFrames.DataFrame…
  :Pastes        => 60×4 DataFrames.DataFrame…
  :Exam          => 4059×5 DataFrames.DataFrame…
  :Socatt        => 1056×9 DataFrames.DataFrame…
  :WWheat        => 60×3 DataFrames.DataFrame…
  :Pixel         => 102×5 DataFrames.DataFrame…
  :Arabidopsis   => 625×8 DataFrames.DataFrame…
  :TeachingII    => 96×14 DataFrames.DataFrame…
  :AvgDailyGain  => 32×6 DataFrames.DataFrame…
  :InstEval      => 73421×7 DataFrames.DataFrame…
  :Poems         => 275996×7 DataFrames.DataFrame…
  :d3            => 130418×5 DataFrames.DataFrame…
  :Hsb82         => 7185×8 DataFrames.DataFrame…
  ⋮              => ⋮

In [3]:
const verbagg = dat[:VerbAgg]


Out[3]:
agitemrespidbsmr2
120MS1WantCurseno1curseotherwantN
211MS1WantCurseno2curseotherwantN
317FS1WantCurseperhaps3curseotherwantY
421FS1WantCurseperhaps4curseotherwantY
517FS1WantCurseperhaps5curseotherwantY
621FS1WantCurseyes6curseotherwantY
739FS1WantCurseyes7curseotherwantY
821FS1WantCurseno8curseotherwantN
924FS1WantCurseno9curseotherwantN
1016FS1WantCurseyes10curseotherwantY
1115FS1WantCurseperhaps11curseotherwantY
1218FS1WantCurseyes12curseotherwantY
1336MS1WantCurseyes13curseotherwantY
1422MS1WantCurseyes14curseotherwantY
1516FS1WantCurseperhaps15curseotherwantY
1618FS1WantCurseperhaps16curseotherwantY
1723FS1WantCurseperhaps17curseotherwantY
1816FS1WantCurseperhaps18curseotherwantY
1921FS1WantCurseno19curseotherwantN
2025FS1WantCurseperhaps20curseotherwantY
2122FS1WantCurseperhaps21curseotherwantY
2215FS1WantCurseyes22curseotherwantY
2326FS1WantCurseyes23curseotherwantY
2413FS1WantCurseperhaps24curseotherwantY
2533FS1WantCurseno25curseotherwantN
2617FS1WantCurseno26curseotherwantN
2717FS1WantCurseyes27curseotherwantY
2822FS1WantCurseyes28curseotherwantY
2921MS1WantCurseperhaps29curseotherwantY
3017FS1WantCurseyes30curseotherwantY
&vellip&vellip&vellip&vellip&vellip&vellip&vellip&vellip&vellip&vellip

At present the response for a Bernoulli GLMM must be a 0/1 floating point vector so we convert the factor r2.


In [4]:
verbagg[:r201] = Float64[r == "Y" for r in verbagg[:r2]]


Out[4]:
7584-element Array{Float64,1}:
 0.0
 0.0
 1.0
 1.0
 1.0
 1.0
 1.0
 0.0
 0.0
 1.0
 1.0
 1.0
 1.0
 ⋮  
 0.0
 1.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0

Details of evaluating the objective function

The glmm function generates, but does not fit, a GeneralizedLinearMixedModel object.


In [5]:
mdl = glmm(@formula(r201 ~ 1 + a + g + b + s + (1|id) + (1|item)),
    verbagg, Bernoulli());
typeof(mdl)


Out[5]:
MixedModels.GeneralizedLinearMixedModel{Float64}

A separate call to fit! is required to fit the model. This involves optimizing an objective function, the Laplace approximation to the deviance, with respect to the parameters, which are $\beta$, the fixed-effects coefficients, and $\theta$, the covariance parameters. The starting estimate for $\beta$ is determined by fitting a GLM to the fixed-effects part of the formula


In [6]:
mdl.β


Out[6]:
6-element Array{Float64,1}:
  0.206053 
  0.0399404
  0.231317 
 -0.794186 
 -1.53919  
 -0.776656 

and the starting estimate for $\theta$, which is a vector of the two standard deviations of the random effects, is chosen to be


In [7]:
mdl.θ


Out[7]:
2-element Array{Float64,1}:
 1.0
 1.0

The Laplace approximation to the deviance requires determining the conditional modes of the random effects. These are the values that maximize the conditional density of the random effects, given the model parameters and the data. This is done using Penalized Iteratively Reweighted Least Squares (PIRLS). In most cases PIRLS is fast and stable. It is simply a penalized version of the IRLS algorithm used in fitting GLMs.

The distinction between the "fast" and "slow" algorithms in the MixedModels package (nAGQ=0 or nAGQ=1 in lme4) is whether the fixed-effects parameters, $\beta$, are optimized in PIRLS or in the nonlinear optimizer. In a call to the pirls! function the first argument is a GeneralizedLinearMixedModel, which is modified during the function call. (By convention, the names of such mutating functions end in ! as a warning to the user that they can modify an argument, usually the first argument.) The second and third arguments are optional logical values indicating if $\beta$ is to be varied and if verbose output is to be printed.


In [8]:
pirls!(mdl, true, true)


varyβ = true
obj₀ = 10210.8534389054
β = [0.206053, 0.0399404, 0.231317, -0.794186, -1.53919, -0.776656]
iter = 1
obj = 8301.483049027265
iter = 2
obj = 8205.604285133919
iter = 3
obj = 8201.896597466888
iter = 4
obj = 8201.848598910707
iter = 5
obj = 8201.848559060703
Out[8]:
8201.848559060703

In [9]:
LaplaceDeviance(mdl)


Out[9]:
8201.848559060703

In [10]:
mdl.β


Out[10]:
6-element Array{Float64,1}:
  0.218535 
  0.0514385
  0.290225 
 -0.979124 
 -1.95402  
 -0.979493 

In [11]:
mdl.θ # current values of the standard deviations of the random effects


Out[11]:
2-element Array{Float64,1}:
 1.0
 1.0

If the optimization with respect to $\beta$ is performed within PIRLS then the nonlinear optimization of the Laplace approximation to the deviance requires optimization with respect to $\theta$ only. This is the "fast" algorithm. Given a value of $\theta$ PIRLS is used to determine the conditional estimate of $\beta$ and the conditional mode of the random effects, b.


In [12]:
mdl.b # conditional modes of b


Out[12]:
2-element Array{Array{Float64,2},1}:
 [-0.600772 -1.93227 … -0.144554 -0.575224]
 [-0.186364 0.180552 … 0.282092 -0.221974] 

In [13]:
fit!(mdl, fast=true, verbose=true);


f_1: 8201.84856 [1.0, 1.0]
f_2: 8190.11782 [1.75, 1.0]
f_3: 8224.45098 [1.0, 1.75]
f_4: 9026.00391 [0.25, 1.0]
f_5: 8205.79378 [1.0, 0.25]
f_6: 8157.04103 [1.38583, 0.736457]
f_7: 8367.72422 [1.33715, 0.0]
f_8: 8170.28883 [1.41365, 1.11042]
f_9: 8158.82932 [1.27225, 0.762811]
f_10: 8161.93341 [1.40936, 0.868084]
f_11: 8156.30098 [1.32694, 0.721015]
f_12: 8156.11668 [1.32365, 0.714275]
f_13: 8156.00207 [1.31847, 0.708856]
f_14: 8155.75359 [1.32072, 0.701702]
f_15: 8155.27522 [1.32636, 0.687802]
f_16: 8154.41 [1.33859, 0.660408]
f_17: 8153.39496 [1.37582, 0.613358]
f_18: 8152.74094 [1.39515, 0.563096]
f_19: 8151.76473 [1.36763, 0.509124]
f_20: 8152.80894 [1.26776, 0.475123]
f_21: 8152.86642 [1.4148, 0.471099]
f_22: 8151.76959 [1.32589, 0.527523]
f_23: 8151.73776 [1.36681, 0.498606]
f_24: 8151.58516 [1.33974, 0.493492]
f_25: 8151.60206 [1.33758, 0.486311]
f_26: 8151.6005 [1.34692, 0.491348]
f_27: 8151.58338 [1.33958, 0.497337]
f_28: 8151.58359 [1.33927, 0.49802]
f_29: 8151.58335 [1.33972, 0.496955]
f_30: 8151.58351 [1.34041, 0.497255]
f_31: 8151.5834 [1.33957, 0.49622]
f_32: 8151.58348 [1.34031, 0.496495]
f_33: 8151.58334 [1.33956, 0.496833]
f_34: 8151.58334 [1.33953, 0.496902]
f_35: 8151.58334 [1.33963, 0.496867]
f_36: 8151.58334 [1.33963, 0.496803]
f_37: 8151.58334 [1.33956, 0.496833]

The optimization process is summarized by


In [14]:
mdl.LMM.optsum


Out[14]:
Initial parameter vector: [1.0, 1.0]
Initial objective value:  8201.848559060627

Optimizer (from NLopt):   LN_BOBYQA
Lower bounds:             [0.0, 0.0]
ftol_rel:                 1.0e-12
ftol_abs:                 1.0e-8
xtol_rel:                 0.0
xtol_abs:                 [1.0e-10, 1.0e-10]
initial_step:             [0.75, 0.75]
maxfeval:                 -1

Function evaluations:     37
Final parameter vector:   [1.33956, 0.496833]
Final objective value:    8151.583340132135
Return code:              FTOL_REACHED

As one would hope, given the name of the option, this fit is fast.


In [15]:
@time(fit!(glmm(@formula(r201 ~ 1 + a + g + b + s + (1 | id) + (1 | item)), 
        verbagg, Bernoulli()), fast=true))


  0.475847 seconds (136.63 k allocations: 16.450 MiB)
Out[15]:
Generalized Linear Mixed Model fit by minimizing the Laplace approximation to the deviance
  Formula: r201 ~ 1 + a + g + b + s + (1 | id) + (1 | item)
  Distribution: Distributions.Bernoulli{Float64}
  Link: GLM.LogitLink()

  Deviance (Laplace approximation): 8151.5833

Variance components:
          Column    Variance   Std.Dev. 
 id   (Intercept)  1.79443144 1.3395639
 item (Intercept)  0.24684282 0.4968328

 Number of obs: 7584; levels of grouping factors: 316, 24

Fixed-effects parameters:
              Estimate Std.Error  z value P(>|z|)
(Intercept)   0.208273  0.405425 0.513715  0.6075
a            0.0543791 0.0167533  3.24587  0.0012
g: M          0.304089  0.191223  1.59023  0.1118
b: scold       -1.0165  0.257531 -3.94708   <1e-4
b: shout       -2.0218  0.259235 -7.79912  <1e-14
s: self       -1.01344  0.210888 -4.80559   <1e-5

The alternative algorithm is to use PIRLS to find the conditional mode of the random effects, given $\beta$ and $\theta$ and then use the general nonlinear optimizer to fit with respect to both $\beta$ and $\theta$. Because it is slower to incorporate the $\beta$ parameters in the general nonlinear optimization, the fast fit is performed first and used to determine starting estimates for the more general optimization.


In [16]:
@time mdl1 = fit!(glmm(@formula(r201 ~ 1+a+g+b+s+(1|id)+(1|item)), 
        verbagg, Bernoulli()), verbose = true)


f_1: 8201.84856 [1.0, 1.0]
f_2: 8190.11782 [1.75, 1.0]
f_3: 8224.45098 [1.0, 1.75]
f_4: 9026.00391 [0.25, 1.0]
f_5: 8205.79378 [1.0, 0.25]
f_6: 8157.04103 [1.38583, 0.736457]
f_7: 8367.72422 [1.33715, 0.0]
f_8: 8170.28883 [1.41365, 1.11042]
f_9: 8158.82932 [1.27225, 0.762811]
f_10: 8161.93341 [1.40936, 0.868084]
f_11: 8156.30098 [1.32694, 0.721015]
f_12: 8156.11668 [1.32365, 0.714275]
f_13: 8156.00207 [1.31847, 0.708856]
f_14: 8155.75359 [1.32072, 0.701702]
f_15: 8155.27522 [1.32636, 0.687802]
f_16: 8154.41 [1.33859, 0.660408]
f_17: 8153.39496 [1.37582, 0.613358]
f_18: 8152.74094 [1.39515, 0.563096]
f_19: 8151.76473 [1.36763, 0.509124]
f_20: 8152.80894 [1.26776, 0.475123]
f_21: 8152.86642 [1.4148, 0.471099]
f_22: 8151.76959 [1.32589, 0.527523]
f_23: 8151.73776 [1.36681, 0.498606]
f_24: 8151.58516 [1.33974, 0.493492]
f_25: 8151.60206 [1.33758, 0.486311]
f_26: 8151.6005 [1.34692, 0.491348]
f_27: 8151.58338 [1.33958, 0.497337]
f_28: 8151.58359 [1.33927, 0.49802]
f_29: 8151.58335 [1.33972, 0.496955]
f_30: 8151.58351 [1.34041, 0.497255]
f_31: 8151.5834 [1.33957, 0.49622]
f_32: 8151.58348 [1.34031, 0.496495]
f_33: 8151.58334 [1.33956, 0.496833]
f_34: 8151.58334 [1.33953, 0.496902]
f_35: 8151.58334 [1.33963, 0.496867]
f_36: 8151.58334 [1.33963, 0.496803]
f_37: 8151.58334 [1.33956, 0.496833]
f_1: 8151.58334 [0.208273, 0.0543791, 0.304089, -1.0165, -2.0218, -1.01344, 1.33956, 0.496833]
f_2: 8161.67001 [0.613699, 0.0543791, 0.304089, -1.0165, -2.0218, -1.01344, 1.33956, 0.496833]
f_3: 8159.16562 [0.208273, 0.0711324, 0.304089, -1.0165, -2.0218, -1.01344, 1.33956, 0.496833]
f_4: 8152.56923 [0.208273, 0.0543791, 0.495312, -1.0165, -2.0218, -1.01344, 1.33956, 0.496833]
f_5: 8153.43623 [0.208273, 0.0543791, 0.304089, -0.758965, -2.0218, -1.01344, 1.33956, 0.496833]
f_6: 8154.04906 [0.208273, 0.0543791, 0.304089, -1.0165, -1.76257, -1.01344, 1.33956, 0.496833]
f_7: 8153.70955 [0.208273, 0.0543791, 0.304089, -1.0165, -2.0218, -0.802554, 1.33956, 0.496833]
f_8: 8152.13673 [0.208273, 0.0543791, 0.304089, -1.0165, -2.0218, -1.01344, 1.38956, 0.496833]
f_9: 8151.94237 [0.208273, 0.0543791, 0.304089, -1.0165, -2.0218, -1.01344, 1.33956, 0.546833]
f_10: 8160.99226 [-0.197152, 0.0543791, 0.304089, -1.0165, -2.0218, -1.01344, 1.33956, 0.496833]
f_11: 8159.31109 [0.208273, 0.0376258, 0.304089, -1.0165, -2.0218, -1.01344, 1.33956, 0.496833]
f_12: 8152.8189 [0.208273, 0.0543791, 0.112866, -1.0165, -2.0218, -1.01344, 1.33956, 0.496833]
f_13: 8153.27494 [0.208273, 0.0543791, 0.304089, -1.27403, -2.0218, -1.01344, 1.33956, 0.496833]
f_14: 8152.64297 [0.208273, 0.0543791, 0.304089, -1.0165, -2.28104, -1.01344, 1.33956, 0.496833]
f_15: 8152.75834 [0.208273, 0.0543791, 0.304089, -1.0165, -2.0218, -1.22433, 1.33956, 0.496833]
f_16: 8152.10285 [0.208273, 0.0543791, 0.304089, -1.0165, -2.0218, -1.01344, 1.28956, 0.496833]
f_17: 8152.03137 [0.208273, 0.0543791, 0.304089, -1.0165, -2.0218, -1.01344, 1.33956, 0.446833]
f_18: 8151.52031 [0.201112, 0.0544089, 0.312427, -1.02248, -2.07447, -1.04342, 1.33922, 0.497771]
f_19: 8151.50778 [0.201793, 0.0544486, 0.313889, -1.02199, -2.05689, -1.04261, 1.33903, 0.498404]
f_20: 8151.52004 [0.201295, 0.0544603, 0.314764, -1.02204, -2.04793, -1.04969, 1.33867, 0.499945]
f_21: 8151.54284 [0.201124, 0.0544557, 0.295171, -1.02255, -2.06051, -1.04562, 1.33897, 0.498566]
f_22: 8151.49141 [0.202347, 0.0544886, 0.327263, -1.02106, -2.05959, -1.03243, 1.33933, 0.495848]
f_23: 8151.49549 [0.201621, 0.0544794, 0.329618, -1.02131, -2.05984, -1.0359, 1.3392, 0.49188]
f_24: 8151.50532 [0.204911, 0.0544951, 0.321367, -1.02002, -2.04051, -1.04048, 1.33948, 0.493551]
f_25: 8151.5078 [0.201757, 0.0544995, 0.32957, -0.995883, -2.06335, -1.03432, 1.33931, 0.49575]
f_26: 8151.5057 [0.241985, 0.0544995, 0.329563, -1.02151, -2.06334, -1.03431, 1.33931, 0.495751]
f_27: 8151.66543 [0.201764, 0.0528502, 0.329544, -1.02151, -2.06331, -1.0343, 1.33931, 0.495752]
f_28: 8151.47683 [0.218793, 0.0554083, 0.329603, -1.02092, -2.06582, -1.03314, 1.33914, 0.495879]
f_29: 8151.49011 [0.179357, 0.0553107, 0.327184, -1.0205, -2.06165, -1.03128, 1.33918, 0.495969]
f_30: 8151.47991 [0.219823, 0.055509, 0.3321, -1.02135, -2.07013, -1.03507, 1.34399, 0.495785]
f_31: 8151.47199 [0.206629, 0.0552795, 0.32745, -1.02038, -2.06012, -1.03041, 1.34236, 0.496658]
f_32: 8151.46291 [0.210257, 0.0552172, 0.331507, -1.0218, -2.06343, -1.03733, 1.34137, 0.492506]
f_33: 8151.48255 [0.21785, 0.0550646, 0.345279, -1.02486, -2.07393, -1.04724, 1.34224, 0.484367]
f_34: 8151.46757 [0.210071, 0.0551388, 0.328943, -1.0213, -2.05954, -1.03509, 1.33653, 0.492911]
f_35: 8151.47233 [0.207016, 0.0553218, 0.327765, -1.02188, -2.06765, -1.02945, 1.34212, 0.488138]
f_36: 8151.46595 [0.210061, 0.0551346, 0.328805, -1.04666, -2.05933, -1.03497, 1.3412, 0.492932]
f_37: 8151.45575 [0.210797, 0.0551754, 0.338288, -1.03419, -2.06481, -1.04655, 1.34194, 0.494195]
f_38: 8151.50048 [0.210545, 0.0567701, 0.33487, -1.03242, -2.06051, -1.04324, 1.3417, 0.494459]
f_39: 8151.46499 [0.211006, 0.055577, 0.338695, -1.03864, -2.05648, -1.05513, 1.34488, 0.494599]
f_40: 8151.45033 [0.210886, 0.0554897, 0.338642, -1.03392, -2.06293, -1.05083, 1.33994, 0.493767]
f_41: 8151.45094 [0.210362, 0.0555242, 0.339272, -1.03478, -2.06288, -1.05043, 1.33895, 0.49278]
f_42: 8151.44889 [0.210961, 0.0555218, 0.339638, -1.03442, -2.06412, -1.05191, 1.33995, 0.495119]
f_43: 8151.45211 [0.211043, 0.055465, 0.34131, -1.03465, -2.06328, -1.0533, 1.33925, 0.49621]
f_44: 8151.44889 [0.21093, 0.0555085, 0.339225, -1.03421, -2.06362, -1.05391, 1.33994, 0.495139]
f_45: 8151.44571 [0.210359, 0.0555389, 0.339572, -1.03505, -2.06643, -1.05248, 1.34003, 0.495076]
f_46: 8151.44117 [0.209704, 0.0555746, 0.339815, -1.03665, -2.07077, -1.05404, 1.34023, 0.494997]
f_47: 8151.43735 [0.209962, 0.0556568, 0.341491, -1.04039, -2.07808, -1.05822, 1.34073, 0.494983]
f_48: 8151.44555 [0.200638, 0.0557604, 0.340893, -1.03834, -2.08392, -1.06064, 1.34049, 0.495847]
f_49: 8151.44226 [0.209923, 0.0556274, 0.34503, -1.03984, -2.07678, -1.05719, 1.3407, 0.495026]
f_50: 8151.43344 [0.213129, 0.0556139, 0.339248, -1.04353, -2.07954, -1.05771, 1.34059, 0.494978]
f_51: 8151.43032 [0.216161, 0.0556102, 0.336588, -1.04514, -2.07997, -1.05935, 1.3406, 0.494632]
f_52: 8151.42791 [0.217863, 0.0556435, 0.331923, -1.04833, -2.08062, -1.06195, 1.34059, 0.493336]
f_53: 8151.42777 [0.219512, 0.0555958, 0.329635, -1.04471, -2.08089, -1.06253, 1.33998, 0.491696]
f_54: 8151.42913 [0.216556, 0.0556064, 0.327223, -1.04313, -2.08051, -1.05971, 1.33934, 0.49011]
f_55: 8151.42932 [0.2182, 0.0555144, 0.327174, -1.04313, -2.07943, -1.06133, 1.33991, 0.492187]
f_56: 8151.42835 [0.221872, 0.0556048, 0.331844, -1.04584, -2.08249, -1.06526, 1.34007, 0.491791]
f_57: 8151.42683 [0.219592, 0.0556214, 0.328882, -1.04566, -2.08122, -1.06327, 1.33987, 0.492049]
f_58: 8151.42312 [0.220771, 0.0557478, 0.328939, -1.04601, -2.08172, -1.06288, 1.33963, 0.492135]
f_59: 8151.42058 [0.223571, 0.0558914, 0.329402, -1.04682, -2.08358, -1.06224, 1.33895, 0.492341]
f_60: 8151.42165 [0.223552, 0.05613, 0.3286, -1.04683, -2.08274, -1.06492, 1.33908, 0.492296]
f_61: 8151.41967 [0.224329, 0.0559482, 0.329728, -1.04791, -2.08533, -1.06312, 1.33893, 0.492291]
f_62: 8151.41911 [0.225285, 0.0559918, 0.329499, -1.04822, -2.08673, -1.06221, 1.3392, 0.492155]
f_63: 8151.41738 [0.2244, 0.0560065, 0.329306, -1.05026, -2.08785, -1.06224, 1.33914, 0.491998]
f_64: 8151.41585 [0.223555, 0.0560536, 0.328536, -1.05462, -2.08913, -1.06207, 1.33896, 0.491648]
f_65: 8151.41372 [0.219811, 0.0561321, 0.328655, -1.05428, -2.09319, -1.06332, 1.33887, 0.491521]
f_66: 8151.41282 [0.221406, 0.056211, 0.329278, -1.05558, -2.09704, -1.0639, 1.33887, 0.491026]
f_67: 8151.4124 [0.220374, 0.0562921, 0.327746, -1.05792, -2.09972, -1.06533, 1.3385, 0.490832]
f_68: 8151.41313 [0.219956, 0.0563272, 0.329969, -1.05895, -2.10203, -1.06463, 1.33831, 0.490498]
f_69: 8151.41201 [0.218096, 0.0563912, 0.327746, -1.05842, -2.09895, -1.06526, 1.33869, 0.490699]
f_70: 8151.41166 [0.218548, 0.0563956, 0.326707, -1.05826, -2.09936, -1.0654, 1.33906, 0.490535]
f_71: 8151.4118 [0.219042, 0.0564406, 0.326001, -1.05854, -2.09931, -1.0665, 1.33936, 0.490352]
f_72: 8151.41108 [0.218574, 0.0564464, 0.325999, -1.05953, -2.10033, -1.06609, 1.33894, 0.49077]
f_73: 8151.40974 [0.218645, 0.0564887, 0.325754, -1.06066, -2.10116, -1.06546, 1.33908, 0.491108]
f_74: 8151.4096 [0.218163, 0.0564833, 0.326315, -1.06352, -2.10257, -1.06457, 1.33955, 0.491679]
f_75: 8151.4109 [0.218477, 0.0564715, 0.325962, -1.0648, -2.1043, -1.06545, 1.33946, 0.491564]
f_76: 8151.40712 [0.219176, 0.0565934, 0.325719, -1.06302, -2.1028, -1.06338, 1.33951, 0.491773]
f_77: 8151.40508 [0.218683, 0.0567021, 0.324246, -1.06401, -2.1041, -1.06074, 1.3393, 0.492227]
f_78: 8151.40416 [0.217101, 0.0567959, 0.323054, -1.06822, -2.10642, -1.05795, 1.33821, 0.493367]
f_79: 8151.40597 [0.216894, 0.0568122, 0.322573, -1.07065, -2.11114, -1.05632, 1.33655, 0.493448]
f_80: 8151.40518 [0.21875, 0.0568996, 0.321267, -1.07122, -2.10867, -1.05768, 1.33799, 0.493119]
f_81: 8151.40297 [0.214685, 0.0568348, 0.322432, -1.06778, -2.10677, -1.05743, 1.33882, 0.494058]
f_82: 8151.40288 [0.212125, 0.0569334, 0.32046, -1.06926, -2.10525, -1.05573, 1.34006, 0.495352]
f_83: 8151.40429 [0.212735, 0.0570069, 0.321477, -1.06926, -2.10358, -1.05217, 1.33988, 0.495195]
f_84: 8151.40247 [0.211583, 0.0569636, 0.321333, -1.06935, -2.10837, -1.05605, 1.33938, 0.49504]
f_85: 8151.40254 [0.212001, 0.0569765, 0.321436, -1.06973, -2.10932, -1.05639, 1.33913, 0.495014]
f_86: 8151.40222 [0.211252, 0.0569569, 0.322349, -1.06817, -2.10787, -1.05671, 1.33928, 0.495328]
f_87: 8151.4023 [0.209318, 0.0569244, 0.322033, -1.06661, -2.107, -1.05601, 1.33916, 0.495446]
f_88: 8151.40426 [0.208253, 0.056947, 0.322954, -1.06939, -2.10765, -1.05728, 1.33933, 0.495236]
f_89: 8151.40188 [0.213221, 0.0568949, 0.322456, -1.06698, -2.10895, -1.05593, 1.33939, 0.495448]
f_90: 8151.40139 [0.213529, 0.0569978, 0.32166, -1.06556, -2.10919, -1.05643, 1.33951, 0.495399]
f_91: 8151.40158 [0.213489, 0.057015, 0.320759, -1.06651, -2.10877, -1.05603, 1.33936, 0.495742]
f_92: 8151.40143 [0.213983, 0.056922, 0.322215, -1.06448, -2.10774, -1.05691, 1.33954, 0.495203]
f_93: 8151.40137 [0.213233, 0.0570287, 0.321586, -1.06561, -2.10898, -1.05679, 1.33992, 0.495145]
f_94: 8151.40144 [0.214111, 0.0569899, 0.321667, -1.06507, -2.1089, -1.0577, 1.33959, 0.494901]
f_95: 8151.40139 [0.212586, 0.057085, 0.321146, -1.06679, -2.11062, -1.05628, 1.33997, 0.495323]
f_96: 8151.40132 [0.212208, 0.0570541, 0.321786, -1.06614, -2.10944, -1.05575, 1.33996, 0.495524]
f_97: 8151.40136 [0.211707, 0.0570879, 0.321567, -1.06654, -2.10983, -1.05549, 1.3402, 0.495628]
f_98: 8151.40145 [0.211606, 0.05711, 0.321385, -1.06626, -2.10993, -1.05481, 1.34001, 0.495904]
f_99: 8151.40132 [0.212636, 0.0570292, 0.322066, -1.06562, -2.10917, -1.05603, 1.33981, 0.495367]
f_100: 8151.40131 [0.212629, 0.0570371, 0.322028, -1.06549, -2.10925, -1.05596, 1.33978, 0.495353]
f_101: 8151.4013 [0.212574, 0.0570326, 0.322018, -1.06558, -2.10932, -1.05589, 1.33975, 0.495345]
f_102: 8151.40129 [0.21242, 0.057031, 0.322021, -1.06569, -2.10948, -1.0557, 1.33967, 0.495326]
f_103: 8151.40133 [0.212485, 0.0570303, 0.322057, -1.06574, -2.1097, -1.05553, 1.33948, 0.495317]
f_104: 8151.40123 [0.2124, 0.0570326, 0.322055, -1.06547, -2.10985, -1.05592, 1.33966, 0.495334]
f_105: 8151.40112 [0.212086, 0.0570145, 0.321893, -1.06522, -2.11002, -1.0559, 1.33967, 0.495335]
f_106: 8151.401 [0.211505, 0.0569975, 0.321756, -1.06446, -2.11038, -1.05586, 1.33974, 0.495352]
f_107: 8151.40096 [0.210711, 0.0569916, 0.322023, -1.06348, -2.1105, -1.05554, 1.34003, 0.49548]
f_108: 8151.40095 [0.210638, 0.0569966, 0.321914, -1.06364, -2.11048, -1.05552, 1.34002, 0.495514]
f_109: 8151.40094 [0.209794, 0.0570155, 0.321635, -1.06378, -2.11041, -1.0552, 1.34014, 0.495766]
f_110: 8151.40093 [0.210065, 0.0570163, 0.321521, -1.06391, -2.11057, -1.05516, 1.34008, 0.495664]
f_111: 8151.40094 [0.209996, 0.0570166, 0.321522, -1.06377, -2.11075, -1.05524, 1.34008, 0.495677]
f_112: 8151.40094 [0.210034, 0.057011, 0.321496, -1.06395, -2.11039, -1.05506, 1.34009, 0.495652]
f_113: 8151.40095 [0.21001, 0.0570142, 0.321564, -1.06407, -2.11048, -1.05522, 1.34005, 0.495651]
f_114: 8151.40091 [0.210312, 0.0570228, 0.321587, -1.06383, -2.11057, -1.05506, 1.34007, 0.495674]
f_115: 8151.40088 [0.210498, 0.0570324, 0.321549, -1.06379, -2.11051, -1.05512, 1.3401, 0.495669]
f_116: 8151.40089 [0.21073, 0.0570389, 0.321505, -1.06373, -2.11049, -1.05523, 1.34018, 0.495662]
f_117: 8151.40089 [0.21055, 0.0570292, 0.321515, -1.06396, -2.11034, -1.05514, 1.34009, 0.495659]
f_118: 8151.40084 [0.210443, 0.0570383, 0.321482, -1.06363, -2.11038, -1.05516, 1.34008, 0.495671]
f_119: 8151.40076 [0.210198, 0.0570464, 0.32135, -1.0633, -2.11014, -1.05523, 1.34006, 0.495679]
f_120: 8151.40061 [0.209473, 0.0570592, 0.321162, -1.06257, -2.10972, -1.05529, 1.34007, 0.495719]
f_121: 8151.40043 [0.207543, 0.0570942, 0.321136, -1.06147, -2.10904, -1.05504, 1.34022, 0.49578]
f_122: 8151.4004 [0.207, 0.0570794, 0.320749, -1.06014, -2.10806, -1.05521, 1.34025, 0.495974]
f_123: 8151.40038 [0.205495, 0.0571455, 0.320866, -1.05983, -2.10851, -1.05475, 1.3405, 0.496015]
f_124: 8151.40029 [0.205354, 0.0571345, 0.32052, -1.05948, -2.10747, -1.05559, 1.34053, 0.495759]
f_125: 8151.40022 [0.205476, 0.0571549, 0.320486, -1.05936, -2.10748, -1.05546, 1.3405, 0.495749]
f_126: 8151.40019 [0.205355, 0.0571646, 0.320419, -1.05914, -2.10748, -1.05538, 1.34045, 0.495725]
f_127: 8151.40017 [0.2053, 0.0571763, 0.320357, -1.05896, -2.1072, -1.0553, 1.34043, 0.495729]
f_128: 8151.40015 [0.204924, 0.0571829, 0.320172, -1.05842, -2.1067, -1.05527, 1.34042, 0.495754]
f_129: 8151.40017 [0.204412, 0.057192, 0.320497, -1.05792, -2.1066, -1.05519, 1.34048, 0.495757]
f_130: 8151.40014 [0.205075, 0.057174, 0.320236, -1.0587, -2.10671, -1.0553, 1.34038, 0.495743]
f_131: 8151.4001 [0.204829, 0.057185, 0.320267, -1.05876, -2.1067, -1.05552, 1.34035, 0.495768]
f_132: 8151.40008 [0.204587, 0.0571856, 0.320247, -1.05875, -2.1065, -1.05566, 1.3403, 0.495793]
f_133: 8151.40007 [0.20453, 0.0571795, 0.320147, -1.05845, -2.10604, -1.05594, 1.34021, 0.495822]
f_134: 8151.40007 [0.203787, 0.0571888, 0.320001, -1.05841, -2.1059, -1.05564, 1.3402, 0.495917]
f_135: 8151.40005 [0.204491, 0.0571832, 0.31997, -1.05873, -2.10595, -1.05593, 1.34019, 0.495803]
f_136: 8151.40005 [0.204276, 0.057186, 0.319796, -1.05886, -2.10589, -1.05596, 1.34019, 0.49575]
f_137: 8151.40005 [0.204423, 0.0571918, 0.319734, -1.0591, -2.10566, -1.05602, 1.34021, 0.495772]
f_138: 8151.40004 [0.204548, 0.0571876, 0.319676, -1.05918, -2.10582, -1.05602, 1.34015, 0.495777]
f_139: 8151.40004 [0.204464, 0.057187, 0.319764, -1.05899, -2.10564, -1.05601, 1.34014, 0.495774]
f_140: 8151.40004 [0.20449, 0.0571728, 0.320003, -1.05915, -2.10541, -1.05582, 1.3401, 0.495746]
f_141: 8151.40004 [0.204276, 0.0571907, 0.31971, -1.05883, -2.10554, -1.05602, 1.34016, 0.495784]
f_142: 8151.40001 [0.204346, 0.057197, 0.319838, -1.05895, -2.10558, -1.05598, 1.34012, 0.495753]
f_143: 8151.39996 [0.203938, 0.057223, 0.31993, -1.05893, -2.10565, -1.05594, 1.34011, 0.495732]
f_144: 8151.3999 [0.202922, 0.0572697, 0.319948, -1.05885, -2.10588, -1.05596, 1.34015, 0.495707]
f_145: 8151.39991 [0.201254, 0.0573483, 0.319619, -1.05853, -2.10625, -1.05611, 1.34035, 0.495632]
f_146: 8151.3999 [0.20196, 0.0572999, 0.31975, -1.05836, -2.10572, -1.05588, 1.34021, 0.495751]
f_147: 8151.3999 [0.201256, 0.0573196, 0.319513, -1.05843, -2.10514, -1.05627, 1.34014, 0.495761]
f_148: 8151.39996 [0.201709, 0.0572848, 0.319812, -1.05814, -2.10585, -1.05612, 1.34023, 0.49578]
f_149: 8151.39987 [0.201814, 0.0573123, 0.319706, -1.05867, -2.10564, -1.05573, 1.34019, 0.4957]
f_150: 8151.39986 [0.201309, 0.05734, 0.319368, -1.05918, -2.10566, -1.05559, 1.34018, 0.495598]
f_151: 8151.39984 [0.201033, 0.0573565, 0.319868, -1.05921, -2.10608, -1.05546, 1.34028, 0.495568]
f_152: 8151.39991 [0.201524, 0.0573613, 0.32074, -1.05942, -2.10734, -1.05535, 1.34044, 0.495436]
f_153: 8151.39983 [0.200436, 0.0573842, 0.31978, -1.05912, -2.10588, -1.05548, 1.34029, 0.495528]
f_154: 8151.39982 [0.200129, 0.057382, 0.320023, -1.05862, -2.10514, -1.05561, 1.34028, 0.495467]
f_155: 8151.39982 [0.199374, 0.0574085, 0.319784, -1.05851, -2.10499, -1.05554, 1.34024, 0.495508]
f_156: 8151.39983 [0.199669, 0.0574186, 0.319664, -1.05826, -2.10508, -1.05566, 1.34024, 0.49545]
f_157: 8151.39982 [0.199801, 0.0573997, 0.319788, -1.05866, -2.10514, -1.05562, 1.34026, 0.495478]
f_158: 8151.39982 [0.19913, 0.057419, 0.31975, -1.05854, -2.10492, -1.05556, 1.34023, 0.495496]
f_159: 8151.3998 [0.199185, 0.0574229, 0.319868, -1.0585, -2.10502, -1.0556, 1.3402, 0.495491]
f_160: 8151.39978 [0.199175, 0.0574247, 0.320107, -1.05837, -2.10521, -1.05569, 1.34015, 0.495476]
f_161: 8151.39977 [0.198799, 0.0574291, 0.320559, -1.05782, -2.10519, -1.05579, 1.34009, 0.495385]
f_162: 8151.39977 [0.198977, 0.0574256, 0.320442, -1.05801, -2.10529, -1.05579, 1.34009, 0.495416]
f_163: 8151.39977 [0.199064, 0.0574221, 0.320394, -1.05807, -2.10537, -1.05579, 1.34008, 0.495423]
f_164: 8151.39976 [0.199036, 0.0574185, 0.320301, -1.0581, -2.1053, -1.05577, 1.34004, 0.495418]
f_165: 8151.39976 [0.198886, 0.0574185, 0.320183, -1.05812, -2.10506, -1.05572, 1.33996, 0.49543]
f_166: 8151.39976 [0.198769, 0.0574126, 0.320157, -1.05797, -2.10476, -1.05572, 1.33983, 0.495493]
f_167: 8151.39975 [0.199003, 0.0574465, 0.320228, -1.05822, -2.10509, -1.05589, 1.33994, 0.495442]
f_168: 8151.39975 [0.198611, 0.0574472, 0.320195, -1.05816, -2.10511, -1.05593, 1.33988, 0.495387]
f_169: 8151.39974 [0.198519, 0.0574469, 0.320417, -1.05796, -2.10497, -1.05581, 1.33983, 0.495357]
f_170: 8151.39974 [0.198166, 0.05746, 0.320463, -1.05785, -2.10493, -1.05579, 1.33977, 0.495309]
f_171: 8151.39975 [0.198044, 0.0574687, 0.32045, -1.05786, -2.10493, -1.05581, 1.3398, 0.495326]
f_172: 8151.39974 [0.198794, 0.0574303, 0.320608, -1.058, -2.10486, -1.05589, 1.33979, 0.495316]
f_173: 8151.39974 [0.199034, 0.0574233, 0.320693, -1.05794, -2.10496, -1.05588, 1.3398, 0.49533]
f_174: 8151.39973 [0.198937, 0.0574267, 0.320615, -1.05807, -2.10499, -1.05586, 1.33983, 0.495322]
f_175: 8151.39973 [0.198923, 0.0574286, 0.320609, -1.05824, -2.10501, -1.05573, 1.33982, 0.495308]
f_176: 8151.39973 [0.198804, 0.0574295, 0.320529, -1.05842, -2.10494, -1.05564, 1.33973, 0.49532]
f_177: 8151.39973 [0.198731, 0.0574321, 0.320498, -1.05844, -2.10492, -1.05565, 1.33971, 0.495313]
f_178: 8151.39972 [0.198811, 0.0574329, 0.320604, -1.05851, -2.10498, -1.05562, 1.33971, 0.495285]
f_179: 8151.39973 [0.198689, 0.0574314, 0.320742, -1.0585, -2.10513, -1.05563, 1.3397, 0.495288]
f_180: 8151.39973 [0.198627, 0.0574319, 0.320686, -1.05847, -2.10481, -1.05554, 1.33972, 0.495286]
f_181: 8151.39972 [0.198993, 0.0574241, 0.320691, -1.05854, -2.10507, -1.05556, 1.3397, 0.495278]
f_182: 8151.39972 [0.199193, 0.0574171, 0.320764, -1.05864, -2.10513, -1.05553, 1.33969, 0.4953]
f_183: 8151.39973 [0.199118, 0.0574075, 0.32064, -1.05874, -2.10518, -1.05557, 1.33969, 0.495302]
f_184: 8151.39972 [0.19905, 0.0574236, 0.320832, -1.05851, -2.10511, -1.05549, 1.33967, 0.495288]
f_185: 8151.39972 [0.199263, 0.0574145, 0.320708, -1.05864, -2.10513, -1.05553, 1.33973, 0.495312]
f_186: 8151.39972 [0.199145, 0.0574214, 0.320803, -1.05862, -2.1051, -1.05554, 1.33971, 0.495279]
f_187: 8151.39972 [0.199275, 0.057412, 0.320796, -1.05864, -2.10503, -1.05569, 1.33968, 0.49529]
f_188: 8151.39972 [0.199179, 0.0574208, 0.320712, -1.05859, -2.10515, -1.05552, 1.33972, 0.495315]
f_189: 8151.39972 [0.19929, 0.0574134, 0.320741, -1.0586, -2.10507, -1.05548, 1.33972, 0.495326]
f_190: 8151.39972 [0.199237, 0.0574217, 0.320695, -1.05864, -2.10523, -1.05556, 1.33971, 0.495306]
f_191: 8151.39972 [0.199207, 0.0574221, 0.320734, -1.05867, -2.10526, -1.05554, 1.3397, 0.495291]
f_192: 8151.39972 [0.199284, 0.0574207, 0.320769, -1.05874, -2.10538, -1.05551, 1.33969, 0.495282]
f_193: 8151.39972 [0.199197, 0.057424, 0.320768, -1.05873, -2.10533, -1.05553, 1.33967, 0.495275]
f_194: 8151.39972 [0.199267, 0.057421, 0.320761, -1.0587, -2.10537, -1.05552, 1.33971, 0.495286]
f_195: 8151.39972 [0.199263, 0.0574212, 0.32075, -1.05872, -2.10535, -1.05552, 1.33971, 0.495288]
f_196: 8151.39972 [0.199247, 0.057422, 0.320741, -1.05875, -2.10537, -1.05549, 1.33972, 0.495286]
f_197: 8151.39972 [0.199269, 0.0574224, 0.32073, -1.0588, -2.10542, -1.05545, 1.33974, 0.495285]
f_198: 8151.39972 [0.199283, 0.0574229, 0.320739, -1.05878, -2.10536, -1.05553, 1.33972, 0.495281]
f_199: 8151.39972 [0.199181, 0.0574243, 0.320746, -1.05877, -2.10539, -1.05548, 1.33971, 0.495287]
f_200: 8151.39972 [0.199087, 0.0574288, 0.320753, -1.05881, -2.10543, -1.05544, 1.3397, 0.495278]
f_201: 8151.39972 [0.199149, 0.0574261, 0.32077, -1.05882, -2.10543, -1.05546, 1.3397, 0.495284]
f_202: 8151.39972 [0.199093, 0.0574315, 0.320784, -1.05878, -2.10542, -1.05544, 1.33971, 0.49528]
f_203: 8151.39972 [0.199096, 0.0574277, 0.320734, -1.05879, -2.1054, -1.05546, 1.3397, 0.49528]
f_204: 8151.39972 [0.199102, 0.0574277, 0.320722, -1.05881, -2.10538, -1.05544, 1.3397, 0.495283]
f_205: 8151.39972 [0.199074, 0.0574279, 0.320741, -1.05879, -2.10538, -1.05546, 1.3397, 0.495281]
  2.717180 seconds (847.98 k allocations: 66.549 MiB, 1.18% gc time)
Out[16]:
Generalized Linear Mixed Model fit by minimizing the Laplace approximation to the deviance
  Formula: r201 ~ 1 + a + g + b + s + (1 | id) + (1 | item)
  Distribution: Distributions.Bernoulli{Float64}
  Link: GLM.LogitLink()

  Deviance (Laplace approximation): 8151.3997

Variance components:
          Column    Variance   Std.Dev.  
 id   (Intercept)  1.79480455 1.33970316
 item (Intercept)  0.24530373 0.49528146

 Number of obs: 7584; levels of grouping factors: 316, 24

Fixed-effects parameters:
              Estimate Std.Error  z value P(>|z|)
(Intercept)   0.199074  0.405173 0.491332  0.6232
a            0.0574279 0.0167572  3.42705  0.0006
g: M          0.320741  0.191258  1.67701  0.0935
b: scold      -1.05879  0.256794 -4.12312   <1e-4
b: shout      -2.10538  0.258518 -8.14406  <1e-15
s: self       -1.05546  0.210293   -5.019   <1e-6

This fit provided slightly better results (Laplace approximation to the deviance of 8151.400 versus 8151.583) but took 6 times as long. That is not terribly important when the times involved are a few seconds but can be important when the fit requires many hours or days of computing time.

The comparison of the slow and fast fit is available in the optimization summary after the slow fit.


In [17]:
mdl1.LMM.optsum


Out[17]:
Initial parameter vector: [0.208273, 0.0543791, 0.304089, -1.0165, -2.0218, -1.01344, 1.33956, 0.496833]
Initial objective value:  8151.583340132033

Optimizer (from NLopt):   LN_BOBYQA
Lower bounds:             [-Inf, -Inf, -Inf, -Inf, -Inf, -Inf, 0.0, 0.0]
ftol_rel:                 1.0e-12
ftol_abs:                 1.0e-8
xtol_rel:                 0.0
xtol_abs:                 [1.0e-10, 1.0e-10]
initial_step:             [0.405425, 0.0167533, 0.191223, 0.257531, 0.259235, 0.210888, 0.05, 0.05]
maxfeval:                 -1

Function evaluations:     205
Final parameter vector:   [0.199074, 0.0574279, 0.320741, -1.05879, -2.10538, -1.05546, 1.3397, 0.495281]
Final objective value:    8151.399719870473
Return code:              FTOL_REACHED