Test of a Generalized Metropolis-Hastings MCMC to explore the parameter space of the FitzHugh-Nagumo model: http://www.scholarpedia.org/article/FitzHugh-Nagumo_model

If the number of proposals per iteration equals 1, then the behaviour of this runner is equivalent to Standard M-H, for number of proposals > 1, it will behave as the Generalized Metropolis-Hastings algorithm.

OPERATION:

  • Run a cell by pressing the black triangle in the toolbar above.
  • Note that the execution of a cell may take a while, and will be confirmed by a printout.
  • If a cell prints output in a pink box, re-run and see if it disappears. If not, close and re-open the notebook, or select "Kernel/Restart" at the top.
  • To remove all printed output and figures, select "Cell/All Output/Clear" at the top.

In [ ]:
###Load the PyPlot package (only on the main process)
import PyPlot
println("PyPlot package loaded successfully")

In the following cell, you can specify the number of parallel processes to run the MCMC with. The way to do this differs when running the notebook on a single computer vs. when running this notebook on a cluster of different computers (for more information on clusters see Preparing an AWS Cluster).

  1. To run the MCMC not in parallel (in a single Julia process), set RUNPARALLEL=false.

  2. To run the MCMC in parallel on a single machine, set RUNPARALLEL=true and RUNONCLUSTER=false. You can set how many additional processes to run with by setting the NPROCS variable. It is recommended not to make NPROCS larger than the total number of CPU cores on your machine (defined by Julia global variable Sys.CPU_CORES).

  3. When running this notebook on a cluster, set RUNPARALLEL=true and RUNONCLUSTER=true. Set the xxx.xxx.xxx.xxx values to the private IP addresses of the slave machines you have started (add as many slaveip entries to machvec as required).


In [ ]:
RUNPARALLEL = true
RUNONCLUSTER = false

if RUNPARALLEL
    println("Starting additional Julia processes")
    NPROCS = min(3,Sys.CPU_CORES) #do not make larger than CPU_CORES
    if nprocs() < NPROCS
        addprocs(NPROCS-nprocs(),topology=:master_slave)
    end
    println("Number of Julia processes: ",nprocs())

    if RUNONCLUSTER 
        println("Starting additional Julia processes on the cluster")
        slaveip1 = "ubuntu@xxx.xxx.xxx.xxx"
        slaveip2 = "ubuntu@xxx.xxx.xxx.xxx"
        machvec = [(slaveip1,:auto),(slaveip2,:auto)]
        addprocs(machvec,topology=:master_slave)
        println("Total number of Julia processes in cluster: ",nprocs())
    end
end

In [ ]:
###Now Load the GMH package on all processes
import GeneralizedMetropolisHastings
import GMHExamples

@everywhere using GeneralizedMetropolisHastings
@everywhere using GMHExamples
println("GMH modules loaded successfully")

In [ ]:
#Standard M-H for nproposals == 1
#Generalized M-H for nproposals > 1
nproposals = 30

#MCMC iteration specifications
nburnin = 1000
niterations = 1000
ntunerperiod = 100

###Initial conditions for the spring-mass ODE (membrane potential and refractory variable)
initial = [-1.0,1.0]

###Default values of the parameters (a,b,c) and prior boundaries
defaults = [0.3,0.3,2.0]
lows = zeros(3)
highs = [5.0,5.0,5.0]

###The variance of the noise on the input data
variance = [0.02,0.005]

println("==========================================")
println("Simulation parameters defined successfully")
println("==========================================")

Specify the model object from the parameters and helper function specified above


In [ ]:
###Create a FitzHugh-Nagumo model with measurement data, ODE function and parameters with default values and priors
m = fitzhughnagumomodel(initial,variance,lows,highs,defaults)

###Show the model
println("==========================")
println("Model defined successfully")
println("==========================")
show(m)

Now specify the sampler (a Metropolis-Hastings sampler with a Gaussian proposal density) and the runner to run a Generalized Metropolis-Hastings algorithm (remember that the choice between Standard and Generalized M-H is made by either setting nproposals to 1 or make it > 1).


In [ ]:
###Create a Metropolis sampler with a Normal proposal density
s = sampler(:mh,:normal,0.1,eye(3))
println("============================")
println("Sampler defined successfully")
println("============================")
show(s)

###Create a tuner that scales the proposal density
t = tuner(:scale,ntunerperiod,0.5,:erf)
println("==========================")
println("Tuner defined successfully")
println("==========================")
show(t)

###Create a Generalized Metropolis-Hastings runner (which will default to Standard MH when nproposals=1)
p = policy(:mh,nproposals;initialize=:default)
r = runner(p,niterations,nproposals;numburnin = nburnin)
println("===========================")
println("Runner defined successfully")
println("===========================")
show(r)

Now run the simulation using the runner, the model and the sampler specified above. A printout will appear when the simulation is finished.


In [ ]:
###Run the MCMC (can take quite a bit of time)
println("=======================")
println("Run the MCMC simulation")
println("=======================")
@time c = run!(r,m,s,t)
println("==========================")
println("Completed MCMC simulation")
println("=========================")

In [ ]:
###Show the results of the simulations
show(c)

nparas = numparas(m)
meanparamvals = mean(samples(c),2)
stdparamvals = std(samples(c),2)

println("Results of the MCMC simulation:")
for i=1:nparas
    println(" parameter $(parameters(m)[i].key):  mean = ",meanparamvals[i]," std = ",stdparamvals[i])
end

In [ ]:
###Plot the measurement data (simmulated data + noise)
PyPlot.figure("FitzHughNagumo1")
modeldata = evaluate!(m,vec(meanparamvals))
PyPlot.plot(dataindex(m),measurements(m)[:,1];label="membrane potential")
PyPlot.plot(dataindex(m),measurements(m)[:,2];label="refractory variable")
PyPlot.plot(dataindex(m),modeldata[:,1])
PyPlot.plot(dataindex(m),modeldata[:,2])
PyPlot.xlabel("Time")
PyPlot.ylabel("Amplitude")
PyPlot.title("FitzHugh-Nagumo model data")
PyPlot.grid("on")
PyPlot.legend(loc="lower right",fancybox="true")

In [ ]:
###Plot the histograms of a,b,c values
for i=1:nparas
    PyPlot.subplot(310 + i)
    h = PyPlot.plt[:hist](vec(getindex(samples(c),i,:)),20)
    PyPlot.grid("on")
    PyPlot.ylabel("Parameter $(parameters(m)[i].key)")
end

In [ ]:
###Plot the loglikelihood values across samples
###After an initial few low values, this should remain relatively high
PyPlot.plot(1:numsamples(c),logposterior(c,m))
PyPlot.title("Log-Posterior values across samples")
PyPlot.xlabel("Samples")
PyPlot.ylabel("Log-Posterior")

In [ ]:
println("Number of processes running: ",nprocs())
println("Number of workers running: ",nworkers())
println("Process IDs: ",procs())

In [ ]:
###Only run this box if you want to shut down all worker processes
println("Pre processes running: ",procs())
if nprocs() > 1
    rmprocs(workers())
    sleep(1.0)
end
println("Post processes running: ",procs())

In [ ]: