Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity, PLOS Comp Biol, 2013, Bernhard Nessler, Michael Pfeiffer, Lars Buesing, Wolfgang Maass

"synaptic weights can be understood as conditional probabilities, and the ensemble of all weights of a neuron as a generative model for high-dimensional inputs that - after learning - causes it to fire with a probability that depends on how well its current input agrees with this generative model."

Inputs $y_i$ are afferent to outputs $z_k$ with weights $w_{ki}$,
membrane potential $u_k = w_{k0}+\sum_i w_{ki}y_i$
where $w_{k0}$ is the intrinsic excitability.
So firing probability $p(z_k\text{ fires at time }t) \propto \exp(u_k(t)-I(t))$,
where $I(t)$ is global inhibition, making this a soft winner take all (WTA) circuit.
Thus Poisson rate can be taken as $r_k(t) = \exp(u_k(t)-I(t))$.

[In Kappel et al 2014, $I(t) = \log(\sum_k \exp(u_k(t)))$, but here not stated.]
Conditional probability that a spike at time $t$ came from $z_k$ is
$q_k(t)= \frac{rdt}{Rdt} = \frac{\exp(u_k(t)-I(t))}{\sum_j \exp(u_j(t)-I(t))} = \frac{\exp(u_k(t))}{\sum_j \exp(u_j(t))}$.


In [ ]: