The partial information decomposition is a method of decomposing the mutual information between a set of sources, $X_0, X_1, \ldots$ and a target $Y$: $I\left[X_0, X_1, \ldots : Y\right]$. In the bivariate sources case, the decomposition takes a particularly intuitive form: There is redundant information, unique information from $X_0$, unique information from $X_1$, and synergistic information.
There are many proposals for how to quantify this decomposition. Here, we explore the behavior of many of these proposals on a wide variety of distribution. This survey helps clarify commonalities and differences is the variety of proposed approaches.
In [ ]:
import dit
from dit.pid.helpers import compare_measures
from dit.pid.distributions import bivariates, trivariates
dit.ditParams['print.exact'] = dit.ditParams['repr.print'] = True
dit.ditParams['text.font'] = 'linechar'
If colorama
is installed, measure names that are green indicate that some partial information values are negative, measure names that are blue indicate that the lattice could not be completely computed, and measure names that are red indicate that the measure resulted in an inconsistancy in the lattice.
In [ ]:
for name, dist in bivariates.items():
compare_measures(dist, name=name)
In [ ]:
for name, dist in trivariates.items():
compare_measures(dist, name=name)
In [ ]: