Information Metrics And Bounds

This notebook contains some notes from thinking about how to solve the following problem:

"Given p(A,B|C), P(B|C) what can we say about P(A|C)?"

While a lower bound on P(A|C) would be ideal, even getting a triangle-inequality-like-bound would be a good starting point. After searching the web, I have a few items which could be useful in this endeavor:

  • Hellinger distance, which is a metric.
  • Mutual information, can be made into metric (variation of information).
  • Even symmetric forms of KL divergence may not satisfy triangle inequality.