SoftMax Distributions for Human-Robot Interaction

This iPython Notebook is a living document that details the development and usage of SoftMax distributions -- a powerful tool to probabilistically decompose state spaces.

DISCLAIMER: This document's current status is in development. That means code might be broken, text may have typos, and math might be missing/wrong. Read this as ongoing research notes until this disclaimer is removed.

NOTE: While you can read this document as-is using nbviewer, the best way to read it is by downloading the .ipynb files and associated code from the project's GitHub repository, then running ipython notebook locally.

Chapter 1 - Introduction to SoftMax Distributions

We look at the basics of why you'd want to use SoftMax distributions, how to make them, and what their general basic properties are.

Chapter 2 - Using Normals Instead of Weights

Instead of defining weights by hand, we show other ways to create SoftMax distributions: using vector normals.

Chapter 3 - Building SoftMax Distributions from Templates

We can abstract the SoftMax distribution creation even further using templates.

Chapter 4 - Subclassing and Superclassing: Multimodal SoftMax

SoftMax distributions can be overly simplistic for some models, so we investigate ways to modify SoftMax distributions for general, non-symmetric cases.

Chapter 5 - Learning SoftMax Distributions from Data

TODO

Chapter 6 - Shaping with Class Boundary Priors

TODO

Chapter 7 - Using Symmetry

TODO

Chapter 8 - N-Dimensional SoftMax

TODO


In [1]:
from IPython.core.display import HTML

# Borrowed style from Probabilistic Programming and Bayesian Methods for Hackers
def css_styling():
    styles = open("../styles/custom.css", "r").read()
    return HTML(styles)
css_styling()


Out[1]: