Let's look at the least square problem we constructed above
In [ ]:
import cgt
cgt.modify_config(backend="python")
X_nk = cgt.matrix("X")
y_n = cgt.vector("y")
w_k = cgt.vector("w")
b = cgt.scalar("b")
ypred_n = X_nk.dot(w_k) + b
loss = cgt.sum(cgt.square(ypred_n - y_n))
Internally, CGT represents the loss function, as well as other expressions, using a directed acyclic graph called the expression graph. In the expression graph, each node corresponds to an intermediate result and the operation that was performed to obtain it.
The graph is made up of two kinds of nodes: Input and Result.
Input nodes correspond to values that are set externally, while Result node correspond to intermediate values in the computation, computed from zero-or-more preceding nodes.
Below is a representation of the expression graph above. (TODO: we should show DAG plot here instead of the tree, maybe using dot for layout)
In [ ]:
cgt.display.print_text(loss);
In [ ]:
# We can inspect the python objects involved.
print loss
print "loss:",loss, loss.parents
print "loss.parents[0]:",loss.parents[0], loss.parents[0].parents
In [ ]:
grads = cgt.grad(loss, [w_k,b])
cgt.display.print_text(grads)
In [ ]:
grads = cgt.simplify(grads)
cgt.display.print_text(grads);
In [ ]:
inputs = [X_nk, y_n, w_k, b]
outputs = [loss]
interpreter = cgt.execution.run_compilation_pipeline(inputs, outputs,[],[])
eg = interpreter.eg
import pprint
pprint.pprint(eg.to_json())