What is the Computational Graph?
"
What is the Computational Graph?
"
In TensorFlow, machine learning algorithms are represented as computational graphs. A computational graph is a type of directed graph where nodes describe operations, while edges represent the data (tensor) flowing between those operations.
Before we dig deeper, let’s learn the building blocks of the graphs.
ref: TensorFlow 1.0 vs 2.0, Part 1: Computational Graphs | by Yusup | AI³ | Theory, Practice, Business | Medium)%20flowing%20between%20those%20operations.&text=Tensors%20may%20have%20a%20shape,don’t%20have%20actual%20values.
The computation graph is the pathway of operations you take to get from the inputs (which could be a data feature vector) to the loss function (since you connect the output to a loss function to optimize).
A feedforward network could be called a computation chain, since you start from some input vector, transform it through some affine transform (a matrix multiply with another node, the weight matrix), run it through a nonlinearity (i.e. an element wise function like the logistic sigmoid), and repeat until you reach the output node — you will then run your output through a function that returns a scalar error value, thus reaching the end of your directed graph. This graph of operations, or computations, is a useful formalism when you want to do something like reverse mode differentiation (or back-propagation of errors) =]