Likewise, I do not find much value in the visualization of the network structure, on itself. However, I really would like to see something that visualizes the geometric representation of the outputs of such Neural Network.
When I was in school, I learned that a Perceptron is an inequality represented by an hyperplane in the space of R^n (for n inputs of the cell). The numeric output of the cell can be interpreted as the size of a vector perpendicular to such hyperplane.
At the time, I tried to make the next step for a Back Propagation network. Assuming the output in each cell in a given layer is a dimension in the inputs of the next layer. Then I could project each hyperplane onto the other plane to find the non linear geometric objects that resulted. Of course in order to do that, the network structure was constrained to be 2-2-2 (so every hyperplane would map to a line in 2-D).
I lacked the math background to make it work properly at the time, but if someone is interested in picking up the ball, i think it'd be an interesting idea.
When I was in school, I learned that a Perceptron is an inequality represented by an hyperplane in the space of R^n (for n inputs of the cell). The numeric output of the cell can be interpreted as the size of a vector perpendicular to such hyperplane.
At the time, I tried to make the next step for a Back Propagation network. Assuming the output in each cell in a given layer is a dimension in the inputs of the next layer. Then I could project each hyperplane onto the other plane to find the non linear geometric objects that resulted. Of course in order to do that, the network structure was constrained to be 2-2-2 (so every hyperplane would map to a line in 2-D).
I lacked the math background to make it work properly at the time, but if someone is interested in picking up the ball, i think it'd be an interesting idea.