Wow this is space age stuff. Combining hard logical inference with probability is very exciting. We can codify background knowledge like physical laws to speed up the process of learning. At the moment NN have to have tons of training data just to pin down various truisms we already know. They learn from a blank slate everytime which is not exploiting all the background knowledge we already know. Well, convolution neural nets kind of encode transnational invariance in their architecture, but we should be able to be more general, this is the path to that!