It´s just a demo of an upcoming open source API that allows running deep neural network models on the browser.
Steps (disclaimer: I´m not related to the creators, so this is just what I understand it does)
1.- You upload your image
2.- Select an image to be the origin of the style
3.- Downloading Model: downloads a trained (on style
transfering) deep neural net
4.- Colorful artifacts: the model is applied to your image. Probably the artifacts are a visualization of the network weights being transformed to WebGL shaders, or just a simple visualization of the internal hidden steps of the transformation
Steps (disclaimer: I´m not related to the creators, so this is just what I understand it does)
1.- You upload your image
2.- Select an image to be the origin of the style
3.- Downloading Model: downloads a trained (on style transfering) deep neural net
4.- Colorful artifacts: the model is applied to your image. Probably the artifacts are a visualization of the network weights being transformed to WebGL shaders, or just a simple visualization of the internal hidden steps of the transformation
5.- You get your image with the style applied