Oh boy. I'm the second author. The official announcement will be in April 1st at the SIGBOVIK conference (http://sigbovik.org/2016), but I guess someone broke the press embargo!
Very funny. A ton of people, I suspect, will take this seriously.
If I may, let me suggest you add a note about your future plans to add backpropagation using Excel/Libreoffice macros, which "might allow users to train large-scale models from scratch within their lifetime."
I'm really glad someone found this early, because I try and stay off the WWW on April 1st. Every site tries to be "funny" and pretty much just ends up being annoying, and DeepExcel would have just drowned in the noise then.
This though: hearty chuckles were produced. (The paper was was even type-set in LaTeX!)
I can see that it is working. Thanks for the amazing product. Will it be apt to share the model in which you are planning to monetise the product? Thanks once again! ;)
Haven't tried the actual file yet, but the idea sounds interesting.
Not sure how academic papers are written normally, but do such papers normally refer to open source licenses as "Commie"? And does anyone know what the paper/project was created for?
I suspect the project was created in preparation for the SIGBOVIK conference[1] or similar.
It's sort of a satirical or mock conference; these guys have created a number of ingenious projects over the years[2], but this one might be their masterpiece.
Certainly it's going to give my own entry[3] a run for its money.
3. A sort of analog neural net using a biological substrate to perform communication and learning. Basically, I've trained some wild dogs I found in the alley behind my apartment for function approximation. Originally I had intended to use a pretty standard hill-climbing approach, but they kept wandering away when I ran out of treats. My new method arranges them in a series of layers for PAC (provably all canine) learning, with the gradients transmitted by bark-propagation. I've not figured out how to implement convolutional operations in this framework yet, so I might hold off on publishing anything this year.
Wow! I didn't realize it was satirical and thought is the author a con or crazy.
Then I headed here and the news broke out for me. I'm pretty sure majority of people who only read about ML in press will take it seriously.
For those who don't know: this is a satirical post.
It's kind of funny, I guess.
But I didn't assume it was a farce. I was genuinely excited to see the implementation, because there is something unique about the transparency and interaction you can have with an excel workbook. It's almost like a GUI for a lisp REPL. Even if I am too dumb to understand the actual c code, I can see the relevant numbers being crunched, how they relate, and more.
So... "good one. you got me."
But even as I sit here as the butt of the joke, I find myself wishing your link had been sincere.
Would people who understand deep learning enough to apply it properly every use Excel?
Every time I've used Excel for a complex problem I've regretted it. Even simple things like joins using VLOOKUPs are incredibly clunky and error prone.
Don't want to be an asshole but it really can't be called Deep Learning if it doesn't do backprob. Deep inference, at best.
The paper reads like a bad joke.
I think that the authors of such excellent publications such as "Visually Identifying Rank", "A Spectral Method for Ghost Detection" and "N-dimensional Polytope Schemes"[1] deserve a little more credit than that.
By the way: It works, try it ;)