Hacker News new | past | comments | ask | show | jobs | submit login

Isn't it rather the other way around, that regression methods informed the development of NNs?

I suppose the finding isn't novel. The contribution here might still be a didactic approach, demystifying NNs by simplifying a fundamental notion in known terms. For example exposing implicitly (that is, leaving the insight to be a success for the learning) that "OTOH you have to design features yourself [for the polyreg, because ...]". Although, since "Feature Engineering" is a buzzword for NN development, I still don't understand the difference. Indeed, the paper implies there is none except for the approach and terminology.

Essentially, they uphold that the traditional terminology should not be discarded in favour of the new, but rather understood in context. The part of understanding is left to the reader, of course.

Relatedly, in a linear algevra course, Terry Tao's one pixel camera was attributed with the success of the following ML resurgence, while we were otherwise talking about convolutions, fourier syntheses, wavelets and the like. It's no secret that linear algebra is a corner stone of ... just a corner stone, and that abstract algebra, topology and the like lies very near; At least this lecturer made it a main concern of the course to get that accross, which figured in nicely with a logics course on universal algebra that I took in parallel, while others taught rather monotonously towards taylor series and fourier transformations. At any rate, pretty much all professional researchers in the field stress that the mathematical basis need to be understood.

PS: The paper is written with an undergrad, what ever that means; No offence, I really don't know the slang, much less how much study vs research this implies. The publishing coauthor slash blog host shows some resentment against new fangled fancyfull terminology in the blog's About section, which might explain the scope of the paper, as well as the intended extent as far as I outlined my impressions above.

Your criticism is not quite correct, insofar the blog post notes the conclusions of the paper explicitly, which seems to be explaining common pitfalls, by use of statistical terms.

PPS: Many obervers lament that the results of NNs are intractable, nigh impossible to verify. This is a strong contrast to mathematical rigor. Hoping for the traditional methodology to get up to speed is in principle justified. I'm sure that your remark about design by hand being intractable holds as well, I'm just not sure to which extent. Showing that it can be done reasonably for some is a start, and chronicaling that endeavor is par for the course, however perhaps not yet enough, I guess, as another comment asks for benchmarks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: