20 odd years ago I worked on someone else's helpdesk as application and PC support. Sometimes the calls that came in were a bit more interesting than Extra Term is blinking at me or Sophos ate my Word doc.
Call (ticket) came in along the lines of "Need help with a neural network I am developing in Excel". I bit. I showed the end user (who was orders of magnitudes cleverer than me) a bit about VBA. Some reasonably good habits like Option Explicit and making sure all eventualities are covered in If clauses. Keep your inputs, working and output separate and sum north/south and east/west and compare. Document the bloody thing! If I recall correctly, it was a Hopfield thingie, so I'm pretty off topic here.
The next call was for help with an Access database with fifty odd tables all linked to each other in a way that must surely be designed to invoke Cthulhu. I deleted it and we started again! My call notes were odd enough to get a mention at the next ops meeting.
Multi-head self-attention seems to be the new trendy architectural primitive.
I don't know how feasible it would be - I guess you could take a set of base operations (matrix multiplication, softmax, etc.) and randomly generate feature transformations and check if any of them yield good features (stick a linear readout at the end of it and test the performance on some downstream tasks).
That would be an unguided search - I guess you could try something like GA or something. Also, it uses neural network training as an inner loop step, so it would probably be to expensive. Better would be if you could get the gradient w.r.t. to the tentative operation somehow.
Problem is that training NNs is nontrivial and you might need things like BatchNorm and residual connections to make things stable, so you'd somehow have to search for good architectures for each operation as well.
I think NAS is a bit higher level than what the OP had in mind - NAS isn't usually used to search for fundamental operations like self-attention or convolution. But I guess you could probably adapt it quite easily.
At least there is work on greatly generalizing convolutions. They're much more broadly applicable (in neural networks) to very differently structured data than they appear to be in their standard form. (The "in neural networks" qualifier is there because quite a bit of this has been understood about the mathematical operation convolution for a long long time).
Another "beginner" intro that starts with describing FCs and neurons, and doesn't tell why we need NNs in the first place.
Although Deep NNs is not very interpretable, there's good intuitions behind the designs. These kind of articles will only make deep learning more mysterious.
I was thinking the same; there are so many articles explaining the basics.
For me it would be more helpful to start off with a real-life scenario where the mentioned method can be applied and even might excel compared to other methods; bonus points if you also explain what properties of the method make it so very well-suited for the specific real-life scenario.
There are so many methods in data science / machine learning and from what I remember from my university days one of the difficult tasks was to know when to use which method, depending on the properties of your data and on what you want to achieve; additionally, sometimes you also need to optimize/improve the method's hyperparameters and that's almost a whole separate discipline by itself.
Nonetheless, the posted article contains a lot of valuable information for a beginner, so it's definitely a good start.
Yes. In most problems I face I find gradient boosting is at least as good if not better then any neural network and much easier to implement and explain.
1. Some companies found a way to promote this type of blog articles past a certain threshold to stay on the front page long enough for... profits?
2. The demography of HN has changed substantially in recent times so that copypasta articles that don't add anything new to existing, better sources are actually valuable to them.
I'm inclined to agree. I've been meaning to take the time to implement a CNN from scratch, I went right to this article hoping it would have some code but no, just the same content over again.
I didn't implement a CNN from scratch, but a few years ago, I wrote a blog post on CNNs [1] because, like the other commenters, I could find almost no decent blog content on what exactly a CNN was. Maybe it will help in your efforts.
this seems really good. I always like to read code cause it gives a better idea how things works under the hood. when I first wrote a toy neural network I read everything I could on the topic, plus I was taking a ML class where the teacher was a big fan of implementing things from scratch.
If you want to implement a CNN from scratch, wouldn't you prefer an article that describes how it works, rather than an article that just gives you the code? Otherwise it's more a process of copying from a source moreso than implementing from scratch.
Call (ticket) came in along the lines of "Need help with a neural network I am developing in Excel". I bit. I showed the end user (who was orders of magnitudes cleverer than me) a bit about VBA. Some reasonably good habits like Option Explicit and making sure all eventualities are covered in If clauses. Keep your inputs, working and output separate and sum north/south and east/west and compare. Document the bloody thing! If I recall correctly, it was a Hopfield thingie, so I'm pretty off topic here.
The next call was for help with an Access database with fifty odd tables all linked to each other in a way that must surely be designed to invoke Cthulhu. I deleted it and we started again! My call notes were odd enough to get a mention at the next ops meeting.
The product is still flying so all good.
Convoluted ... what?