If you’re genuinely a novice programmer/lesser background in linear algebra, AI should be the last thing on your mind. Any attempts at a shortcut will enhance the difficulty in learning AI, and being able to code things besides simple examples. (which is why I am annoyed by many of the ML MOOCs which are targeted toward novice programmers)
I disagree. There is nothing magic or hard about basic ML. You can do real work with only some basic linear algebra and programming skills. Sure you won't be doing novel deep learning on 100 terabyte datasets, but most problems aren't that anyway.
Yeah, and to be honest while it's always important to be able to understand the math behind what you're doing, you can easily get started with just understanding what these algorithms do and why, and then work to expand your knowledge of the math over time from there.
Why does this annoy you? Some of the MOOCs are amazing and they certainly don't shortcut. I had many hard-learned lessons and realizations when going through the MOOCs I have taken. They also help in "knowing what you don't know"...which is essential in breaking out of ignorance.
I've learned computer science in college, so I do have _some_ linear algebra background. I'm by no means an expert coder, but maybe 'intermediary' would have been a better description than 'novice'. :-)
I say it for the opposite reason of elitism (and I’ve spoken out many time against elitism in ML/AI).
As other posts note, there are many resources available for teaching the concepts. But they don’t teach the limits of AI, and the rise of MOOCs is setting novice programmers up for a shock when they encounter real world data that is not as nice as the Titanic dataset, and requires making smart decisions to handle the data, and handle it in a way that does not invalidate the results.
Many romanticize ML/AI as something that can solve any problem, which is a dangerous approach.