Do you have any reference for this claim, or are you guessing? It was reported that the algorithm was a Gradient Boosting Machine by investigators who gained access to the code.
Most comments here are in one of two camps: 1) you don't need to know any of this stuff, you can make AI systems without this knowledge, or 2) you need this foundational knowledge to really understand what's going on.
Both perspectives are correct. The field is bifurcating into two different skill sets: ML engineer and ML scientist (or researcher).
It's great to have both types on a team. The scientists will be too slow; the engineers will bound ahead trying out various APIs and open-source models. But when they hit a roadblock or need to adapt an algorithm many engineers will stumble. They need an R&D mindset that is quite alien to many of them.
> But when they hit a roadblock or need to adapt an algorithm many engineers will stumble.
My experience is the other way around.
People underestimate how powerful building systems is and how most of the problems worth solving are boring and require out-of-the-box techniques.
During the last decade, I was in some teams and I noticed the same pattern: The company has some extra budget and "believes" that their problem is exceptional.
Then goes and hires some PhDs Data scientists with some publications but only know R and are fresh from some Python bootcamps.
After 3 months, or this new team no much was done, tons of Jupyter notebooks around but no code in production, and some of them did not even have an environment to do experimentation.
The business problem is still not solved. The company realizes that having a lot of Data Scientists not not so many Data/ML Enginers means that they are (a) blocked to do pushing something to production or (b) are creating a death star of data pipelines + algorithms + infra (spending 70% more of resources due to lack of knowledge on Python).
The project gets delayed. Some people become impatient.
Now you have a solid USD 2.5 million/year team that is not capable of delivering a proof of concept due to the fact that people cannot do the serving via Batch or via REST API.
The company lost momentum, competitors moved fast. They released an imperfect solution, but a solution ahead, and they have users on it and they are enhancing.
Frustration kicks in, and PMs and Eng Managers fight about accountability. VP of Product and Engineering wants heads in a silver plate.
Some PhDs get fired and go to be teachers in some local university.
I think because it's a relatively 'younger' field, there is a bit more need to know about the foundations in AI than in programming. You hit the perimeters a bit more often and need to do a bit of research to modify or create a model.
Whereas it's unlikely in most programming jobs you would need to do any research into programming language design.
I agree with you. Being a corporate department head, I've led exactly one project that's had me digging through my DS&A textbook. But it's much more common to need to go beyond the limits of an off-the-shelf deep learning algorithm. Plus many of the cutting edge deep learning advancements have been fairly simple to implement but required serious effort to create, and being able to understand an Arxiv paper can have a direct impact on the job you're currently working on, whereas being able to read all of TAOCP will make you a better coder, but in a more abstract way.
This sounds like a dont-buy-pitch for an AI engineer...
The point the commenter is making is that both schools of thought in the comments are valuable and unless you perform both roles, i.e. an engineer who is familiar with the scientific foundations, both are symbiotic and not in contention.
I guess this message is delivered by an AI scientist, sure.
It's almost self-exploratory that when you hit a roadblock in practice you go back to foundations, and good people should aim to do both. In that case I don't see where ML engineer/scientist bifurcation comes from except for some to feel good about themselves
Not at all. It's something I've seen in practice over many years. Neither skill set is 'better' than the other, just different.
There is a need for people who are able to build using available tools, but who don't have an interest in the theory or foundations of the field. It's a valuable mindset and nothing in my original comment suggested otherwise.
It's also pretty clear that many comments on this post divide into the two mindsets I've described.
Thank you for that. I wonder if you'd be able to just post a link to the paper, or give its title? I didn't have a Meetup account, tried to create one, spent 4 minutes dicking with it, and still can't get it to let me in.
The deeper problem here is that review sites don't work well for things as personal as books. I've read many books based on excellent reviews in amazon, and hated many of them. Most people don't have the same taste as me. Likewise many people hate the books I love, and give one star reviews to them.
Having a good circle of friends or being part of a serious book club. You form personal connections with people of similar taste. You get goos recommendations from such people.
There are books mentioned inside books and movies that you like. Try those.
Search for a book in particular topic, append reddit. Go to reddit thread, and try books that appear in many comments are the most upvoted.
Search "books" on HN. You will get ~10 years worth of reading in about ~30 mins browsing.
You must have people that you look up to- professionally, among relatives, etc. There must be some people that are bookish? Take recommendations from them.
You must have a niche area of interest? Follow forums/groups of that group. What books come up more often? Read those.
There are a lot of such advice that I can continue with. Probably a whole blog post. But, that's for another time.
All advices above are practised by me. None are theoretical.
This is to misunderstand Miles Davis. The point is that he treated the trumpet player with respect, as if he was taking to a trumpet player, not a child. He was engaging with a group of peers. The evidence for this is that he went on the hire the keyboard player, who was 16 years old, to play in his band a year later.
I hear what you're saying. :). People need to hear home truths that's for sure. Its the only way to improve. However, a bit of humour and warmth wouldn't go amiss would it? When I was a kid I played in a wind band. One weekend we had an ultra-sarcastic military guy conduct us. We reacted first with , WTF? This guy is really rude. It soon became apparent though that he had a heart of gold and could make us play really well. He cared about us, and his brusque manner was only skin deep and/or put on / an "act" . It all turned out to be pretty funny. "You do read music, do you?" was a memorable quip. We were like "can't believe he said that", but I think the recipient of that remark rather liked him in the end. It seemed to me Miles appeared not to know the people need that warmth. Which makes me wonder about autism. I must say, Herbie Hancock, who seems like a nice person and full of the warmth that Miles seems to lack, speaks warmly of Miles. So, maybe I should go away and re-read what Herbie had to say about Miles. ;)
This new law is irrelevant to these points. Vertebrate animals were already considered sentient in UK law. So sharks, dolphins, turtles, fish etc were already covered.
The new recognition of cephalopods and decapods are is relevant to these points because they can be and are killed by these human-driven factors, and experience pain and distress.
The UK's recognition that fish can suffer doesn't seem to have forced the elimination of the UK's fishing industry. That industry intentionally catches and kills large numbers of fish but incidentally kills plenty of other kinds of animals, including cephalopods and decapods, through the extremely untargeted practices.
My understanding is that bottom-trawling is still a permitted in UK waters, and as of a couple years ago, was even still allowed in most marine protected areas.
Sorry for not being clear enough. The point is that this £100M is very likely to be spent on the Turing Institute since it seems to suck up all AI funding in the UK. It will therefore likely be wasted.
Here's the recording https://youtu.be/FYYZZVV5vlY?si=ReoygVJMgY9oje3p