> "Machines are taking control of investing..... Funds run by computers that follow rules set by humans"
One of the points that the article makes is that this statement is changing, that computers are increasingly creating their own rules. The literal next sentence:
"New artificial-intelligence programs are also writing their own investing rules, in ways their human masters only partly understand."
You're right though, that the responsibility still falls on the humans. Anybody running algos that they don't understand should ensure that they are covered from a legal/ethical perspective.
Actually, the problem is not of the human masters' understanding, it's the ignorance of the human master. Some models are so involved, no one knows how they'll behave under certain market conditions, as we saw in the perfect storm of 2008 with credit default swaps, CMOs and CBOs. That's the danger, no matter how smart and clever the creator believes they are.
What I meant was the machine learning systems that have developed in the last decade since the 2008 crash. Many of these are black boxes with thousands to millions of variables, and it's very difficult to understand exactly why a ML algorithm made a decision that it did.
The former. LTCM blowup in the late ‘90’s is a perfect example of a generally correct trade, but the market moving against them long enough to force insolvency.
Apparently, none of the Nobel prize winners thought to model out that particular adverse condition.
Bunch of smart people + one missed market condition = failure (eventually).
One of the points that the article makes is that this statement is changing, that computers are increasingly creating their own rules. The literal next sentence:
"New artificial-intelligence programs are also writing their own investing rules, in ways their human masters only partly understand."
You're right though, that the responsibility still falls on the humans. Anybody running algos that they don't understand should ensure that they are covered from a legal/ethical perspective.