Or more realistically 'The probability of Mr Smith making his flight has dropped below x%, based on his body fat estimate, number of drinks, and distance from gate. Sell his seat.'
"Would you like another drink sir?"
Why alert him when they can sell his seat, sell him a few more drinks, a massage, and some entertainment?
And there won't even be any "whistleblowers", since everyone will be able to honestly say they had no idea. The programmers will just plug all the information they have into a generic machine learning system, and ask it to predict the probability of a missed flight for each flyer, based on opaque correlations.
This is a legitimate reason to worry about AIs. The "world domination" angle seems overblown, but machine learning has the potential to evolve some immoral behaviors like this without anybody even knowing.
"Would you like another drink sir?"
Why alert him when they can sell his seat, sell him a few more drinks, a massage, and some entertainment?