This cuts both ways, of course - in some ways, I think we want automated systems to be able to discriminate. A good example is speed cameras - in days past, a man speeding to get his wife to the maternity ward could be pulled over and then hurriedly wished good luck, maybe even getting an escort. These days, he gets a fine in the post three weeks later.
In theory, the same way as a policeman - assessing the situation and the context, and then making a decision not to issue the fine.
But in practice, I don't think it can, without having total surveillance capabilities - not to mention essentially sci-fi quantities of AI advancement.
There should always be a human in the loop when justice is meted out. There's no way to write a law or an algorithm that covers every circumstance. Unthinking "by the book" enforcement is tyranny.
(Like those wretched "zero tolerance" laws, where everyone knows they are unjust, but feel impelled to literally interpret the law.)