The higher an animal's intelligence the easier it is for us to emphasize with. Most humans don't like the idea of being eaten alive.
Expanding this idea to machines is very easy. It is ok to treat a machine like a slave that is only capable of moving an object from point a to point b? How about a machine that has a self-aware AI?
Well self-awareness doesn't have a direct correlation with intelligence. I agree that it generally comes down to what/who we can easily empathize with, but I would argue it sentience matters more in that respect (as well as providing a more logical basis for what entities we should actually be concerned about).