I don't really understand this notion -- I usually get this kind of question from Americans: that if you consent to something, it makes it OK, regardless of the circumstances under which you gave your consent. I think this ideas usually comes from people playing life in "easy mode" (white, well educated, middle or upper middle class), that have never been forced to consent to humilliating or exploitative demands of the kind billions of people often have.
If I burn down your house (and kill your cousin and your Jew neighbors), and you or your kid, starving, accepts to come work for my company, and I treat you like shit on top, that doesn't make it OK in my book.
If I burn down your house (and kill your cousin and your Jew neighbors), and you or your kid, starving, accepts to come work for my company, and I treat you like shit on top, that doesn't make it OK in my book.