There was probably a more tactful way to shift labor from passionate volunteers to soulless AI.
I too would be upset if an organization threw out a decade of translation work without any warning or discussion, in favor of a robot pretending to understand my language and failing.
I agree there could have been more tact, and I would probably be upset too if a bunch of my work got replaced by automation. I would even feel that way if the automation was better. I've criticized Mozilla extensively in the past, and I would see a clumsy rollout as consistent with my concerns about them.
The difficulty with a post like this is it brings out a primal anti-technology impulse in all of us. But once you clear away the piling-on and emotionally charged hot takes there isn't much here to talk about.
This post, aside from a statement of intent to quit, is a report that was made about the bot. Mozilla made an invitation to address the concerns. All of that seems normal. Rollout with mistake -> bug report -> attempting to understand what went wrong. But the bug report contains unrealistic demands that seem almost rhetorical and the attempt to figure out what went wrong is being met with scorn, as in the top comment of this HN discussion.
> passionate volunteers to soulless AI.
Humans don't have souls either as I'm sure you know :-P. To post this comment, you're using a soulless computer that took jobs away from human computers. You probably listen to music made with soulless synthesizers that took jobs from musicians. You no doubt take photos with soulless cameras that took jobs from painters.
I think we have to be clear to ourselves that, although the transition to automation will be painful, nobody is going to prevent technology from advancing. So we have to find a way to use it to build the future we want, not try to tear it down as soulless or evil.
I too would be upset if an organization threw out a decade of translation work without any warning or discussion, in favor of a robot pretending to understand my language and failing.