Agreed. I think the larger problem highlighted here is that in this case "YouTube" is just the faceless algorithm making these decisions, and accessing the judgement of a real human being is nearly impossible for the average end user.
They quite likely have some humans verify the content - however, the criteria most likely are quite straightforward (depictions of the deaths violate their guidelines).
They don't provide the judgement because that only invites attempts of explaining and negotiation - they don't want to spend time to do a careful review of every contested video, they want to make an usually accurate final judgement with the minimum time investment possible (e.g. 5 seconds for a video), and they don't want to spend resources reading all kinds of reasoning and appeals, so they don't. And that is their right to do so - they can completely arbitrarily choose which videos to host on their site and which not.
Sure, but returning to the original point - by that token, why should YouTube have any obligation or even judicial ability to retain videos like this on "the back end?"
It is indeed YouTube's right to vaporize any bits at any time. But when they are the leading video platform on the entire web by a huge margin they need to at least adequately present the reality of their content guidelines to users in countries like Syria who are probably not focused on researching alternative video hosts while trying to document chemical weapons attacks.