There's an interesting first comment providing some argument for the decision:
Under common law (aka Anglo-American jurisprudence) Rulings of Law must follow precedent. Additionally novel rulings establish precedent. In other words, to be fair to Alice, we must use the same criteria for judgement that we use for Bob. Furthermore, it is established law that someone is "innocent until proven guilty". Using these two axioms, we can see that while refusing to allow bayesian statistics in this case may seem unenlightened, it is nevertheless legally essential that courts do so, as holding someone legally responsible for an action that they did not commit is reprehensible under our legal system, whereas letting someone off the hook for something that they did is merely unfortunate. In our system of law, it is desirable that investigators utilize all available tools, including math, to discover likely culprits and that they use that information to refine and direct their search for actual evidence. It is also desirable that courts refuse to allow the tools to be used as evidence and require that substantial proof be provided.
That seems to apply to ruling out Bayesian priors, and I think it's perfectly reasonable for society to disallow certain forms of evidence in the interest of ensuring fairness, perhaps including Bayesian priors. But this article points out that it goes beyond that, into ruling out the entire idea of epistemic uncertainty:
> You cannot properly say that there is a 25 per cent chance that something has happened: Hotson v East Berkshire Health Authority [1987] AC 750. Either it has or it has not.